by Shachar Shamir
🚀 Automated LinkedIn Post Generator from Article Links (Telegram → AI → Google Sheets → LinkedIn) This workflow lets you collect article links through a Telegram bot, automatically analyze and summarize them with AI, store everything neatly in Google Sheets, and generate polished LinkedIn posts on demand whenever the user types “generate”. Perfect for creators, marketers, and founders who want to post consistently without spending hours analyzing articles or writing drafts. 🧠 How It Works 1️⃣ User Sends Articles via Telegram Your Telegram bot is the main input point. Whenever the user drops a link, the workflow: Detects the URL Fetches the content Sends it to AI for analysis This keeps the process simple. 2️⃣ AI Analyzes & Summarizes the Article The workflow uses your LLM (OpenAI, Anthropic, etc.) to: Summarize the article Extract key insights Identify main arguments Capture tone and context It produces a clean, structured dataset for each link. 3️⃣ Everything is Saved into Google Sheets Each article becomes a new row in your Google Sheet. The sheet serves as your content library with fields like: Date Title Link Summary Insights Commentary You can save dozens of articles and generate posts from any of them later. 4️⃣ User Requests a Post with “generate” When the user types “generate”, the workflow will: Pull the latest article(s) from Google Sheets (or any selection logic you choose) Build a LinkedIn-ready post using AI Apply the requested tone/style Format it as a clean, professional post The final post is sent right back to Telegram — ready to copy/paste into LinkedIn. 🛠️ Setup Steps 🔧 1. Create a Telegram Bot Go to @BotFather on Telegram Create a new bot Copy the API token Paste the token into the Telegram Trigger node in n8n 🔧 2. Add Your AI Credentials Go to Credentials → OpenAI (or your provider) Add your API key Select this credential in all AI nodes You can switch to GPT-4o, GPT-4o-mini, or any model you prefer. 🔧 3. Connect Google Sheets Go to Credentials → Google Authenticate with your Google account Make sure the sheet contains the required columns: Date Title Link Summary Insights Commentary You can customize or add additional columns as needed. 🔧 4. Adjust Workflow Logic (Optional) You can modify: How the AI summarizes The LinkedIn post style How posts are selected (latest, random, specific tone, etc.) Whether you store more metadata Multi-language support Everything is modular. 🔧 5. Test the Flow Send yourself a link via the Telegram bot Check that it appears in Google Sheets Type “generate” Receive your LinkedIn post instantly 🎉 You’re Ready! This workflow helps you build a personal content pipeline that: Collects links Saves ideas Summarizes insights Generates LinkedIn posts on demand All directly from your phone, inside Telegram. If you remix or extend this template, I’d love to see what you build!
by Shadrack
How it works This AI agent manages your Streak CRM via WhatsApp. Send messages to create contacts, add boxes to pipelines, update stages, link contacts to boxes, and retrieve information, add and get tasks all through natural conversation. Setup steps Connect your Streak API credentials (Basic Auth) Configure WhatsApp Business integration Add your Streak pipeline keys to the workflow Test with a simple command like "create contact John Doe" Customization tips Add custom Streak fields to the AI's tool descriptions Modify pipeline stages to match your workflow Adjust AI prompts to recognize your team's terminology Enable additional Streak operations (comments, tasks, files) The agent worked perfectly when you give it access to streak API documentation, to attain even more complex tasks
by BytezTech
Build a WhatsApp AI shopping bot with virtual try-on using Gemini 📌 Overview This workflow fully automates your T-shirt store's WhatsApp shopping experience using GPT for intent detection, MongoDB Atlas for vector-based product search, Redis for session management, and Google Gemini for AI-powered virtual try-on. It automatically handles customer messages, finds relevant products, processes orders, and generates realistic try-on images — all inside WhatsApp, with no app or website required. Customers can search for T-shirts, place orders, and virtually try on items in a single conversation. Redis ensures fast product caching and session tracking. MongoDB Atlas stores the product catalog and orders. Google Sheets logs every order automatically. Gemini generates realistic try-on images from customer selfies. This workflow eliminates manual order handling, improves customer experience, and gives store owners full visibility into orders and product searches. ⚙️ How it works This workflow runs automatically when a customer sends a WhatsApp message. 🔍 Product search 💬 Receives the customer message via WhatsApp Business API 🧠 GPT classifies the intent as product search, recommendation, or general query ⚡ Checks Redis cache for existing results (TTL: 1 hour) 🔎 On a cache miss, runs MongoDB Atlas vector search using OpenAI embeddings 🛍️ Sends matching products as interactive WhatsApp cards with Order Now and Virtual Try-On buttons 🛒 Order flow 👆 Triggered when the customer taps the Order Now button 📦 AI agent fetches product details from MongoDB 🗃️ Creates a new order document in MongoDB 📊 Logs the order to Google Sheets automatically ✅ Sends an order confirmation message to the customer via WhatsApp 👗 Virtual try-on flow 👆 Triggered when the customer taps the Virtual Try-On button 💾 Stores the product ID in Redis (TTL: 10 minutes) 📸 Prompts the customer to send a clear front-facing selfie 🔍 Gemini validates that exactly one real person is in the photo 🖼️ Merges the product image and selfie and generates a realistic try-on image 📩 Sends the try-on result back to the customer via WhatsApp 🗑️ Clears the Redis context after delivery 🛠 Setup steps Import this workflow into n8n Connect your WhatsApp Business Cloud API credentials Connect your OpenAI API credentials (for embeddings and GPT model) Connect your Google Gemini API credentials Connect your MongoDB Atlas credentials and create a vector index named ShopingBot on the product collection Connect your Redis credentials Connect your Google Drive service account credentials Connect your Google Sheets service account credentials Import your product catalog with embeddings into the MongoDB product collection Activate the workflow The workflow will run automatically when customers send WhatsApp messages. 🚀 Features 🧠 AI-powered shopping 🤖 Automatically classifies customer intent using GPT 🔎 Semantic product search using OpenAI embeddings and MongoDB Atlas vector search ⚡ Redis caching for ultra-fast repeated search results (TTL: 1 hour) 💬 Interactive WhatsApp product cards with Order Now and Virtual Try-On buttons 🔄 Sliding window session memory (last 20 messages per user) 🛒 Order management 📦 Fully automated order creation saved to MongoDB 📊 Automatic order logging to Google Sheets 🤖 AI agent handles the complete order flow without manual input ✅ Instant order confirmation sent to the customer via WhatsApp 👗 Virtual try-on ✨ AI-powered try-on image generation using Google Gemini 📷 Selfie validation ensures exactly one real person is in the photo 🖼️ Product and selfie images resized and merged before generation 📩 Try-on result delivered directly in the WhatsApp conversation 🗑️ Redis TTL automatically clears try-on context after delivery 🔐 Security and reliability 🛡️ Advanced message validation with spam and XSS protection ❌ Unsupported message types rejected with friendly error messages 🔁 Retry logic on critical HTTP request nodes 📦 Modular workflow architecture for easy customisation and scaling 📋 Requirements You need the following accounts and credentials: 🔧 n8n 📱 WhatsApp Business Cloud API 🤖 OpenAI API (embeddings and GPT model) ✨ Google Gemini API 🍃 MongoDB Atlas (with vector index named ShoppingBot on the product collection) ⚡ Redis server 📁 Google Drive (service account) 📊 Google Sheets (service account) 🎯 Benefits 🚀 Fully automated WhatsApp shopping experience 🙌 No manual order handling required 👗 Customers can try on products before buying ⚡ Fast product search with Redis caching 📊 All orders automatically tracked in Google Sheets 💼 Reduces support workload for store owners 🕐 Works 24/7 without human intervention 👨💻 Author BytezTech Pvt Ltd
by Cheng Siong Chin
How It Works This workflow automates end-to-end financial transaction processing for finance teams managing high-volume bank data. It eliminates manual reconciliation by intelligently classifying transactions, detecting anomalies, and generating executive summaries. The system pulls transaction data from Fable Bank, routes it through multiple AI models (OpenAI GPT-4, NVIDIA NIM) for classification and analysis, reconciles accounts, and distributes formatted reports via email. Finance managers and accounting teams benefit from reduced processing time, improved accuracy, and real-time anomaly detection. The workflow handles transaction categorization, reconciliation schema generation, account matching, journal entry creation, and comprehensive reporting—transforming hours of manual work into minutes of automated processing with AI-enhanced accuracy. Setup Steps Configure Fable Bank API credentials for transaction data access Add OpenAI API key for GPT-4 classification and reconciliation models Set up NVIDIA NIM credentials for anomaly detection services Connect Google Sheets for reconciliation schema storage Configure Gmail account for automated report distribution Prerequisites OpenAI API account with GPT-4 access Use Cases Monthly financial close automation, daily transaction monitoring for fraud detection Customization Replace Fable Bank with your banking API Benefits Reduces reconciliation time by 90%, eliminates manual data entry errors
by Rahul Joshi
Description Automatically analyze incoming lead replies from Google Sheets using Azure OpenAI GPT-4, classify their intent (Demo Request, Pricing, Objection, etc.), and create actionable follow-up tasks in ClickUp — all without manual intervention. Streamline your sales response workflow and never miss a lead again. 🤖📩📈 What This Template Does Triggers every 15 minutes to check for new lead replies in Google Sheets. ⏰ Prepares lead data for AI analysis by standardizing input fields. 🧩 Uses Azure OpenAI GPT-4 to classify lead intent (Demo Request, Pricing Inquiry, Objection, etc.). 🧠 Routes leads based on intent to the corresponding follow-up handler. 🔀 Creates new ClickUp tasks with calculated due dates, descriptions, and pipeline stages. 🗂️ Adds structured checklists to each task for consistent sales follow-ups. ✅ Loops through multiple tasks while respecting ClickUp API rate limits. 🔁 Key Benefits ✅ Saves hours of manual lead qualification and task creation. ✅ Ensures no lead reply is ignored or delayed. ✅ Standardizes intent-based follow-ups for sales teams. ✅ Enhances productivity with AI-driven decision logic. ✅ Maintains clear visibility across CRM and task systems. Features 15-minute recurring trigger to monitor new replies. AI-powered intent classification using Azure OpenAI GPT-4. Multi-category routing logic for personalized next steps. Seamless ClickUp integration for automated task generation. Smart checklist creation for follow-up management. Batch loop processing to avoid rate-limit errors. Requirements n8n instance (cloud or self-hosted). Google Sheets OAuth2 credentials with read access. Azure OpenAI GPT-4 API credentials. ClickUp API token with workspace permissions. Target Audience Sales and marketing teams managing inbound leads. 💼 Agencies automating client qualification workflows. 🏢 Startups improving lead follow-up efficiency. 🚀 Teams integrating AI-driven insights into CRM processes. 🌐 Step-by-Step Setup Instructions Connect Google Sheets with your lead replies document. 📊 Add Azure OpenAI GPT-4 API credentials for intent analysis. 🧠 Configure ClickUp workspace details — team, space, folder, and list IDs. ⚙️ Set your preferred trigger interval (default: every 15 minutes). ⏰ Run a test with sample data to confirm intent mapping and task creation. ✅ Activate the workflow to automatically classify leads and create ClickUp tasks. 🚀
by Geoffroy
This n8n template demonstrates how to automatically generate and publish SEO/AEO-optimized Shopify blog articles from a list of keywords using AI for content creation, image generation, and metadata optimization. Who’s it for Shopify marketers, content teams, and solo founders who want consistent, hands-off blog production with built-in SEO/AEO hygiene and internal linking. What it does The workflow picks a keyword from your Google Sheet based on priority, search volume, and difficulty. It then checks your Shopify blog for existing slugs to avoid duplicate, drafts a 900+ word article optimized for SEO/AEO, generates a hero image, creates the article in Shopify, sets SEO metafields (title/description), and logs the result to your Sheets for tracking and future internal links. How it works Google Sheets → Candidate selection:* Reads *Keywords, **Links, and Published tabs: ranks by priority → volume → difficulty. (In the workflow it is explained how to exactly set up the Google Sheets) De-dupe slugs:** Paginates your blog via Shopify GraphQL to collect existing handles and make sure to use a different one. OpenAI content + image:** Builds a structured prompt (SEO/AEO and internal linking), calls Chat Completions and Image Generation for a hero image. Shopify publish:** Creates the article via REST and updates title_tag / description_tag metafields via GraphQL. Log + link graph:* Appends to *Published* tab to keep track of articles posted and *Links** tab for ongoing internal-link suggestions. How to set up Open Set – Config and fill: shopDomain, siteBaseUrl, blogId, blogHandle, sheetId, author. Optional: autoPublish, maxPerRun, tz. Create the Google Sheet with Keywords, Links, Published tabs using the provided column structure. I have personally used Semrush to generate that list of keywords. Add credentials: Shopify Admin token (Header/Bearer), OpenAI API key, and Google Service Account. Requirements Shopify store with Blog API access OpenAI API key Google Service Account with access to Google Sheets API (can be activated here here) How to customize Change the cron in Schedule Trigger for different days/times. Adjust maxPerRun, autoPublish, language or any other variables in the "Set - Config" node. Adjust the prompt from the "Code - Build Prompt" node. Extend the Sheets schema with extra scoring signals if needed.
by Robert Breen
This workflow fetches deals and their notes from Pipedrive, cleans up stage IDs into names, aggregates the information, and uses OpenAI to generate a daily summary of your funnel. ⚙️ Setup Instructions 1️⃣ Set Up OpenAI Connection Go to OpenAI Platform Navigate to OpenAI Billing Add funds to your billing account Copy your API key into the OpenAI credentials in n8n 2️⃣ Connect Pipedrive In Pipedrive → Personal preferences → API → copy your API token URL shortcut: https://{your-company}.pipedrive.com/settings/personal/api In n8n → Credentials → New → Pipedrive API Company domain: {your-company} (the subdomain in your Pipedrive URL) API Token: paste the token from step 1 → Save In the Pipedrive nodes, select your Pipedrive credential and (optionally) set filters (e.g., owner, label, created time). 🧠 How It Works Trigger**: Workflow runs on manual execution (can be scheduled). Get many deals**: Pulls all deals from your Pipedrive. Code node**: Maps stage_id numbers into friendly stage names (Prospecting, Qualified, Proposal Sent, etc.). Get many notes**: Fetches notes attached to each deal. Combine Notes**: Groups notes by deal, concatenates content, and keeps deal titles. Set Field Names**: Normalizes the fields for summarization. Aggregate for Agent**: Collects data into one object. Turn Objects to Text**: Prepares text data for AI. OpenAI Chat Model + Summarize Agent: Generates a **daily natural-language summary of deals and their current stage. 💬 Example Prompts “Summarize today’s deal activity.” “Which deals are still in negotiation?” “What updates were added to closed-won deals this week?” 📬 Contact Need help extending this (e.g., send summaries by Slack/Email, or auto-create tasks in Pipedrive)? 📧 rbreen@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Oneclick AI Squad
This workflow creates a self-improving AI agent inside n8n that can understand natural language tasks, plan steps, use tools (HTTP, code, search, …), reflect on results, and continue until the goal is reached — then deliver the final answer. How it works Webhook or manual trigger receives a task description LLM creates initial plan + first tool call (or finishes immediately) Loop: • Execute chosen tool • Send observation back to LLM • LLM reflects → decides next action or finish When finished → format final answer, save result, send Slack notification Setup steps Connect OpenAI (or Anthropic/Groq/Gemini) credential (Optional) Connect Slack credential for notifications Replace the placeholder “Other Tools” Code node with real tool nodes (Switch + HTTP Request, Google Sheets, Code node, etc.) Test with simple tasks first: • “What is the current weather in Ahmedabad?” • “Calculate 17×42 and explain the steps” Adjust max iterations (via SplitInBatches or custom counter) to prevent infinite loops Activate the workflow and send POST request to webhook with JSON: {"task": "your task here"} Requirements LLM API access (gpt-4o-mini works well for testing) Optional: Slack workspace for alerts Customization tips Upgrade to stronger reasoning models (o1-preview, Claude 3.5/3.7 Sonnet, Gemini 2.0) Add real tools: browser automation, vector DB lookup, file read/write, calendar Improve memory: append full history or use external vector store Add cost/safety guardrails (max iterations, forbidden actions) Contact Us If you need help setting up this workflow, want custom modifications, or have questions about integrating specific tools/services: 🌐 Website: https://www.oneclickitsolution.com/contact-us/
by Shinji Watanabe
Who’s it for Learners, teachers, and content creators who track German vocabulary in Google Sheets and want automatic enrichment with synonyms, example sentences, and basic lexical info—without copy-and-paste. How it works / What it does When a new row is added to your sheet (column vocabulary), the workflow looks up the word in OpenThesaurus and checks if any entries are found. If so, an LLM generates a strict JSON object containing: natural_sentence (a clear German example), part_of_speech, translation_ja (concise Japanese gloss), and level (CEFR estimate). The JSON is parsed and written back to the same row, keeping your spreadsheet the single source of truth. If no entry is found, the workflow writes a helpful “not found” note. How to set up Connect Google Sheets and select your spreadsheet/tab. Confirm a vocabulary column exists. Configure OpenThesaurus (no API key required). Add your LLM credentials and keep the prompt’s “JSON only” constraint. Rename nodes clearly and add a yellow sticky note with this description. Requirements Access to Google Sheets LLM credentials (e.g., OpenAI) A tab containing a vocabulary column How to customize the workflow Adjust the If condition (e.g., require terms.length > 1 or fall back to the headword). Tweak the LLM prompt for tone, length, or level policy. Map extra fields in the Set node; add columns for difficulty tags or usage notes. Follow security best practices (no hardcoded secrets in HTTP nodes).
by Roshan Ramani
Company Knowledge Base Assistant Who's it for This workflow is designed for companies looking to onboard new employees and interns efficiently. It's perfect for HR teams, team leaders, and organizations that want to provide instant access to company knowledge without manual intervention. Whether you're a startup or an established company, this assistant helps your team find answers quickly from your existing documentation. What it does This AI-powered chatbot automatically learns from your company documents stored in Google Drive and provides accurate, contextual answers to employee questions. The system continuously monitors a designated Drive folder, processes new documents, and makes them instantly searchable through a conversational interface. Key features: Automatic document ingestion from Google Drive Intelligent search across all company documents Conversational interface with memory Source citation for answers Real-time updates when new documents are added How it works The workflow has two main components: Document Processing Pipeline: Monitors your Google Drive folder every minute for new files. When a document is added, it's automatically downloaded, split into searchable chunks, converted into vector embeddings, and stored in an in-memory knowledge base. Chat Interface: Users send questions via webhook, the AI agent searches the knowledge base for relevant information, maintains conversation history for context, and returns accurate answers with source citations. Requirements Google Drive account with OAuth2 credentials Google Service Account for document downloads OpenAI API key for embeddings and chat model Designated Google Drive folder for company documents Setup Instructions Configure Google Drive: Set up Google Drive OAuth2 credentials in the "Watch Company Docs Folder" node Set up Google Service Account credentials in the "Fetch New Document" node Select your company documents folder in the trigger node Configure OpenAI: Add your OpenAI API key to both embedding nodes The workflow uses GPT-4 Mini for cost-effective responses Upload Your Documents: Add company handbooks, policies, procedures, and FAQs to the designated Drive folder Documents will be automatically processed within minutes Test the Chat Interface: The webhook endpoint accepts POST requests with this format: { "data": "Your question here", "session_id": "unique-user-id" } Integrate with Your Tools: Connect the webhook to Slack, Teams, or your internal chat platform Each user gets their own conversation history via session_id How to customize Change check frequency**: Adjust polling interval in "Watch Company Docs Folder" from every minute to hourly or daily Adjust chunk size**: Modify the "Split into Searchable Chunks" node to change how documents are segmented Increase context**: Change topK parameter in "Search Company Documents" to retrieve more relevant sections Extend memory**: Adjust contextWindowLength in "Conversation History" to remember more previous messages Switch AI model**: Replace GPT-4 Mini with GPT-4 or other models based on your accuracy needs Add filters**: Modify the system prompt to focus on specific departments or document types Custom responses**: Update the system message in "Company Knowledge Assistant" to match your company's tone Tips for best results Use clear, descriptive file names for documents in Drive Organize documents by department or topic in subfolders Include FAQ documents with common questions and answers Regularly update outdated documents to maintain accuracy Monitor the assistant's responses and refine the system prompt as needed
by Cheng Siong Chin
Based on the workflow image, here is the complete n8n template submission: Title: ai curriculum modernisation with learning outcome and industry demand alignment How It Works This workflow automates higher education curriculum analysis and modernisation using a multi-agent AI system. Designed for academic administrators, curriculum designers, and institutional planners, it eliminates manual effort in aligning course content with graduate employment outcomes and industry demand signals. The pipeline starts by concurrently loading graduate employment data, enrolment patterns, and extracting course syllabi from PDFs. These are merged and fed into a Curriculum Knowledge Base using semantic embeddings and text splitting. A Curriculum Modernisation Agent orchestrates two sub-agents: a Learning Outcome Alignment Agent (using semantic retrieval and cognitive load analysis) and an Industry Demand Forecast Agent (querying live employment data). Outputs are parsed and stored as structured analysis results, enabling institutions to make evidence-based curriculum decisions at scale. Setup Steps Add OpenAI or compatible LLM API credentials to all Chat Model and Embedding nodes. Connect graduate employment and enrolment data sources. Set up vector store credentials for the Curriculum Knowledge Base node. Configure Employment Data Query Tool with your labour market data source or API. Update Store Analysis Results with your target storage destination. Prerequisites Vector store (e.g., Pinecone, Qdrant, or Supabase) Graduate employment & enrolment data (CSV or DB) Course syllabi in PDF format Use Cases Annual curriculum review aligned to graduate employment trends Customisation Swap embedding models for domain-specific academic corpora Benefits Automates labour-intensive curriculum mapping processes
by Rahul Joshi
📊 Description Enhance content quality, SEO performance, and editorial consistency using an AI-powered optimization engine that blends OpenAI, Google Sheets history, Pinecone knowledge, and real-time SERP intelligence. This workflow transforms rough drafts into polished, SEO-optimized content while preserving original meaning — and includes human review before final publication. 🚀✍️ What This Template Does Step 1: Trigger the optimization from Chat or manual run: Starts the optimization process using the Chat Trigger node, passing topic, content ID, and customization parameters. 💬 Step 2: Retrieve contextual knowledge: Pulls historical versions from Google Sheets and relevant company information from Pinecone vector storage to guide consistent optimization. 📚 Step 3: Fetch SERP competitor data: Uses SerpAPI to gather ranking competitors, headings, snippets, PAA questions, and search intent to strengthen the optimized draft. 🔍 Step 4: Run AI content optimization: AI Agent (GPT-4o-mini) rewrites the draft without starting from scratch, improving structure, SEO, tone, clarity, and keyword coverage. 🤖 Step 5: Enforce structured JSON output: Ensures the optimized draft follows a strict JSON schema containing title, meta description, sections, keywords, and metadata. 🧩 Step 6: Request human review in Slack: Sends the optimized draft to Slack and waits for approval (approve/reject). Team members can refine or confirm before finalization. 💬🧑💼 Step 7: Save approved version back to Google Sheets: Updates or appends a new version in the content_versions sheet with metadata, SEO fields, and version history. 📊 Step 8: Send success confirmation to Slack: Posts a notification confirming that the approved draft has been published. 🔔 Key Benefits ✓ Eliminates manual editing and SEO refinement ✓ Produces consistent, high-quality, conversion-focused content ✓ Ensures factual accuracy and tone preservation ✓ Enhances content using SERP-based competitor insights ✓ Maintains version history for auditability ✓ Introduces structured human approval workflow ✓ Fully automated publishing pipeline Features AI-assisted rewrite using GPT-4o-mini Google Sheets version retrieval + updating Pinecone knowledge base retrieval SERP competitor and keyword intelligence Slack approval workflow (sendAndWait) Structured JSON output enforcement Version incrementing & metadata tracking Secure credentials management Requirements OpenAI API Key (GPT-4o-mini or higher) Google Sheets OAuth2 credentials SerpAPI Key Slack Bot Token with chat:write Pinecone API and vector index Pre-created Google Sheet for versioning Optional: Existing company knowledge stored in Pinecone Target Audience SEO content teams optimizing blog drafts Marketing teams refining landing pages Agencies managing editorial workflows Enterprises maintaining knowledge-based content Writers/editorial teams that need AI assistance + human QA Teams that require version-controlled SEO content Step-by-Step Setup Instructions Connect these credentials in n8n: OpenAI, Slack, Google Sheets, Pinecone, SerpAPI. 🔐 Replace Google Sheet ID in the Sheets nodes with your own. Ensure your Pinecone index exists and contains embeddings. Configure Slack channel ID for approvals and notifications. Update test topic/content ID in the Set Input Parameters node. Run a manual test to confirm SERP retrieval, data context, and AI output. Deploy and use Chat Trigger to start generating optimized content on demand.