by Satyam Tripathi
Try It Out! This n8n template demonstrates how to build an autonomous AI news agent using Decodo MCP that automatically finds, scrapes, and delivers fresh industry news to your team via Slack. Use cases are many – automated news monitoring for your industry, competitive intelligence gathering, startup monitoring, regulatory updates, research automation, or daily briefings for your organization. How it works Define your news topics using the Set node – AI, MCP, web scraping, whatever matters to your business. The AI Agent processes those topics using the Gemini Chat Model, determining which tools to use and when. Here's where it gets interesting: Decodo MCP gives your AI agent the tools to search Google, scrape websites, and parse content automatically – all while bypassing geo-restrictions and anti-bot measures. The agent hunts for fresh articles from the last 48 hours, extracts clean data, and returns structured JSON results. Format Results cleans up the AI's messy output and removes duplicates. Your polished news digest gets delivered to Slack with clickable links and summaries. How to use Schedule trigger runs daily at 9 AM – adjust timing or swap for webhook triggers as needed. Customize topics in the Set node to match your industry. Scales effortlessly: add more topics, tweak search criteria, done. Requirements Decodo MCP credentials (free trial available) – grab the Smithery connection URL with keys and paste it straight into your n8n MCP node. Done. Gemini API key for the AI processing – drop it into the Google Gemini Chat Model node and pick whichever Gemini model fits your needs. Slack workspace for delivery – n8n's Slack integration docs have you covered. What the final output looks like Here's what your team receives in Slack every morning: Need help? Join the Discord or email support@decodo.com for questions. Happy Automating!
by Muhammad Nouman
How it works This workflow turns a Google Drive folder into a fully automated YouTube publishing pipeline. Whenever a new video file is added to the folder, the workflow generates all YouTube metadata using AI, uploads the video to your YouTube channel, deletes the original file from Drive, sends a Telegram confirmation, and can optionally post to Instagram and Facebook using permanent system tokens. High-level flow: Detects new video uploads in a specific Google Drive folder. Downloads the file and uses AI to generate: • a polished first-person YouTube description • an SEO-optimized YouTube title • high-ranking YouTube tags Uploads the video to YouTube with the generated metadata. Deletes the original Drive file after upload. Sends a Telegram notification with video details. (Optional) Posts to Instagram & Facebook using permanent system user tokens. Set up steps Setup usually takes a few minutes. Add Google Drive OAuth2 credentials for the trigger and download/delete nodes. Add your OpenAI (or Gemini) API credentials for title/description/tag generation. Add YouTube OAuth2 credentials in the YouTube Upload node. Add Facebook/Instagram Graph API credentials if enabling cross-posting. Replace placeholder IDs (Drive folder ID, Page ID, IG media endpoint). Review sticky notes in the workflow—they contain setup guidance and token info. Activate the Google Drive trigger to start automated uploads.
by Ludwig
How it works: This workflow automates tagging for WordPress posts using AI: Fetch blog post content and metadata. Generate contextually relevant tags using AI. Verify existing tags in WordPress and create new ones if necessary. Automatically update posts with accurate and optimized tags. Set up steps: Estimated time: ~15 minutes. Configure the workflow with your WordPress API credentials. Connect your content source (e.g., RSS feed or manual input). Adjust tag formatting preferences in the workflow settings. Run the workflow to ensure proper tag creation and assignment. This workflow is perfect for marketers and content managers looking to streamline their content categorization and improve SEO efficiency.
by Bhavy Shekhaliya
Overview AI-powered workflow that transforms any article URL into platform-optimized social media posts for LinkedIn, Twitter (X), and Reddit. Uses Mozilla Readability for content extraction, multi-agent AI with RAG from viral LinkedIn post database, and interactive review forms for content refinement before auto-publishing. Key Capabilities: Extracts article content: title, author, text, images, metadata Generates LinkedIn posts using 3-agent system with viral pattern matching Creates Twitter threads under 280 characters with article links Auto-posts to Reddit with AI-selected flairs Interactive review/regeneration workflow with feedback loops Auto-publishes with images or links to all platforms How It Works Stage 1: Article Content Extraction Form Submission: User enters article URL (with basic auth protection) URL Validation: Checks if valid URL format Article Scraping: HTTP request fetches HTML content Readability Parsing: Mozilla Readability extracts: Clean article text (removes ads, navigation, etc.) Title, author, excerpt Word count, site name Featured image (from og:image, twitter:image, or first img tag) Error Handling: Returns user-friendly error if scraping fails Stage 2: LinkedIn Post Generation (3-Agent System) Agent 1: LinkedIn Post Strategist Input**: Extracted article content (title, text, author, excerpt) RAG Process**: Queries Supabase vector database for similar viral LinkedIn posts Analysis**: Identifies patterns, hooks, formatting, engagement triggers Output**: Strategic insights and viral content patterns Agent 2: LinkedIn Post Generator Input**: Article content + strategist insights Process**: Creates post using viral patterns from database Rule**: Must include article URL in post Output**: Draft LinkedIn post Agent 3: LinkedIn Post Formatter Input**: Generated post Process**: Removes extraneous content Applies Sans Serif Bold Unicode for emphasis (𝗯𝗼𝗹𝗱 𝘁𝗲𝘅𝘁) Removes markdown/em dashes Ensures clean formatting Output**: Polished, ready-to-post LinkedIn content Review Loop: User sees formatted post in web form Options: "Regenerate" or "Continue" If regenerate: Provide feedback → Agent creates new version Second review form with same options After 2 iterations or approval, proceeds to image selection Stage 3: Image Handling for LinkedIn Image Preview: Shows extracted article image User Choice: "Yes" → Downloads image, posts with text + image "Continue without Image" → Posts with text + article link preview Auto-Publish: Posts to LinkedIn with selected format Stage 4: Twitter (X) Post Generation Parallel process (runs alongside LinkedIn): Twitter Agent: Creates tweet under 280 characters (including spaces) Must include article URL Uses GPT-4.1 or GPT-5 models Tweet Review Form: User reviews generated tweet Regeneration Loop (if requested): User provides feedback Re-generate Tweet Agent creates new version Second review form Auto-Tweet: Posts with article image attachment Stage 5: Reddit Post Automation Parallel process (runs alongside LinkedIn/Twitter): Subreddit Selection: User picks from dropdown (r/n8n, r/mcp, r/technews) Flair Retrieval: Fetches available flairs for selected subreddit via Reddit API AI Flair Selection: GPT-4o-mini analyzes article title + available flairs Selects most appropriate flair Auto-Post: Submits link post to Reddit with title and selected flair How To Use Prerequisites API Credentials Required OpenAI API: GPT-4.1, GPT-5, GPT-5-mini, GPT-4o-mini access Supabase: Vector database with linkedin_post table (from previous workflow) LinkedIn OAuth2: Developer app with posting permissions Twitter OAuth2: Developer account with tweet permissions Reddit OAuth2: App credentials with submit permissions Basic Auth: For form password protection Setup Steps 1. Configure Form Access Open "On Article Submission" node Set up basic auth credentials for form protection Get form URL from webhook settings 2. Link Vector Database Ensure Supabase vector store has viral LinkedIn posts (use previous workflow to populate) Verify "LinkedIn Post Vector Store" credentials Check "Embedding" node has OpenAI API key 3. Set Up Social Media APIs LinkedIn: Configure "Text + Image" and "Text + Link" nodes Update person parameter with your LinkedIn profile ID Add OAuth2 credentials Twitter: Configure "Tweet" and "Re-generated Tweet" nodes Add Twitter OAuth2 credentials Reddit: Update subreddit list in "Reddit Form" dropdown (customize to your subreddits) Configure "Get Flair", "Reddit Post" nodes with OAuth2 Update subreddit name in "Reddit Post" query parameters 4. Configure AI Models Verify all OpenAI credentials in language model nodes Models used: GPT-4.1, GPT-5, GPT-5-mini (adjust based on your access)
by totoma
Use Cases Receive a newsletter featuring curated, contributor-friendly issues from your favorite repositories. By regularly reviewing active issues and new releases, you'll naturally develop stronger habits around open source contribution as your brain starts recognizing these projects as important. How It Works Collects the latest issues, comments, and recent commits using the GitHub API. Uses an AI model to select up to three beginner-friendly issues worth contributing to. Summarizes each issue—with contribution guidance and relevance insights—using Deepwiki MCP. Converts the summaries into HTML and delivers them as an email newsletter. Requirements GitHub Personal Access Token OpenRouter API Key Google App Password Make sure your target open-source project is indexed at https://deepwiki.com/{owner}/{repo} (e.g. https://deepwiki.com/vercel/next.js) How to Use Update the “Load repo info” node with your target repository’s owner and name (e.g. owner: vercel, repo: next.js). Add your GitHub Personal Access Token to the credentials of the “Get Issues from GitHub” node. Connect your OpenRouter API key to all models linked to the Agent node. Add your Google App Password to the “Send Email” node credentials. Enter the same email address (associated with the Google App Password) in both the “to email” and “from email” fields — the newsletter will be sent to this address. Customization Adjust the maximum number of contributor-friendly issues retrieved in the “Get Top Fit Issues” node. Improve results by tuning the models connected to the Agent node. Refine the criteria for “contributor-friendliness” within the “IssueRank Agent” node. Cron Setup Replace the manual trigger with a Schedule Trigger node or another scheduling-capable node. If you don't have an n8n Cloud account, use this alternative setup: fork the repository and follow the setup instructions. TroubleShooting If there is an issue with the AI model’s response, modify the ai_model setting. (If you want to use a free model, search for models containing “free” and choose one of them.)
by Intuz
Disclaimer: Community nodes are used, and template can only be used on self-hosted n8n instances. This n8n template from Intuz provides a complete solution to automate your entire B2B lead generation pipeline, from discovering recently funded companies to drafting hyper-personalized outreach emails with AI. Who's this workflow for? Sales Development Representatives (SDRs) Business Development Teams Growth Hackers Startup Founders Marketing Agencies How it works 1. Scrape Funded Companies: The workflow begins by using Apify to scrape a target list of recently funded companies directly from a Crunchbase search. 2. Enrich with Apollo.io: It takes each company and uses the Apollo.io API to find key decision-makers (like VPs, Directors) and enrich their contact information, including finding their email addresses. 3. Populate Google Sheets: All the gathered lead data—company name, contact name, title, email, LinkedIn URL, etc.—is neatly organized and added to a Google Sheet. 4. AI-Personalized Email Crafting: The workflow sends the lead's information to OpenAI (GPT-4) with a highly specialized prompt, instructing it to write a concise, impactful, and hyper-personalized "first touch" cold email. 5. Update Lead List with Email Content: Finally, the unique, AI-generated email is saved back into the Google Sheet alongside the corresponding lead's information, making it ready for you to send. Pre-conditions and Requirements Before you can successfully execute this workflow, you must have the following accounts, credentials, and assets in place. 1. n8n Instance: You need an active n8n instance (self-hosted). 2. Apify Account & Crunchbase Access: Apify Account: A registered account on Apify. Crunchbase Account: An active, logged-in Crunchbase account (a paid subscription is recommended for accessing detailed search filters). 3. Apollo.io API: You need an Apollo.io plan that includes API access. You can generate the API from settings. 4. Google Sheet: Create a new Google Sheet to store your leads. The workflow is configured for two tabs: one for raw data ("HealthCare" in the template) and one for email generation ("Company sheet"). 5. OpenAI Account: An account with OpenAI with API access and billing set up. Setup Instructions 1. Apify Connection: Connect your Apify account in the Run an Actor node. You'll need an apify scrapper, here's the link In the Custom Body field, update the search.url with your target Crunchbase discovery URL and provide a valid cookie for authentication. 2. Apollo.io Connection: Connect your Apollo.io account using HTTP Header Authentication in the three Apollo nodes. You will need to provide your API key. 3. Google Sheets Connection: Connect your Google Sheets account. Create a spreadsheet and update the Document ID and Sheet Name in the three Google Sheets nodes to match yours. Ensure your sheet columns are set up to receive the data. 4. OpenAI Connection: Connect your OpenAI account in the Message a model node. The prompt is pre-engineered for high-quality output, but you can tailor it to better fit your specific value proposition. 5. Activate Workflow: Click "Execute workflow" to run the automation manually and watch your AI-powered lead list build itself. Customization Guide This workflow is a powerful template. To adapt it to your specific business needs, you should review and modify the following nodes. 1. Changing Your Target Companies (The Source) Node: Run an Actor What to change: The search.url parameter inside the customBody. How to do it: Go to Crunchbase and perform a search for your ideal companies (e.g., filter by different funding rounds, industry, location, keywords, etc.). Copy the URL from your browser's address bar after the search results have loaded. Paste this new URL as the value for "search.url" in the node. You can also adjust "count": 10 to pull more or fewer companies per run. Be mindful of Apify and Apollo credit usage. 2. Defining Your Ideal Contact Persona Node: Apollo - Get User What to change: The person_seniorities and person_titles arrays in the jsonBody. How to do it: 1. Seniority: Modify the person_seniorities list to match who you sell to. Examples: ["c_level", "founder"] or ["manager", "contributor"]. 2. Job Titles: This is crucial. Replace the existing list of titles ("engineering", "technology", etc.) with keywords relevant to your target buyer. For example, if you sell to marketing teams, you might use: ["marketing", "demand generation", "growth", "content", "brand"]. 3. Configuring Your Google Sheet Destination Nodes: Append or update row in sheet and Update row in sheet What to change: The documentId and sheetName. How to do it: Open your Google Sheet. The documentId is the long string of characters in the URL between /d/ and /edit. Copy and paste it into the "Document ID" field in both nodes. The sheetName (or Sheet ID/gid) needs to be set for your specific tabs. Make sure the sheet names/IDs in the nodes match the tabs in your document. Column Mapping: If you change the column names in your Google Sheet, you must update the column mapping inside these nodes to ensure the data is written to the correct place. 4. Tailoring the AI Email Generation Node: Message a model (OpenAI) What to change: The prompt, the model, and the input variables. How to do it: The Prompt: This is the heart of your outreach. Read the entire prompt carefully and edit it to reflect your company's value proposition, tone of voice, and specific call-to-action. Value Proposition: Change the line "We help them cut that specific infrastructure spend..." to match what your product does. Use a powerful, single data point if you have one. Call-to-Action (CTA): Modify the final question ("Curious if infra efficiency is on your roadmap...") to something that fits your sales process. Tone: Adjust the initial instructions (e.g., "Your tone is that of a peer...") if you want a different style. The Model: The workflow uses gpt-4.1. You can switch to a different model like gpt-4o (potentially better/faster) or gpt-3.5-turbo (much cheaper, but lower quality) depending on your budget and needs. Input Variables: The prompt uses {{ $json['Company Name'] }}, {{ $json['Person Designation'] }}, and {{ $json.Industry }}. If you want to add more personalization (e.g., based on a company's funding amount), you would first need to ensure that data is passed to this node, then add the new variable (e.g., {{ $json['Funding Amount'] }}) into the prompt. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Don Jayamaha Jr
Instantly fetch real-time Bitget spot market data directly in Telegram! This workflow integrates the Bitget REST v2 API with Telegram (plus optional AI-powered formatting) to deliver the latest crypto price, order book, candles, and recent trades. Perfect for crypto traders, analysts, and investors who need reliable market data at their fingertips—no API key required.  Sign-up for Bitget for 6,200 USDT in rewards to trade: Collect Now How It Works A Telegram bot listens for user requests (e.g., BTCUSDT). The workflow connects to Bitget public endpoints to fetch: Ticker (latest price & 24h stats) Order book depth (top bids/asks) Recent trades (price, side, volume, timestamp) Candlestick data (1m, 15m, 1h, 4h, 1d) Historical candles (optional, for backfill before endTime) A Calculator node derives useful metrics like mid-price and spread. A Think node reshapes raw JSON into Telegram-ready text. A splitter ensures reports over 4000 characters are chunked safely. The final market insights are delivered instantly back to Telegram. What You Can Do with This Agent ✅ Track live prices & 24h stats for any Bitget spot pair. ✅ Monitor order book liquidity and spreads in real-time. ✅ Analyze candlesticks across multiple timeframes. ✅ Review recent trades to see execution flow. ✅ Fetch historical candles for extended market context. ✅ Receive clean, structured reports with optional AI-enhanced formatting. Set Up Steps Create a Telegram Bot Use @BotFather to generate a bot token. Configure in n8n Import Bitget AI Agent v1.02.json into your n8n instance. Add your Telegram credentials (bot token + your Telegram ID in the User Authentication node). Add an OpenAI key if you want AI-powered formatting. (Optional) Add an *Bitget api key** . Deploy and Test Send BTCUSDT to your bot. Get live Bitget spot data instantly in Telegram! 🚀 Unlock powerful, real-time Bitget insights in Telegram—zero setup, zero API keys required! 📺 Setup Video Tutorial Watch the full setup guide on YouTube: 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: Don Jayamaha – LinkedIn
by Robert Breen
Chat to write or reword a blog post. The workflow stores each result in Google Sheets and uses a sub-workflow “Google tool” to count rows per session (your running context). If a session exceeds a row threshold, the flow can branch (e.g., stop or notify). ⚙️ Setup Instructions 1️⃣ Set Up OpenAI Connection Go to OpenAI Platform Navigate to OpenAI Billing Add funds to your billing account Copy your API key into the OpenAI credentials in n8n 2️⃣ Prepare Your Google Sheet Connect your Data in Google Sheets Use this format: Sample Sheet Row 1 = column names (e.g., session, Rows, output) Data in rows 2–100 (or more if you prefer) In n8n, use Google Sheets OAuth2 → pick your Spreadsheet and Worksheet (Optional) You can adapt this to Airtable, Notion, or a Database 🧠 How It Works Chat Trigger**: Provide a topic (write) or paste existing text (reword). Code Node (“Choose to Write or Edit Blog”)**: Builds a system_prompt + user_prompt Instructs the agent to call the Google tool (sub-workflow) with only the sessionid to count existing rows. Tool Workflow (“google”)**: Fetches rows from the sheet → filters by session → summarizes row count. Agent (“Blog Writer & Editor”)**: Returns structured JSON (items/rows, session, blog body). Store (Google Sheets)**: Appends { session, Rows, output } to the sheet. If Node**: Example rule: Rows > 3 → branch/limit/notify as needed. 💬 Example Prompts “Write a 600-word blog about n8n agents with 3 bullet takeaways. Session: abc123.” “Reword this post into a concise LinkedIn article. Session: launchQ3:\n<your text here>” “Draft a blog intro and 5 SEO headlines on marketing automation. Session: mkt-01.” 📬 Contact Need help tailoring this to Airtable/Notion/DB, or adding auto-publishing? 📧 rbreen@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by System Admin
Objective: Automatically categorize incoming emails based on existing Gmail labels or create a new label if none match. Tools: - Get message - Read all labels - Create label - Assign label to message...
by Oneclick AI Squad
A fully automated, AI-powered email assistant built in n8n that reads incoming emails, understands their intent and sentiment, classifies them by category, drafts intelligent context-aware replies, and sends them automatically — all without any human intervention. Built using OpenAI + Gmail/SMTP integration for any business or team. 🎯 What's the Goal? Replace slow, manual email handling with an always-on AI email assistant that reads every incoming message, understands what the sender needs, and responds instantly with a professional, personalized reply — escalating only when truly necessary. 💡 Why Does It Matter? Email overload is one of the biggest productivity killers for businesses. Teams spend hours every day reading, triaging, and replying to repetitive messages — support requests, meeting inquiries, sales questions, and more. This workflow automates the entire cycle: inbox → understand → reply → log → escalate, saving hours per day while improving response times from hours to seconds. ⚙️ How It Works Gmail trigger polls inbox for new unread emails Email content (subject + body) is extracted and cleaned AI classifies email: Support / Sales / Meeting / Complaint / Spam / General Sentiment is analyzed: Positive / Neutral / Negative / Urgent AI drafts a professional, context-aware reply based on classification If sentiment is Urgent or Negative → escalate to human via Slack If routine → auto-send reply via Gmail Email thread logged to Google Sheets (sender, category, sentiment, reply) Slack notification sent with summary for team awareness Label applied in Gmail for organization & tracking 🔧 Configuration Requirements Gmail OAuth2** (for reading inbox and sending replies) OpenAI API key** (for classification, sentiment & reply generation) Google Sheets OAuth2** (for email log & analytics) Slack Bot Token** (for escalation alerts & team summaries) Optional: SMTP credentials** (if using non-Gmail provider) Optional: CRM webhook** (to sync contacts & interactions) 🚀 Setup Guide Import this workflow into your n8n instance Connect credentials: Gmail OAuth2, OpenAI, Google Sheets, Slack Open the Set Email Config node and configure: check_interval_minutes — how often to poll inbox (recommended: 5) auto_reply_categories — which categories to auto-reply (e.g. Support, General) escalate_categories — which to escalate (e.g. Complaint, Urgent) sender_name — your name or business name for reply signature log_sheet_id — Google Sheets document ID for logging slack_channel — channel name for escalation alerts Customize reply templates in the Set Reply Templates node Run a test with a sample email using the manual trigger Verify reply quality in Gmail Sent folder Check Google Sheets log for the entry Activate workflow — it will now monitor inbox automatically Monitor escalation volume in Slack and tune thresholds weekly 📞 Contact Us Need help setting up or customizing this workflow for your business? 👉 https://www.oneclickitsolution.com/contact-us/
by Tony Adijah
Who is this for This workflow is ideal for marketers, product managers, competitive intelligence teams, and anyone who needs to track changes on web pages — whether it's competitor pricing, job postings, policy updates, product pages, or any content that matters to your business. What this workflow does It automatically monitors a list of URLs on a schedule, fetches each page, extracts clean text content, compares it against the previous snapshot stored in Google Sheets, generates a detailed line-by-line diff with change percentage and severity rating, and sends instant alerts via Telegram and email when changes are detected. How it works Schedule Trigger checks your target URLs at a configurable interval (default: every 4 hours). URL List node defines which pages to monitor — easily add or remove URLs without touching other nodes. HTTP Request fetches the current HTML content of each page with browser-like headers. Content Extractor strips HTML tags, scripts, styles, and navigation to get clean readable text, plus extracts page title and meta description. Load Previous Snapshot reads the last saved version from Google Sheets for comparison. Diff Engine compares current vs. previous content line-by-line, calculates a change percentage, assigns a severity level (low/medium/high/critical), and generates a human-readable diff summary showing exactly what was added or removed. Change Filter only passes through pages that actually changed — no noise, no false alerts. Save Snapshot stores the new version in Google Sheets for the next comparison cycle. Telegram + Email Alerts send formatted notifications with the diff summary, change percentage, severity, and a direct link to the page. Setup steps Add URLs — Edit the "URL List" code node and add your target URLs. Each entry needs a name (friendly label) and url (full URL). Optionally add a selector keyword to focus on specific page sections. Google Sheets — Create a new spreadsheet with a sheet named "website" and columns: Name, url, selector, pageTitle, metaDescription, cleanText, contentHash, contentLength, fetchedAt, httpStatus. Connect your Google Sheets OAuth credential and set the spreadsheet ID in both Sheet nodes. Telegram — Create a bot via @BotFather, get your Bot Token and Chat ID. Connect the Telegram credential and set your Chat ID in the Telegram Alert node. Email (optional) — Connect your SMTP or Gmail credential in the Email Alert node and set your from/to addresses. Schedule — Adjust the cron interval in the trigger node (every 1 hour for critical pages, every 24 hours for low-priority). First Run — Run the workflow manually once to save baseline snapshots. Change detection begins from the second run onward. Requirements Google Sheets account with OAuth credentials Telegram bot (created via @BotFather) with bot token and chat ID SMTP or Gmail credentials for email alerts (optional) n8n instance (cloud or self-hosted) How to customize Add CSS selector keywords in the URL List to monitor only specific sections of a page (e.g., pricing tables, job listings). Increase check frequency to every 15 minutes for critical monitoring. Add Slack, Discord, or WhatsApp as additional alert channels by branching from the Save Snapshot node. Use AI (OpenAI, Ollama) to summarize changes in plain language instead of raw diff output. Monitor API endpoints (JSON responses) by adjusting the Content Extractor logic. Add multiple sheets for different monitoring categories (competitors, pricing, legal, careers).
by Avkash Kakdiya
How it works This workflow monitors user usage via a webhook and automatically triggers an upsell process when limits are exceeded. It formats incoming data, generates a personalized email using AI, and sends it to the user. The workflow then logs the activity in HubSpot and notifies the team via Slack. This ensures timely engagement and improves upgrade conversion rates without manual intervention. Step-by-step Trigger & prepare data** Webhook – Receives real-time user usage data from your app. Edit Fields – Extracts and formats key fields like email, plan, and usage. Generate and send AI email** AI Agent – Creates a personalized upsell message based on usage data. Groq Chat Model – Sub-node that powers AI text generation. Send a message (Gmail) – Sends the generated email to the user. Update CRM with activity** Search contacts (HubSpot) – Finds the user in HubSpot using email. HTTP Request – Logs an upsell note/activity in the contact record. Notify team for follow-up** Send a message (Slack) – Alerts the team with user and usage details. Why use this? Automates upsell outreach at the perfect moment of high intent Reduces manual monitoring of user usage and limits Ensures CRM is always updated with customer activity Improves team visibility with real-time Slack notifications Increases upgrade conversions with personalized AI messaging