by Christian Lutz
How it works This workflow automates the delivery of personalized, AI-generated reports or roadmaps for new leads. When someone submits their information through a form, the workflow: Captures and stores the lead data. Uses an AI model to generate a customized report or roadmap. Formats the output into a professional, email-ready HTML document. Sends the report automatically to the lead via email. Optionally sends internal notifications (e.g., via chat or email) for tracking and follow-up. The process eliminates manual work and ensures every lead receives instant, high-quality output tailored to their input. Setup steps Webhook – Connect your form or website to the webhook endpoint to receive lead data. Data Table – Create or link a table to store incoming leads and track delivery status. AI Configuration – Add your OpenAI (or compatible) API credentials and customize prompts for your desired output. Email Setup – Configure SMTP credentials and define sender/recipient addresses for report delivery. Notifications – Optionally connect a chat or messaging service (e.g., Telegram) for internal alerts. Activation – Test the workflow, confirm the data flow and email delivery, then activate it for live use. This workflow transforms manual lead engagement into a fully automated, AI-driven experience that delivers instant, personalized value to every new contact.
by Takuya Ojima
Who’s it for Remote and distributed teams that schedule across time zones and want to avoid meetings landing on public holidays—PMs, CS/AM teams, and ops leads who own cross-regional calendars. What it does / How it works The workflow checks next week’s Google Calendar events, compares event dates against public holidays for selected country codes, and produces a single Slack digest with any conflicts plus suggested alternative dates. Core steps: Workflow Configuration (Set) → Fetch Public Holidays (via a public holiday API such as Calendarific/Nager.Date) → Get Next Week Calendar Events (Google Calendar) → Detect Holiday Conflicts (compare dates) → Generate Reschedule Suggestions (find nearest business day that isn’t a holiday/weekend) → Format Slack Digest → Post Slack Digest. How to set up Open Workflow Configuration (Set) and edit: countryCodes, calendarId, slackChannel, nextWeekStart, nextWeekEnd. Connect your own Google Calendar and Slack credentials in n8n (no hardcoded keys). (Optional) Adjust the Trigger to run daily or only on Mondays. Requirements n8n (Cloud or self-hosted) Google Calendar read access to the target calendar Slack app with permission to post to the chosen channel A public-holiday API (no secrets needed for Nager.Date; Calendarific requires an API key) How to customize the workflow Time window: Change nextWeekStart/End to scan a different period. Holiday sources: Add or swap APIs; merge multiple regions. Suggestion logic: Tweak the look-ahead window or rules (e.g., skip Fridays). Output: Post per-calendar messages, DM owners, or create tentative reschedule events automatically.
by vanhon
Split Test AI Prompts Using Supabase & Langchain Agent This workflow allows you to A/B test different prompts for an AI chatbot powered by Langchain and OpenAI. It uses Supabase to persist session state and randomly assigns users to either a baseline or alternative prompt, ensuring consistent prompt usage across the conversation. 🧠 Use Case Prompt optimization is crucial for maximizing the performance of AI assistants. This workflow helps you run controlled experiments on different prompt versions, giving you a reliable way to compare performance over time. ⚙️ How It Works When a message is received, the system checks whether the session already exists in the Supabase table. If not, it randomly assigns the session to either the baseline or alternative prompt. The selected prompt is passed into a Langchain Agent using the OpenAI Chat Model. Postgres is used as chat memory for multi-turn conversation support. 🧪 Features Randomized A/B split test per session Supabase database for session persistence Langchain Agent + OpenAI GPT-4o integration PostgreSQL memory for maintaining chat context Fully documented with sticky notes 🛠️ Setup Instructions Create a Supabase table named split_test_sessions with the following columns: session_id (text) show_alternative (boolean) Add credentials for: Supabase OpenAI PostgreSQL (for chat memory) Modify the "Define Path Values" node to set your baseline and alternative prompts. Activate the workflow. Send messages to test both prompt paths in action. 🔄 Next Steps Add tracking for conversions or feedback scores to compare outcomes. Modify the prompt content or model settings (e.g. temperature, model version). Expand to multi-variant tests beyond A/B. 📚 Learn More How This Workflow Uses Supabase + OpenAI for Prompt Testing
by Robert Breen
Use this template to upload an image, run a first-pass OpenAI Vision analysis, then re-attach the original file (binary/base64) to the next step using a Merge node. The pattern ensures your downstream AI Agent (or any node) can access both the original file (data) and the first analysis result (content) at the same time. ✅ What this template does Collects an image file* via *Form Trigger** (binary field labeled data) Analyzes the image* with *OpenAI Vision* (GPT-4o) using *base64** input Merges* the original upload and the analysis result (combine by position) so the next node has *both** Re-analyzes/uses* the image alongside the first analysis in an *AI Agent** step 🧩 How it works (Node-by-node) Form Trigger Presents a simple upload form and emits a binary/base64 field named data. Analyze image (OpenAI Vision) Reads the same data field as base64 and runs image analysis with GPT-4o. The node outputs a text content (first-pass analysis). Merge (combine by position) Combines the two branches so the next node receives both the original upload (data) and the analysis (content) on the same item. AI Agent Receives data + content together. Prompt includes the original image (=data) and the first analysis ({{$json.content}}) to compare or refine results. OpenAI Chat Model Provides the language model for the Agent (wired as ai_languageModel). 🛠️ Setup Instructions (from the JSON) > Keep it simple: mirror these settings and you’re good to go. 1) Form Trigger (n8n-nodes-base.formTrigger) Path:* d6f874ec-6cb3-46c7-8507-bd647c2484f0 *(you can change this) Form Title:** Image Document Upload Form Description:** Upload a image document for AI analysis Form Fields:** Label: data Type: file Output:* emits a binary/base64 field named *data**. 2) Analyze image (@n8n/n8n-nodes-langchain.openAi) Resource:** image Operation:** analyze Model:** gpt-4o Text:* =data *(use the uploaded file field) Input Type:** base64 Credentials:* OpenAI (use your stored *OpenAI API** credential) 3) Merge (n8n-nodes-base.merge) Mode:** combine Combine By:** combineByPosition Connect Form Trigger → Merge (input 2) Connect Analyze image → Merge (input 1) This ensures the original file (data) and the analysis (content) line up on the same item. 4) AI Agent (@n8n/n8n-nodes-langchain.agent) Prompt Type:** define Text:** System Message:** analyze the image again and see if you get the same result. Receives:** merged item containing data + content. 5) OpenAI Chat Model (@n8n/n8n-nodes-langchain.lmChatOpenAi) Model:** gpt-4.1-mini Wiring:* connect as *ai_languageModel* to the *AI Agent** Credentials:** same OpenAI credential as above > Security Note: Store API keys in Credentials (do not hardcode keys in nodes). 🧠 Why “Combine by Position” fixes the binary issue Some downstream nodes lose access to the original binary once a branch processes it. By merging the original branch (with data) and the analysis branch (with content) by position, you restore a single item with both fields—so the next step can use the image again while referencing earlier analysis. 🧪 Test Tips Upload a JPG/PNG and execute the workflow from the Form Trigger preview. Confirm Merge output contains both data (binary/base64) and content (text). In the AI Agent, log or return both fields to verify availability. 🔧 Customize Swap GPT-4o for another Vision-capable model if needed. Extend the AI Agent to extract structured fields (e.g., objects detected, text, brand cues). Add a Router after Merge to branch into storage (S3, GDrive) or notifications (Slack, Email). 📝 Requirements n8n (cloud or self-hosted) with web UI access OpenAI** credential configured (Vision support) 🩹 Troubleshooting Binary missing downstream?* Ensure *Merge* receives *both** branches and is set to combineByPosition. Wrong field name?* The *Form Trigger* upload field must be labeled *data** to match node expressions. Model errors?* Verify your *OpenAI* credential and that the chosen model supports *image analysis**. 💬 Sticky Note (included in the workflow) > “Use Binary Field after next step” — This workflow demonstrates how to preserve and reuse an uploaded file (binary/base64) after a downstream step by using a Merge node (combineByPosition). A user uploads an image via Form Trigger → the image is analyzed with OpenAI Vision → results are merged back with the original upload so the next AI Agent step can access both the original file (data) and the first analysis (content) at the same time. 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Bohdan Saranchuk
This n8n template automates your customer support workflow by connecting Gmail, OpenAI, Supabase, and Slack. It listens for new incoming emails, classifies them using AI, routes them to the appropriate Slack channel based on category (e.g., support or new requests), logs each thread to Supabase for tracking, and marks the email as read once processed. Good to know • The OpenAI API is used for automatic email classification, which incurs a small per-request cost. See OpenAI Pricing for up-to-date info. • You can easily expand the categories or connect more Slack channels to fit your workflow. • The Supabase integration ensures you don’t process the same thread twice. How it works Gmail Trigger checks for unread emails. Supabase Get Row verifies if the thread already exists. If it’s a new thread, the OpenAI node classifies the email into categories such as “support” or “new-request.” The Switch node routes messages to the correct Slack channel based on classification. Supabase Create Row logs thread details (sender, subject, IDs) to your database. Finally, the Gmail node marks the message as read to prevent duplication. How to use • The workflow uses a manual Gmail trigger by default, but you can adjust the polling frequency. • Modify category names or Slack channels to match your internal setup. • Extend the workflow to include auto-replies or ticket creation in your CRM. Requirements • Gmail account (with OAuth2 credentials) • Slack workspace (with channel access) • OpenAI account for classification • Supabase project for storing thread data Customizing this workflow Use this automation to triage incoming requests, route sales leads to specific teams, or even filter internal communications. You can add nodes for auto-responses, CRM logging, or task creation in Notion or ClickUp.
by Wessel Bulte
Generate Weekly n8n Execution Report and Email Summary Description: How it works Automatically runs every 7 days to pull all n8n workflow executions from the past week Merges execution data with workflow information to provide context Generates a professional HTML report with execution statistics (errors, successes, waiting status) Sends the formatted report with Outlook or Gmail Set up steps 1. Configure n8n API Credential Go to your n8n instance → Settings → API Create a new API token with read access to workflows and executions In this workflow, add a new "n8n" credential and paste your API token This credential is used by two nodes: "Get all Workflows" and "Get all previous executions" 2. Connect Email Services Configure your Outlook credential in the "Send a message outlook" node Configure your Gmail credential in the "Send a message gmail" node Set your preferred email recipients in both nodes 3. Adjust Schedule (Optional) By default, the workflow runs every 7 days Edit the "Schedule Trigger" node to change the interval if needed Key features Tracks workflow execution status and runtime metrics Calculates average and total runtime for each status type Provides visual HTML report with color-coded status indicators Dual email delivery (Outlook + Gmail options) Requires only n8n API credentials (no external API keys needed) Need Help 🔗 LinkedIn – Wessel Bulte
by Ilyass Kanissi
📋Instant Proposal Generator Automatically convert sales call transcripts into professional client proposals by extracting key details with AI, dynamically populating Google Slides templates, and tracking progress in Airtable, all in one seamless workflow. 🎯 What does this workflow do? This end-to-end automation creates client-ready proposals by: Taking call transcripts via chat interface The AI analyzes the transcript to extract key details like company name, goals, budget, and requirements, then structures this data as JSON for seamless workflow integration. Generating customized documents using Google Slides template with dynamic variables, Auto populating {Company_Name}, {Budget}, etc. from extracted data. Delivering finished proposals: Sharing final document with client, and Updating CRM status automatically. ⚙️ How it works User input: Paste call transcript in chat trigger node AI analysis: OpenAI node processes text to extract structured JSON, Identifies company name, goals, budget, requirements, etc. Document copy: it copies the file from Google Drive, and name it {company name} proposal. Variables replacement: Replaces all template variables ({Company_Name}, {Budget}, etc.) with extracted data from ChatGPT. Delivery & tracking: Shares finalized proposal with client via email, an Updates Airtable "Lead Status" to "Proposal Sent". 🔑 Required setup OpenAI API Key: Create a key from here Google Cloud Credentials: Setup here Required scopes: Google Slides edit + file creation Airtable Access Token: Create one from here
by Mohammad
Telegram ticket intake and status tracking with Postgres Who’s it for Anyone running support requests through Telegram, Email, Webhooks, and so on who needs a lightweight ticketing system without paying Zendesk prices. Ideal for small teams, freelancers, or businesses that want tickets logged in a structured database (Postgres) and tracked automatically. I'm using Telegram since it's the most convenient one. How it works / What it does This workflow turns (Telegram) into a support desk: Receives new requests via a Telegram bot command. Creates a ticket in a Postgres database with a correlation ID, requester details, and status. Auto-confirms back to the requester with the ticket ID. Provides ticket updates (status changes, resolutions) when queried. Keeps data clean using dedupe keys and controlled input handling. How to set up Create a Telegram bot using @BotFather and grab the token. Connect your Postgres database to n8n and create a tickets table: CREATE TABLE tickets ( id BIGSERIAL PRIMARY KEY, correlation_id UUID, source TEXT, external_id TEXT, requester_name TEXT, requester_email TEXT, requester_phone TEXT, subject TEXT, description TEXT, status TEXT, priority TEXT, dedupe_key TEXT, chat_id TEXT, created_at TIMESTAMP DEFAULT NOW(), updated_at TIMESTAMP DEFAULT NOW() ); Add your Telegram and Postgres credentials in n8n (via the Credentials tab, not hardcoded). Import the workflow JSON and replace the placeholder credentials with yours. Test by sending /new in Telegram and follow the prompts. Requirements n8n (latest version recommended) Telegram bot token Postgres instance (local, Docker, or cloud) How to customize the workflow Change database fields if you need more requester info. Tweak the Switch node and Comands for multiple status types. Extend with Slack, Discord, or email nodes for broader notifications. Integrate with external systems (CRM, project management) by adding more branches.
by Rohit Dabra
Google Sheets → Stripe Payment Automation Workflow 📌 Overview This workflow automates the end-to-end process of generating and sending client payment links using Google Sheets and Stripe. Whenever a new or updated entry is added to the Google Sheet, the workflow will: Fetch client and invoice details. Create a Stripe Product and Price. Generate a Stripe Payment Link. Store the link back in the Google Sheet. Upload a copy of the invoice to Google Drive. Send a professional, formatted email with the payment link to the client using Gmail. 🔗 Demo Video: Watch on YouTube ⚡️ Workflow Steps Trigger – The workflow is triggered on any update in the Google Sheet. Filter – Ensures only relevant rows (e.g., PENDING invoices) proceed. Stripe Automation Create Stripe Product Create Stripe Price Generate Stripe Payment Link Google Drive – Store invoice files (if required). Google Sheets – Update the sheet with the generated Stripe Payment Link and timestamp. Gmail – Send a client-facing email with the invoice details and payment link. 🛠 Setup Guide Prerequisites n8n account** Google Sheets & Google Drive credentials** Gmail API credentials** Stripe API Key** Steps Clone/Import Workflow Import the workflow JSON file into your n8n instance. Configure Google Sheets Create a Google Sheet with columns: Order ID, Client Name, Client Email, Items Description, Due Date, Amount, Currency, Invoice Status, Invoice Link, Stripe Payment Link, Last Updated Connect your Google Sheets node to this sheet. Set Up Stripe Obtain your Stripe Secret Key from Stripe Dashboard. Add it in the Stripe nodes for Product, Price, and Payment Link creation. Google Drive Configure to store invoice backups (optional). Gmail Authorize Gmail and set up the Send Email node. Customize the email template with client details and the Stripe link. Test the Workflow Add a sample row in Google Sheets. Run the workflow manually or update the sheet to trigger automatically. Verify that the Stripe link is created, updated in the sheet, and emailed to the client. ✅ Now your workflow is ready to automatically manage client invoices and payments with Stripe + Google Sheets + Gmail + Google Drive.
by Meak
Form Lead Scoring with AI → Google Sheets + Slack When a new lead fills out your Typeform or any other form, this workflow classifies the message with AI, stores it in the right Google Sheet tab, and can send your team a Slack alert. Benefits Get new leads in real time from any form Classify each lead with AI (hot / neutral / cold) Save leads to the correct Google Sheets tab automatically Send Slack alerts for hot leads so you act fast Keep your pipeline clean and easy to scan How It Works Webhook receives a new form submission Parse name, email, phone, message, and timestamp AI analyzes the message and returns hot / neutral / cold Route to the matching Google Sheets tab (Hot, Neutral, Cold) (Optional) Post a Slack message with key details Who Is This For Agencies running paid ads and lead forms Sales teams that need quick triage Coaches, creators, and SaaS teams with waitlists Setup Connect your form tool (Typeform or other) to a webhook Add Google Gemini (or your AI model) credentials Connect Google Sheets (Spreadsheet ID + Tab names) (Optional) Connect Slack and select a channel Test with a few submissions to check routing ROI & Monetization Respond faster to hot leads → higher close rates Save 2–4 hours/week on manual sorting Offer as “AI lead scoring” for clients ($500–$2k/month) Strategy Insights In the full walkthrough, I show how to: Write a short, reliable prompt for clear labels Map form fields cleanly into Sheets Format Slack alerts for quick reading Expand with auto-replies or CRM sync later Check Out My Channel For more AI automation systems that get real results, check out my YouTube channel where I share the exact strategies I use to build automation workflows, win clients, and scale to $20k+ monthly revenue.
by SpaGreen Creative
WooCommerce New Category Alert via WhatsApp Using Rapiwa API This n8n automation listens for the creation of a new WooCommerce product category, fetches all WooCommerce customers, cleans and formats their phone numbers, verifies them using the Rapiwa WhatsApp validation API, sends a WhatsApp message to verified numbers with the new category info, and logs each interaction into a Google Sheet (separately for verified and unverified customers). Who this is for You have a WooCommerce store and want to: Send a promotional message when a new product category is added, Verify customer WhatsApp numbers in bulk, Keep a clear log in Google Sheets of which numbers are verified or not. What it does (high level) Webhook is triggered when a new WooCommerce category is created. Fetches all WooCommerce customers via API. Limits processing to the first 10 customers (for performance/testing). Cleans phone numbers (removes +, spaces, and non-digits). Verifies each number via Rapiwa WhatsApp Verify API. If verified: sends WhatsApp message with new category info, logs as Verification = verified, Status = sent. If not verified: logs as Verification = unverified, Status = not sent. Processes users in batches with delays to avoid rate limiting. How it works (step-by-step) Trigger**: Webhook node is triggered by WooCommerce category creation. Format Data**: Category details (name, slug, description) are parsed. Get Customers**: Fetch all WooCommerce customers using the WooCommerce API. Limit**: Only the first 10 are processed. Loop & Clean**: Loop over each customer, clean phone numbers and extract info. Verify Number**: Send HTTP POST to https://app.rapiwa.com/api/verify-whatsapp. Decision Node**: Use If node to check if exists == true. Send Message**: If verified, send WhatsApp message with category details. Append to Sheet**: Log verified and unverified customers separately in Google Sheets. Wait + Batch Control**: Use Wait and SplitInBatches nodes to control flow and prevent throttling. Example verify body (HTTP Request node): { "number": "{{ $json['WhatsApp No'] }}" } Customization ideas Send images, videos, or template messages if supported by Rapiwa. Personalize messages using name or category data. Increase delay or reduce batch size to minimize risk of rate limits. Add a second sheet to log full API responses for debugging and auditing. Best practices Test on small batches before scaling. Only send messages to users who opted in. Store API credentials securely using n8n’s credentials manager. Ensure your Google Sheet column headers match exactly with what's expected. Key Improvements Made Clarified the trigger source as a Webhook from WooCommerce category creation. Fixed inconsistency in the "What it does" section (originally referenced reading from Google Sheets, but your workflow starts from WooCommerce, not Sheets). Standardized terminology to match n8n nodes: Webhook, Loop, HTTP Request, etc. Aligned the flow exactly with your nodes: Webhook → Format → Get Customers → Limit → Loop → Clean → Verify → If → Send/Log → Wait → Repeat Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
by Ossian Madisson
This n8n template allows you to automatically upload all attached files from incoming emails to Google Drive with optional filters on sender, receiver and file types This template is built to be customized for your specific needs. This template has the core logic and n8n node specific references sorted to work with dynamic file names throughout the workflow. Use cases Store invoices in Google Drive Save recurring reports in Google Drive Post recurring reports to another n8n workflow for further processing Archive files to Google Drive by email Save all files received by a client in a dedicated Google Drive folder Good to know The workflow is designed to not use custom code, preferring built-in nodes in n8n How it works Trigger on incoming emails with attachments (Optional) filter on sender/recipient Splits all attachments of the email into separate items (Optional) filter attachment based on file type (Optional) treat attachments with different file types through different paths Upload attachment to Google Drive Mark the email read and archive it after all attachments has been processed Notify in Slack how many attachments was processed in the execution How to use Configure Google credentials (1,2,6) Configure Slack credentials (7) Configure or disable sender/receiver filter (3) Configure or disable file type filter (4) Configure or disable file type paths (5) Configure destination folder (6) Build on this to fit your use case Note: there's a similar template with the same basics but with less ready-made modifications and no loop that allows us to archive the email and notify to Slack when done.