by Anir Agram
๐ Google Sheets Leads โ ๐ฒ Random Templates โ โ๏ธ Personalized Emails โ ๐ Status Tracking What this workflow does ๐ Reads leads list from Google Sheets (Name, Email, Send Status) ๐ Filters out already-contacted leads (skips "SENT" status) ๐ฒ Randomly selects email template from template library โ๏ธ Personalizes subject and body with lead's name ๐ง Sends emails one-by-one with delays between sends โ Updates Google Sheet with send status and timestamp ๐ Loops through all unsent leads automatically Why it's useful โก Automate cold outreach without manual copy-paste ๐ฏ Avoid duplicate sendsโtracks who's been contacted ๐ Rotate email templates for A/B testing and variety ๐ค Personalization makes emails feel human, not spammy โฑ๏ธ Built-in delays prevent spam flags and rate limits ๐ Full audit trail of who received what and when How it works ๐ Google Sheets (Leads) โ reads Name, Email, Send Status ๐ฆ IF Node โ filters leads where Send Status โ "SENT" ๐ Loop Over Items โ processes leads one-by-one ๐ง Google Sheets (Templates) โ fetches Subject + Body templates ๐ฒ Code Node โ picks random template ๐ Merge โ combines lead data with template โ๏ธ Edit Fields โ replaces [Name] with actual lead name ๐จ Send Email โ delivers personalized message โณ Wait โ adds delay between sends (avoids spam flags) โ Google Sheets (Update) โ marks as "SENT" with timestamp What you'll need ๐ Google Sheet #1: Leads (columns: Name, Email, Send Status, Time) ๐ง Google Sheet #2: Templates (columns: Subject, Body) ๐ฌ SMTP credentials (SendGrid, Mailgun, etc.) ๐ Google Sheets OAuth Setup steps ๐ Create "Leads" sheet with columns: Name | Email | Send Status | Time ๐ง Create "Templates" sheet with columns: Subject | Body (use [Name] placeholder) ๐ Connect Google Sheets OAuth credentials ๐ฌ Add SMTP email credentials ๐งญ Update both Google Sheets node IDs to point to your sheets โ๏ธ Set "From Email" in Send Email node ๐งช Test with 2-3 test leads first Customization ideas โฑ๏ธ Adjust Wait time (30s-5min) to control send rate ๐ Add click tracking with UTM parameters ๐ Send Slack/Telegram notification when campaign completes ๐ฏ Add lead scoringโprioritize high-value leads first ๐ Log opens/replies to separate tracking sheet Who it's for ๐งโ๐ป Freelancers doing cold outreach to agencies ๐ Sales teams running lead generation campaigns ๐ Startups reaching out to potential customers ๐ฏ Marketers testing email copy variations ๐ผ Business developers nurturing prospect lists Quick Setup Guide Before You Start - What You Need: ๐ Google account for Sheets access ๐ง SMTP email account (Gmail, custom domain, or email service) ๐ List of leads (names + emails) โ๏ธ Email templates ready (with [Name] placeholders) Want help customizing? ๐ง anirpoke@gmail.com ๐ LinkedIn
by Ahmed Salama
Categories CRM Automation, Revenue Operations, Sales Operations, Meeting Automation Build a Revenue Ops Meeting Pipeline with Pipedrive, Calendar, Slack This workflow creates a CRM-driven revenue operations meeting pipeline that automatically coordinates meetings once a deal reaches a specific stage in Pipedrive. When a deal moves into the Meeting Booking stage, the workflow waits for the SDR to complete the meeting details, creates the event in Google Calendar, sends a confirmation email to the client, and notifies the internal team in Slack. The result is a reliable, no-manual-work system that ensures meetings are scheduled, confirmed, and communicated without human follow-up. Benefits CRM as the Single Source of Truth** All automation is triggered directly from deal stage changes. Reduced No-Shows** Clients receive timely meeting confirmations with correct links and times. Zero Manual Coordination** No copying links, sending reminders, or checking calendars. Internal Visibility** Sales teams receive Slack reminders automatically. Time-Zone Safe Scheduling** Meeting times are calculated and normalized automatically. How It Works Deal Stage Trigger (Pipedrive) Listens for deal updates in Pipedrive Runs only when the deal enters the Meeting Booking stage Prevents execution on irrelevant pipeline changes Controlled Wait Logic Pauses execution to allow the SDR to: Add the meeting link Set date, time, and duration Ensures data completeness before scheduling Data Extraction & Enrichment Fetches full deal details Extracts: Client name and email Company name Meeting link Meeting date and time Calculates ISO start and end times with time-zone handling Calendar Event Creation (Google Calendar) Creates a calendar event automatically Adds the client as an attendee Inserts the meeting link as the event location Client Email Confirmation Sends a personalized confirmation email Includes meeting date, time, and context Reduces rescheduling and confusion Internal Slack Notification Sends a reminder to a selected Slack channel Notifies SDRs and sales managers of upcoming meetings Keeps teams aligned without CRM checking Required Setup Pipedrive Deal pipeline with a defined Meeting Booking stage Meeting details stored in deal activities Google Calendar OAuth access enabled Permission to create events Gmail OAuth access enabled Email sending permissions Slack OAuth access enabled Target channel selected Business Use Cases Sales Teams Eliminate missed meetings and manual reminders Reduce admin work for SDRs Revenue Operations Standardize meeting execution across pipelines Improve forecasting reliability Founders & Managers Increase meeting attendance without micromanagement Agencies & Consultants Deliver CRM-based RevOps automation to clients Difficulty Level Intermediate Estimated Build Time 45โ60 minutes Monthly Operating Cost Pipedrive: Existing plan Google Calendar: Free Gmail: Free Slack: Free or paid workspace n8n: Self-hosted or cloud Typical range: $0โ20/month Why This Workflow Works Deal stage changes represent real sales intent Waiting logic prevents broken automations Calendar-first execution ensures reliability Multi-channel notifications reduce human error Possible Extensions Add SMS or WhatsApp reminders Auto-cancel meetings on stage rollback Log meeting outcomes back to Pipedrive Trigger post-meeting follow-ups Add AI-generated meeting summaries
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? ๐ Book a Call | ๐ฌ DM me on Linkedin What this workflow does This workflow automates repetitive email sending directly from Google Sheets, eliminating hours of manual work each week. It reads email data from your spreadsheet, uses AI to generate personalized content based on recipient context, sends messages through Gmail, and automatically tracks responses back to your sheet. Perfect for sending reminders, follow-ups, or regular customer communications at scale. Key features Smart filtering**: Only processes rows marked with Status = "To send", preventing duplicate sends AI personalization**: Generates tailored email content using OpenAI based on brief recipient introductions Automatic status updates**: Marks emails as "Sent" after delivery to maintain accurate records Response tracking**: Monitors Gmail replies and logs them back to your spreadsheet automatically Common use cases Event reminders and webinar notifications with personalized context Customer follow-up sequences that feel individually crafted Regular business communications requiring recipient-specific details Sales outreach campaigns with AI-generated personalization at scale Setup requirements Credentials needed: Google Sheets OAuth2 connection Gmail OAuth2 connection OpenAI API credentials Google Sheet structure: Create columns for: To (email address), Subject, Introduction (one-sentence recipient context), Status ("To send" or "Sent"), and Response (auto-filled). Configuration: Update the Google Sheets document ID and sheet names to match your spreadsheet. Customize the AI prompt template in the AI Agent node to match your desired email tone and format. The response tracking branch runs automatically every minute to capture replies.
by inderjeet Bhambra
How it Works This automated workflow transforms Zoom meeting emails into professional summaries and Google Docs. It monitors your Gmail for Zoom meeting notification emails, extracts meeting content using AI-powered analysis, and generates both email-safe HTML summaries and Google Docs-compatible text. The workflow intelligently parse meeting transcripts, creating structured summaries with attendee lists, key discussion points, action items, and next steps. It automatically creates Google Docs for record-keeping and sends formatted email summaries to meeting participants. Who is it For Perfect for project managers, team leads, executives, and meeting coordinators who regularly conduct Zoom meetings and need consistent, professional documentation. Ideal for organizations that require structured meeting records, action item tracking, and seamless integration between communication tools. Especially valuable for distributed teams, consulting firms, and corporate environments where meeting accountability and follow-up are critical. Benefits for Using this Workflow Time Savings**: Eliminates 30-60 minutes of manual meeting summary creation per meeting. Professional Consistency**: Ensures all meeting summaries follow the same structured format with headers, attendees, discussion points, and action items. Automated Documentation**: Creates searchable Google Docs archives and distributes summaries without manual intervention. Enhanced Accountability**: Clear action item tracking with assignees and deadlines improves follow-through. Multi-Format Output**: Provides both email-friendly HTML and plain text formats for maximum compatibility across different platforms and systems. Setup Requirements Prerequisites**: Active n8n instance, Gmail account, Google Docs access, OpenAI API key. Required Credentials**: Configure Gmail OAuth2 for both trigger and sending, Google Docs OAuth2 for document creation, OpenAI API for GPT-4 processing. Configuration Steps**: 1) Import workflow and activate Gmail trigger with filter for Zoom meeting emails, 2) Set up Google Drive folder permissions for document creation, 3) Configure OpenAI credentials and verify AI agent tools connection, 4) Test workflow with a sample Zoom email to ensure proper formatting and delivery. Optional Customization**: Modify Gmail search filters, adjust Google Docs folder location, customize email templates, or fine-tune AI prompt for specific meeting formats.
by tsushima ryuto
This n8n workflow is designed to centralize the management and tracking of customer inquiries received through multiple channels (email and web forms). Who's it for? Customer support teams Marketing teams Sales teams Small to medium-sized businesses Individuals looking to streamline customer inquiry processes How it works / What it does This workflow is designed to automatically collect, process, route, and track customer inquiries from different sources. Multi-Channel Input: The workflow listens for inquiries from both incoming emails and web form submissions. Email Trigger: Monitors a specific inbox for sent emails. Webhook - Web Form: Listens for web form data submitted to a designated endpoint. Data Extraction and Parsing: Extract Email Content: Extracts HTML content from incoming emails to get a clean text message. Parse Email Data: Extracts relevant information from the email, such as customer name, email address, subject, message, received timestamp, source ("email"), and inquiry type (e.g., "urgent", "billing", "general") based on the subject line. Parse Webhook Data: Extracts customer name, email, subject, message, received timestamp, source ("webform"), and inquiry type from the web form data based on the provided type or a default of "general". Merge Inquiries: The parsed email and web form inquiry data are combined into a single stream for continued processing. Route by Inquiry Type: The workflow then routes the inquiries based on the extracted inquiryType. Urgent Inquiries: Inquiries marked as "urgent" are routed to a specific Slack channel for immediate alerts. General Inquiries: Inquiries marked as "general" are notified in another Slack channel. Billing Inquiries: Inquiries marked as "billing" are routed to the general inquiries channel, or can be customized for a separate channel if needed. Save to Google Sheets: All inquiry data is logged into a Google Sheet, which serves as a central repository, including details like customer name, email, subject, message, source, received timestamp, and inquiry type. Send Auto-Reply Email: Customers receive an automated email reply confirming that their inquiry has been successfully received. How to set up Google Sheets: Create a new spreadsheet in your Google Drive. Name the first sheet "Inquiries" and create the following header row: customerName, customerEmail, subject, message, source, receivedAt, inquiryType. In the 'Save to Google Sheets' node, configure the Spreadsheet ID and Sheet Name. Link your Google Sheets credentials. Email Trigger (IMAP): Set up the 'Email Trigger' node to connect to your IMAP email account. Test it to ensure it correctly listens for incoming emails before activating the workflow. Webhook - Web Form: Copy the Webhook URL from the 'Webhook - Web Form' node and configure your web form to submit data to it. Ensure your web form sends fields like name, email, subject, message, and type in JSON format. Slack: Configure your Slack credentials to connect to your Slack workspace. Update the relevant Slack Channel IDs in both the 'Notify Urgent - Slack' and 'Notify General - Slack' nodes for sending notifications for urgent and general inquiries. Gmail: Set up your Gmail credentials to connect to your Gmail account. Ensure the 'Send Auto-Reply Email' node is correctly linked to your sending Gmail account. Requirements An n8n instance A Google Sheets account An IMAP-enabled email account A Slack workspace A Gmail account A basic web form (to integrate with the Webhook node) How to customize the workflow Add more Inquiry Types: You can add more specific inquiry types (e.g., "technical support", "returns") by adding more rules in the **'Route by Inquiry Type' node. Additional Notification Channels**: To integrate other notification systems (e.g., Microsoft Teams, Discord, SMS) beyond Slack, create new routing outputs and add new notification nodes for the desired service. CRM Integration**: Instead of or in addition to saving data to Google Sheets, you can add new nodes to connect to CRM systems like Salesforce, HubSpot, or others. Prioritization and Escalation**: Implement more complex logic to trigger escalation processes or prioritization rules based on inquiry type or keywords. AI Sentiment Analysis**: Integrate an AI node to analyze the sentiment of inquiry messages and route or prioritize them accordingly. `
by Madame AI
Create curated industry trend reports from Medium to Google Docs This workflow automates the process of market research by generating high-quality, curated digests of Medium articles for specific topics. It scrapes recent content, uses AI to filter out spam and duplicates, categorizes the stories into readable buckets, and compiles everything into a formatted Google Doc report. Target Audience Content marketers, market researchers, product managers, and investors looking to track industry trends without reading through noise. How it works Schedule: The workflow runs on a defined schedule (e.g., daily or weekly) via the Schedule Trigger. Define Source: A Set node defines the specific Medium tag URL to track (e.g., /tag/artificial-intelligence). Scrape Content: BrowserAct visits the target Medium page and scrapes the latest article titles, authors, and summaries. Analyze & Filter: An AI Agent (powered by Claude via OpenRouter) analyzes the raw feed. It removes duplicates, filters out spam/clickbait, and categorizes high-quality stories into buckets (e.g., "Must Reads," "Engineering," "Wealth"). Create Report: A Google Docs node creates a new document using the digest title generated by the AI. Build Document: The workflow loops through the analyzed items, appending headers and body text to the Google Doc section by section. Notify Team: A Slack node sends a message to your chosen channel confirming the report is ready. How to set up Configure Credentials: Connect your BrowserAct, Google Docs, Slack, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the Automated Industry Trend Scraper & Outline Creator template is saved in your BrowserAct account. Set Target Topic: Open the Target Page node and replace the Target_Medium_Link with the Medium tag archive you wish to track (e.g., https://medium.com/tag/bitcoin/archive). Configure Notification: Open the Send a message node (Slack) and select the channel where you want to receive alerts. Activate: Turn the workflow on. Requirements BrowserAct* account with the *Automated Industry Trend Scraper & Outline Creator** template. Google Docs** account. Slack** account. OpenRouter** account (or any compatible LLM credentials). How to customize the workflow Adjust the AI Persona: Modify the system prompt in the Analyzer & Script writer node to change the categorization buckets (e.g., change "Engineering" to "Marketing Strategies"). Change the Output Destination: Replace the Google Docs nodes with Notion or Airtable nodes if you prefer a database format over a document. Add Email Delivery: Add a Gmail or Outlook node at the end to email the finished Google Doc link directly to stakeholders. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Stop Writing Outlines! Use This AI Trend Scraper (BrowserAct + n8n + Gemini)
by Oneclick AI Squad
This enterprise-grade n8n workflow automates the Pharmaceutical Raw Material COA Verification & Vendor Quality Scoring System โ from upload to final reporting โ using AI-powered document extraction, specification matching, and dynamic vendor scoring. It processes Certificates of Analysis (COAs) to validate compliance, assign quality scores, generate approvals or CAPA requests, and notify stakeholders, ensuring regulatory adherence and vendor accountability with full audit trails and zero manual data entry. Key Features Webhook-triggered COA Upload** for seamless integration with file-sharing systems AI Document Extraction** to parse test results and data from uploaded COAs Automated Specification Analysis** matching against predefined quality standards Weighted Vendor Scoring** based on compliance metrics and historical performance Compliance Decision Engine** with approve/reject branching and CAPA flagging Dynamic Certificate Generation** for approved materials, including digital signatures Vendor Database Synchronization** to update scores and records in real-time Targeted Email Notifications** for QA, production, and executive teams Executive Reporting Dashboard** with summaries, scores, and verification logs Audit-Ready Logging** for all steps, deviations, and decisions Workflow Process | Step | Node | Description | | ---- | ----------------------------------- | -------------------------------------------------------- | | 1 | START: Upload COA | Webhook trigger receives uploaded COA file for verification process | | 2 | EXTRACT: Parse COA | Extracts test results and data from the COA document using AI parsing | | 3 | ANALYZE: Vendor Compliance | Compares extracted data against specifications and flags deviations | | 4 | SCORE: Vendor Quality Rating | Calculates weighted compliance score based on test results and history | | 5 | DECISION: Compliance Route | Evaluates score/status: Branches to approve (green) or reject (red) path | | 6 | APPROVED: Generate Approval Cert (Approved Path) | Creates digital approval certificate for compliant materials | | 7 | Update Vendor Database | Saves verification record, score, and status to vendor database | | 8 | NOTIFY: Email Alert | Sends detailed notifications to QA/production teams | | 9 | REPORT: Final Report | Generates executive summary with COA scores and verifications | | 10 | REJECT: Generate Rejection Report (Reject Path) | Produces rejection report with deviation details | | 11 | Request CAPA | Initiates Corrective and Preventive Action (CAPA) process | | 12 | PATH REJECTED | Terminates rejected branch with audit log entry | Setup Instructions 1. Import Workflow Open n8n โ Workflows โ Import from Clipboard Paste the JSON workflow 2. Configure Credentials | Integration | Details | | ----------------- | -------------------------------------------------- | | File Storage (e.g., Google Drive/AWS S3) | API key or OAuth for COA upload handling | | AI Extraction (e.g., Claude or OCR Tool) | API key for document parsing (e.g., claude-3-5-sonnet-20241022) | | Database (e.g., PostgreSQL/Airtable) | Connection string for vendor records and specs | | Email (SMTP/Gmail) | SMTP credentials or OAuth for notifications | 3. Update Database/Sheet IDs Ensure your database or Google Sheets include: VendorDatabase for scores and history Specifications for quality standards 4. Set Triggers Webhook:** /coa-verification (for real-time file uploads) Manual/Scheduled:** For batch processing if needed 5. Run a Test Use manual execution to confirm: COA extraction and analysis Score calculation and branching Email notifications and report generation (use sample COA file) Database/Sheets Structure VendorDatabase | vendorId | coaId | score | complianceStatus | lastVerified | deviations | capaRequested | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------| | VEND-123456 | COA-789012 | 92.5 | Approved | 2025-11-04T14:30:00Z | None | No | Specifications | materialType | testParam | specMin | specMax | weight | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------|----------| | API Excipient | Purity (%) | 98.0 | 102.0 | 0.4 | System Requirements | Requirement | Version/Access | | --------------------- | ---------------------------------------------- | | n8n | v1.50+ (AI and database integrations supported) | | AI Parsing API | claude-3-5-sonnet-20241022 or equivalent OCR | | Database API | SQL connection or Google Sheets API | | Email API | https://www.googleapis.com/auth/gmail or SMTP | | File Storage | AWS S3 or Google Drive API access | Optional Enhancements Integrate ERP Systems (e.g., SAP) for direct material release Add Regulatory Export to PDF/CSV for FDA audits Implement Historical Trend Analysis for vendor performance dashboards Use Multi-Language Support for global COA extraction Connect Slack/Teams for real-time alerts beyond email Enable Batch Processing for high-volume uploads Add AI Anomaly Detection for predictive non-compliance flagging Build Custom Scoring Models via integrated ML tools Result: A fully automated quality assurance pipeline that verifies COAs, scores vendors, and drives compliance decisions โ ensuring pharmaceutical safety and efficiency with AI precision and complete traceability. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by Amit Mehta
Streamline Your Zoom Meetings with Secure, Automated Stripe Payments This comprehensive workflow automates the entire process of setting up a paid online event, from scheduling a Zoom meeting and creating a Stripe payment link to tracking participants and sending confirmation emails. How it Works This workflow has two primary, distinct branches: Event Creation and Participant Registration. Event Creation Flow (Triggered via Form): An administrator submits details (title, price, date/time) via a form. The workflow creates a new Zoom meeting with a unique password. It creates a Stripe Product and a Payment Link. A dedicated Google Sheet tab is created for tracking participants. An email is sent to the event organizer with all the details, including the Zoom link, payment link, and participant list URL. Participant Registration Flow (Triggered via Stripe Webhook): A webhook is triggered when a Stripe payment is completed (checkout.session.completed). The participant's details are added to the dedicated Google Sheet tab. A confirmation email is sent to the participant with the Zoom link and password. A notification email is sent to the event organizer about the new registration. Use Cases Webinar Sales**: Automate setup and registration for paid webinars. Consulting/Coaching Sessions**: Streamline the booking and payment process for group coaching calls. Online Classes**: Handle registration, payment, and access distribution for online courses or classes. Setup Instructions Credentials: Add credentials for: Zoom: For creating the meeting. Google: You need both Gmail and Google Sheets credentials. Stripe: For creating products and handling payment webhooks. Google Sheet: Create a new, blank Google Sheet to hold meeting and participant information. Config Node: Fill the Config node with: currency (e.g., EUR). sheet_url (the URL of the Google Sheet you created). teacher_email (the organizer/host's email). Workflow Logic The workflow splits into two logical parts handled by an if node: Part A: Event Creation (Triggered by Creation Form) Trigger: Creation Form (Form Trigger). Check: if is creation flow (If) evaluates to true. Zoom: Create Zoom meeting creates the session. Stripe Product: Create Stripe Product creates a product and price in Stripe. Stripe Link: Create payment link generates the public payment link, embedding Zoom and sheet metadata. Google Sheet: Create participant list creates a new sheet tab for the event. Email Host: Send email to teacher notifies the host of the successful setup. Part B: Participant Registration (Triggered by On payment) Trigger: On payment (Stripe Trigger - checkout.session.completed). Format: Format participant extracts customer details. Google Sheet: Add participant to list appends the new participant's info to the event's sheet. Email Participant: Send confirmation to participant sends the Zoom access details. Email Host: Notify teacher sends a registration alert. Node Descriptions | Node Name | Description | |-----------|-------------| | Creation Form | A form trigger used to input the event's required details (title, price, start date/time). | | On payment | A Stripe trigger that listens for the checkout.session.completed event, indicating a successful payment. | | Create Zoom meeting | Creates a new Zoom meeting, calculating the start time based on the form inputs. | | Create Stripe Product | Posts to the Stripe API to create a new product and price based on the form data. | | Create payment link | Creates a Stripe Payment Link, embedding Zoom meeting and Google Sheet ID metadata. | | Create participant list | Creates a new tab (named dynamically) in the configured Google Sheet for event tracking. | | Add participant to list | Appends a new row to the event's Google Sheet tab upon payment completion. | | Send email to teacher / Notify teacher | Sends emails to the host/organizer for creation confirmation and new participant registration, respectively. | | Send confirmation to participant | Sends the welcome email to the paying customer with the Zoom access details retrieved from the Stripe metadata. | Customization Tips Email Content**: You are encouraged to adapt the email contents in the Gmail nodes to fit your branding and tone. Currency**: Change the currency in the Config node. Zoom Password**: The password is set to a random 4-character string; you can modify the logic in the Create Zoom meeting node. Stripe Price**: The price is sent to Stripe in the smallest currency unit (e.g., cents, * 100). Suggested Sticky Notes for Workflow Setup**: "Add Your credentials [Zoom, Google, Stripe]. Note: For Google, you need to add Gmail and Google Sheet. Create a new Google Sheet. Keep this sheet blank for now. And fill the config node." Creation Form**: "Your journey to easy event management starts here. Click this node, copy the production URL, and keep it handy. It's your personal admin tool for quickly creating new meetings." Customize**: "Feel free to adapt email contents to your needs." Config**: "Setup your flow". Required Files 2DT5BW5tOdy87AUl_Streamline_Your_Zoom_Meetings_with_Secure,_Automated_Stripe_Payments.json: The n8n workflow export file. A new, blank Google Sheet (URL configured in the Config node). Testing Tips Test Creation**: Run the Creation Form to trigger the Part A flow. Verify that a Zoom meeting and Stripe Payment Link are created, a new Google Sheet tab appears, and the host receives the setup email. Test Registration**: Simulate a successful payment to the generated Stripe link to trigger the Part B flow. Verify that the participant is added to the Google Sheet, receives the confirmation email with Zoom details, and the host receives the notification. Suggested Tags & Categories #Stripe #Zoom #Payment #E-commerce #GoogleSheets #Gmail #Automation #Webinar
by Spiritech Studio
This n8n template demonstrates how to automatically extract text content from PDF documents received via WhatsApp messages using OCR. It is designed for use cases where users submit documents through WhatsApp and the document content needs to be digitized for further processing โ such as document analysis, AI-powered workflows, compliance checks, or data ingestion. Good to know This workflow processes PDF documents only. OCR is handled using AWS Textract, which supports both scanned and digital PDFs. AWS Textract pricing depends on the number of pages processed. Refer to AWS Textract Pricing for up-to-date costs. An AWS S3 bucket is required as an intermediate storage layer for the PDF files. Processing time may vary depending on PDF size and number of pages. How it works The workflow is triggered when an incoming WhatsApp message containing a PDF document is received. The PDF file is downloaded from WhatsAppโs media endpoint using an HTTP Request node. The downloaded PDF is uploaded to an AWS S3 bucket to make it accessible for OCR processing. AWS Textract is invoked to analyze the PDF stored in S3 and extract all readable text content. The Textract response is parsed and consolidated into a clean, ordered text output representing the PDFโs content. How to use The workflow can be triggered using a webhook connected to WhatsApp Cloud API or any compatible WhatsApp integration. Ensure your AWS credentials have permission to upload to S3 and invoke Textract. Once active, simply send a PDF document via WhatsApp to start the extraction process automatically. Requirements WhatsApp integration (e.g. WhatsApp Cloud API or provider webhook) AWS account with: S3 bucket access Textract permissions n8n instance with HTTP Request and AWS nodes configured Customising this workflow Store extracted text in a database or document store. Pass the extracted content to an AI model for summarization, classification, or validation. Split output by pages or sections. Add file type validation or size limits. Extend the workflow to support additional document formats.
by Tristan V
YouTube Video Transcript Summarizer โ Discord Bot > Paste a YouTube URL into a Discord channel and this workflow automatically extracts the transcript, uses an LLM to generate a concise summary, and stores everything in a database โ all in seconds. > Self-hosted n8n only. This workflow uses the Execute Command node to run yt-dlp inside the n8n container. This requires shell access, which is only available on self-hosted instances (Docker, VPS, etc.) โ it will not work on n8n Cloud. Import this workflow into n8n Prerequisites | Tool | Purpose | |------|---------| | Discord Bot | Listens for messages and sends replies | | yt-dlp | Downloads subtitles and video metadata (must be installed in the n8n container) | | Google Gemini API | Summarizes video transcripts (Gemini 2.5 Flash) | | Supabase | Stores video data and run logs | Credentials | Node | Credential Type | Notes | |------|----------------|-------| | Discord Trigger | Discord Bot Trigger | Bot token with Message Content Intent enabled | | Discord Reply / Discord Not YouTube Reply / Discord Error Reply | Discord Bot | Same bot, used for sending messages | | Message a model (Gemini) | Google Gemini (PaLM) API | API key from Google AI Studio | | Save to Supabase / Log Run / Log Run Error | Supabase | Project URL + anon key | What It Does When a user pastes a YouTube URL into a Discord channel, the workflow: Detects the YouTube URL using RegEx (supports youtube.com, youtu.be, shorts, live) Extracts the video's subtitles (English and Vietnamese) and metadata using yt-dlp Cleans the raw VTT subtitle file into plain-text transcript Summarizes the transcript using an LLM (Gemini 2.5 Flash) into a TLDR + detailed summary (in the original language) Stores the video metadata, full transcript, and AI summary in a Supabase database Logs every run (success or error) to a separate runs table for tracking Chunks long summaries into Discord-safe messages (โค2000 characters each) Replies in Discord with the video title, stats, and the full summary Non-YouTube messages get a friendly "not a YouTube link" reply. Errors are caught, classified, logged to the database, and reported back to Discord. How It Works Main Flow (Happy Path) Discord Trigger โ Extract YouTube URL โ Is YouTube URL? โโ Yes โ yt-dlp Get Metadata โ Parse Metadata โ Read Subtitle File โ Parse Transcript โ โ Message a model (Gemini) โ Prepare Insert Data โ Save to Supabase โ โ Prepare Success Log โ Log Run โ Prepare Messages for Discord โ Discord Reply โโ No โ Discord Not YouTube Reply Error Flow Error Trigger โ Prepare Error Data โ Log Run Error โ Discord Error Reply Node Breakdown | # | Node | Type | Description | |---|------|------|-------------| | 1 | Discord Trigger | Discord Bot Trigger | Fires on every message in the configured channel | | 2 | Extract YouTube URL | Code | RegEx extracts video ID from message content | | 3 | Is YouTube URL? | IF | Routes YouTube URLs to processing, others to rejection reply | | 4 | yt-dlp Get Metadata | Execute Command | Downloads subtitles (.vtt, English/Vietnamese) and prints metadata JSON | | 5 | Parse Metadata | Code | Extracts title, channel, views, duration via RegEx; decodes Unicode for multi-language support | | 6 | Read Subtitle File | Execute Command | Dynamically finds and reads the .vtt file (continueOnFail enabled) | | 7 | Parse Transcript | Code | Strips VTT timestamps/tags, deduplicates lines | | 8 | Message a model | Google Gemini | Sends transcript to Gemini 2.5 Flash for TLDR + detailed summary (in original language) | | 9 | Prepare Insert Data | Code | Merges summary with all metadata fields | | 10 | Save to Supabase | Supabase | Inserts full record into videos table | | 11 | Prepare Success Log | Code | Builds success run record | | 12 | Log Run | Supabase | Inserts into runs table | | 13 | Prepare Messages for Discord | Code | Chunks long summaries into Discord-safe messages (โค2000 chars) | | 14 | Discord Reply | Discord | Posts summary preview to channel | | 15 | Discord Not YouTube Reply | Discord | Replies when message isn't a YouTube link | | 16 | Error Trigger | Error Trigger | Catches any unhandled node failure | | 17 | Prepare Error Data | Code | Classifies error type and extracts context | | 18 | Log Run Error | Supabase | Logs error to runs table | | 19 | Discord Error Reply | Discord | Posts error message to channel | Setup Guide 1. Discord Bot Go to the Discord Developer Portal Create a new Application โ Bot Enable Message Content Intent under Privileged Intents Copy the Bot Token Invite the bot to your server with Send Messages + Read Messages permissions In n8n, create a Discord Bot Trigger credential (for listening) and a Discord Bot credential (for sending replies) Update the guild ID and channel ID in the Discord Trigger node and all Discord reply nodes 2. yt-dlp yt-dlp must be installed in your n8n container. For Docker-based installs: docker exec -it n8n apk add --no-cache python3 py3-pip docker exec -it n8n pip3 install yt-dlp Optional: Place a cookies.txt file at /home/node/.n8n/cookies.txt to avoid age-gated or bot-detection issues. 3. Google Gemini API Go to Google AI Studio Click Create API Key and copy it In n8n, click the Gemini node โ Credential โ Create New Paste your API key and save 4. Supabase Create a project at supabase.com Go to Settings โ API and copy the URL and anon key In n8n, create a Supabase credential with your URL and API key Run the SQL below in the Supabase SQL Editor to create the required tables Supabase SQL -- Videos table: stores video metadata, transcript, and AI summary CREATE TABLE videos ( video_id TEXT PRIMARY KEY, title TEXT, channel TEXT, upload_date TEXT, duration INT, view_count INT, description TEXT, transcript TEXT, ai_summary TEXT, thumbnail_url TEXT, channel_id TEXT, date_added TIMESTAMPTZ DEFAULT now() ); -- Runs table: logs every workflow execution (success or error) CREATE TABLE runs ( video_id TEXT PRIMARY KEY, process_status TEXT NOT NULL, error_type TEXT, notes TEXT, date_added TIMESTAMPTZ DEFAULT now() );
by Cheng Siong Chin
How It Works This workflow automates academic research processing by routing queries through specialized AI models while maintaining contextual memory. Designed for researchers, faculty, and graduate students, it solves the challenge of managing multiple AI models for different research tasks while preserving conversation context across sessions. The system accepts research queries via webhook, stores them in vector databases for semantic search, and intelligently routes requests to appropriate AI models (OpenAI, Anthropic Claude, or NVIDIA NIM). Results are consolidated, formatted, and delivered via email with full citation tracking. The workflow maintains conversation history using Pinecone vector storage, enabling follow-up queries that reference previous interactions. This eliminates manual model switching, context loss, and repetitive credential managementโstreamlining research workflows from literature review to hypothesis generation. Setup Steps Configure Pinecone credentials Add OpenAI API key for GPT-4 access and embeddings Set up Anthropic Claude API credentials for advanced reasoning Configure NVIDIA NIM API key for specialized academic models Connect Google Sheets for query logging and result tracking Set Gmail OAuth credentials for automated result delivery Configure webhook URL for query submission endpoint Prerequisites Active accounts and API keys for Pinecone, OpenAI Use Cases Literature review automation with semantic paper discovery. Customization Modify AI model selection logic for domain-specific optimization. Benefits Reduces research processing time by 60% through automated routing.
by Cheng Siong Chin
How It Works This workflow automates legislative compliance analysis by coordinating multiple specialized OpenAI agents to interpret regulatory documents, evaluate organizational impact, and manage stakeholder communication with complete audit traceability. It is built for compliance officers, legal teams, and governance leaders who must process new or amended legislation quickly without the burden of manual document review. The template addresses the core challenge of staying compliant amid rapidly evolving regulations. When a legislative document is submitted, the workflow retrieves and extracts its full text, then passes it to a Policy Interpretation Agent powered by OpenAI for structured analysis. A Governance Orchestration Agent then activates three parallel specialist agentsโImpact Assessment, Compliance Mapping, and Stakeholder Notification to generate standardized outputs. Decisions are routed based on review status: auto-approved items are logged directly into Google Sheets, while flagged items trigger legal review through Slack alerts, compliance tracker updates, and stakeholder notifications, ensuring every regulatory change is evaluated, documented, and acted upon promptly. Setup Steps Add OpenAI API key to all OpenAI Model nodes Connect Google Sheets OAuth2 credentials; set spreadsheet IDs for Auto-Approved Log Configure Slack OAuth2 token; set target channel in Notify Legal Team node Set up Gmail/SMTP credentials in Notify Stakeholders node; update recipient addresses Configure legislative document source URL or webhook endpoint in Fetch Legislative Document node Adjust routing thresholds in Route by Review Status node to match your approval criteria Prerequisites OpenAI API key, Google Sheets with OAuth2, Slack workspace with bot token Use Cases Regulatory change management, GDPR/financial compliance monitoring, policy impact assessment Customization Swap OpenAI for NVIDIA NIM models, add additional specialist agents Benefits Cuts manual compliance review time by 70%, ensures no legislation goes unassessed