by Pedro Protes
AI Agent that uses MCP Server to execute actions requested via Evolution API. This workflow receives messages and media from WhatsApp via the Evolution API, converts the content into structured inputs, and forwards them to an AI Agent capable of triggering MCP tools to execute external actions. 🔧 How it works A Webhook receives messages sent to WhatsApp via the Evolution API. The "Message Type" node detects and forwards the received media. It handles the types Text, Image, Audio, and Document. If it is another media type, the fallback forwards a "media not supported" message to the user. The message goes to the system where it retrieves the Base64 of the media. The media is converted into Binary File(s) and a Gemini node will generate a text input for the agent. The AI Agent receives the structured input and calls the appropriate MCP Tool. In this example, only one MCP Server was configured. The AI Agent generates the output and sends it to the user. 🗒️ Requirements Evolution API Account, with the instance configured. Gemini API. Google Calendar API. MCP Server (Internal or external, whichever you prefer) configured and with a URL to link to the MCP Tool. ✔️ How to set up Configure the Evolution API webhook** Copy the webhook URL generated in the first node. In the Evolution API panel, go to the instance > webhook > paste the URL into the corresponding field. Configure Google Calendar credentials** In n8n, go to Credentials → Create New and select Google Calendar OAuth2. Select this credential in all Google Calendar MCP nodes (Get, Create, Update, Delete). Enable MCP Server nodes** Copy the MCP Server URL and paste it into the “Endpoint field of the MCP Tool. Configure Evolution API nodes** In all Evolution API nodes, you need to fill in the “instance field with the name of your Evolution API instance. 🦾 how to adapt it? Customize or extend the MCP Tools** You can add new MCP tools (e.g., Google Sheets, Notion, ClickUp). Only the agent prompt needs to be updated; the workflow structure remains the same. I opted to use simple memory, but if you want the agent to remember the entire conversation, I recommend changing the memory type; as it is, it will only remember the last 8 messages. If you're going to use a tool like Chatwoot or TypeBot, simply change the webhook URL and pay attention to the objects that the switch (Message Type) uses.
by vinci-king-01
How it works This workflow automatically monitors supplier health and supply chain risks, providing real-time alerts and daily reports to procurement teams. Key Steps Daily Risk Check - Runs the workflow every morning at 9:00 AM to assess supplier health. Multi-Source Data Collection - Scrapes supplier websites, investor relations pages, and industry news for risk indicators. AI-Powered Risk Analysis - Uses ScrapeGraphAI to extract and analyze financial status, operational issues, and regulatory problems. Risk Scoring Engine - Calculates comprehensive risk scores (1-10) based on multiple factors including financial health, operational disruptions, and news sentiment. Alternative Supplier Discovery - Automatically searches for backup suppliers when high-risk situations are detected. Smart Alert System - Routes notifications based on risk levels: immediate alerts for high-risk suppliers, daily summaries for normal operations. Multi-Channel Notifications - Sends alerts via Slack and detailed reports via email to procurement teams. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping capabilities. Set up Slack integration - Connect your Slack workspace and configure the #procurement-alerts and #supply-chain-updates channels. Configure email settings - Set up email credentials for detailed reports to procurement teams. Customize supplier URLs - Update the supplier website URLs to monitor your specific suppliers. Adjust risk thresholds - Modify the risk scoring parameters based on your industry and risk tolerance. Set notification preferences - Configure alert conditions and message formatting for your team's needs.
by Avkash Kakdiya
How it works This workflow sends WhatsApp messages and emails in bulk using contact data stored in Google Sheets. Contacts are processed in small batches to control throughput and avoid API rate limits. WhatsApp and email are treated as independent channels and are sent only when their status is marked as pending. All success and failure results are written back to Google Sheets to enable tracking, retries, and safe re-runs. Step-by-step Step 1: Fetch contacts & batch processing** Manual Trigger – Starts the workflow manually. Get Contacts – Reads contact data from Google Sheets. Split In Batches – Processes contacts in controlled batch sizes. Step 2: Email preparation & sending** Has Email Address – Checks whether the contact has an email address. IF Mail Pending – Ensures the email is still marked as pending. PrepareEmail email – Loads the selected InboxPlus email template. Build HTML Email – Builds the final HTML email body. Fetch Email Image – Downloads images for inline or attachment usage. Send Gmail – Sends the email via Gmail. Delivered – Confirms successful email delivery. Step 3: WhatsApp message sending** Has Phone Number – Checks whether the contact has a phone number. IF WhatsApp Pending – Ensures the WhatsApp message is still pending. Send template – Sends the approved WhatsApp template message. Sent – Confirms message acceptance by WhatsApp. Step 4: Delivery status updates** Update Sheet – Writes successful delivery results back to Google Sheets. Mail Failure – Updates Google Sheets if email delivery fails. Whatsapp Failure – Updates Google Sheets if WhatsApp delivery fails. Why use this? Prevents duplicate messages with channel-level pending checks Handles WhatsApp and email independently in one workflow Supports safe retries without resending completed messages Keeps Google Sheets as the single source of truth Scales bulk outreach safely using batch-based execution
by Pratyush Kumar Jha
A compact n8n workflow that accepts a YouTube link or uploaded video, pulls a transcript via Supadata.ai, runs a language-model-based video analysis agent to produce a structured report, extracts a title/metadata, then creates and updates a Google Doc with the analysis. It's designed to automate transcription → analysis → document creation for fast, repeatable video reviews. How it works Trigger — Upload File or YouTube Link A form trigger receives a youtube_url or an uploaded file/webhook event. Transcription — Transcription using Supadata.ai Calls the transcription API using the x-api-key header to retrieve the video transcript/text. Analysis — Analyser The transcript is passed to the Analyser LangChain agent which runs a tailored prompt (expert video analyst) and generates a plain-text report. Metadata extraction — File Name Detector The information extractor parses the analyser output to extract structured attributes such as the Title. Aggregation & Merge Merge/Aggregate nodes combine the analysis and extracted fields into a single payload. Document Creation Creating New File creates a Google Docs document using the extracted Title, and Updating Content in File inserts the analyser output into the document. Optional Follow-ups Additional nodes can forward the document link, send it to Slack, or store metadata in a database. Quick Setup Guide 👉 Demo & Setup Video 👉 Course Nodes of interest Upload File or YouTube Link** formTrigger (webhook) — Entry point for user-supplied links or files. Transcription using Supadata.ai** httpRequest — Fetches transcript from https://api.supadata.ai/... and requires the x-api-key header. OpenRouter Chat Model / OpenRouter Chat Model1** lmChatOpenRouter — Language model nodes connected to the Analyser and File Name Detector using the model deepseek/deepseek-r1-distill-llama-70b. Analyser** LangChain agent node that contains the expert analysis prompt and generates a full plain-text report from the transcript. Configuration includes hasOutputParser: true and retry enabled. File Name Detector** LangChain information extractor that extracts structured attributes like Title from the analysis output. Merge / Aggregate** Combines outputs from analysis and extraction into a single payload used for document creation. Creating New File / Updating Content in File** Google Docs nodes used to create and update documents using googleDocsOAuth2Api credentials. What you’ll need (credentials) OpenRouter account** Used by OpenRouter Chat Model nodes. API key stored in the openRouterApi credential. Supadata.ai API key** Added in the HTTP header x-api-key in the transcription request. Google Docs OAuth2** googleDocsOAuth2Api credential used for creating and updating Google Docs. Optional integrations** Slack webhook, Google Drive, or database credentials if adding notifications or persistent storage. Recommended settings & best practices Prompt control** Keep the Analyser prompt explicit about required sections, output style, and how to handle missing transcripts. Retries & timeouts** Enable retries for long-running model or HTTP calls. Configure proper HTTP request timeouts. Rate limits** Respect transcription and model provider rate limits. Add throttling if needed. Input validation** Validate the youtube_url before processing and handle transcript failures gracefully. Chunk transcripts** Split long transcripts into chunks before sending to the LLM to avoid context limit issues. Logging & audit** Store transcripts, analysis results, and metadata for debugging and traceability. Security** Store API keys as n8n credentials rather than plaintext. Document naming** Sanitize the extracted Title to prevent invalid filename characters. Monitoring** Add error notifications via email or Slack for failed runs. Customization ideas Alternative transcription providers** Replace Supadata.ai with AssemblyAI, Whisper (self-hosted), or YouTube captions. Multiple output formats** Export results to Google Docs, PDF, or JSON metadata. Speaker diarization** Include speaker labels and timestamps in the analysis. Summaries & highlights** Add TL;DR summaries and timestamped key moments. Content classification** Use additional LLM nodes to detect sentiment, category, or compliance issues. Thumbnail generation** Capture frames from the video to generate thumbnails. Webhook callbacks** Send the document link to Slack, email, or other systems. Model routing** Use smaller models for short videos and higher-quality models for long videos. Human review pipeline** Create a review queue for manual verification before publishing results. Tags video-analysis transcription n8n langchain automations google-docs openrouter supadata reporting workflow
by Zain Khan
AI-Powered Customer Feedback: Triage and Insight-Driven Chat This n8n workflow creates a two-phase system for handling customer feedback received via a Jotform submission. The first agent quickly triages the issue, and the second agent engages in a persistent, conversational exchange over email to collect the information necessary for a resolution. Phase 1: Triage and Initial Action (AI Agent) This phase is triggered by a new submission on the Jotform. The goal is to immediately categorize the feedback and take the appropriate initial action. Jotform Trigger: The workflow starts instantly when a user submits your designated feedback form. AI Agent (Triage): This agent (powered by Google Gemini) is tasked with two primary jobs: Sentiment Analysis and Response Drafting: It reads the feedback (q6_typeA6) and the user's name (q3_name.first). If Positive: It uses the Send a message in Gmail tool to send a concise, appreciative thank you note. If Negative: It uses the Send a message in Gmail tool to send an initial, empathetic response acknowledging the issue and stating that a team member will follow up with questions. It also uses the Create an issue in Jira Software tool to log the bug or issue immediately. Data Structuring: It uses the Structured Output Parser to extract key data points, most importantly the threadId of the initial email, which is crucial for the follow-up conversation agent. Phase 2: Conversational Insight Gathering (AI Agent (Chat)) This phase takes over for all negative feedback, engaging the customer in a back-and-forth exchange to collect essential details required for the development or support team. Gmail Trigger: This node is set to poll for new, unread emails (which are expected to be replies from the customer). Simple Memory: This node is vital for the conversational aspect. It is configured to use the unique email threadId as its session key, allowing the AI Agent to remember the entire history of the conversation (previous questions asked and details provided) across multiple emails. AI Agent (Chat): This is the 2nd agent and the core of the conversational process. Role: It acts as a dedicated feedback assistant. Goal: Its instruction is to reply and ask for specific, missing information needed for the ticket, such as: what device they were using, if they know the steps to reproduce the issue, and to confirm that the team will send a free coupon for credits as a thank you for their help. Tool: It uses the Reply to a message in Gmail tool to continue the conversation directly within the original email thread. Resolution: The agent is trained to look for confirmation that all necessary information has been provided. Once it determines the issue details are complete, it will send a final thank you email and automatically use the Jira tool to summarize and update the existing Jira issue with the new insights, closing the loop on the data collection process. Requirements To implement and run this automated customer feedback workflow, the following accounts and credentials are required: 1. Automation Platform n8n Instance:** A running instance of n8n (Cloud or self-hosted) to host and execute the workflow. Sign up for n8n using: https://n8n.partnerlinks.io/pe6gzwqi3rqw 2. Service Credentials You must set up and connect the following credentials within your n8n instance: Google Gemini API Key:* Required to power both *AI Agent** nodes for sentiment analysis and conversational follow-up. Gmail OAuth2/API Key:** Required for: The Send a message in Gmail tool (for initial replies). The Gmail Trigger (to detect new replies). The Reply to a message in Gmail tool (for the ongoing conversation). Jotform API Key:* Required for the *Jotform Trigger** node to instantly receive and process new form submissions. Sign up for Jotform using: https://www.jotform.com/?partner=zainurrehman Jira Software Credentials:* Required for the *Create an issue in Jira Software* tool (for the first agent) and the *Jira Tool** (for the second agent to update the ticket). 3. External Configurations Jotform Setup:** A live Jotform must be configured with specific fields to capture the user's name, email, and the feedback text. Jira Setup:* You need a designated *Jira Project* and a defined *Issue Type** for the workflow to create and update tickets.
by SIÁN Agency
Monitor Idealista Rental Prices and Send Slack Alerts Who is this for? Apartment hunters, property managers, and relocation consultants who want instant Slack notifications when new rental listings appear on Idealista matching their exact criteria. What this workflow does Never miss a new rental listing on Idealista again. This workflow runs every 6 hours, scrapes the latest rental listings matching your exact criteria, detects genuinely new listings you haven't seen before, and sends Slack alerts with price, size, location, and a direct link. This workflow uses the n8n-nodes-idealista-scraper community node and requires a self-hosted n8n instance. How it works The Schedule Trigger fires every 6 hours The Idealista Scraper fetches latest rentals via API-based extraction (never breaks unlike screen scrapers) The Code node compares results against previously seen listings using n8n's workflow static data Only genuinely new listings pass through -- duplicates are automatically filtered out Each new listing is sent as a rich Slack message and logged to Google Sheets Setup Install n8n-nodes-idealista-scraper via Settings > Community Nodes Add your Apify API credential (get token here) Add your Slack credential (OAuth2) Create a Slack channel (default: #real-estate-alerts) Create a Google Sheet for listing history and select it in the Sheets node Customize the scraper filters for your target area and budget Activate the workflow! Customization Change locationId for other cities (Barcelona, Rome, Lisbon, Milan) Adjust maxPrice, bedroom, and amenity filters (64+ available including pets, furnished, AC) Replace Slack with Email, Telegram, or Discord Adjust schedule interval (default: every 6 hours) The first run treats all listings as new; subsequent runs only alert on unseen listings Cost: $0.10/day (4 runs x 1 page x 40 properties) Default filters | Filter | Value | |--------|-------| | Operation | Rent | | Location | Madrid | | Max Price | 1,500 EUR/month | | Bedrooms | 1-2 bedrooms | | Elevator | Required | | Published Since | Last week |
by Ilyass Kanissi
📋Instant Proposal Generator Automatically convert sales call transcripts into professional client proposals by extracting key details with AI, dynamically populating Google Slides templates, and tracking progress in Airtable, all in one seamless workflow. 🎯 What does this workflow do? This end-to-end automation creates client-ready proposals by: Taking call transcripts via chat interface The AI analyzes the transcript to extract key details like company name, goals, budget, and requirements, then structures this data as JSON for seamless workflow integration. Generating customized documents using Google Slides template with dynamic variables, Auto populating {Company_Name}, {Budget}, etc. from extracted data. Delivering finished proposals: Sharing final document with client, and Updating CRM status automatically. ⚙️ How it works User input: Paste call transcript in chat trigger node AI analysis: OpenAI node processes text to extract structured JSON, Identifies company name, goals, budget, requirements, etc. Document copy: it copies the file from Google Drive, and name it {company name} proposal. Variables replacement: Replaces all template variables ({Company_Name}, {Budget}, etc.) with extracted data from ChatGPT. Delivery & tracking: Shares finalized proposal with client via email, an Updates Airtable "Lead Status" to "Proposal Sent". 🔑 Required setup OpenAI API Key: Create a key from here Google Cloud Credentials: Setup here Required scopes: Google Slides edit + file creation Airtable Access Token: Create one from here
by Bohdan Saranchuk
This n8n template automates your customer support workflow by connecting Gmail, OpenAI, Supabase, and Slack. It listens for new incoming emails, classifies them using AI, routes them to the appropriate Slack channel based on category (e.g., support or new requests), logs each thread to Supabase for tracking, and marks the email as read once processed. Good to know • The OpenAI API is used for automatic email classification, which incurs a small per-request cost. See OpenAI Pricing for up-to-date info. • You can easily expand the categories or connect more Slack channels to fit your workflow. • The Supabase integration ensures you don’t process the same thread twice. How it works Gmail Trigger checks for unread emails. Supabase Get Row verifies if the thread already exists. If it’s a new thread, the OpenAI node classifies the email into categories such as “support” or “new-request.” The Switch node routes messages to the correct Slack channel based on classification. Supabase Create Row logs thread details (sender, subject, IDs) to your database. Finally, the Gmail node marks the message as read to prevent duplication. How to use • The workflow uses a manual Gmail trigger by default, but you can adjust the polling frequency. • Modify category names or Slack channels to match your internal setup. • Extend the workflow to include auto-replies or ticket creation in your CRM. Requirements • Gmail account (with OAuth2 credentials) • Slack workspace (with channel access) • OpenAI account for classification • Supabase project for storing thread data Customizing this workflow Use this automation to triage incoming requests, route sales leads to specific teams, or even filter internal communications. You can add nodes for auto-responses, CRM logging, or task creation in Notion or ClickUp.
by spencer owen
YNAB Super Budget Ever wish that Y.N.A.B was just a little smarter when auto-categorizing your transactions? Now you can supercharge your YNAB budget with ChatGPT! No more manual categorization. Setup Get a YNAB Api Key Get YNAB Budget ID & Account ID (They are part of the URL) https://app.ynab.com/BUDGETID/accounts/ACCOUNTID Additional information Every transaction that this workflow modifies will be tagged with n8n and color yellow. You can easily review all changes by selecting just that tag. Customization By default it pulls transactions from the last 30 days. This workflow will post a message in a discord channel showing which transactions it modified and what categories it chose. Discord notifications are optional. Considerations YNAB allows for 200 api calls per hour. If you have more than 200 Uncategorized transactions, consider reducing the previous_days value.
by Omer Fayyaz
This workflow automates multi-platform content creation by transforming form submissions into tailored blog posts, LinkedIn posts, and Facebook posts using AI and web research What Makes This Different: Multi-Platform Content Generation** - Creates optimized content for blog, LinkedIn, and Facebook simultaneously AI-Powered Content Adaptation** - Uses OpenAI to tailor content for each platform's unique audience and format Web Research Integration** - Leverages Tavily API to gather relevant, up-to-date information on any topic Form-Based Input** - Simple form interface for content subject and target audience specification Automated Workflow** - End-to-end automation from form submission to content delivery Slack Integration** - Delivers all generated content via Slack notification for easy review and sharing Key Benefits of Automated Content Creation: Time Efficiency** - Generates three different content pieces in one workflow execution Platform Optimization** - Each content piece is specifically crafted for its intended platform Research-Backed Content** - Incorporates current web information for accurate, relevant content Consistent Brand Voice** - AI ensures consistent tone and messaging across all platforms Scalable Content Production** - Handles multiple content requests without manual intervention Centralized Delivery** - All content delivered to one location for easy management Who's it for This template is designed for content marketers, social media managers, small business owners, marketing agencies, and content creators who need to produce consistent, high-quality content across multiple platforms. It's perfect for businesses that want to streamline their content creation process, maintain a consistent brand voice, and leverage AI to create platform-specific content that resonates with their target audience. How it works / What it does This workflow creates an automated content creation system that transforms form submissions into multi-platform content. The system: Receives form submissions with content subject and target audience through n8n form trigger Extracts search parameters from form data to prepare for web research Searches the web using Tavily API to gather relevant, current information on the topic Processes search results by splitting and aggregating content for AI processing Generates platform-specific content using OpenAI agents for LinkedIn, Facebook, and blog formats Aggregates all content into a single output with all three platform versions Sends Slack notification with all generated content for review and distribution Key Innovation: Multi-Platform AI Content Generation - Unlike traditional content tools that create one piece of content, this system automatically generates three different versions optimized for each platform's unique audience, format requirements, and engagement patterns, all based on current web research and AI-powered adaptation. How to set up 1. Configure Form Trigger Set up n8n form trigger with "Content Subject" and "Target Audience" fields Configure form settings and validation rules Test form submission functionality Ensure proper data flow to subsequent nodes 2. Configure OpenAI API Set up OpenAI API credentials in n8n Ensure proper API access and quota limits Configure the OpenAI Chat Model node for content generation Test AI model connectivity and response quality 3. Configure Tavily API Get your API key**: Sign up at tavily.com and obtain your API key from the dashboard Add API key to workflow**: In the "Search Web" HTTP Request node, replace "ADD YOU API KEY HERE" with your actual Tavily API key Example configuration**: { "api_key": "your-actual-api-key-here", "query": "{{ $json.query.replace(/\"/g, '\\\"') }}", "search_depth": "basic", "include_answer": true, "topic": "news", "include_raw_content": true, "max_results": 3 } Configure search parameters**: Ensure max_results is set to 3 and search_depth to "basic" for optimal performance Test API connectivity**: Run a test execution to verify search results are returned correctly 4. Configure Slack Integration Set up Slack API credentials in n8n Configure Slack channel ID for content delivery Set up proper message formatting for content display Test Slack notification delivery 5. Test the Complete Workflow Submit test form with sample content subject and target audience Verify web search returns relevant results Check that AI generates appropriate content for all three platforms Confirm Slack notification contains all generated content Requirements n8n instance** with form trigger and HTTP request capabilities OpenAI API** access for AI-powered content generation Tavily API** credentials for web search functionality Slack workspace** with API access for content delivery Active internet connection** for real-time API interactions How to customize the workflow Modify Content Generation Parameters Adjust the number of web search results (currently set to 3) Add more search depth options (basic, advanced, comprehensive) Implement content length controls for different platforms Add content tone and style preferences Enhance AI Capabilities Customize AI prompts for specific industries or niches Add support for multiple languages Implement brand voice consistency across all platforms Add content quality scoring and optimization Expand Content Sources Integrate with additional research APIs (Google Search, Bing, etc.) Add support for internal knowledge base integration Implement trending topic detection Add competitor content analysis Improve Content Delivery Add email notifications alongside Slack Implement content scheduling capabilities Add content approval workflows Implement content performance tracking Business Features Add content analytics and performance metrics Implement A/B testing for different content versions Add content calendar integration Implement team collaboration features Key Features Multi-platform content generation** - Creates optimized content for blog, LinkedIn, and Facebook AI-powered content adaptation** - Tailors content for each platform's unique requirements Web research integration** - Incorporates current, relevant information from web searches Form-based input** - Simple interface for content subject and target audience specification Automated workflow** - End-to-end automation from form submission to content delivery Platform-specific optimization** - Each content piece follows platform best practices Slack integration** - Centralized delivery of all generated content Scalable content production** - Handles multiple content requests efficiently Technical Architecture Highlights AI-Powered Content Generation OpenAI integration** - Advanced language model for content creation Platform-specific prompts** - Tailored AI instructions for each social platform Content optimization** - AI ensures platform-appropriate formatting and tone Quality consistency** - Maintains brand voice across all generated content Web Research Integration Tavily API** - Comprehensive web search with content extraction Real-time data** - Access to current, relevant information Content aggregation** - Combines multiple sources for comprehensive coverage Search optimization** - Efficient query construction for better results Form-Based Input System n8n form trigger** - Simple, user-friendly input interface Data validation** - Ensures required fields are properly filled Parameter extraction** - Converts form data to search and generation parameters Error handling** - Graceful handling of incomplete or invalid inputs Multi-Platform Output LinkedIn optimization** - Professional tone with industry-specific formatting Facebook adaptation** - Engaging, shareable content with appropriate length Blog formatting** - Comprehensive, SEO-friendly long-form content Unified delivery** - All content delivered through single Slack notification Use Cases Content marketing agencies** needing efficient multi-platform content creation Small businesses** requiring consistent social media presence across platforms Marketing teams** looking to streamline content production workflows Solo entrepreneurs** needing professional content without hiring writers E-commerce brands** requiring product-focused content for multiple channels Professional services** needing thought leadership content across platforms Event organizers** requiring promotional content for different social channels Educational institutions** needing content for student engagement and recruitment Business Value Time Efficiency** - Reduces content creation time from hours to minutes Cost Savings** - Eliminates need for multiple content creators or agencies Consistency** - Maintains brand voice and messaging across all platforms Scalability** - Handles unlimited content requests without additional resources Quality Assurance** - AI ensures professional-quality content every time Multi-Platform Reach** - Maximizes content distribution across key social channels Research Integration** - Incorporates current information for relevant, timely content This template revolutionizes content creation by combining AI-powered writing with real-time web research, creating an automated system that produces high-quality, platform-optimized content for blog, LinkedIn, and Facebook from a simple form submission.
by Anshul Chauhan
Deploy a Multi-Tool AI Assistant on WhatsApp with Google Gemini Deploy a true AI assistant on WhatsApp. This n8n workflow uses a sophisticated hierarchical agent structure to not only handle conversations but also manage your emails and calendar directly from your chat, all powered by Google Gemini. Key Features Powered by Google Gemini:** Utilizes the advanced capabilities of Google's Gemini models for understanding complex commands and generating natural, human-like responses. Intelligent Task Delegation (Hierarchical Agents):* Features a central *Personal Agent** that understands the user's intent and intelligently delegates tasks to specialized sub-agents for email, calendar, or general chat. Full Email & Calendar Management:** Connects directly to your Google Workspace to send emails, create drafts, apply labels, create/update/delete calendar events, check your availability, and more. Context-Aware Conversations:** Employs memory at multiple levels, allowing the assistant to remember the context of your requests for a coherent and intuitive user experience. Seamless WhatsApp Integration:** Connects directly with the WhatsApp Business API to send and receive messages, engaging users on one of the world's most popular messaging platforms. Easy to Deploy & Customize:** Get your assistant running with minimal configuration and easily extend its capabilities by adding new tools or modifying the prompts of the existing agents. How It Works The workflow uses an advanced agent-based model to process incoming messages: The Whatsapp Trigger node listens for and receives new messages sent to your WhatsApp Business number. The message is passed to the main Manager Agent. The Personal Agent analyzes the message to understand the user's intent (e.g., "send an email," "check my schedule," or just "hello"). Based on the intent, it routes the task to the appropriate sub-agent: the Email Tool, the Calendar Tool, or the general Chatbot Model. The selected sub-agent executes the task using its own dedicated tools (e.g., the Email Tool uses Gmail nodes to send a message). The result or response from the sub-agent is passed back to the Send message (WhatsApp) node, which delivers the reply to the user. Prerequisites An active n8n instance. A Meta Business Account and a configured Meta App with the "WhatsApp Business" product added. A Google Gemini API Key. A Google Account with pre-configured OAuth2 credentials in n8n for Gmail and Google Calendar. Step-by-Step Setup Guide 1. Configure WhatsApp Credentials: In your n8n instance, add new "WhatsApp Business" credentials. You will need a Permanent Access Token and a Phone Number ID from your Meta App's "WhatsApp > API Setup" dashboard. 2. Set Up the WhatsApp Trigger: Open the Whatsapp Trigger node. In the "Webhook URL" section, copy the Test URL. Go to your Meta App's dashboard under "WhatsApp > Configuration". Click "Edit" in the Webhooks section. Paste the n8n Test URL into the Callback URL field. Create and enter a Verify token (a simple password of your choice). Enter this same token in the Whatsapp Trigger node in n8n. Subscribe to the messages webhook event. Once verified, copy the Production URL from n8n and paste it into the same Callback URL field in the Meta dashboard. 3. Configure the Google Gemini Nodes: You must add your Google Gemini API Key to the credentials for all the Google Gemini Chat Model nodes. This includes the one in the Chatbot Model, Email Tool, and Calendar Tool. 4. Configure the Google Tools (Email & Calendar): Email Tool:* Open the group of nodes labeled *Email Tool**. For every Gmail node (Send Email, Create Draft, Get Labels, etc.), select your pre-configured Google OAuth2 credential. Calendar Tool:* Open the group of nodes labeled *Calendar Tool**. For every Google Calendar node (Create Event, Get all event, etc.), select your pre-configured Google OAuth2 credential. 5. Activate and Test: Save and activate the workflow. Send a message to your configured WhatsApp Business number.
by Julian Reich
This n8n template demonstrates how to automatically convert voice messages from Telegram into structured, searchable notes in Google Docs using AI transcription and intelligent tagging. Use cases are many: Try capturing ideas on-the-go while walking, recording meeting insights hands-free, creating voice journals, or building a personal knowledge base from spoken thoughts! Good to know OpenAI Whisper transcription costs approximately $0.006 per minute of audio ChatGPT tagging adds roughly $0.001-0.003 per message depending on length The workflow supports both German and English voice recognition Text messages are also supported - they bypass transcription and go directly to AI tagging Perfect companion: Combine with the "Weekly AI Review**" workflow for automated weekly summaries of all your notes! How it works Telegram receives your voice message or text and triggers the workflow An IF node intelligently detects whether you sent audio or text content For voice messages: Telegram downloads the audio file and OpenAI Whisper transcribes it to text For text messages: Content is passed directly to the next step ChatGPT analyzes the content and generates up to 3 relevant keywords (Work, Ideas, Private, Health, etc.) A function node formats everything with Swiss timestamps, message type indicators, and clean structure The formatted entry gets automatically inserted into your Google Doc with date, keywords, and full content Telegram sends you a confirmation with the transcribed/original text so you can verify accuracy How to use Simply send a voice message or text to your Telegram bot - the workflow handles everything automatically The manual execution can be used for testing, but in production this runs on every message Voice messages work best with clear speech in quiet environments for optimal transcription Requirements Telegram Bot Token and configured webhook OpenAI API account for Whisper transcription and ChatGPT tagging Google Docs API access for document writing A dedicated Google Doc where all notes will be collected Customising this workflow Adjust the AI prompt to use different tagging categories relevant to your workflow (e.g., project names, priorities, emotions) Add multiple Google Docs for different contexts (work vs. private notes) Include additional processing like sentiment analysis or automatic task extraction Connect to other apps like Notion, Obsidian, or your preferred note-taking system And don't forget to also implement the complimentary workflow Weekly AI Review!