by franck fambou
⚠️ IMPORTANT: This template requires self-hosted n8n hosting due to the use of community nodes (MCP tools). It will not work on n8n Cloud. Make sure you have access to a self-hosted n8n instance before using this template. Overview This workflow automation allows a Google Gemini-powered AI Agent to orchestrate multi-source web intelligence using MCP (Model Context Protocol) tools such as Firecrawl, Brave Search, and Apify. The system allows users to interact with the agent in natural language, which then leverages various external data collection tools, processes the results, and automatically organizes them into structured spreadsheets. With built-in memory, flexible tool execution, and conversational capabilities, this workflow acts as a multi-agent research assistant, capable of retrieving, synthesizing, and delivering actionable insights in real time. How the system works AI Agent + MCP Pipeline User Interaction A chat message is received and forwarded to the AI Agent. AI Orchestration The agent, powered by Google Gemini, decides which MCP tools to invoke based on the query. Firecrawl-MCP: Recursive web crawling and content extraction. Brave-MCP: Real-time web search with structured results. Apify-MCP: Automation of web scraping tasks with scalable execution. Memory Management A memory module stores context across conversations, ensuring multi-turn reasoning and task continuity. Spreadsheet automation Results are structured in a new, automatically created Google Spreadsheet, enriched with formatting and additional metadata. Data processing The workflow generates the spreadsheet content, updates the sheet, and improves results via HTTP requests and field edits. Delivery of results Users receive a structured and contextualized dataset ready for review, analysis, or integration into other systems. Configuration instructions Estimated setup time: 45 minutes Prerequisites Self-hosted n8n instance (v0.200.0 or higher recommended) Google Gemini API key MCP-compatible nodes (Firecrawl, Brave, Apify) configured Google Sheets credentials for spreadsheet automation Detailed configuration steps Step 1: Configuring the AI Agent AI Agent node**: Select Google Gemini as the LLM model Configure your Google Gemini API key in the n8n credentials Set the system prompt to guide the agent's behavior Connect the Simple Memory node to enable context tracking Step 2: Integrating MCP Tools Firecrawl-MCP Configuration**: Install the @n8n/n8n-nodes-firecrawl-mcp package Configure your Firecrawl API key Set crawling parameters (depth, CSS selectors) Brave-MCP configuration**: Install the @n8n/n8n-nodes-brave-mcp package Add your Brave Search API key Configure search filters (region, language, SafeSearch) Apify-MCP configuration**: Install the @n8n/n8n-nodes-apify-mcp package Configure your Apify credentials Select the appropriate actors for your use cases Step 3: Spreadsheet automation “Create Spreadsheet” node**: Configure Google Sheets authentication (OAuth2 or Service Account) Set the file name with dynamic timestamps Specify the destination folder in Google Drive “Generate Spreadsheet Content” node**: Transform the agent's outputs into tabular format Define the columns: URL, Title, Description, Source, Timestamp Configure data formatting (dates, links, metadata) “Update Spreadsheet” node**: Insert the data into the created sheet Apply automatic formatting (headers, colors, column widths) Add summary formulas if necessary Step 4: Post-processing and delivery “Data Enrichment Request” node** (formerly “HTTP Request1”): Configure optional API calls to enrich the data Add additional metadata (geolocation, sentiment, categorization) Manage errors and timeouts “Edit Fields” node**: Refine the final dataset (metadata, tags, filters) Clean and normalize the data Prepare the final response for the user Structure of generated Google Sheets Default columns | Column | Description | Type | |---------|-------------|------| | URL | Data source URL | Hyperlink | | Title | Page/resource title | Text | | Description | Description or content excerpt | Long text | | Source | MCP tool used (Brave/Firecrawl/Apify) | Text | | Timestamp | Date/time of collection | Date/Time | | Metadata | Additional data (JSON) | Text | Automatic formatting Headings**: Bold font, colored background URLs**: Formatted as clickable links Dates**: Standardized ISO 8601 format Columns**: Width automatically adjusted to content Use cases Business and enterprise Competitive analysis combining search, crawling, and structured scraping Market trend research with multi-source aggregation Automated reporting pipelines for business intelligence Research and academia Literature discovery across multiple sources Data collection for research projects Automated bibliographic extraction from online sources Engineering and development Discovery of APIs and documentation Aggregation of product information from multiple platforms Scalable structured scraping for datasets Personal productivity Automated creation of newsletters or knowledge hubs Personal research assistant compiling spreadsheets from various online data Key features Multi-source intelligence Firecrawl for deep crawling Brave for real-time search Apify for structured web scraping AI-driven orchestration Google Gemini for reasoning and tool selection Memory for multi-turn interactions Context-based adaptive workflows Structured data output Automatic spreadsheet creation Data enrichment and formatting Ready-to-use datasets for reporting Performance and scalability Handles multiple simultaneous tool calls Scalable web data extraction Real-time aggregation from multiple MCPs Security and privacy Secure authentication based on API keys Data managed in Google Sheets / n8n Configurable retention and deletion policies Technical architecture Workflow User query → AI agent (Gemini) → MCP tools (Firecrawl / Brave / Apify) → Aggregated results → Spreadsheet creation → Data processing → Results delivery Supported data types Text and metadata** from crawled web pages Search results** from Brave queries Structured data** from Apify scrapers Tabular reports** via Google Sheets Integration options Chat interfaces Web widget for conversational queries Slack/Teams chatbot integration REST API access points Data sources Websites (via Firecrawl/Apify) Search engines (via Brave) APIs (via HTTP Request enrichment) Performance specifications Query response: < 5 seconds (search tasks) Crawl capacity: Thousands of pages per run Spreadsheet automation: Real-time creation and updates Accuracy: > 90% when using combined sources Advanced configuration options Customization Set custom prompts for the AI Agent Adjust the spreadsheet schema for reporting needs Configure retries for failed tool runs Analytics and monitoring Track tool usage and costs Monitor crawl and search success rates Log queries and outputs for auditing Troubleshooting and support Timeouts:** Manually re-run failed MCP executions Data gaps:** Validate Firecrawl/Apify selectors Spreadsheet errors:** Check Google Sheets API quotas
by Rahul Joshi
Description Automate Jira backlog management with intelligent cleanup, prioritization, and AI-powered reporting. This workflow scans daily to identify stale issues, missing priorities, and overdue tasks — auto-updates Jira with corrective labels, logs everything into Google Sheets for tracking, and notifies teams via Slack. Every Friday, it sends an AI-generated backlog summary email to project leads for visibility and planning. 🚀📅 What This Template Does Step 1: Triggers automatically every weekday at 7:00 AM to fetch backlog issues from Jira. ⏰ Step 2: Filters issues missing estimates, assignees, or priority values for cleanup. 🧹 Step 3: Applies corrective labels (e.g., “Needs Estimation,” “Unassigned,” “Overdue”). 🏷️ Step 4: Logs all flagged issues into Google Sheets with timestamps for audit tracking. 📊 Step 5: Sends real-time Slack alerts summarizing key backlog insights. 💬 Step 6: Every Friday, uses GPT-4 to generate a summarized backlog health report. 🤖 Step 7: Delivers weekly summary emails to leads and project managers via Gmail. 📧 Key Benefits ✅ Eliminates manual backlog reviews and prioritization. ✅ Ensures consistent Jira hygiene and task visibility. ✅ Provides centralized backlog tracking via Google Sheets. ✅ Sends real-time alerts for overdue and unassigned tasks. ✅ Offers AI-driven insights for better sprint planning. Features Automated daily trigger (Mon–Fri, 7 AM) Jira issue fetching and filtering by priority and assignment Smart labeling for hygiene tracking Slack alerts for backlog anomalies Weekly GPT-4 generated summary reporting Google Sheets integration for historical logging Gmail integration for summary email delivery Requirements Jira API credentials with read/write issue permissions Google Sheets OAuth2 credentials for data logging Slack Bot token with chat:write permissions Gmail OAuth2 credentials for email delivery OpenAI or Azure OpenAI API key for GPT-4 summarization Target Audience Agile and Scrum teams maintaining large backlogs 🧩 Product managers ensuring backlog quality and consistency 📋 Engineering leads seeking proactive backlog hygiene 🛠️ Organizations needing visibility across project tasks 🏢 Remote teams using Slack for daily syncs 🌐 Step-by-Step Setup Instructions Connect Jira credentials and specify your project key(s). 🔑 Link your Google Sheet and replace YOUR_SHEET_ID for backlog tracking. 📊 Configure Slack and replace YOUR_CHANNEL_ID for alert delivery. 💬 Add Gmail credentials and define recipient emails for weekly reports. 📧 Add your GPT-4 API key (OpenAI or Azure) for AI summarization. 🤖 Adjust cron expression (0 7 * * 1-5) to match your local timezone. ⏰ Run manually once to validate all connections, then enable automation. ✅
by Miftah Rahmat
Automate Water Bill Calculations with Telegram, Gemini AI, and Google Sheets This workflow automates the calculation of monthly water bills. Residents can send a photo of their water meter along with their name via Telegram. The workflow uses Gemini AI to extract the meter reading, calculates the usage difference compared to the previous month, and updates a Google Sheet with the billing details. Finally, the workflow sends a summary back via Telegram. Don’t hesitate to reach out if you have any questions or run into issues! 🙌 Requirements A Telegram bot token (created via BotFather). A Google account with access to Google Sheets. A Gemini API key (). A pre-created Google Sheet with the required columns. Google Sheet Setup Create a new Google Sheet with the following columns: Nama, Volume Sebelumnya, Volume Saat Ini, Harga/m³, Jumlah Bayar, Beban, Total Bayar, Tanggal Input Workflow Setup Instructions Connect Google Sheets Add your Google Sheets credentials in n8n. Link the workflow to your sheet with the structure above. Set Up Telegram Bot Create a Telegram bot via BotFather. Copy your bot token into the Telegram Trigger node. Configure Gemini AI Obtain a Gemini API key from Google AI Studio. Add it to your n8n credentials. The workflow will parse the meter reading from the uploaded image. Example Calculation Previous Volume: 535 m³ Current Volume: 545 m³ Usage: 10 m³ Price per m³: Rp3.000 Fixed cost: Rp3.000 Total Bill: Rp33.000 How It Works User sends a photo of the water meter with caption (name). Telegram Trigger receives the message. Gemini AI reads the meter number from the photo. Workflow fetches previous volume from Google Sheets. Usage and total bill are calculated. Data is stored back into Google Sheets. Bot replies in Telegram with detailed bill info. Customization Change Harga/m³ in the sheet to match your community’s water price. Update Beban if your community uses a different fixed fee. Edit the Telegram reply message node to adjust wording. With this workflow, you can streamline water billing for residents, ensure accuracy, and save time on manual calculations.
by Țugui Dragoș
How It Works Story Generation – Your idea is transformed into a narrative split into scenes using DeepSeek LLM. Visuals – Each scene is illustrated with AI images via Replicate, then animated into cinematic video clips with RunwayML. Voice & Music – Narration is created using ElevenLabs (text-to-speech), while Replicate audio models generate background music. Final Assembly – All assets are merged into a professional video using Creatomate. Delivery – Everything is orchestrated by n8n, triggered from Slack with /render, and the final video link is delivered back instantly. Workflow in Action 1. Trigger from Slack Type your idea with /render in Slack - the workflow starts automatically. 2. Final Video Output Receive a polished cinematic video link in Slack. 3. Creatomate Template ⚠️ Important: You must create your own template in Creatomate. This is a one-time setup - the template defines where the voiceover, music, and video clips will be placed. The more detailed and refined your template is, the better the final cinematic result. Required APIs To run this workflow, you need accounts and API keys from the following services: DeepSeek – Story generation (LLM) Replicate – Images & AI music generation RunwayML – Image-to-video animations ElevenLabs – Text-to-speech voiceovers Creatomate – Video rendering and templates Dropbox – File storage and asset syncing Slack – Workflow trigger and video delivery Setup Steps Import the JSON workflow into your n8n instance. Add your API credentials for each service above. Create a Creatomate template (only once) – define layers for visuals, voice, and music. Trigger the workflow from Slack with /render Your Story Idea. Receive your final cinematic video link directly in Slack. Use Cases Automated YouTube Shorts / TikToks for faceless content creators. Scalable ad creatives and marketing videos for agencies. Educational explainers** and onboarding videos generated from text. Rapid prototyping** of cinematic ideas for developers & storytellers. With this workflow, you’re not just using AI tools – you’re running a full AI-powered studio in n8n.
by Rully Saputra
Who’s it for This workflow is perfect for IT departments, helpdesk teams, or internal service units that manage incoming support requests through Jotform. It automates ticket handling, classification, and response—saving time and ensuring consistent communication. How it works When a new IT service request is submitted through Jotform, this workflow automatically triggers in n8n. The submitted details (name, department, category, comments, etc.) are structured and analyzed using Google Gemini AI to summarize and classify the issue’s priority level (P0–P2). P0 (High): Urgent issues that send an immediate Telegram alert. P1 (Medium) / P2 (Low): Logged in Google Sheets for tracking and reporting. After classification, the workflow sends a confirmation email to the requester via Gmail, providing a summary of their submission and current status. How to set up Connect your Jotform account to the Jotform Trigger node. Add your Google Sheets, Gmail, and (optionally) Telegram credentials. Map your Jotform fields in the “Set” node (Full Name, Department, Category, etc.). Test by submitting a form response. Requirements Jotform account and published IT request form Google Sheets account Gmail account (for replies) Optional: Telegram bot for real-time alerts n8n account (cloud or self-hosted) How to customize the workflow Adjust AI classification logic in the Priority Classifier node. Modify email templates for tone or format. Add filters or additional routing for different departments. Extend to integrate with your internal ticketing or Slack systems.
by go-surfe
🚀 What this template does Automatically finds and enriches key contacts in a deal’s buying group by combining the company domain from the HubSpot deal with the buying group criteria you define (departments, seniorities, countries, job titles). It then pushes these contacts into HubSpot and emails your team a clean summary with direct HubSpot links—so no decision-maker falls through the cracks. Before starting, make sure you have: Buying Group Criteria Excel – Contains two sheets: Buying group reference values (reference list) Your Buying Group Criterias (where you define your filters) You’ll import the Excel file into Google Sheets during setup. ❓ What Problem Does This Solve? When a new opportunity/deal is created, sales teams often miss adjacent decision-makers (e.g., VP Sales, Head of Marketing). This template searches for those people, enriches their contact data, adds/updates them in HubSpot, and notifies your team with a one-glance table. 🧰 Prerequisites: To use this template, you’ll need: A self-hosted or cloud instance of n8n A Surfe API Key (Bearer token for People Search & Bulk Enrich) A Google Sheets account (OAuth2 or service account) with access to your criteria sheet A HubSpot developer account (for the HubSpot Deal Trigger) A HubSpot normal account (where there is your deals, contacts, companies) A Gmail account to send the enrichment summary email The workflow JSON file (included with this tutorial) Buying Group Criteria Excel (included with this tutorial) 📌 Your input (Google Sheets) This workflow uses a Google Sheet with two tabs: Buying group reference values – A read-only reference list of all available options for the departments and seniorities columns. You’ll use this list to choose your search filters. Your Buying Group Criterias – The sheet where you define the actual filters used in the workflow. ⚠️ Before you start: Import the provided Excel file into your Google Sheets account so both tabs appear exactly as in the template. How to fill it the tab “Your Buying Group Criterias”: departments (Column A) → Select one or more values from the reference tab. Only rows containing a value will be used in the search. seniorities (Column B) → Select one or more values from the reference tab. Only rows containing a value will be used in the search. countries (Column C) → Enter any ISO Alpha-2 country codes (e.g., fr, gb, de). This is a free-text filter. jobTitles (Column D) → Enter any job title keywords you want to search for (e.g., CTO, Head of Marketing). This is also a free-text filter. The workflow will read the filled cells from each column, clean duplicates, and pass them to the Surfe People Search API. ⚙️ Setup Instructions 4.1 🔐 Create Your Credentials in n8n 4.1.1 📊 Google Sheets OAuth2 API Go to n8n → Credentials Create new credentials: Type: Google Sheets OAuth2 API Here a pop-up will open where you can login to your Google account from where you will read the Google Sheets When it’s done you should see this on n8n 4.1.2 📧 Gmail OAuth2 API Go to n8n → Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.3 🚀 Surfe API In your Surfe dashboard → Use Surfe Api → copy your API key Go to n8n → Credentials → Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.4 🎯 HubSpot OAuth2 API Go to n8n → Credentials → Create Credential → HubSpot Oauth2 API Here make sure to select your normal hubspot account where your companies, deals and contacts are and not the hubspot-developers-xxx.com Done ✅ 4.1.5 🔓 HubSpot Private App Token Go to HubSpot → Settings → Integrations → Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n → Credentials → Create Credential → HubSpot App Token Paste your App Token 4.1.6 🎯 HubSpot Developer Api: In order to Use the HubSpot Trigger node, you need to setup HubSpot Developer Api To configure this credential, you'll need a HubSpot developer account and: A Client ID: Generated once you create a public app. A Client Secret: Generated once you create a public app. A Developer API Key: Generated from your Developer Apps dashboard. An App ID: Generated once you create a public app. To create the public app and set up the credential: Log into your HubSpot app developer account. Select Apps from the main navigation bar. Select Get HubSpot API key. You may need to select the option to Show key. Copy the key and enter it in n8n as the Developer API Key. Still on the HubSpot Apps page, select Create app. On the App Info tab, add an App name, Description, Logo, and any support contact info you want to provide. Anyone encountering the app would see these. Open the Auth tab. Copy the App ID and enter it in n8n. Copy the Client ID and enter it in n8n. Copy the Client Secret and enter it in n8n. In the Scopes section, select Add new scope. Add all the scopes listed in Required scopes for HubSpot Trigger node to your app. Select Update. Copy the n8n OAuth Redirect URL and enter it as the Redirect URL in your HubSpot app. Select Create app to finish creating the HubSpot app. Refer to the HubSpot Public Apps documentation for more detailed instructions. You should see this on n8n at the end. ✅ You are now all set for the credentials 4.2 📥 Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the … on the top left Import from File 4.2.1 🔗 Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Google Sheets -> Credentials to connect with → Google Sheets Account Gmail Node Credentials to connect with → Gmail account HubSpot: Create or Update →Credentials to connect with → Huspot App Token Account HubSpot Get Company →Credentials to connect with → Huspot App Token Account HubSpot get deal →Credentials to connect with → Huspot App Token Account HubSpot Trigger →Credentials to connect with → Huspot Developer account HTTP Node GET deal associated companies from HUBSPOT → Credential Type → Hubspot OAuth2 API Surfe HTTP nodes: Authentication → Generic Credential Type Generic Auth Type → Bearer Auth Bearer Auth → Select the credentials you created before Surfe HTTP nodes 4.2.2 🔧 Additional Setup for the node Google Sheets READ CRITERIAS: Paste the url of your google sheets in Document → By URL Select the sheet Your Buying Group Criterias 🔄 How This N8N Workflow Works A new deal is created in HubSpot, which triggers the workflow. The workflow retrieves the company domain linked to that deal. It reads the buying group criteria from your Google Sheet (departments, seniorities, countries, job titles). These criteria are combined with the company domain to create a search payload for Surfe’s People Search API (limited to 200 people per run). Matching contacts are then sent to Surfe’s Bulk Enrichment API to retrieve emails, phone numbers, and other details. n8n polls Surfe until the enrichment job is complete. Enriched contact data is extracted and filtered so that only contacts with at least one valid email or phone number remain. These contacts are created or updated in HubSpot. Finally, a Gmail summary email is sent to your team with a clean table of the new or updated contacts and direct links to view them in HubSpot. 🧩 Use Cases Net-new deal created → instantly surface the rest of the buying group and enrich contacts. 🛠 Customization Ideas 🔁 Add retry logic for failed Surfe enrichment jobs 📤 Log enriched contacts into a Google Sheet or Airtable for auditing 📊 Extend the flow to generate a basic summary report of enriched vs rejected contacts ⏳ Trigger the enrichment not only on deal creation but also at a specific deal stage change 📧 Send the summary email to multiple recipients or a team mailing list ✅ Summary This template automates buying-group discovery and enrichment off a new HubSpot deal, writes enriched contacts back to HubSpot, and emails a neat table to your team—so reps focus on outreach, not admin. Import it, connect credentials, point at your criteria sheet, and Let Surfe do the rest.
by Mirai
This n8n template automates targeted lead discovery, AI-driven data structuring, and personalized cold-email sending at controlled intervals. It’s ideal for sales teams, founders, and agencies that want to scale outreach without losing personalization. Good to know Can run on an interval (e.g., every 10 minutes) to fetch and process new leads. Requires API keys for OpenAI (content + parsing) and Apify (lead discovery). Emails are sent one-by-one with delays (the Wait node) to reduce spam risk. Lead data is written to Google Sheets—we recommend separate sheets for leads with and without emails. Works with Gmail, Outlook, or your own SMTP—just plug in your credentials. How it works Form Trigger (START) A form collects: Job Title, Company Size, Keywords, Location. Apollo URL Generator (GPT) The model turns the form fields into a precise Apollo search URL. Run Apify (Actor) Apify fetches contacts/companies that match your preferences for downstream processing. Limit Caps how many records are prepared per run (e.g., max 5). Parse Lead Data (GPT) Extracts key fields (full name, email, title, LinkedIn, company, company links). Synthesizes a short 2–3 sentence sales-ready summary for each lead. Sorting (If) Splits leads into with email vs. without email. With email → main sheet + email pathway Without email → a separate sheet for later enrichment Email Magic (GPT) Uses the parsed data to personalize your fixed email template for each lead (keeps structure/intent, swaps in the right details). Sending Emails (Loop + Wait + Sender) Loop Over sends messages individually. Wait inserts a pause between sends (fully configurable). Delivery via Gmail or SMTP (custom domain / Outlook). Confirmation After the loop finishes, a Gmail node sends a “campaign complete” confirmation. How to use Enable the workflow and open the start form. Enter preferences: job title, company size, keywords, location. Add credentials: OpenAI (for parsing + email generation) Apify (Bearer token in Run Apify) Google (Sheets + optionally Gmail) SMTP/Outlook (if not using Gmail) Set limits (the Limit node) and send interval (the Wait node). Choose sheets for leads with/without email. Run—the workflow will fetch leads, prepare emails, and send them with spacing. Requirements OpenAI API key Apify API token (access to the chosen Actor) Google Sheets for storage Gmail or SMTP/Outlook credentials for sending An operational n8n instance Customising this workflow Email template: Edit the text in “Creating a email” while preserving placeholders. Segmentation: Add more conditions (role, industry, country) and route to different templates/sheets. Follow-ups: Add a second loop that reads statuses and sends timed reminders. Data enrichment: Insert additional APIs before “Parse Lead Data.” Anti-spam: Increase Wait duration, rotate senders, vary subject lines. Reporting: Add a “send status” sheet and an error log. Security & compliance tips Store API keys in n8n Credentials, not plain-text nodes. Respect GDPR/opt-out—track source and first-contact date in your sheet. Start with a small batch, validate deliverability, then scale up. In short Automated lead capture → AI cleaning + summary → personalized emails → spaced sending → completion notice. Scalable, customizable, and ready to plug into your preferred sender and template.
by Oneclick AI Squad
This automated n8n workflow enables an AI-powered movie recommendation system on WhatsApp. Users send messages like "I want to watch a horror movie" or "Where can I watch the Jumanji movie?" The workflow uses AI to interpret the request, searches relevant APIs (e.g., TMDb, JustWatch), and replies with movie recommendations or streaming platform availability via WhatsApp. Fundamental Aspects WhatsApp Webhook Trigger**: Initiates the workflow when a WhatsApp message is received. Analyze WhatsApp Message**: Uses AI (e.g., Ollama Model) to interpret the user's intent and extract request type. Check Request Type**: Determines if the request is for a movie genre or a specific movie title. Check Where Request**: Identifies if the request includes a "where to watch" query. Extract Movie Title**: Extracts the movie title from the message if specified. Extract Genre**: Identifies the movie genre from the message if specified. Search Specific Movie Title**: Queries an API (e.g., TMDb) for details about a specific movie. Search Movies by Genre**: Queries an API (e.g., TMDb) for movies matching the genre. Get Streaming Availability**: Queries an API (e.g., JustWatch) for streaming platforms. Format Streaming Response**: Prepares the response with streaming platform details. Format Genre Recommendations**: Prepares the response with genre-based movie recommendations. Prepare WhatsApp Message**: Formats the final response for WhatsApp. Send WhatsApp Response**: Sends the recommendation or streaming info back to the user via WhatsApp. Setup Instructions Import the Workflow into n8n: Download the workflow JSON and import it via the n8n interface. Configure API Credentials: Set up WhatsApp Business API credentials with a valid phone number and token. Configure TMDb API key (e.g., https://api.themoviedb.org). Configure JustWatch API key (e.g., https://api.watchmode.com). Set up AI model credentials (e.g., Ollama Model). Run the Workflow: Activate the webhook trigger and test with a WhatsApp message. Verify Responses: Check WhatsApp for accurate movie recommendations or streaming info. Adjust Parameters: Fine-tune API endpoints or AI model as needed. Features AI Interpretation**: Uses AI to analyze user intents (genre or movie title). API Integration**: Searches TMDb for movie details and JustWatch for streaming availability. Real-Time Responses**: Sends instant replies via WhatsApp. Custom Recommendations**: Provides genre-based or specific movie recommendations. Technical Dependencies WhatsApp Business API**: For receiving and sending messages. TMDb API**: For movie details and genre searches. JustWatch API**: For streaming availability. Ollama Model**: For AI-based message analysis. n8n**: For workflow automation and integration. Customization Possibilities Add More APIs**: Integrate additional movie databases (e.g., IMDb). Enhance AI**: Train the Ollama Model for better intent recognition. Support More Languages**: Add multilingual support for WhatsApp responses. Add Email Alerts**: Include email notifications for admin monitoring. Customize Responses**: Adjust the format of recommendations or streaming info.
by Automate With Marc
Automatic Personalized Sales Follow-Up with GPT-5, Pinecone, and Tavily Research Description Never let a lead go cold. This workflow automatically sends personalized follow-up emails to every inbound inquiry. It combines GPT-5, Pinecone Vector DB, and Tavily Research to craft responses that align with your brand’s best practices, tone, and the latest product updates. With embedded research tools, every response is both timely and relevant—helping your sales team convert more leads without manual effort. 👉 Watch step-by-step builds of workflows like these on: www.youtube.com/@automatewithmarc How It Works Form Trigger – Captures inbound lead details (name, company, email, and message). AI Sales Agent (GPT-5) – Researches the lead’s business and problem statement, referencing Pinecone for your brand guidelines and product updates. Uses Tavily research for real-time enrichment. Structured Output Parser – Ensures the subject line and email body are formatted cleanly in JSON. Send Follow-Up Email (Gmail Node) – Delivers a polished, ready-to-go follow-up directly to the lead. Simple Memory – Maintains context across follow-ups for more natural conversations. Why Sales Teams Will Love It ⏱ Faster responses — every lead gets an immediate, high-quality reply. 📝 On-brand every time — Pinecone ensures tone matches your playbook. 🌍 Research-driven — Tavily enriches responses with fresh, relevant context. 📈 Higher conversions — timely, personalized outreach drives more meetings. 🤖 Hands-off automation — sales reps focus on closing, not chasing. Setup Instructions Form Trigger Configure your inbound form to capture lead details (name, email, company, message). Connect it to this workflow. Pinecone Setup Create a Pinecone index and embed your brand guidelines, sales playbook, and product updates. Update the Pinecone Vector Store node with your index name. Tavily Setup Add your Tavily API key to the Tavily Research node. OpenAI Setup Add your OpenAI API key to the GPT-5 Chat Model node. Adjust the system prompt inside the AI Agent to reflect your company’s style and tone. Gmail Node Connect your Gmail account to the Send Follow-Up Email node. Update sender details if you want the emails to come from a shared inbox or a rep’s personal account. Customization Tone of Voice – Modify the system prompt inside the AI Agent to be more professional, casual, or industry-specific. Scheduling Links – Replace the default Calendly link with your own booking tool. Form Fields – Add or remove fields depending on the information you collect (e.g., budget, role, region). Requirements Gmail account (for sending follow-up emails) OpenAI API key (GPT-5) Pinecone account (for storing/retrieving guidelines + updates) Tavily API key (for online research enrichment)
by Trung Tran
Chat-Based AWS IAM Policy Generator with OpenAI Agent > Chat-driven workflow that lets IT and DevOps teams generate custom AWS IAM policies via AI, automatically apply them to AWS, and send an email notification with policy details. 👤 Who’s it for This workflow is designed for: Cloud Engineers / DevOps* who need to quickly generate and apply *custom IAM policies** in AWS. IT Support / Security teams* who want to create IAM policies through a *chat-based interface** without manually writing JSON. Teams that want automatic notifications (via email) once new policies are created. ⚙️ How it works / What it does Trigger → Workflow starts when a chat message is received. IAM Policy Creator Agent → Uses OpenAI to: Interpret user requirements (e.g., service, actions, region). Generate a valid IAM policy JSON following AWS best practices. IAM Policy HTTP Request → Sends the generated policy to AWS IAM CreatePolicy API. Email Notification → Once AWS responds with a CreatePolicyResponse, an email is sent with policy details (name, ARN, ID, timestamps, etc.) using n8n mapping. Result: The user can chat with the AI agent, create a policy, and receive an email confirmation with full details. 🛠 How to set up Chat Trigger Node Configure the When chat message received node to connect your preferred chat channel (Slack, MS Teams, Telegram, etc.). IAM Policy Creator Agent Add OpenAI Chat Model as the LLM. Use a system prompt that enforces AWS IAM JSON best practices (least privilege, correct JSON structure). Connect Memory (Simple Memory) and Structured Output Parser to ensure consistent JSON output. IAM Policy HTTP Request Set method: POST URL: https://iam.amazonaws.com/ Add authentication using AWS Signature v4 (Access Key + Secret Key). Body: Action=CreatePolicy PolicyName={{ $json.CreatePolicyResponse.CreatePolicyResult.Policy.PolicyName }} PolicyDocument={{ $json.policyDocument }} Version=2010-05-08 Email for tracking 📋 Requirements n8n instance (self-hosted or cloud). AWS IAM user/role with permission to iam:CreatePolicy. AWS Access Key + Secret Key (for SigV4 signing in HTTP request). OpenAI API key (for the Chat Model). Email server credentials (SMTP or provider integration). 🎨 How to customize the workflow Restrict services/actions** → Adjust the IAM Policy Creator Agent system prompt to limit what services/policies can be generated. Notification channels** → Replace the email node with Slack, MS Teams, or PagerDuty to alert other teams. Tagging policies** → Modify the HTTP request to include Tags when creating policies in AWS. Human-readable timestamps** → Add a Function or Set node to convert CreateDate and UpdateDate from Unix epoch to ISO datetime before sending emails. Approval step** → Insert a manual approval node before sending the policy to AWS for compliance workflows.
by Krishna Sharma
📄 AI-Powered Document Summarizer & Notifier This workflow monitors a Google Drive folder for new files (Google Docs or PDFs), extracts text, summarizes content with OpenAI, and sends results to Slack or Email. 🔧 How It Works Monitors a Google Drive folder for new files. Detects file type → Google Doc vs PDF. Extracts text (via Google Docs API or PDF extractor). Summarizes & analyzes content using OpenAI. Sends results to Slack and/or Email. 👤 Who Is This For? Business teams → Quick digests of reports, proposals, contracts. Educators / researchers → Summaries of long study materials. Founders / managers → Daily summaries without opening every file. Operations teams → Compliance and documentation tracking. 💡 Use Case / Problem Solved Reading long documents is time-consuming. Sharing key points across teams requires manual effort. Important context (sentiment, action items) is often missed. 👉 This workflow solves it by auto-summarizing documents and notifying teams instantly. ⚙️ What This Workflow Does Monitors Google Drive for new Google Docs or PDFs. Extracts text automatically. Uses OpenAI to generate: Title Summary Key Points Suggested Action Items Language detection Sentiment (positive, neutral, negative) Pushes output to Slack channel and/or Email inbox. 🛠️ Setup Instructions Prerequisites Google Drive (OAuth2) Google Docs (OAuth2) OpenAI API Key Slack (OAuth2) or Gmail (OAuth2) Steps to Configure Connect Google Drive Choose the folder you want to monitor. Set up file type routing Use an IF node to split Docs vs PDFs. For Google Docs Use Google Docs Get → extract text. For PDFs Use Google Drive Download → Extract PDF. Send text to OpenAI Connect to your OpenAI model. Customize the system prompt to generate title, summary, sentiment, etc. Notify Send output to Slack channel or Gmail. Save & activate your workflow. 📌 Notes Adjust OpenAI prompt to suit your context. For large PDFs, consider splitting into smaller chunks.
by Rakin Jakaria
Use cases are many: Manage your Gmail inbox, schedule calendar events, and handle contact details — all from one central AI-powered assistant. Perfect for freelancers managing clients, agency owners who need streamlined communication, or busy professionals who want a personal AI secretary handling their email and calendar. Good to know At time of writing, each Gemini request is billed per token. See Gemini Pricing for the latest info. The workflow requires Gmail, Calendar, Sheets, and Telegram integrations. Ensure you’ve set up OAuth2 credentials correctly before running. How it works Triggers: The workflow listens for **new Gmail messages or Telegram commands. Smart AI Processing**: Incoming emails are summarized, classified (Client, Sponsorship, or Not Business), and labeled automatically. Auto-Replies**: Depending on classification, the assistant sends pre-written replies (e.g., client acknowledgment, sponsorship rates, or polite rejection). Calendar Management**: Through natural language requests in Telegram, you can schedule, update, or delete calendar events with conflict-checking in place. Contact Handling**: If you send an email to someone not yet in your database, the agent will prompt you for their email, add it to Google Sheets, and reuse it for future tasks. Memory**: The AI maintains conversation context, so repeated tasks feel seamless and natural. How to use Send commands via Telegram like: “Schedule a meeting with Sarah on Monday at 3 PM” “Send an email to David about the proposal” Watch as the assistant checks your calendar, sends emails, and keeps your contacts updated — all automatically. Requirements Gmail account (with labels created for Client, Sponsorship Request, and Not Business) Google Calendar for scheduling Google Sheets for contact management Google Gemini API key Telegram bot for live interaction Customising this workflow You can expand it to: Handle Slack or WhatsApp messages in addition to Telegram. Add more classification categories (e.g., Invoices, Personal, Leads). Extend auto-replies with dynamic templates stored in Google Sheets. Log all interactions to Notion or Airtable for a CRM-style history of communications. 👉 Rakin Jakaria