by Anshul Chauhan
Automate Your Life: The Ultimate AI Assistant in Telegram (Powered by Google Gemini) Transform your Telegram messenger into a powerful, multi-modal personal or team assistant. This n8n workflow creates an intelligent agent that can understand text, voice, images, and documents, and take action by connecting to your favorite tools like Google Calendar, Gmail, Todoist, and more. At its core, a powerful Manager Agent, driven by Google Gemini, interprets your requests, orchestrates a team of specialized sub-agents, and delivers a coherent, final response, all while maintaining a persistent memory of your conversations. Key Features 🧠 Intelligent Automation: Uses Google Gemini as a central "Manager Agent" to understand complex requests and delegate tasks to the appropriate tool. 🗣️ Multi-Modal Input: Interact naturally by sending text, voice notes, photos, or documents directly into your Telegram chat. 🔌 Integrated Toolset: Comes pre-configured with agents to manage your memory, tasks, emails, calendar, research, and project sheets. 🗂️ Persistent Memory: Leverages Airtable as a knowledge base, allowing the assistant to save and recall personal details, company information, or past conversations for context-rich interactions. ⚙️ Smart Routing: Automatically detects the type of message you send and routes it through the correct processing pipeline (e.g., voice is transcribed, images are analyzed). 🔄 Conversational Context: Utilizes a window buffer to maintain short-term memory, ensuring follow-up questions and commands are understood within the current conversation. How It Works The Telegram Trigger node acts as the entry point, receiving all incoming messages (text, voice, photo, document). A Switch node intelligently routes the message based on its type: Voice**: The audio file is downloaded and transcribed into text using a voice-to-text service. Photo**: The image is downloaded, converted to a base64 string, and prepared for visual analysis. Document**: The file is routed to a document handler that extracts its text content for processing. Text**: The message is used as-is. A Merge node gathers the processed input into a unified prompt. The Manager Agent receives this prompt. It analyzes the user's intent and orchestrates one or more specialized agents/tools: memory_base (Airtable): For saving and retrieving information from your long-term knowledge base. todo_and_task_manager (Todoist): To create, assign, or check tasks. email_agent (Gmail): To compose, search, or send emails. calendar_agent (Google Calendar): To schedule events or check your agenda. research_agent (Wikipedia/Web Search): To look up information. project_management (Google Sheets): To provide updates on project trackers. After executing the required tasks, the Manager Agent formulates a final response and sends it back to you via the Telegram node. Setup Instructions Follow these steps to get your AI assistant up and running. Telegram Bot: Create a new bot using the BotFather in Telegram to get your Bot Token. In the n8n workflow, configure the Telegram Trigger node's webhook. Add your Bot Token to the credentials in all Telegram nodes. For proactive messages, replace the chatId placeholders with your personal Telegram Chat ID. Google Gemini AI: In the Google Gemini nodes, add your credentials by providing your Google Gemini API key. Airtable Knowledge Base: Set up an Airtable base to act as your assistant's long-term memory. In the memory_base nodes (Airtable nodes), configure the credentials and provide the Base ID and Table ID. Google Workspace APIs: Connect your Google account credentials for Gmail, Google Calendar, and Google Sheets. In the relevant nodes, specify the Document/Sheet IDs you want the assistant to manage. Connect Other Tools: Add your credentials for Todoist and any other integrated tool APIs. Configure Conversational Memory: This workflow is designed for multi-user support. Verify that the Session Key in the "Window Buffer Memory" nodes is correctly set to a unique user identifier from Telegram (e.g., {{ $json.chat.id }}). This ensures conversations from different users are kept separate. Review Schedule Triggers: Check any nodes designed to run on a schedule (e.g., "At a regular time"). Adjust their cron expressions, times, and timezone to fit your needs (e.g., for daily summaries). Test the Workflow: Activate the workflow. Send a text message to your bot (e.g., "Hello!"). Estimated Setup Time 30–60 minutes:** If you already have your API keys, account credentials, and service IDs (like Sheet IDs) ready. 2–3 hours:** For a complete, first-time setup, which includes creating API keys, setting up new spreadsheets or Airtable bases, and configuring detailed permissions.
by Mariela Slavenova
This template enriches a lead list by analyzing each contact’s LinkedIn activity and auto-generating a single personalized opening line for cold outreach. Drop a spreadsheet into a Google Drive folder → the workflow parses rows, fetches LinkedIn content (recent post or profile), uses an LLM to craft a one-liner, writes the result back to Google Sheets, and sends a Telegram summary. ⸻ Good to know • Works with two paths: • Recent post found → personalize from the latest LinkedIn post. • No recent post → personalize from profile fields (headline, about, current role). • Requires valid Apify credentials for LinkedIn scrapers and LLM keys (Anthropic and/or OpenAI). • Costs depend on the LLM(s) you choose and scraping usage. • Replace all placeholders like [put your token here] and [put your Telegram Bot Chat ID here] before running. • Respect the target platform’s terms of service when scraping LinkedIn data. What this workflow does Trigger (Google Drive) – Watches a specific folder for newly uploaded lead spreadsheets. Download & Parse – Downloads the file and converts it to structured items (first name, last name, company, LinkedIn URL, email, website). Batch Loop – Processes each row individually. Fetch Activity – Calls Apify LinkedIn Profile Posts (latest post) and records current date for recency checks. Recency Check (LLM) – An OpenAI node returns true/false for “post is from the current year.” Branching • If TRUE → AI Agent (Anthropic) crafts a single, natural reference line based on the recent post. • If FALSE → Apify LinkedIn Profile → AI Agent (Anthropic) crafts a one-liner from profile data (headline/about/current role). Write Back (Google Sheets) – Updates the original sheet by matching on email and writing the personalization field. Notify (Telegram) – Sends a brief completion summary with sheet name and link. Requirements • Google Drive & Google Sheets connections • Apify account + token for LinkedIn scrapers • LLM keys: Anthropic (Claude) and/or OpenAI (you can use one or both) • Telegram bot for notifications (bot token + chat ID) How to use Connect credentials for Google, Apify, OpenAI/Anthropic, and Telegram. Set your folder in the Google Drive Trigger to the one where you’ll drop lead sheets. Map sheet columns to the expected headers (e.g., First Name, Last Name, Company Name for Emails, Person Linkedin Url, Email, Website). Replace placeholders ([put your token here], [put your Telegram Bot Chat ID here]) in the respective nodes. Upload a test spreadsheet to the watched folder and run once to validate the flow. Review results in your sheet (new personalization column) and check Telegram for the completion message. Setup Connect credentials - Google Drive/Sheets, Apify, OpenAI and/or Anthropic, Telegram. Configure the Drive trigger - Select the folder where you’ll upload your lead sheets. Map columns - Ensure your sheet has: First Name, Last Name, Company Name for Emails, Person Linkedin Url, Email, Website. Replace placeholders - In HTTP nodes: Bearer [put your token here]. In Telegram node: [put your Telegram Bot Chat ID here] (Optional) Adjust the recency rule - Current logic checks for current-year posts; change the prompt if you prefer 30-day windows. How to use Upload a test spreadsheet to the watched Drive folder. Execute the workflow once to validate. Open your Google Sheet to see the new personalization column populated. Check Telegram for the completion summary. Customizing this template • Data sources: Add company news, website content, or X/Twitter as fallback signals. • LLM choices: Use only Anthropic or only OpenAI; tweak temperature for tone. • Destinations: Write to a CRM (HubSpot/Salesforce/Airtable) instead of Sheets. • Notifications: Swap Telegram for Slack/Email/Discord. Who it’s for • Sales & SDR teams needing authentic, scalable personalization for cold outreach. • Lead gen agencies enriching spreadsheets with ready-to-use openers. • Marketing & growth teams improving reply rates by referencing real prospect activity. Limitations & compliance • LinkedIn scraping may be rate-limited or blocked; follow platform ToS and local laws. • Costs vary with scraping volume and LLM usage. Need help customizing? Contact me for consulting and support: LinkedIn
by Will Carlson
What it does: Collects cybersecurity news from trusted RSS feeds and uses OpenAI’s Retrieval-Augmented Generation (RAG) capabilities with Pinecone to filter for content that is directly relevant to your organization’s tech stack. “Relevant” means the AI looks for news items that mention your specific tools, vendors, frameworks, cloud platforms, programming languages, operating systems, or security solutions — as described in your .txt scope documents. By training on these documents, the system understands the environment you operate in and can prioritize news that could affect your security posture, compliance, or operational stability. Once filtered, summaries of the most important items are sent to your work email every day. How it works Pulls in news from multiple cybersecurity-focused RSS feeds:** The workflow automatically collects articles from trusted, high-signal security news sources. These feeds cover threat intelligence, vulnerability disclosures, vendor advisories, and industry updates. Filters articles for recency and direct connection to your documented tech stack:** Using the publish date, it removes stale or outdated content. Then, leveraging your .txt scope documents stored in Pinecone, it checks each article for references to your technologies, vendors, platforms, or security tools. Uses OpenAI to generate and review concise summaries:** For each relevant article, OpenAI creates a short, clear summary of the key points. The AI also evaluates whether the article provides actionable or critical information before passing it through. Trains on your scope using Pinecone Vector Store (free) for context-aware filtering:** Your scope documents are embedded into a vector store so the AI can “remember” your environment. This context ensures the filtering process understands indirect or non-obvious connections to your tech stack. Aggregates and sends only the most critical items to your work email:** The system compiles the highest-priority news items into one daily digest, so you can review key developments without wading through irrelevant stories. What you need to do: Setup your OpenAI and Pinecone credentials in the workflow Create and configure a Pinecone index (dimension 1536 recommended) Pinecone is free to setup. Setup Pinecone with a single free index. Use a namespace like: scope. Make sure the embedding model is the same for all of your Pinecone references. Submit .txt scope documents listing your technologies, vendors, platforms, frameworks, and security products. .txt does not need to be structured. Add as much detail as possible. Update AI prompts to accurately describe your company’s environment and priorities.
by Razvan Bara
How it works: This n8n workflow automates communication with meeting invitees to decrease no-show rates by sending timely email and WhatsApp reminders, and a clarification request if more information is needed to prepare the meeting. Step-by-step: The workflow is triggered by an incoming email notification from Calendly about a newly scheduled meeting. It uses AI to extract key meeting data from the email content. It checks if the invitee didn't provide sufficient information, and, if there is a need for more information, sends a clarification request email. It calculates the waiting time required for the 24-hour and 1-hour reminders. It uses an If node to determine the correct waiting path based on the meeting time. It uses Wait nodes for timing the reminders correctly. Finally, it sends a reminder email and a WhatsApp reminder before the meeting. Customization Options: Replace Google Gemini with your preferred LLM model (though Gemini works on the free tier). Tailor email and WhatsApp messages to speak your brand's language. Replace Twillio node to WhatsApp node to be a completly free usage flow.
by Meak
Gmail Lead Reply Analyzer → HubSpot Task + Slack Alert Most sales teams read every email, guess if it’s important, and tell teammates manually. This workflow does it automatically: check intent and sentiment with AI, create follow-up tasks, send Slack alerts, and save everything to Google Sheets. Benefits AI checks sentiment, intent, urgency, and priority Creates HubSpot tasks only if follow-up is needed Sends Slack message with lead summary Logs all results to Google Sheets for tracking Runs 24/7 with no manual sorting How It Works Gmail trigger watches a label for new replies Workflow extracts sender, subject, and message AI analyzes message and returns: sentiment, intent, urgency, next step Code step cleans result, adds date, and checks if follow-up is needed If follow-up = yes → create HubSpot task, send Slack alert, log to Sheets If follow-up = no → just log to Sheets Who Is This For Sales teams getting many leads by email Founders who handle leads themselves Agencies needing clear and fast lead triage Setup Connect Gmail (choose or create label) Add OpenAI API key (model: GPT-4o mini) Connect HubSpot (App Token for tasks) Connect Slack (channel for alerts) Connect Google Sheets (Spreadsheet + Tab) Optional: change how urgency/priority is scored in the code ROI & Monetization Save 3–6 hours per week on email sorting Answer faster and close more deals Sell as $1k–$3k/month “inbox automation” service Strategy Insights In the full walkthrough, I show how to: Make sure AI always returns valid JSON Adjust what counts as a follow-up lead Format Slack messages for quick reading Use Google Sheets as a simple dashboard Check Out My Channel For more AI automation systems that get real results, check out my YouTube channel where I share exactly how I build automation workflows, sell high-value services, and scale to $20k+ monthly revenue.
by John Alejandro SIlva
🤖💬 Smart Telegram AI Assistant with Memory Summarization & Dynamic Model Selection > Optimize your AI workflows, cut costs, and get faster, more accurate answers. 📋 Description Tired of expensive AI calls, slow responses, or bots that forget your context? This Telegram AI Assistant template is designed to optimize cost, speed, and precision in your AI-powered conversations. By combining PostgreSQL chat memory, AI summarization, and dynamic model selection, this workflow ensures you only pay for what you really need. Simple queries get routed to lightweight models, while complex requests automatically trigger more advanced ones. The result? Smarter context, lower costs, and better answers. This template is perfect for anyone who wants to: ⚡ Save money by using cheaper models for easy tasks. 🧠 Keep context relevant with AI-powered summarization. ⏱️ Respond faster thanks to optimized chat memory storage. 💬 Deliver better answers directly inside Telegram. ✨ Key Benefits 💸 Cost Optimization: Automatically routes simple requests to Gemini Flash Lite and reserves Gemini Pro only for complex reasoning. 🧠 Smarter Context: Summarization ensures only the most relevant chat history is used. ⏱️ Faster Workflows: Storing user + agent messages in a single row reduces DB queries by half and saves ~0.3s per response. 🎤 Voice Message Support: Convert Telegram voice notes to text and reply intelligently. 🛡️ Error-Proof Formatting: Safe MarkdownV2 ensures Telegram-ready answers. 💼 Use Case This template is for anyone who needs an AI chatbot on Telegram that balances cost, performance, and intelligence. Customer support teams can reduce expenses by using lightweight models for FAQs. Freelancers and consultants can offer faster AI-powered chats without losing context. Power users can handle voice + text seamlessly while keeping conversations memory-aware. Whether you’re scaling a business or just want a smarter assistant, this workflow adapts to your needs and budget. 💬 Example Interactions Quick Q&A** → Routed to Gemini Flash Lite for fast, low-cost answers. Complex problem-solving** → Sent to Gemini Pro for in-depth reasoning. Voice messages** → Automatically transcribed, summarized, and answered. Long conversations** → Context is summarized, ensuring precise and efficient replies. 🔑 Required Credentials Telegram Bot API** (Bot Token) PostgreSQL** (Database connection) Google Gemini API** (Flash Lite, Flash, Pro) ⚙️ Setup Instructions 🗄️ Create the PostgreSQL table (chat_memory) from the Gray section SQL. 🔌 Configure the Telegram Trigger with your bot token. 🤖 Connect your Gemini API credentials. 🗂️ Set up PostgreSQL nodes with your DB details. ▶️ Activate the workflow and start chatting with your AI-powered Telegram bot. 🏷 Tags telegram ai-assistant chatbot postgresql summarization memory gemini dynamic-routing workflow-optimization cost-saving voice-to-text 🙏 Acknowledgement A special thank you to Davide for the inspiration behind this template. His work on the AI Orchestrator that dynamically selects models based on input type served as a foundational guide for this architecture. 💡 Need Assistance? Want to customize this workflow for your business or project? Let’s connect: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez
by Cheng Siong Chin
Introduction Automates travel planning by aggregating flights, hotels, activities, and weather via APIs, then uses AI to generate professional itineraries delivered through Gmail and Slack. How It Works Webhook receives requests, searches APIs (Skyscanner, Booking.com, Kiwi, Viator, weather), merges data, AI builds itineraries, scores options, generates HTML emails, delivers via Gmail/Slack. Workflow Template Webhook → Extract → Parallel Searches (Flights/Hotels/Activities/Weather) → Merge → Build Itinerary → AI Processing → Score → Generate HTML → Gmail → Slack → Response Workflow Steps Trigger & Extract: Receives destination, dates, preferences, extracts parameters. Data Gathering: Parallel APIs fetch flights, hotels, activities, weather, merges responses. AI Processing: Analyzes data, creates itinerary, ranks recommendations. Delivery: Generates HTML email, sends via Gmail/Slack, confirms completion. Setup Instructions API Configuration: Add keys for Skyscanner, Booking.com, Kiwi, Viator, OpenWeatherMap, OpenRouter. Communication: Connect Gmail OAuth2, Slack webhook. Customization: Adjust endpoints, AI prompts, HTML template, scoring criteria. Prerequisites API keys: Skyscanner, Booking.com, Kiwi, Viator, OpenWeatherMap, OpenRouter Gmail account Slack workspace n8n instance Use Cases Corporate travel planning Vacation itinerary generation Group trip coordination Customization Add sources (Airbnb, TripAdvisor) Filter by budget preferences Add PDF generation Customize Slack format Benefits Saves 3-5 hours per trip Real-time pricing aggregation AI-powered personalization Automated multi-channel delivery
by Tsubasa Shukuwa
How it works This workflow automatically fetches the latest public grant information from the Ministry of Health, Labour and Welfare (MHLW) RSS feed. It uses AI to summarize and structure each grant post into a clear format, stores the results in Google Sheets, and sends a formatted HTML summary via Gmail. Workflow summary Schedule Trigger – Runs the flow daily or weekly. RSS Feed Reader – Fetches the latest MHLW news and updates. Text Classifier (AI) – Categorizes the item as “Grant/Subsidy”, “Labor-related”, or “Other”. AI Agent – Extracts structured data such as title, summary, deadline, amount, target, and URL. Google Sheets – Appends or updates the database using the grant title as the key. Code Node – Builds an HTML report summarizing new entries. Gmail – Sends a daily digest email to your inbox. Setup steps Add your OpenRouter API key as a credential (used in the AI Agent). Replace the Google Sheets ID and sheet name with your own. Update the recipient email address in the Gmail node. Adjust the schedule trigger to match your preferred frequency. (Optional) Add more RSS feeds if you want to monitor other sources. Ideal for Consultants or administrators tracking subsidy and grant programs Small business owners who want automatic updates Anyone who wants a daily AI-summarized government grant digest ⚙️ Note: Detailed explanations and setup hints are included as Sticky Notes above each node inside the workflow.
by Daniel Agrici
This workflow automates business intelligence. Submit one URL, and it scrapes the website, uses AI to perform a comprehensive analysis, and generates a professional report in Google Doc and PDF format. It's perfect for agencies, freelancers, and consultants who need to streamline client research or competitive analysis. How It Works The workflow is triggered by a form input, where you provide a single website URL. Scrape: It uses Firecrawl to scrape the sitemap and get the full content from the target website. Analyze: The main workflow calls a Tools Workflow (included below) which uses Google Gemini and Perplexity AI agents to analyze the scraped content and extract key business information. Generate & Deliver: All the extracted data is formatted and used to populate a template in Google Docs. The final report is saved to Google Drive and delivered via Gmail. What It Generates The final report is a comprehensive business analysis, including: Business Overview: A full company description. Target Audience Personas: Defines the demographic and psychographic profiles of ideal customers. Brand & UVP: Extracts the brand's personality matrix and its Unique Value Proposition (UVP). Customer Journey: Maps the typical customer journey from Awareness to Loyalty. Required Tools This workflow requires n8n and API keys/credentials for the following services: Firecrawl (for scraping) Perplexity (for AI analysis) Google Gemini (for AI analysis) Google Services (for Docs, Drive, and Gmail) ⚠️ Required: Tools Workflow This workflow will not work without its "Tools" sub-workflow. Please create a new, separate workflow in n8n, name it (e.g., "Business Analysis Tools"), and paste the following code into it. { "name": "Business Analysis Workflow Tools", "nodes": [ { "parameters": { "workflowInputs": { "values": [ { "name": "function" }, { "name": "keyword" }, { "name": "url" }, { "name": "location_code" }, { "name": "language_code" } ] } }, "type": "n8n-nodes-base.executeWorkflowTrigger", "typeVersion": 1.1, "position": [ -448, 800 ], "id": "e79e0605-f9ac-4166-894c-e5aa9bd75bac", "name": "When Executed by Another Workflow" }, { "parameters": { "rules": { "values": [ { "conditions": { "options": { "caseSensitive": true, "leftValue": "", "typeValidation": "strict", "version": 2 }, "conditions": [ { "id": "8d7d3035-3a57-47ee-b1d1-dd7bfcab9114", "leftValue": "serp_search", "rightValue": "={{ $json.function }}", "operator": { "type": "string", "operation": "equals", "name": "filter.operator.equals" } } ], "combinator": "and" }, "renameOutput": true, "outputKey": "serp_search" }, { "conditions": { "options": { "caseSensitive": true, "leftValue": "", "typeValidation": "strict", "version": 2 }, "conditions": [ { "id": "bb2c23eb-862d-4582-961e-5a8d8338842c", "leftValue": "ai_mode", "rightValue": "={{ $json.function }}", "operator": { "type": "string", "operation": "equals", "name": "filter.operator.equals" } } ], "combinator": "and" }, "renameOutput": true, "outputKey": "ai_mode" }, { "conditions": { "options": { "caseSensitive": true, "leftValue": "", "typeValidation": "strict", "version": 2 }, "conditions": [ { "id": "4603eee1-3888-4e32-b3b9-4f299dfd6df3", "leftValue": "internal_links", "rightValue": "={{ $json.function }}", "operator": { "type": "string", "operation": "equals", "name": "filter.operator.equals" } } ], "combinator": "and" }, "renameOutput": true, "outputKey": "internal_links" } ] }, "options": {} }, "type": "n8n-nodes-base.switch", "typeVersion": 3.2, "position": [ -208, 784 ], "id": "72c37890-7054-48d8-a508-47ed981551d6", "name": "Switch" }, { "parameters": { "method": "POST", "url": "https://api.dataforseo.com/v3/serp/google/organic/live/advanced", "authentication": "genericCredentialType", "genericAuthType": "httpBasicAuth", "sendBody": true, "specifyBody": "json", "jsonBody": "=[\n {\n \"keyword\": \"{{ $json.keyword.replace(/[:'\"\\\\/]/g, '') }}\",\n \"location_code\": {{ $json.location_code }},\n \"language_code\": \"{{ $json.language_code }}\",\n \"depth\": 10,\n \"group_organic_results\": true,\n \"load_async_ai_overview\": true,\n \"people_also_ask_click_depth\": 1\n }\n]", "options": { "redirect": { "redirect": { "followRedirects": false } } } }, "type": "n8n-nodes-base.httpRequest", "typeVersion": 4.2, "position": [ 384, 512 ], "id": "6203f722-b590-4a25-8953-8753a44eb3cb", "name": "SERP Google", "credentials": { "httpBasicAuth": { "id": "n5o00CCWcmHFeI1p", "name": "DataForSEO" } } }, { "parameters": { "content": "## SERP Google", "height": 272, "width": 688, "color": 4 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 288, 432 ], "id": "81593217-034f-466d-9055-03ab6b2d7d08", "name": "Sticky Note3" }, { "parameters": { "assignments": { "assignments": [ { "id": "97ef7ee0-bc97-4089-bc37-c0545e28ed9f", "name": "platform", "value": "={{ $json.tasks[0].data.se }}", "type": "string" }, { "id": "9299e6bb-bd36-4691-bc6c-655795a6226e", "name": "type", "value": "={{ $json.tasks[0].data.se_type }}", "type": "string" }, { "id": "2dc26c8e-713c-4a59-a353-9d9259109e74", "name": "keyword", "value": "={{ $json.tasks[0].data.keyword }}", "type": "string" }, { "id": "84c9be31-8f1d-4a67-9d13-897910d7ec18", "name": "results", "value": "={{ $json.tasks[0].result }}", "type": "array" } ] }, "options": {} }, "type": "n8n-nodes-base.set", "typeVersion": 3.4, "position": [ 592, 512 ], "id": "a916551a-009b-403f-b02e-3951d54d2407", "name": "Prepare SERP output" }, { "parameters": { "content": "# Google Organic Search API\n\nThis API lets you retrieve real-time Google search results with a wide range of parameters and custom settings. \nThe response includes structured data for all available SERP features, along with a direct URL to the search results page. \n\n👉 Documentation\n", "height": 272, "width": 496, "color": 4 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 976, 432 ], "id": "87672b01-7477-4b43-9ccc-523ef8d91c64", "name": "Sticky Note17" }, { "parameters": { "method": "POST", "url": "https://api.dataforseo.com/v3/serp/google/ai_mode/live/advanced", "authentication": "genericCredentialType", "genericAuthType": "httpBasicAuth", "sendBody": true, "specifyBody": "json", "jsonBody": "=[\n {\n \"keyword\": \"{{ $json.keyword }}\",\n \"location_code\": {{ $json.location_code }},\n \"language_code\": \"{{ $json.language_code }}\",\n \"device\": \"mobile\",\n \"os\": \"android\"\n }\n]", "options": { "redirect": { "redirect": {} } } }, "type": "n8n-nodes-base.httpRequest", "typeVersion": 4.2, "position": [ 384, 800 ], "id": "fb0001c4-d590-45b3-a3d0-cac7174741d3", "name": "AI Mode", "credentials": { "httpBasicAuth": { "id": "n5o00CCWcmHFeI1p", "name": "DataForSEO" } } }, { "parameters": { "content": "## AI Mode", "height": 272, "width": 512, "color": 6 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 288, 720 ], "id": "2cea3312-31f8-4ff0-b385-5b76b836274c", "name": "Sticky Note11" }, { "parameters": { "assignments": { "assignments": [ { "id": "b822f458-ebf2-4a37-9906-b6a2606e6106", "name": "keyword", "value": "={{ $json.tasks[0].data.keyword }}", "type": "string" }, { "id": "10484675-b107-4157-bc7e-b942d8cdb5d2", "name": "result", "value": "={{ $json.tasks[0].result[0].items }}", "type": "array" } ] }, "options": {} }, "type": "n8n-nodes-base.set", "typeVersion": 3.4, "position": [ 592, 800 ], "id": "6b1e7239-ee2b-4457-8acb-17ce87415729", "name": "Prepare AI Mode Output" }, { "parameters": { "content": "# Google AI Mode API\n\nThis API provides AI-generated search result summaries and insights from Google. \nIt returns detailed explanations, overviews, and related information based on search queries, with parameters to customize the AI overview. \n\n👉 Documentation\n", "height": 272, "width": 496, "color": 6 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 800, 720 ], "id": "d761dc57-e35d-4052-a360-71170a155f7b", "name": "Sticky Note18" }, { "parameters": { "content": "## Input", "height": 384, "width": 544, "color": 7 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -528, 672 ], "id": "db90385e-f921-4a9c-89f3-53fc5825b207", "name": "Sticky Note" }, { "parameters": { "assignments": { "assignments": [ { "id": "b865f4a0-b4c3-4dde-bf18-3da933ab21af", "name": "platform", "value": "={{ $json.platform }}", "type": "string" }, { "id": "476e07ca-ccf6-43d4-acb4-4cc905464314", "name": "type", "value": "={{ $json.type }}", "type": "string" }, { "id": "f1a14eb8-9f10-4198-bbc7-17091532b38e", "name": "keyword", "value": "={{ $json.keyword }}", "type": "string" }, { "id": "181791a0-1d88-481c-8d98-a86242bb2135", "name": "results", "value": "={{ $json.results[0].items }}", "type": "array" } ] }, "options": {} }, "type": "n8n-nodes-base.set", "typeVersion": 3.4, "position": [ 800, 512 ], "id": "83fef061-5e0b-417c-b1f6-d34eb712fac6", "name": "Sort Results" }, { "parameters": { "content": "## Internal Links", "height": 272, "width": 272, "color": 5 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 288, 1024 ], "id": "9246601a-f133-4ca3-aac8-989cb45e6cd2", "name": "Sticky Note7" }, { "parameters": { "method": "POST", "url": "https://api.firecrawl.dev/v2/map", "sendHeaders": true, "headerParameters": { "parameters": [ { "name": "Authorization", "value": "Bearer your-firecrawl-apikey" } ] }, "sendBody": true, "specifyBody": "json", "jsonBody": "={\n \"url\": \"https://{{ $json.url }}\",\n \"limit\": 400,\n \"includeSubdomains\": false,\n \"sitemap\": \"include\"\n }", "options": {} }, "type": "n8n-nodes-base.httpRequest", "typeVersion": 4.2, "position": [ 368, 1104 ], "id": "fd6a33ae-6fb3-4331-ab6a-994048659116", "name": "Get Internal Links" }, { "parameters": { "content": "# Firecrawl Map API\n\nThis endpoint maps a website from a single URL and returns the list of discovered URLs (titles and descriptions when available) — extremely fast and useful for selecting which pages to scrape or for quickly enumerating site links. (Firecrawl)\n\nIt supports a search parameter to find relevant pages inside a site, location/languages options to emulate country/language (uses proxies when available), and SDK + cURL examples in the docs,\n\n👉 Documentation\n\n[1]: https://docs.firecrawl.dev/features/map \"Map | Firecrawl\"\n", "height": 272, "width": 624, "color": 5 }, "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 560, 1024 ], "id": "08457204-93ff-4586-a76e-03907118be3c", "name": "Sticky Note24" } ], "pinData": { "When Executed by Another Workflow": [ { "json": { "function": "serp_search", "keyword": "villanyszerelő Largo Florida", "url": null, "location_code": 2840, "language_code": "hu" } } ] }, "connections": { "When Executed by Another Workflow": { "main": [ [ { "node": "Switch", "type": "main", "index": 0 } ] ] }, "Switch": { "main": [ [ { "node": "SERP Google", "type": "main", "index": 0 } ], [ { "node": "AI Mode", "type": "main", "index": 0 } ], [ { "node": "Get Internal Links", "type": "main", "index": 0 } ] ] }, "SERP Google": { "main": [ [ { "node": "Prepare SERP output", "type": "main", "index": 0 } ] ] }, "AI Mode": { "main": [ [ { "node": "Prepare AI Mode Output", "type": "main", "index": 0 } ] ] }, "Prepare SERP output": { "main": [ [ { "node": "Sort Results", "type": "main", "index": 0 } ] ] }, "Sort Results": { "main": [ [] ] } }, "active": false, "settings": { "executionOrder": "v1" }, "versionId": "6fce16d1-aa28-4939-9c2d-930d11c1e17f", "meta": { "instanceId": "1ee7b11b3a4bb285563e32fdddf3fbac26379ada529b942ee7cda230735046a1" }, "id": "VjpOW2V2aNV9HpQJ", "tags": [] } `
by Muhammad Ali
Who’s it for Perfect for marketing agencies that manage multiple Facebook ad accounts and want to automate their weekly reporting. It eliminates manual data collection, analysis, and client updates by delivering a ready-to-share PDF report. How it works Every Monday, the workflow: Fetches the previous week’s campaign metrics from the Facebook Graph API. Formats and summarizes each campaign’s performance using OpenAI. Merges all summaries into one comprehensive report with insights and next-week suggestions. Converts the report into a polished PDF using any PDF generation API. Sends the final PDF report automatically to the client via Gmail. How to set up Connect your Facebook, OpenAI, and Gmail accounts in n8n. Add credentials for your preferred PDF generator (e.g., PDFCrowd, Placid, etc.). Open the “Set Node” to customize recipient email, date range, or report text. Requirements Facebook Graph API access token OpenAI API key Gmail credentials API key for your PDF generation service How to customize You can modify the trigger day, personalize the report design, or include additional analytics such as ROAS, CPC, or conversion data for deeper insights.
by Tsubasa Shukuwa
How it works This workflow automatically generates a new haiku poem every morning using AI, formats it in 5-7-5 structure, saves it to Google Docs, and sends it to your email inbox. Workflow steps: Schedule Trigger – Runs daily at 7:00 AM. AI Agent – Asks AI to output four words (kigo, noun, verb1, verb2) in JSON format. Code in JavaScript – Builds a 5-7-5 haiku using the AI-generated words and sets today’s title. Edit Fields – Prepares document fields (title and body) for Google Docs. Create a document – Creates a new Google Document for the haiku. Prepare Append – Collects the document ID and haiku text for appending. Update a document – Inserts the haiku into the existing Google Doc. Send a message – Sends the haiku of the day to your Gmail inbox. OpenRouter Chat Model – Connects the OpenRouter model used by the AI Agent. Setup steps Connect your OpenRouter API key as a credential (used in the AI Agent node). Update your Google Docs folder ID and Gmail account credentials. Change the email recipient address in the “Send a message” node. Adjust the Schedule Trigger time as you like. Run the workflow once to test and verify document creation and email delivery. Ideal for Writers and poets who want daily creative inspiration. Individuals seeking a fun morning ritual. Educators demonstrating AI text generation in a practical example. ⚙️ Note: Each node includes an English Sticky Note above it for clarity and documentation.
by Jitesh Dugar
👤 Who’s it for This workflow is designed for employees who need to submit expense claims for business trips. It automates the process of extracting data from receipts/invoices, logging it to a Google Sheet, and notifying the finance team via email. Ideal users: Employees submitting business trip expense claims HR or Admins reviewing travel-related reimbursements Finance teams responsible for processing claims ⚙️ How it works / What it does Employee submits a form with trip information (name, department, purpose, dates) and uploads one or more receipts/invoices (PDF). Uploaded files are saved to Google Drive for record-keeping. Each PDF is passed to a DocClaim Assistant agent, which uses GPT-4o and a structured parser to extract structured invoice data. The data is transformed and formatted into a standard JSON structure. Two parallel paths are followed: Invoice records are appended to a Google Sheet for centralized tracking. A detailed HTML email summarizing the trip and expenses is generated and sent to the finance department for claim processing. 🛠 How to set up Create a form to capture: Employee Name Department Trip Purpose From Date / To Date Receipt/Invoice File Upload (multiple PDFs) Configure file upload node to store files in a specific Google Drive folder. Set up DocClaim Agent using: GPT-4o or any LLM with document analysis capability Output parser for standardizing extracted receipt data (e.g., vendor, total, tax, date) Transform extracted data into a structured claim record (Code Node). Path 1: Save records to a Google Sheet (one row per expense). Path 2: Format the employee + claim data into a dynamic HTML email Use Send Email node to notify the finance department (e.g., finance@yourcompany.com) ✅ Requirements Jotform account with expense form setup Sign up for free here n8n running with access to: Google Drive API (for file uploads) Google Sheets API (for logging expenses) Email node (SMTP or Gmail for sending) GPT-4o or equivalent LLM with document parsing ability PDF invoices with clear formatting Shared Google Sheet for claim tracking Optional: Shared inbox for finance team 🧩 How to customize the workflow Add approval steps**: route the email to a manager before finance Attach original PDFs**: include uploaded files in the email as attachments Localize for other languages**: adapt form labels, email content, or parser prompts Sync to ERP or accounting system**: replace Google Sheet with QuickBooks, Xero, etc. Set limits/validation**: enforce max claim per trip or required fields before submission Auto-tag expenses**: add categories (e.g., travel, accommodation) for better reporting