by Jimleuk
This template attempts to replicate OpenAI's DeepResearch feature which, at time of writing, is only available to their pro subscribers. > An agent that uses reasoning to synthesize large amount of online information and complete multi-step research tasks for you. Source Though the inner workings of DeepResearch have not been made public, it is presumed the feature relies on the ability to deep search the web, scrape web content and invoking reasoning models to generate reports. All of which n8n is really good at! Using this workflow, n8n users can enjoy a variation of the Deep Research experience for themselves and their teams at a fraction of the cost. Better yet, learn and customise this Deep Research template for their businesses and/or organisations. Check out the generated reports here: https://jimleuk.notion.site/19486dd60c0c80da9cb7eb1468ea9afd?v=19486dd60c0c805c8e0c000ce8c87acf How it works A form is used to first capture the user's research query and how deep they'd like the researcher to go. Once submitted, a blank Notion page is created which will later hold the final report and the researcher gets to work. The user's query goes through a recursive series of web serches and web scraping to collect data on the research topic to generate partial learnings. Once complete, all learnings are combined and given to a reasoning LLM to generate the final report. The report is then written to the placeholder Notion page created earlier. How to use Duplicate this Notion database template and make sure all Notion related nodes point to it. Sign-up for APIFY.com API Key for web search and scraping services. Ensure you have access to OpenAI's o3-mini model. Alternatively, switch this out for o1 series. You must publish this workflow and ensure the form url is publically accessible. On depth & breadth configuration For more detailed reports, increase depth and breadth but be warned the workflow will take exponentially longer and cost more to complete. The recommended defaults are usually good enough. Depth=1 & Breadth=2 - will take about 5 - 10mins. Depth=1 & Breadth=3 - will take about 15 - 20mins. Dpeth=3 & Breadth=5 - will take about 2+ hours! Customising this workflow I deliberately chose not to use AI-powered scrapers like Firecrawl as I felt these were quite costly and quotas would be quickly exhausted. However, feel free to switch web search and scraping services which suit your environment. Maybe you don't decide to source the web and instead, data collection comes from internal documents instead. This template gives you freedom to change this. Experiment with different Reasoning/Thinking models such as Deepseek and Google's Gemini 2.0. Finally, the LLM prompts could definitely be improved. Refine them to fit your use-case. Credits This template is largely based off the work by David Zhang (dzhng) and his open source implementation of Deep Research: https://github.com/dzhng/deep-research
by Marcial Ambriz
Remixed Backup your workflows to GitHub from Solomon's work. Check out his templates. How it works This workflow will backup your workflows to GitHub. It uses the n8n API node to export all workflows. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. In addition, it also checks if any workflows have been deleted from n8n. If a workflow no longer exists in n8n, the corresponding file will be removed from the repository to keep everything in sync. Who is this for? People wanting to backup their workflows outside the server for safety purposes or to migrate to another server.
by Muhammad Nouman
How it works This workflow turns a Google Drive folder into a fully automated YouTube publishing pipeline. Whenever a new video file is added to the folder, the workflow generates all YouTube metadata using AI, uploads the video to your YouTube channel, deletes the original file from Drive, sends a Telegram confirmation, and can optionally post to Instagram and Facebook using permanent system tokens. High-level flow: Detects new video uploads in a specific Google Drive folder. Downloads the file and uses AI to generate: • a polished first-person YouTube description • an SEO-optimized YouTube title • high-ranking YouTube tags Uploads the video to YouTube with the generated metadata. Deletes the original Drive file after upload. Sends a Telegram notification with video details. (Optional) Posts to Instagram & Facebook using permanent system user tokens. Set up steps Setup usually takes a few minutes. Add Google Drive OAuth2 credentials for the trigger and download/delete nodes. Add your OpenAI (or Gemini) API credentials for title/description/tag generation. Add YouTube OAuth2 credentials in the YouTube Upload node. Add Facebook/Instagram Graph API credentials if enabling cross-posting. Replace placeholder IDs (Drive folder ID, Page ID, IG media endpoint). Review sticky notes in the workflow—they contain setup guidance and token info. Activate the Google Drive trigger to start automated uploads.
by Jonas
🎧 Daily RSS Digest & Podcast Generation This workflow automates the creation of a daily sports podcast from your favorite news sources. It fetches articles, uses AI to write a digest and a two-person dialogue, and produces a single, merged audio file with KOKORO TTS ready for listening. ✨ How it works: 📰 Fetch & Filter Daily News: The workflow triggers daily, fetches articles from your chosen RSS feeds, and filters them to keep only the most recent content. ✍️ Generate AI Digest & Script: Using Google Gemini, it first creates a written summary of the day's news. A second AI agent then transforms this news into an engaging, conversational podcast script between two distinct AI speakers. 🗣️ Generate Voices in Chunks: The script is split into individual lines of dialogue. The workflow then loops through each line, calling a Text-to-Speech (TTS) API to generate a separate audio file (an MP3 chunk) for each part of the conversation. 🎛️ Merge Audio with FFmpeg: After all the audio chunks are created and saved locally, a command-line script generates a list of all the files and uses FFmpeg to losslessly merge them into a single, seamless MP3 file. All temporary files are then deleted. 📤 Send the Final Podcast: The final, merged MP3 is read from the server and delivered directly to your Telegram chat with a dynamic, dated filename. You can modify: 📰 The RSS Feeds to any news source you want. 🤖 The AI Prompts to change the tone, language, or style of the digest and podcast. 🎙️ The TTS Voices used for the two speakers. 📫 The Final Delivery Method (e.g., send to Discord, save to Google Drive, etc.). Perfect for creating a personalized, hands-free news briefing to listen to on your commute. Inspired by: https://n8n.io/workflows/6523-convert-newsletters-into-ai-podcasts-with-gpt-4o-mini-and-elevenlabs/
by Elay Guez
Stock Analysis Agent (Hebrew, RTL, GPT-4o) Overview Get comprehensive stock analysis with this AI-powered workflow that provides actionable insights for your investment decisions. On a weekly basis, this workflow: Analyzes stock data from multiple sources (Chart-img, Twelve Data API, Alphavantage) Performs technical analysis using advanced indicators (RSI, MACD, Bollinger Bands, Resistance and Support Levels) Scans financial news from Alpha Vantage to capture market sentiment Uses OpenAI's GPT-4o to identify patterns, trends, and trading opportunities Generates a fully styled, responsive HTML email (with proper RTL layout) in Hebrew Sends detailed recommendations directly to your inbox Perfect for investors, traders, and financial analysts who want data-driven stock insights - combining technical indicators with news sentiment for more informed decisions. Setup Instructions Estimated setup time: 15 minutes Required credentials: OpenAI API Key Chart-img API Key (free tier) Twelve Data API Key (free tier) Alpha Vantage API Key (free tier) SMTP credentials (for email delivery) Steps: Import this template into your n8n instance. Add your API keys under credentials. Configure the SMTP Email node with: Host (e.g., smtp.gmail.com), Port (465 or 587), Username (your email), Password (app-specific password or login). Activate the workflow. Fill in the Form. Enjoy! (Check your Spam mailbox) Customization Tips Modify the analysis timeframe (daily, weekly, monthly) Add integrations with trading platforms or portfolio management tools Adjust the recommendation criteria based on your risk tolerance Why Use This? This is more than just stock data. It's an intelligent financial assistant that combines technical analysis with market sentiment to provide actionable recommendations - automatically. Important Note: This report is being generated automatically and does not constitute an investment recommendation. Please consult a licensed investment advisor before making any investment decisions.
by Davide
This workflow is a highly advanced multimodal AI assistant designed to operate through WhatsApp. It can understand and respond to text, images, voice messages, and PDF documents by combining OpenAI models with smart logic to adapt to the content received. 🎯 Core Features 📥 1. Automatic Message Type Detection Using the Input type node, the bot detects whether the user has sent: Text Voice messages Images Files (PDF) Other unsupported content 💬 2. Smart Text Message Handling Text messages are processed by an OpenAI GPT-4o-mini agent with a customized system prompt. Replies are concise, accurate, and formatted for mobile readability. 🖼️ 3. Image Analysis & Description Images are downloaded, converted to base64, and analyzed by an image-aware AI model. The output is a rich, structured description, designed for visually impaired users or visual content interpretation. 🎙️ 4. Voice Message Transcription & Reply Audio messages are downloaded and transcribed using OpenAI Whisper. The transcribed text is analyzed and answered by the AI. Optionally, the AI reply can be converted back to voice using OpenAI's text-to-speech, and sent as an audio message. 📄 5. PDF Document Extraction & Summary Only PDFs are allowed (filtered via MIME type). The document’s content is extracted and combined with the user's message. The AI then provides a relevant summary or answer. 🧠 6. Contextual Memory Each user has a personalized session ID with a memory window of 10 interactions. This ensures a more natural and contextual conversation flow. How It Works Thisworkflow is designed to handle incoming WhatsApp messages and process different types of inputs (text, audio, images, and PDF documents) using AI-powered analysis. Here’s how it functions: Trigger: The workflow starts with the **WhatsApp Trigger node, which listens for incoming messages (text, audio, images, or documents). Input Routing: The **Input type (Switch node) checks the message type and routes it to the appropriate processing branch: Text: Directly forwards the message to the AI agent for response generation. Audio: Downloads the audio file, transcribes it using OpenAI, and sends the transcription to the AI agent. Image: Downloads the image, analyzes it with OpenAI’s GPT-4 model, and generates a detailed description. PDF Document: Downloads the file, extracts text, and processes it with the AI agent. Unsupported Formats: Sends an error message if the input is not supported. AI Processing: The **AI Agent1 node, powered by OpenAI, processes the input (text, transcribed audio, image description, or PDF content) and generates a response. Response Handling**: For audio inputs, the AI’s response is converted back into speech (using OpenAI’s TTS) and sent as a voice message. For other inputs, the response is sent as a text message via WhatsApp. Memory: The **Simple Memory node maintains conversation context for follow-up interactions. Setup Steps To deploy this workflow in n8n, follow these steps: Configure WhatsApp API Credentials: Set up WhatsApp Business API credentials (Meta Developer Account). Add the credentials in the WhatsApp Trigger, Get Image/Audio/File URL, and Send Message nodes. Set Up OpenAI Integration: Provide an OpenAI API key in the Analyze Image, Transcribe Audio, Generate Audio Response, and AI Agent1 nodes. Adjust Input Handling (Optional): Modify the Switch node ("Input type") to handle additional message types if needed. Update the "Only PDF File" IF node to support other document formats. Test & Deploy: Activate the workflow and test with different message types (text, audio, image, PDF). Ensure responses are correctly generated and sent back via WhatsApp. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Jez
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Uncover new business leads with this AI-Powered Prospect Discovery Agent! This n8n workflow acts as a specialized intelligent assistant that, given a business type and location, uses multiple search strategies to identify a list of potential prospect companies and their websites. Stop manually trawling through search results! This agent automates the initial phase of lead generation by: Understanding your target business profile (type, location, keywords). Strategically using web search tools (Brave Search, Google Gemini Search) to find relevant businesses. Performing quick validations to confirm relevance. Returning a clean, structured JSON list of prospect names and their website URLs. How it Works: The workflow is built around an AI agent powered by Google Gemini. This agent is equipped with tools like: Brave Web Search:** For broad initial sourcing of potential business candidates. Google Gemini Search:** For advanced, context-aware discovery and finding businesses mentioned in various online sources. Brave Local Search (Selective):** For quick verification of local presence or finding website URLs for identified names. Jina AI Web Page Scraper (Very Selective):** For extremely rapid relevance checks on uncertain websites by scanning page content for keywords. The agent's system prompt guides it to use these tools efficiently to build a list of prospects without getting bogged down in deep research on any single one at this discovery stage. Use Cases: Lead Generation:** Automatically generate lists of potential clients based on industry and location. Market Research:** Identify key players or types of businesses in a specific geographical area. Sales Development:** Provide SDRs with initial lists of companies to research further. Called as a Sub-Workflow:** Designed to be easily integrated as a "tool" into more complex orchestrating AI agents (e.g., a BNI Pitch Planner that first needs to identify who to target). Setup: Import the workflow. Configure Credentials: You'll need n8n credentials for: Google Gemini (for the Chat model and the Gemini Search/Vertex AI Search tool). Brave Search (e.g., via Smithery MCP, or adapt if you have direct API access). Jina AI (for the web scraper). Assign these to the respective nodes. Review System Prompt: The prospect_discovery_agent node contains a detailed system prompt. You can fine-tune this to adjust its search strategies or the strictness of its matching. Inputs: This workflow is triggered by an "Execute Workflow Trigger" node (prospect_discovery_workflow). It expects the following inputs: business_type (string): e.g., "artisan bakery" location_query (string): e.g., "Portland, Oregon" desired_num_prospects (number): e.g., 5 additional_keywords (string, optional): e.g., "organic, gluten-free" To Use (as a Sub-Workflow/Tool): This workflow is typically called by another n8n workflow (e.g., using a "Tool Workflow" node from the Langchain nodes). The calling workflow would provide the inputs listed above. The "Prospect Discovery" workflow will then execute and its final node (the prospect_discovery_agent) will output a JSON array of found prospects, like: [ { "business_name": "Rose Petal Bakery", "website_url": "https://rosepetalbakerypdx.com" }, { "business_name": "The Daily Bread Artisans", "website_url": "https://dailybreadpdx.com" } ] If no prospects are found, it returns an empty array []. This template provides a powerful and focused tool for automating the initial stages of prospect identification.
by ömerDrn
Automated Cryptocurrency Analysis & Reporting with Google Gemini and CoinGecko This powerful template is an n8n workflow that automates cryptocurrency market data analysis and delivers reports directly to your inbox. It fetches real-time data from CoinGecko API, generates AI-powered analysis, and sends the report via email. Features Scheduled Execution**: Runs automatically at a set time daily (default: 10:00 AM). Customizable Analysis**: Personalize analysis content/language via "AI Prompt" nodes. Easy Scalability**: Duplicate node groups to add more cryptocurrencies. Flexible AI Integration**: Defaults to Google Gemini, but supports ChatGPT/Ollama. Setup Instructions n8n Installation: Install n8n (self-hosted or Cloud version). Email Account Setup: Add email service credentials in n8n (e.g., Microsoft Outlook OAuth2). AI Model Credentials (Google Gemini): Obtain API key from Google AI Studio and add to n8n "Credentials". Import Template: Copy the JSON code into n8n as a new workflow. Customization Change Cryptocurrencies**: Update ids= parameter in HTTP Request nodes (e.g., ids=bitcoin). Edit AI Prompt**: Modify text in "AI Prompt" nodes. Use Different AI Model**: Replace Google Gemini with supported alternatives. Update Email Address**: Change recipient in "Send Mail" nodes. `
by Intuz
This n8n template from Intuz provides a complete and automated solution for creating an autonomous social media manager. This workflow uses an AI agent to intelligently generate unique, high-quality content, check for duplicates, and post it on a consistent schedule to automate your entire Twitter presence. Who's this workflow for? Social Media Managers Marketing Teams & Agencies Startup Founders & Solopreneurs Content Creators How it works 1. Runs on a Schedule: The workflow automatically starts at a set interval (e.g., every 6 hours), ensuring a consistent posting schedule. 2. AI Generates a New Tweet: An advanced AI Agent, powered by OpenAI, uses a detailed prompt to craft a new, engaging tweet. The prompt defines the tone, topics, character limits, and hashtags. 3. Checks for Duplicates: Before finalizing the tweet, the AI Agent is equipped with a tool to read a Google Sheet containing a log of all previously published posts. This allows it to ensure the new content is always unique. 4. Posts to Twitter (X): The final, unique tweet is automatically posted to your connected Twitter account. 5. Logs the New Post: After posting, the workflow logs the new tweet back into the Google Sheet, updating the history for the next run. This completes the autonomous loop. Setup Instructions Schedule Your Posts: In the Start Workflow (Schedule Trigger) node, set the frequency you want the workflow to run (e.g., every 6 hours). Connect OpenAI: Add your OpenAI API key in the OpenAI Chat Model node. Customize the prompt in the AI Agent node to match your brand's voice, target keywords, and specific URLs. Configure Google Sheets: Connect your Google Sheets account. Create a sheet with two columns: Tweet Content and Status. In both the Get Data from Google Sheet and Add new Tweet to Google sheet nodes, select your credentials and specify the Document ID and Sheet Name. Connect Twitter (X): In the Create Tweet node, connect the Twitter account where you want to post. Activate Workflow: Save the workflow and toggle the "Active" switch to ON. Your AI social media manager is now live! Key Requirements to Use This Template Before you start, please ensure you have the following accounts and assets ready: An n8n Instance: An active n8n account (Cloud or self-hosted) where you can import and run this workflow. OpenAI Account: An active OpenAI account with an API Key. You will need to have billing enabled to use the language models for tweet generation. Google Account & Sheet: A Google account and a pre-made Google Sheet. The sheet must have two specific columns: Tweet Content and Status. Twitter (X) Developer Account: A Twitter (X) account with an approved Developer profile. You need an App created within the Developer Portal with the necessary permissions (v2 API access with Write scopes) to post tweets automatically. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Jez
Summary This n8n workflow implements an AI-powered agent that intelligently uses the Brave Search API (via an external MCP service like Smithery) to perform both web and local searches. It understands natural language queries, selects the appropriate search tool, and exposes this enhanced capability as a single, callable MCP tool. Key Features 🤖 Intelligent Tool Selection: AI agent decides between Brave's web search and local search tools based on user query context. 🌐 MCP Microservice: Exposes complex search logic as a single, easy-to-integrate MCP tool (call_brave_search_agent). 🧠 Powered by Google Gemini: Utilizes the gemini-2.5-flash-preview-05-20 LLM for advanced reasoning. 🗣️ Conversational Memory: Remembers context within a single execution flow. 📝 Customizable System Prompt: Tailor the AI's behavior and responses. 🧩 Modular Design: Connects to external Brave Search MCP tools (e.g., from Smithery). Benefits 🔌 Simplified Integration: Easily add advanced, AI-driven search capabilities to other applications or agent systems. 💸 Reduced Client-Side LLM Costs: Offloads complex prompting and tool orchestration to n8n, minimizing token usage for client-side LLMs. 🔧 Centralized Logic: Manage and update search strategies and AI behavior in one place. 🚀 Extensible: Can be adapted to use other search tools or incorporate more complex decision-making. Nodes Used @n8n/n8n-nodes-langchain.mcpTrigger (MCP Server Trigger) @n8n/n8n-nodes-langchain.toolWorkflow @n8n/n8n-nodes-langchain.agent (AI Agent) @n8n/n8n-nodes-langchain.lmChatGoogleGemini (Google Gemini Chat Model) n8n-nodes-mcp.mcpClientTool (MCP Client Tool - for Brave Search) @n8n/n8n-nodes-langchain.memoryBufferWindow (Simple Memory) n8n-nodes-base.executeWorkflowTrigger (Workflow Start - for direct execution/testing) Prerequisites An active n8n instance (v1.22.5+ recommended). A Google AI API key for using the Gemini LLM. Access to an external MCP service that provides Brave Search tools (e.g., a Smithery account configured with their Brave Search MCP). This includes the MCP endpoint URL and any necessary authentication (like an API key for Smithery). Setup Instructions Import Workflow: Download the Brave_Search_Smithery_AI_Agent_MCP_Server.json file and import it into your n8n instance. Configure LLM Credential: Locate the 'Google Gemini Chat Model' node. Select or create an n8n credential for "Google Palm API" (used for Gemini), providing your Google AI API key. Configure Brave Search MCP Credential: Locate the 'brave_web_search' and 'brave_local_search' (MCP Client) nodes. Create a new n8n credential of type "MCP Client HTTP API". Name: e.g., Smithery Brave Search Access Base URL: Enter the URL of your Brave Search MCP endpoint from your provider (e.g., https://server.smithery.ai/@YOUR_PROFILE/brave-search/mcp). Authentication: If your MCP provider requires an API key, select "Header Auth". Add a header with the name (e.g., X-API-Key) and value provided by your MCP service. Assign this newly created credential to both the 'brave_web_search' and 'brave_local_search' nodes. Note MCP Trigger Path: Open the 'Brave Search MCP Server Trigger' node. Copy its unique 'Path' (e.g., /cc8cc827-3e72-4029-8a9d-76519d1c136d). You will combine this with your n8n instance's base URL to get the full endpoint URL for clients. How to Use This workflow exposes an MCP tool named call_brave_search_agent. External clients can call this tool via the URL derived from the 'Brave Search MCP Server Trigger'. Example Client MCP Configuration (e.g., for Roo Code): "n8n-brave-search-agent": { "url": "https://YOUR_N8N_INSTANCE/mcp/cc8cc827-3e72-4029-8a9d-76519d1c136d/sse", "alwaysAllow": [ "call_brave_search_agent" ] } Replace YOUR_N8N_INSTANCE with your n8n's public URL and ensure the path matches your trigger node. Example Request: Send a POST request to the trigger URL with a JSON body: { "input": { "query": "best coffee shops in London" } } The agent will stream its response, including the summarized search results. Customization AI Behavior:* Modify the System Prompt within the *'Brave Search AI Agent'** node to fine-tune its decision-making, response style, or how it uses the search tools. LLM Choice:* Replace the *'Google Gemini Chat Model'** node with any other compatible LLM node supported by n8n. Search Tools:** Adapt the workflow to use different or additional search tools by modifying the MCP Client nodes and updating the AI Agent's system prompt and tool definitions. Further Information GitHub Repository: https://github.com/jezweb/n8n The workflow includes extensive sticky notes for in-canvas documentation. Author Jeremy Dawes (Jezweb)
by Luciano Gutierrez
Healthcare Clinic Assistant with WhatsApp and Telegram Integration Version: 1.1.0 n8n Version: 1.88.0+ License: MIT 📋 Description A comprehensive and modular automation workflow designed for healthcare clinics. It manages patient communication, appointment scheduling, confirmations, rescheduling, internal tasks, and media processing by integrating WhatsApp, Telegram, Google Calendar, and Google Tasks, combined with AI-powered agents for maximum efficiency. This system guarantees proactive communication with patients, streamlined internal clinic management, and consistent data synchronization across platforms. 🌟 Key Features 🤖 AI-Powered Specialized Agents: Distinct agents handle WhatsApp patient support, appointment confirmations, and internal rescheduling tasks. 📱 Omnichannel Communication: Handles patient interactions via WhatsApp and staff commands via Telegram. 📅 Google Calendar Appointment Management: Full synchronization for creating, updating, canceling, and confirming appointments. 📋 Task Management with Google Tasks: Manages shopping lists and administrative tasks efficiently through staff Telegram requests. 🔔 Automated Appointment Reminders: Daily-triggered system proactively sends WhatsApp confirmations to patients for next-day appointments. 🖼️ Intelligent Media Processing: Transcribes audios, extracts text from images, and processes documents using OpenAI and OpenRouter AI models. 🛡️ Escalation to Human Support: Automatically detects sensitive or urgent cases and escalates them to a human agent when needed. 🏥 Use Cases Patient Communication:** Respond to inquiries, schedule, reschedule, and confirm appointments seamlessly via WhatsApp. Internal Clinic Operations:** Allow staff to modify appointments or add shopping list reminders directly from Telegram. Appointment Confirmation System:** Automatically contacts patients one day prior to appointments for confirmation or rescheduling. Task and Reminder Management:** Keeps clinic operations organized through automatic task management with Google Tasks. 🛠️ Technical Implementation WhatsApp Patient Interaction Flow Webhook Reception:** Incoming WhatsApp messages captured via Evolution API webhook. Message Classification:** Intelligent routing of messages based on content type (text, image, audio, document). Media Content Processing:** Audios: Download, convert, and transcribe via OpenAI Whisper. Images: Analyze and extract text/descriptions with OpenAI Vision model. Patient Request Handling:** Specialized WhatsApp assistant responds appropriately using AI prompts. Outbound Message Formatting:** Ensures messages comply with WhatsApp format standards. Message Delivery:** Sends responses back via Evolution API. Telegram Staff Management Flow Telegram Webhook Reception:** Captures messages from authorized staff accounts. Internal Assistant Processing:** Appointment Rescheduling: Identifies and updates appointments through MCP Google Calendar. Task Creation: Adds new entries to the clinic's shopping list using Google Tasks. Notifications and Confirmations:** Sends confirmations back to staff through Telegram. Appointment Reminder System Daily Trigger Activation:** Fires every weekday at 08:00 AM. Calendar Scraping:** Lists next day's appointments from Google Calendar. Patient Contact:** Sends WhatsApp confirmation messages for each appointment. Response Management:** Redirects confirmation or rescheduling replies to appropriate agents. ⚙️ Setup Instructions Import the Workflow n8n → Workflows → Import from File → Upload this JSON file. Configure Credentials Evolution API (WhatsApp Communication) Telegram Bot API (Staff Communication) Google Calendar OAuth2 (Appointment Management) Google Tasks OAuth2 (Task Management) OpenAI and OpenRouter APIs (AI Agents) PostgreSQL Database (Chat Memory) Set Sensitive Variables Replace placeholder values: {sua instância aqui} → Evolution API instance name {número_whatsapp} → WhatsApp numbers {url_do_servidor} → Server URLs {a sua apikey aqui} → API keys {seu_calendario} → Google Calendar ID Customize AI Prompts Adjust system prompts to fit your clinic’s tone, service style, and patient communication guidelines. Set clinic operating hours, escalation rules, and cancellation procedures in AI prompts. Activate and Test Simulate patient messages via WhatsApp. Test Telegram commands from staff members. Validate daily appointment reminders using the scheduled trigger. 🏷️ Tags Healthcare Clinic Management WhatsApp Integration Telegram Bot Appointment Scheduling Google Calendar Google Tasks AI Agents n8n Automation 📚 Technical Notes PostgreSQL is used for persistent chat memory across sessions. Multiple AI Models Used: OpenAI GPT-4.1-nano OpenAI GPT-4.1-mini Google Gemini 2.0 and 2.5 Full media content processing supported (audio, image, text). Compliant escalation workflows ensure patient safety and proper handoff to human staff when necessary. All sensitive patient data are securely stored inside calendar event descriptions for easy retrieval by agents. 📜 License This workflow is provided under the MIT License. Feel free to adapt and customize it for your clinic’s specific needs.
by Oneclick AI Squad
In this guide, we’ll walk you through setting up an AI-driven workflow that automatically fetches daily sales, food waste, and customer feedback data from Google Sheets, generates actionable insights using AI, merges them into a comprehensive report, and sends it as an email draft. Ready to automate your restaurant’s daily insights? Let’s dive in! What’s the Goal? Automatically retrieve daily sales data, food waste records, and customer feedback from Google Sheets. Use AI to analyze data and generate insights, including top performers, waste reduction recommendations, and feedback summaries. Merge the insights into a structured daily report. Send the report as an AI-generated email draft for review or sending. Enable scheduled automation for daily insights delivery. By the end, you’ll have a self-running system that delivers daily restaurant insights effortlessly. Why Does It Matter? Manual data analysis and reporting are time-consuming and error-prone. Here’s why this workflow is a game-changer: Zero Human Error**: AI ensures accurate and consistent insights. Time-Saving Automation**: Instantly process data and draft reports, boosting efficiency. Scheduled Delivery**: Receive insights daily without manual effort. Actionable Insights**: Empower your team with data-driven decisions. Think of it as your tireless data analyst that keeps your restaurant informed. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Initiate the workflow daily using the Daily Report Scheduler node (e.g., every day at a set time). Step 2: Fetch Daily Sales Data Retrieve sales data from the Google Sheet using the Fetch Daily Sales Data node. Step 3: Fetch Daily Food Waste Records Retrieve food waste data from the Google Sheet using the Fetch Daily Food Waste Records node. Step 4: Fetch Customer Feedback Retrieve customer feedback from the Google Sheet using the Fetch Customer Feedback node. Step 5: Normalize Sales Records Process and standardize sales data for AI analysis. Step 6: Normalize Waste Data Process and standardize food waste data for AI analysis. Step 7: Normalize Feedback Data Process and standardize customer feedback data for AI analysis. Step 8: AI Sales Insights Generator Use AI (e.g., Google Chat Model) to analyze sales data, identify top performers, and provide recommendations. Step 9: AI Waste Reduction Insights Generator Use AI to analyze waste data and suggest reduction strategies. Step 10: AI Feedback Summary Use AI to summarize customer feedback and identify common themes. Step 11: Format Sales Output Structure the sales insights into a readable format. Step 12: Format Waste Output Structure the waste reduction insights into a readable format. Step 13: Format Feedback AI Output Structure the feedback summary into a readable format. Step 14: Merge & Create Email Combine all formatted insights into a single daily report email draft. Step 15: Prepare Email Content Finalize the email content for sending. Step 16: Send Daily Report Send the AI-generated daily summary email via Gmail. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built workflows to save time. Below is a step-by-step guide to importing the Restaurant Daily Insights Automation workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: Workflows are shared as JSON files or code snippets, e.g., from the n8n community, a colleague, or exported from another n8n instance. Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n (via n8n Cloud or self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): Click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code into the text box. Click Import to load the workflow. Option 2: Import via JSON File: Click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import. Setup Notes Google Sheet Columns**: Sales Data Sheet: Date, Item Name, Quantity Sold, Revenue, Cost, Profit. Food Waste Records Sheet: Date, Item Name, Waste Quantity, Reason, Timestamp. Customer Feedback Sheet: Date, Customer Name, Feedback Text, Rating, Timestamp. Google Sheets Credentials**: Configure OAuth2 settings in the fetch nodes with your Google Sheet ID and credentials. AI Models**: Set up the AI nodes (e.g., Google Chat Model) with appropriate API credentials. Gmail Integration**: Authorize the Send Daily Report node with Gmail API credentials to send emails. Scheduling**: Adjust the Daily Report Scheduler node to your preferred time (e.g., daily at 9 AM).