by Samir Saci
Tags: Supply Chain Management, Logistics, Transportation, Data Transmission Context Hey! I'm Samir, a Supply Chain Engineer and Data Scientist from Paris founder of LogiGreen Consulting We help small and medium businesses improve their logistics processes using AI, Data Analytics and Automation. > Sustainable and Efficient supply chains with N8N! 📬 For business inquiries, you can add me on Here What is an EDI Message? Electronic Data Interchange (EDI) is a standardized method of automatically transferring data between computer systems. They ensure the smooth flow of essential transactional data, such as purchase orders, invoices, shipping notices, and more. For instance, a manufacturing company can receive purchase orders from a retailer via EDI. However, they need complex integration for the transmission and processing of the messages. Who is this template for? This workflow template is designed for small companies that cannot connect to their customers and need to manually process the EDI messages received. How does it work? This workflow uses a Gmail Trigger that analyzes all the incoming emails. 📧 Gmail Trigger → Detects emails with "EDI" in the subject. 📜 Parses EDI Message → Uses a JavaScript Code Node to extract structured data. 📊 Formats the Data → Converts it into a table-friendly format. 📑 Updates Google Sheets → Automatically logs the processed orders. Prerequisite This workflow does not require any additional paying subscription. A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Gmail API A Google sheet to store the shipment records. You do not need to prepare the columns. Next Steps Follow the sticky notes to set up the parameters inside each node and get ready to improve your logistics operations! 📺 Watch the Step-by-Step Guide 🎥 Check My Tutorial 🚀 Interested in applications of N8N for Logistics & Supply Chain Management? Let's connect on Linkedin Notes This template includes an example of EDI message to test the workflow. If you want to learn more about Electronic Data Interchange: 🚚 Blog Article about Electronic Data Interchange (EDI) This workflow has been created with N8N 1.82.1 Submitted: March 19th, 2025
by Davide
This workflow enables users to perform web searches directly from Telegram using the Brave search engine. By simply sending the command /brave followed by a query, the workflow retrieves search results from Brave and returns them as a Telegram message. This workflow is ideal for users who want a quick and private way to search the web without switching between apps. 🚀 This workflow is a powerful tool for automating interactions with Brave tools through Telegram, providing users with quick and easy access to information directly in their chat. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to process user queries from Telegram, execute a Brave tool via the MCP Client, and send the results back to the user. Here's how it works: Telegram Trigger: The workflow starts with the Telegram Trigger node, which listens for new messages in a Telegram chat. When a message is received, the workflow checks if it starts with the command /brave. Filter Messages: The If node filters messages that start with /brave. If the message doesn't start with /brave, the workflow stops. Edit Fields: The Edit Fields node extracts the text of the message for further processing. Clean Query: The Clean Query node removes the /brave command from the message, leaving only the user's query. List Brave Tools: The List Brave Tools node retrieves the list of available tools from the MCP Client. Execute Brave Tool: The Exec Brave Tool node executes the first tool in the list using the cleaned query as input. Send Message: The Send Message node sends the result of the Brave tool execution back to the user in the Telegram chat. 2. Preliminary Steps Access to an n8n self-hosted instance and install the Community node "n8n-nodes-mcp". Please see this easy guide Get your Brave Search API Key: https://brave.com/search/api/ Telegram Bot Access Token In "List Brave Tools" create new credential as shown in this image In Environment field set this value: BRAVE_API_KEY=your-api-key 3. Set Up Steps To set up and use this workflow in n8n, follow these steps: Telegram Configuration: Set up Telegram credentials in n8n for the Telegram Trigger and Send Message nodes. Ensure the Telegram bot is authorized to read messages and send responses in the chat. MCP Client Configuration: Set up MCP Client credentials in n8n for the List Brave Tools and Exec Brave Tool nodes. Ensure the MCP Client is configured to interact with Brave tools. Test the Workflow: Send a message starting with /brave followed by a query (e.g., /brave search for AI tools) to the Telegram chat. The workflow will: Process the query. Execute the Brave tool via the MCP Client. Send the result back to the Telegram chat. Optional Customization: Modify the workflow to include additional features, such as: Adding more commands or tools. Integrating with other APIs or services for advanced use cases. Sending notifications via other channels (e.g., email, Slack) Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Oneclick AI Squad
Overview This solution ensures the secure backup and version control of your self-hosted n8n workflows by storing them in a GitLab repository. It compares current workflows with their GitLab counterparts, updates files when differences are detected, and organizes them in user-specific folders (e.g., repo -> username -> workflow.json). Backups are triggered manually or weekly, with a success notification sent via email. Operational Process Manual Backup Trigger**: Initiates the backup process on demand. Scheduled Weekly Backup**: Automatically triggers the backup every week. Fetch N8N Workflows**: Retrieves all workflows from n8n using the API (getAll:workflow). Prepare Backup Metadata**: Generates metadata, including user details for folder organization. Process Each Workflow**: Handles each workflow individually for processing. Format Workflow for GitLab**: Structures workflows with proper versioning for GitLab compatibility. Rate Limit Control**: Manages API rate limits to ensure smooth operation. Create to GitLab Repository**: Saves workflows to GitLab; creates a new file if it doesn’t exist. Check Backup Status**: Verifies if the file exists; if true, proceeds to update; if false, loops back. Update Backup Summary**: Updates the existing file in GitLab with the latest version. Log Backup Results**: Records the outcome of the backup process. Send Email**: Sends a confirmation email: "Hello, The scheduled backup of all n8n workflows has been completed successfully. All workflows have been committed to the GitLab repository without any errors. Regards, n8n Automation Bot" Implementation Guide Import this solution into your n8n instance. Configure GitLab API credentials and specify the target repository. Set up n8n API access to enable workflow retrieval. Customize the Prepare Backup Metadata node to map users to folders as needed. Test the process using the Manual Backup Trigger to confirm GitLab integration. Schedule weekly backups via the Scheduled Weekly Backup node (recommended for Fridays). Requirements GitLab API credentials with write access n8n API access for workflow retrieval A configured GitLab repository Customization Options Adjust the Prepare Backup Metadata node to include additional user fields. Modify the Rate Limit Control node to accommodate varying API limits. Tailor the Send Email node to include custom notification details.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors customer support forums and Q&A platforms to extract valuable customer insights and pain points. It saves you time by eliminating the need to manually browse through forum discussions and provides structured analysis of customer questions, answers, and recurring issues. Overview This workflow automatically scrapes customer support forums like Stack Exchange and SuperUser to find questions and discussions related to specific topics or brands. It uses AI to analyze forum content, extract customer pain points, and identify recurring issues, then sends structured insights directly to your product team via email. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping forum pages and Q&A platforms without being blocked OpenAI**: AI agent for intelligent forum content analysis and insight extraction Gmail**: For sending automated insight reports to your team How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Gmail: Connect your Gmail account for sending team notifications Customize: Set target forum URLs and define the topics or brands to monitor Use Cases Product Teams**: Identify customer pain points and feature requests from forum discussions Customer Support**: Monitor common issues and questions customers are asking Market Research**: Understand customer needs and challenges in your industry Competitive Analysis**: Track how customers discuss competitor products and services Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #forummonitoring #customersupport #brightdata #webscraping #customerinsights #n8nworkflow #workflow #nocode #forumautomation #customerresearch #supportmonitoring #painpointanalysis #communitymonitoring #forumanalysis #customerfeedback #productinsights #supportforums #stackexchange #customervoice #userresearch #productfeedback #techsupport #communitylistening #customerexperience #supportanalysis #forumdata #qandamonitoring #customerpainpoints
by Julian Reich
This n8n workflow automates the transformation of press releases into polished articles. It converts the content of an email and its attachments (PDF or Word documents) into an AI-written article/blog post. What does it do? This workflow assists editors and journalists in managing incoming press-releases from governments, companies, NGOs, or individuals. The result is a draft article that can easily be reviewed by the editor, who receives it in a reply email containing both the original input and the output, plus an AI-generated self-assessment. This self-assessment represents an additional feedback loop where the AI compares the input with the output to evaluate the quality and accuracy of its transformation. How does it work? Triggered by incoming emails in Google, it first filters attachments, retaining only Word and PDF files while removing other formats like JPGs. The workflow then follows one of three paths: If no attachments remain, it processes the inline email message directly. For PDF attachments, it uses an extractor to obtain the document content. For Word attachments, it extracts the text content by a http request. In each case, the extracted content is then passed to an AI agent that converts the press release into a well-structured article according to predefined prompts. A separate AI evaluation step provides a self-assessment by comparing the output with the original input to ensure quality and accuracy. Finally, the workflow generates a reply email to the sender containing three components: the original input, the AI-generated article, and the self-assessment. This streamlined process helps editors and journalists efficiently manage incoming press releases, delivering draft articles that require minimal additional editing." How to set it up 1. Configure Gmail Connection: Create or use an existing Gmail address Connect it through the n8n credentials manager Configure polling frequency according to your needs Set the trigger event to "Message Received" Optional: Filter incoming emails by specifying authorized senders Enable the "Download Attachments" option 2. Set Up AI Integration: Create an OpenAI account if you don't have one Create a new AI assistant or use an existing one Customize the assistant with specific instructions, style guidelines, or response templates Configure your API credentials in n8n to enable the connection 3. Configure Google Drive Integration: Connect your Google Drive credentials in n8n Set the operation mode to "Upload" Configure the input data field name as "data" -Set the file naming format to dynamic: {{ $json.fileName }} 4. Configure HTTP Request Node: Set request method to "POST" Enter the appropriate Google API endpoint URL Include all required authorization headers Structure the request body according to API specifications Ensure proper error handling for API responses 5. Configure HTTP Request Node 2: Set request method to "GET" Enter the appropriate Google API endpoint URL Include all required authorization headers Configure query parameters as needed Implement response validation and error handling 6. Configure Self-Assessment Node: Set operation to "Message a Model" Select an appropriate AI model (e.g., GPT-4, Claude) Configure the following prompt in the Message field: Please analyze and compare the following input and output content: (for example) Original Input: {{ $('HTTP Request3').item.json.data }} {{ $('Gmail Trigger').item.json.text }} Generated Output: {{ $json.output }} Provide a detailed self-assessment that evaluates: Content accuracy and completeness Structure and readability improvements Tone and style appropriateness Any information that may have been omitted or misrepresented Overall quality of the transformation 7. Configure Reply Email Node: Set operation to "Send" and select your Gmail account Configure the "To" field to respond to the original sender: {{ $('Gmail Trigger').item.json.from }} Set an appropriate subject line: RE: {{ $('Gmail Trigger').item.json.subject }} Structure the email body with clear sections using the following template: handlebars EDITED ARTICLE* {{ $('AI Article Writer 2').item.json.output }} SELF-ASSESSMENT* Rating: 1 (poor) to 5 (excellent) {{ $json.message.content }} ORIGINAL MESSAGE* {{ $('Gmail Trigger').item.json.text }} ATTACHMENT CONTENT* {{ $('HTTP Request3').item.json.data }} Note: Adjust the template fields according to the input source (PDF, Word document, or inline message). For inline messages, you may not need the "ATTACHMENT CONTENT" section.
by Eric
This is a specific use case. The ElevenLabs guide for Cal.com bookings is comprehensive but I was having trouble with the booking API request. So I built a simple workflow to validate the request and handle the booking creation. Who's this for? You have an ElevenLabs voice agent (or other external service) booking meetings in your Cal.com account and you want more control over the book_meeting tool called by the voice agent. How's it work? Request is received by the webhook trigger node Request sent from ElevenLabs voice agent, or other source Request body contains contact info for the user with whom a meeting will be booked in Cal.com Workflow validates input data for required fields in Cal.com If validation fails, a 400 bad request response is returned If valid, meeting is booked in Cal.com api How do I use this? Create a custom tool in the ElevenLabs agent setup, and connect it to the webhook trigger in this workflow. Add authorization for security. Instruct your voice agent to call this tool after it has collected the required information from the user. Expected input structure Note: Modify this according to your needs, but be sure to reflect your changes in all following nodes. Requirements here depend on required fields in your Cal.com event type. If you have multiple event types in Cal.com with varying required fields, you'll need to handle this in this workflow, and provide appropriate instructions in your *voice agent prompt*. "body": { "attendee_name": "Some Guy", "start": "2025-07-07T13:30:00Z", "attendee_phone": "+12125551234", "attendee_timezone": "America/New_York", "eventTypeId": 123456, "attendee_email": "someguy@example.com", "attendee_company": "Example Inc", "notes": "Discovery call to find synergies." } Modifications Note: ElevenLabs doesn't handle webhook response headers or body, and only recognizes the response code. In other words, if the workflow responds with 400 Bad request that's the only info the voice agent gets back; it doesn't get back any details, eg. "User email still needed". You can modify the structure of the expected webhook request body, and then you should reflect that structure change in all following nodes in the workflow. Ie. if you change attendee_name to attendeeFirstName and attendeeLastName then you need to make this change in the following nodes that use these properties. You can also require or make optional other user data for the Cal.com event type which would reduce or increase the data the voice agent must collect from the user. You can modify the authorization of this webhook to meet your security needs. ElevenLabs has some limitations and you should be mindful of those, but it also offers a secret feature with proves useful. An improvement to this workflow could include a GET request to a CRM or other db to get info on the user interacting with the voice agent. This could reduce some of the data collection needed from the voice agent, like if you already have the user's email address, for example. I believe you can also get the user's phone number if the voice agent is set up on a dial-in interface, so then the agent wouldn't need to ask for it. This all depends on your use case. A savvy step might be prompting the voice agent to get an email, and using the email in this workflow to pull enrichment data from Apollo.io or similar ;-)
by dataplusminus+-
🎯 Project Purpose This project automates the process of collecting and managing new leads submitted through a web form. It eliminates the need for manual data entry and ensures that each lead is: Properly recorded and time-stamped in a structured format Automatically communicated to the sales or support team Ready for follow-up, with a reminder system in place It’s a lightweight but effective solution suitable for freelancers, small teams, and growing businesses that want to streamline their lead intake process. 🛠️ Tools & Technologies Used Google Forms / Web Form** – Frontend for capturing leads Google Sheets** – Central database for storing lead information n8n** – Automation platform that connects and coordinates all services Gmail** – Handles email notifications for new leads Slack* *(optional) – Provides instant team notifications Date & Time nodes** – Tracks and manages lead response timing Conditional (IF) nodes** – Filters out duplicate and incomplete entries 🔄 Workflow Overview ✨ Key Features ✅ No-code integration using n8n ✅ Instant alerts via Gmail and/or Slack ✅ Google Sheets as an easily accessible backend ✅ Modular design — easy to expand with CRM tools (like HubSpot) ✅ Clean JSON structure and logic, beginner-friendly 📈 Possible Improvements Add email validation via external API (e.g., NeverBounce, Hunter) Integrate with a CRM for deeper automation Add lead scoring based on answers Include automatic follow-up emails after X days Schedule weekly summary reports via email 🧑🏻💻 Creator Information Developed by: Adem Tasin Adem T. 🌐 Website: Dataplusminus+- 📧 Email:dataplusminuss@gmail.com 💼 LinkedIn: Adem Tasin
by Samir Saci
Tags: Automation, AI, Marketing, Content Creation Context I’m a Supply Chain Data Scientist and content creator who writes regularly about data-driven optimization, logistics, and sustainability. Promoting blog articles on LinkedIn used to be a manual task — until I decided to automate it with N8N and GPT-4o. This workflow lets you automatically extract blog posts, clean the content, and generate a professional LinkedIn post using an AI Agent powered by GPT-4o — all in one seamless automation. >Save hours of repetitive work and boost your reach with AI. 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This template is perfect for: Bloggers and writers** who want to promote their content on LinkedIn Marketing teams** looking to automate professional post-generation Content creators** using Ghost platforms It generates polished LinkedIn posts with: A hook A quick summary A call-to-action A signature to drive readers to your contact page How does it work? This workflow runs in N8N and performs the following steps: 🚀 Triggers manually or you can add a scheduler 📰 Pulls recent blog posts from your Ghost site (via API) 🧼 Cleans the HTML content for AI input 🤖 Sends content to GPT-4o with a tailored prompt to create a LinkedIn post 📄 Records all data (post content + LinkedIn output) in a Google Sheet What do I need to start? You don’t need to write a single line of code. Prerequisites: A Ghost CMS account with blog content A Google Sheet to store generated posts An OpenAI API Key Google Sheets API** connected via OAuth2 Next Steps Use the sticky notes in the workflow to understand how to: Add your Ghost API credentials Link your Google Sheet Customize the AI prompt (e.g., change the author name or tone) Optionally add auto-posting to LinkedIn using tools like Buffer or Make 🎥 Watch My Tutorial 🚀 Want to explore how automation can scale your brand or business? 📬 Let’s connect on LinkedIn Notes You can adapt this template for Twitter, Facebook, or even email newsletters by adjusting the prompt and output channel. This workflow was built using n8n 1.85.4 Submitted: April 9th, 2025
by Don Jayamaha Jr
Get deep insights into NFT market trends, sales data, and collection statistics—all powered by AI and OpenSea! This workflow connects GPT-4o-mini, OpenSea API, and n8n automation to provide real-time analytics on NFT collections, wallet transactions, and market trends. It is ideal for NFT traders, collectors, and investors looking to make informed decisions based on structured data. How It Works Receives user queries via Telegram, webhooks, or another connected interface. Determines the correct API tool based on the request (e.g., collection stats, wallet transactions, event tracking). Retrieves data from OpenSea API (requires API key). Processes the information using an AI-powered analytics agent. Returns structured insights in an easy-to-read format for quick decision-making. What You Can Do with This Agent 🔹 Retrieve NFT Collection Stats → Get floor price, volume, sales data, and market cap. 🔹 Track Wallet Activity → Analyze transactions for a given wallet address. 🔹 Monitor NFT Market Trends → Track historical sales, listings, bids, and transfers. 🔹 Compare Collection Performance → View side-by-side market data for different NFT projects. 🔹 Analyze NFT Transaction History → Check real-time ownership changes for any NFT. 🔹 Identify Market Shifts → Detect sudden spikes in demand, price changes, and whale movements. Example Queries You Can Use ✅ "Get stats for the Bored Ape Yacht Club collection." ✅ "Show me all NFT sales from the last 24 hours." ✅ "Fetch all NFT transfers for wallet 0x123...abc on Ethereum." ✅ "Compare the last 3 months of sales volume for Azuki and CloneX." ✅ "Track the top 10 wallets making the most NFT purchases this week." Available API Tools & Endpoints 1️⃣ Get Collection Stats → /api/v2/collections/{collection_slug}/stats (Retrieve NFT collection-wide market data) 2️⃣ Get Events → /api/v2/events (Fetch global NFT sales, transfers, listings, bids, redemptions) 3️⃣ Get Events by Account → /api/v2/events/accounts/{address} (Track transactions by wallet) 4️⃣ Get Events by Collection → /api/v2/events/collection/{collection_slug} (Get sales activity for a collection) 5️⃣ Get Events by NFT → /api/v2/events/chain/{chain}/contract/{address}/nfts/{identifier} (Retrieve historical transactions for a specific NFT) Set Up Steps Get an OpenSea API Key Sign up at OpenSea API and request an API key. Configure API Credentials in n8n Add your OpenSea API key under HTTP Header Authentication. Connect the Workflow to Telegram, Slack, or Database (Optional) Use n8n integrations to send alerts to Telegram, Slack, or save results to Google Sheets, Notion, etc. Deploy and Test Send a query (e.g., "Azuki latest sales") and receive instant NFT market insights! Stay ahead in the NFT market—get real-time analytics with OpenSea’s AI-powered analytics agent!
by Baptiste Fort
Still reminding people about their tasks manually every morning? Let’s be honest — who wants to start the day chasing teammates about what they need to do? What if Slack could do it for you — automatically, at 9 a.m. every day — without missing anything, and without you lifting a finger? In this tutorial, you’ll build a simple automation with n8n that checks Airtable for active tasks and sends reminders in Slack, daily. Here’s the flow you’ll build: Schedule Trigger → Search Records (Airtable) → Send Message (Slack) STEP 1 : Set up your Airtable base Create a new base called Tasks Add a table (for example: Projects, To-Do, or anything relevant) Add the following fields: | Field | Type | Example | | -------- | ----------------- | ------------------------------------------- | | Title | Text | Finalize quote for Client A | | Assignee | Text | Baptiste Fort | | Email | Email | claire@email.com | | Status | Single select | In Progress / Done | | Due Date | Date (dd/mm/yyyy) | 05/07/2025 | Add a few sample tasks with the status In Progress so you can test your workflow later. STEP 2 Create the trigger in n8n In n8n, add a Schedule Trigger node Set it to run every day at 9:00 a.m.: Trigger interval: Days Days Between Triggers: 1 Trigger at hour: 9 Trigger at minute: 0 This is the node that kicks off the workflow every morning. STEP 3 : Search for active tasks in Airtable This step is all about connecting n8n to your Airtable base and pulling the tasks that are still marked as "In Progress". 1. Add the Airtable node In your n8n workflow, add a node called: Airtable → Search Records You can find it by typing "airtable" in the node search. 2. Create your Airtable Personal Access Token If you haven’t already created your Airtable token, here’s how: 🔗 Go to: https://airtable.com/create/tokens Then: Name your token something like TACHES Under Scopes, check: ✅ data.records:read Under Access, select only the base you want to use (e.g. “Tâches”) Click “Save token” Copy the personal token 3. Set up the Airtable credentials in n8n In the Airtable node: Click on the Credentials field Select: Airtable Personal Access Token Click Create New Paste your token Give it a name like: My Airtable Token Click Save 4. Configure the node Now fill in the parameters: Base: Tâches Table: Produits (or Tâches, depending on what you called it) Operation: Search Filter By Formula: {Statut} = "En cours" Return All: ✅ Yes (make sure it’s enabled) Output Format: Simple 5. Test the node Click “Execute Node”. You should now see all tasks with Statut = "En cours" show up in the output (on the right-hand side of your screen), just like in your screenshot. STEP 4: Send each task to Slack Now that we’ve fetched all the active tasks from Airtable, let’s send them to Slack — one by one — using a loop. Add the Slack node Drag a new node into your n8n workflow and select: Slack → Message Name it something like Send Slack Message You can find it quickly by typing "Slack" into the node search bar. Connect your Slack account If you haven't already connected your Slack credentials: Go to n8n → Credentials Select Slack API Click Create new Paste your Slack Bot Token (from your Slack App OAuth settings) Give it a clear name like Slack Bot n8n Choose the workspace and save Then, in the Slack node, choose this credential from the dropdown. Configure the message Set these parameters: Operation: Send Send Message To: Channel Channel: your Slack channel (e.g. #tous-n8n) Message Type: Simple Text Message Message template Paste the following inside the Message Text field: Message template Paste the following inside the Message Text field: New task for {{ $json.name }}: {{ $json["Titre"] }} 👉 Deadline: {{ $json["Date limite"] }} Example output: New task for Jeremy: Relancer fournisseur X 👉 Deadline: 2025-07-04 Test it Click Execute Node to verify the message is correctly sent in Slack. If the formatting works, you’re ready to run it on schedule 🚀
by David w/ SimpleGrow
This n8n workflow tracks user engagement in a specific WhatsApp group by capturing incoming messages via a Whapi webhook. It first filters messages to ensure they come from the correct group, then identifies the message type—text, emoji reaction, voice, or image. The workflow searches for the user in an Airtable database using their WhatsApp ID and increments their message count by one. It updates the Airtable record with the new count and the date of the last interaction. This automated process helps measure user activity and supports engagement initiatives like weekly raffles or rewards. The system is flexible and can be expanded to include more message types or additional actions. Overall, it provides a seamless way to encourage and track user participation in your WhatsApp community.
by Yusei Miyakoshi
Who’s it for Teams that start their day in Slack and want a concise, automated summary of yesterday’s emails—ops leads, PMs, founders, and anyone handling busy inboxes without writing code. What it does / How it works Runs every morning at 08:00 (cron 0 0 8 * * ), fetches all emails received *yesterday, and routes the result: if none were found, it posts a polite “no emails” notice; if emails exist, it aggregates them and asks an AI agent to produce a structured digest, then formats and posts to your chosen Slack channel. The flow uses **Gmail → If → Aggregate (Item Lists) → AI Agent (OpenRouter model with structured output) → Code (Slack formatter) → Slack. A set of sticky notes on the canvas explains each step and required inputs. How to set up Connect Gmail (OAuth2) and keep the default date window (yesterday → today at 00:00). Connect Slack (OAuth2) and select your target channel. Add OpenRouter credentials and pick a compact model (e.g., gpt-4o-mini). Keep the provided structured-output schema and formatter code. Adjust the schedule/timezone if needed (the fallback message includes an Asia/Tokyo example). Paste this description into the yellow sticky note at the top of the canvas. Requirements Gmail & Slack accounts with appropriate scopes OpenRouter API key stored in Credentials (no hard-coded keys) n8n Cloud or self-host with LangChain agent nodes enabled How to customize the workflow Narrow Gmail results with label/search filters (e.g., from:, subject:). Change the digest sections or tone in the AI Agent system prompt. Swap the model for cost/quality needs and tweak temperature/max tokens. Localize dates/timezones in the formatter code and Slack messages. Branch the output to email, Google Docs, or Sheets for archival. Security & publishing tips Rename all nodes clearly, do not hardcode API keys, remove real channel IDs/emails before sharing, and group end-user variables in a Set (Fields) node. Keep the sticky notes—they’re mandatory for reviewers.