by Toshiya Minami
Sort invoice PDFs from Gmail to Google Drive and Google Sheets Who’s it for Freelancers, finance teams, and small businesses that receive invoice PDFs by email and want them automatically saved to Google Drive and logged in Google Sheets—without manual downloading or copy-pasting. How it works / What it does This workflow watches your Gmail inbox for unread messages that match an invoice pattern (e.g., subject:invoice filename:pdf). For each email, it checks for attachments, uploads each PDF to a chosen Google Drive folder, and appends a new row to a Google Sheet with useful metadata: received time, sender, subject, filename, Drive link, and IDs. Finally, it marks the original email as read to avoid duplicates. How to set up Open the Config (Set) node and fill in: drive_folder_id (or leave blank for root) spreadsheet_id (from the Sheet URL) sheet_name (e.g., Invoices) Connect credentials for Gmail, Google Drive, and Google Sheets in each node. Adjust the Gmail search query if needed (language/vendor terms). Run once manually to verify data mapping, then activate. Requirements n8n with valid credentials for Gmail, Google Drive, and Google Sheets. A Google Sheet with appropriate headers (or let the workflow write new columns). How to customize the workflow Replace Gmail with IMAP or Microsoft Outlook if you don’t use Gmail; remove the “mark as read” step accordingly. Add parsing (e.g., extract invoice totals or vendor names via PDF/AI nodes) before the Sheets step. Route based on vendor: create subfolders dynamically in Drive and write to different tabs. Notify your team by adding Slack/Email nodes after logging to Sheets.
by Adil Khan
This workflow integrates Google Analytics 4 (GA4) with Slack, enabling users to query their website data using natural language inside a dedicated Slack channel. An AI Agent interprets user queries, fetches relevant reports from GA4, and responds in Slack as a reply. How it works When a user sends a message in a specified Slack channel, the workflow is triggered. The message is filtered to remove @bot mentions, and then passed to an AI Agent. The AI Agent, powered by a Google Gemini Chat Model and utilizing conversational memory (to have back-and-forth with user on follow up questions, limit of 10), determines if the user's query requires data from Google Analytics 4. If so, it leverages a pre-configured GA4 tool to fetch the necessary report (e.g., page views, users, conversions for a specific date range). Finally, the AI Agent's response, containing the requested data, is sent back to the original Slack channel as a reply. Setup Steps Slack Trigger: Configure the Slack API credential and specify the channel n8n should monitor for new messages. Credentials: Create and configure the following credentials in n8n: Slack API: For sending and receiving messages. Google Analytics 4: For accessing GA4 reports. Requires a Google Cloud Project with the Analytics Data API enabled and a Service Account Key (JSON). Google Gemini Chat Model: For the AI Agent's intelligence. Requires an API key from Google AI Studio. AI Agent System Prompt: Craft a robust system prompt for the AI agent. This prompt should define the agent's role, constraints (e.g., "do not estimate or lie on data, if GA4 is unavailable, inform so"), and guidance on mapping natural language metrics/dimensions to GA4 equivalents (e.g., "when the user mentions 'leads', they mean 'conversions' in GA4"). Slack Reply: Ensure the final Slack "Send a message" node is configured to reply to the original channel, providing the data in a clear, concise format.
by Rakin Jakaria
Use cases are many: Automate Gmail tasks such as sending, replying, labeling, deleting, and fetching emails — all with AI assistance. Perfect for YouTubers managing viewer emails, sales teams handling inquiries, freelancers responding to client requests, or professionals keeping their inbox organized. Good to know At time of writing, each Gemini request is billed per token. See Gemini Pricing for updated details. The workflow uses Gmail labels (e.g., youtube-viewers, sales-inquiry, meeting-request, potential-clients, collaboration-requests) for classification — make sure these exist in your Gmail account. How it works Chat Trigger**: You interact with the agent via a chat interface (webhook). AI Agent**: Gemini-powered assistant interprets your instructions (send, reply, label, delete, fetch emails). Email Actions**: Based on your request, the assistant uses Gmail tools to act on emails (Send, Reply, Label, Delete, Get Many). Contact Lookup**: If only a name is provided, the agent checks Google Sheets for the matching email address. If not found, it prompts you to add it. Memory**: A buffer memory stores chat context so the assistant can maintain continuity across multiple interactions. Labeling**: Emails can be auto-labeled for better organization (e.g., client inquiries, meeting requests). How to use Send commands like: “Reply to John’s email with a follow-up about the project.” “Label Sarah’s email as potential-client.” “Delete the latest spam email.” The Gmail Agent will handle the request instantly and keep everything logged properly. Requirements Gmail account connected with OAuth2 credentials Google Gemini API key for AI processing Google Sheets for contact management Pre-created Gmail labels for organization Customising this workflow Add new Gmail labels for your workflow (e.g., Invoices, Support Tickets). Connect to a CRM (e.g., HubSpot, Notion, or Airtable) for syncing email data. Enhance AI replies with dynamic templates stored in Google Sheets. Extend chat commands to include batch actions (e.g., “Archive all emails older than 30 days”).
by ToolMonsters
How it works This workflow lets you search for leads using FullEnrich's People Search API directly from Monday.com, then auto-fills the results as new items on your board. A Monday.com automation sends a webhook when a new item is created on your "search criteria" board The workflow responds to Monday.com's challenge handshake, then calls FullEnrich's People Search API with the criteria from your Monday.com columns (job title, industry, location, company size, number of results) The search results are split into individual people and each one is created as a new item on your Monday.com "results" board with their name, title, company, domain, LinkedIn URL, and location Set up steps Setup takes about 15 minutes: Monday.com "Search Criteria" board — Create a board with these columns: Job Title (text), Industry (text), Location (text), Company Size (text), Number of Results (number). Note down the column IDs Monday.com "Results" board — Create a board with columns for: First Name, Last Name, Job Title, Company Name, Company Domain, LinkedIn URL, Location. Note down the board ID, group ID, and column IDs Monday.com automation — On your search criteria board, create an automation: When item created → send webhook to the production URL from the "Monday.com Webhook" node FullEnrich — Connect your FullEnrich API credentials Monday.com — Connect your Monday.com API credentials Column mapping — Update the column IDs in the HTTP Request body and in the Monday.com node to match your boards Activate the workflow
by Daniel Rosehill
This workflow provides a way to capture detailed AI prompts using a voice note transcription service and then passes them on for completion to an AI agent. To preserve outputs in a knowledge management system, the AI response and the prompt are combined into one document that is created in a Nuclino collection (note: the Nuclino step is configured manually with a HTTP request node). How it works A webhook receives voice note data from Voicenotes.com containing the title and transcript The transcript is extracted and sent to an AI Agent powered by OpenRouter's Claude Sonnet model The AI generates a structured response in markdown format with Summary, Prompt, and Response sections The original prompt and AI response are merged and prepared for multiple outputs A Nuclino document is created via HTTP Request with the structured content A Slack notification is sent with the prompt, response, and Nuclino note URL Both the original prompt and AI response are archived in NocoDB for future reference How to use The webhook trigger can be configured to receive data from Voicenotes.com or any service that provides title and transcript data Replace the manual trigger with webhook, form, or other triggers as needed Customize the AI system message to change response format and behavior Configure Nuclino workspace and collection IDs for proper document organization Requirements OpenRouter account** for AI model access (Claude Sonnet) Nuclino account** and API token for document creation Slack workspace** with bot permissions for notifications NocoDB instance** for archiving (optional) Voicenotes.com account** for voice input (or alternative webhook source) Customising this workflow AI Models**: Switch between different OpenRouter models by changing the model parameter Response Format**: Modify the AI Agent system message to change output structure Documentation Platforms**: Replace Nuclino HTTP Request with other documentation APIs Notification Channels**: Add multiple Slack channels or other notification services Archive Storage**: Replace NocoDB with other database solutions Input Sources**: Adapt webhook to accept data from different voice note or transcription services Nuclino API The Nuclino API is documented here.
by CentralStationCRM
How it works time trigger using the cron format, every weekday at 5pm gets CentralStationCRM people updates of today checks for tag "Outreach" if true, sends message on gmail (predefine in node) waits 7 days, checks for answers alerts user if an answer is there if not, repeats process with second mail How to set up get credentials for CentralStationCRM, Slack and GMail set up the respective nodes with the credentials define text for your automated mails test without wait nodes
by Sergei Byvshev
Automatically classify and route DevOps requests from your team chat using LLM + on-call calendar lookup. What it does This workflow turns your Mattermost channel into a smart DevOps intake system. When someone mentions @devops-duty, the workflow: Receives the message via Mattermost outgoing webhook Classifies the request into one of 8 categories using an LLM Looks up the current on-call engineer from Google Calendar Routes the request through a Switch node based on category Acknowledges in a Mattermost thread with the classification result Categories create_resource - Provision new databases, secrets, services, DNS records incident - Something is broken — production or staging issues question - Information requests, status checks, clarifications ci_cd_error - Build failures, deployment issues, GitHub Actions problems limits - Billing limits, quotas exceeded change_request - Modify existing infrastructure or configuration `code_approve - Code review and merge request approvals other - Anything that doesn't fit above Extending Each Switch output is an independent branch — connect sub-workflows or additional nodes per category. For example: incident → trigger an AI investigation sub-workflow with MCP tools (Kubernetes, Grafana, etc.) create_resource → run a provisioning playbook ci_cd_error → fetch GitHub Actions logs and analyze failures
by Elvis Sarvia
Validate AI-generated outputs before your workflow acts on them. This template sends a support ticket through AI classification, parses the JSON response, and checks that categories, urgency levels, and confidence scores are all within valid ranges. What you'll do Send a support ticket to the AI for classification. Watch the Code node parse and validate the AI's JSON response against a defined schema. See how valid outputs continue through the workflow while invalid ones get flagged. What you'll learn How to structure AI prompts to return valid JSON How Code nodes parse and validate AI output against expected schemas How to check confidence scores, valid categories, and urgency levels programmatically How to build retry and fallback paths for malformed AI responses Why it matters AI models don't always return what you expect. A confidence score of "high" instead of 0.95, a missing category field, or a malformed JSON response can silently break downstream steps. This template catches those failures before they propagate. This template is a learning companion to the Production AI Playbook, a series that explores strategies, shares best practices, and provides practical examples for building reliable AI systems in n8n. https://go.n8n.io/PAP-D&A-Blog
by Đỗ Thành Nguyên
Publish a Reel on a Facebook Page 🧠 Good to Know > Set up n8n self-hosted instance using https://tino.vn/vps-n8n?affid=388 > Use the code ==VPSN8N== for up to 39% off. This workflow automatically uploads and publishes Facebook Reels using data from Google Sheets and video files stored in Google Drive. It runs on a schedule (every 30 minutes by default) and can be fully customized for your posting routine. ⚙️ How It Works Google Sheets provides the content data — video file ID, caption, and optional links. Google Drive hosts the actual video file (.mp4 format). The workflow initializes an upload session with the Facebook Graph API, uploads the video, and publishes it as a Reel on your Page. Finally, it updates your Google Sheet and adds a comment under the published Reel with your affiliate or product link. How to Use Open the template Google Sheet or make a copy: 👉 Template Sheet Fill out each row with: File ID → the ID of your video file from Google Drive File name → optional Caption → your post caption Link Share → optional Link post → leave empty (it will be filled after posting) Ensure your video file: .mp4 format shared folder on Google Drive that’s accessible to your connected account Add your Facebook Page ID and Page Access Token to the “info” node. (Learn how to get these here: Facebook Reels Workflow Guide) 📋 Requirements n8n instance (Self-hosted recommended):** Set up a self-hosted instance using https://tino.vn/vps-n8n?affid=388 Use the code VPSN8N for up to 39% off. Facebook Page** with publishing permissions Page Access Token** (with pages_manage_posts, pages_read_engagement) Google Drive* and *Google Sheets** account connected to n8n Video files in .mp4 format, under the 1GB upload limit 🎨 Customizing This Workflow Change schedule:* Adjust the *Schedule Trigger** node (e.g., every 2 hours or only during business hours). Track post links:** Add a node to fetch the permalink_url from the Graph API and update it in your sheet. Auto-comment control:** Modify or remove the “Create comment post” node to suit your campaign style. Improve security:* Replace hardcoded tokens with *n8n credentials, **secrets, or a Data Table lookup. This structure keeps your automation scalable, secure, and easy to adapt for multi-page or multi-brand use.
by Ryo Sayama
Who is this for Anyone who wants a fun and practical AI chatbot on LINE. Great for people who enjoy getting advice from multiple angles — whether they face work stress, personal dilemmas, or everyday decisions. What this workflow does When a user sends a text message to the LINE bot, the workflow: Parses the incoming LINE Webhook event Passes the message to Google Gemini via Basic LLM Chain Gemini replies as three distinct personas in a single structured response The advice is logged to Google Sheets for history tracking A Flex Message carousel is sent back to the user — one card per persona, each color-coded The three personas: 🔮 Fortune Teller — mystical, fate-driven advice 💼 Business Coach — logical, action-oriented guidance 😊 Best Friend — casual, empathetic encouragement How to set up Create a LINE Messaging API channel and copy the Channel Access Token Set your n8n webhook URL as the LINE Webhook URL Create a Google Sheets spreadsheet with a sheet named advice_history and these headers in row 1: Timestamp, User ID, Message, Fortune Teller, Business Coach, Best Friend Open the Set config node and paste your LINE token and Sheet ID Connect your Google Gemini credential to the Google Gemini Chat Model node Connect your Google Sheets credential to the Save advice to Sheets node Activate the workflow and send a message to your LINE bot Requirements LINE Messaging API channel (free) Google Gemini API key (free tier available at aistudio.google.com) Google Sheets (any Google account) How to customize Change the three personas in the Generate advice with Gemini prompt to fit your use case (e.g. therapist, investor, comedian) Adjust the Flex Message card colors in Send Flex Message to LINE Add extra columns to Google Sheets to track additional metadata
by oka hironobu
Who is this for Legal teams, operations managers, and freelancers who review contracts regularly and want to catch risky clauses before signing. Ideal for small teams without dedicated legal counsel. What this workflow does This workflow automates contract risk analysis using AI. A user uploads a PDF contract through a web form and selects the contract type. The Code node extracts text from the PDF, then Google Gemini analyzes the full contract for risky clauses, unfavorable terms, and missing legal protections. Each clause gets a severity rating (high, medium, low) with a suggested fix. The parsed results are logged to Google Sheets for tracking, and if the overall risk score exceeds your threshold, a Slack alert fires immediately so nothing slips through. How to set up Get a free Google Gemini API key from Google AI Studio Connect your Google Sheets account and create a spreadsheet called "Contract Reviews" Connect your Slack workspace and select the channel for risk alerts Activate the workflow and share the form URL with your team Requirements Google Gemini API key (free tier available) Google Sheets account Slack workspace with a channel for alerts n8n instance (self-hosted or cloud) How to customize Edit the AI prompt in the "Analyze Contract" node to focus on specific clause types like indemnification or IP assignment Change the risk threshold in the "Check Risk Level" node (default triggers on scores above 7) Add columns to the Sheets node for additional tracking fields like reviewer name or department
by Nima Salimi
Overview This n8n workflow automatically fetches the Forex Factory calendar for yesterday using Rapid API, then saves the data to a connected Google Sheet and sends Telegram alerts for high and medium impact events. It runs daily on schedule, collecting key fields such as currency, time, impact, and market indicators, and organizes them for easy tracking and analysis. Perfect for forex traders and analysts who need quick access to reliable market data from the previous day’s events. ✅ Tasks ⏰ Runs automatically every day 🌐 Fetches yesterday’s Forex Factory calendar via Rapid API 🧾 Collects key data fields: year, date, time, currency, impact, actual, forecast, previous 📊 Saves all records to Google Sheets for tracking and analysis 🚨 Sends Telegram alerts for high and medium impact events ⚙️ Keeps your market data updated and organized with no manual work required 🛠 How to Use 📄 Create a Google Spreadsheet Create a new spreadsheet in Google Sheets and add two sheets: High Impact and Low Impact. Connect it to your Google Sheets nodes in n8n. 🌐 Find the API on Rapid API Go to Rapid API and search for Forex Factory Scraper. Subscribe to the API to get your access key. 🔑 Connect Rapid API to n8n In your HTTP Request node, add the header below to authenticate your request: 💬 Add Your Telegram Chat ID In the Telegram node, paste your Chat ID to receive daily alerts for high-impact news. 🕒 Activate the Workflow Enable the Schedule Trigger to run daily. The workflow will automatically fetch yesterday’s Forex Factory calendar, save it to Google Sheets, and send Telegram notifications.