by Harshil Agrawal
This workflow demonstrates the use of $runIndex expression. It demonstrates how the expression can be used to avoid an infinite loop. The workflow will create 5 Tweets with the content 'Hello from n8n!'. You can use this workflow by replacing the Twitter node with any other node(s) and updating the condition in the IF node.
by Zacharia Kimotho
What problem is this workflow solving? This workflow is aimed for email marketing enthusiasts looking for an easy way to either extract the domain from an email ad also check if the syntax is correct without having to use the code node. How this works For this to work, replace the debugger node with your actual data source. Map your data at match the above layout Run your workflow and check for all the emails that are either valid or not Once done, you will have a list of all your emails, domains, and whether they are valid or not.
by Harshil Agrawal
This workflow allows you to receive a Mattermost message when meeting notes get added to the Notion. Prerequisites Create a table in Notion similar to this: Meeting Notes Follow the steps mentioned in the documentation, to create credentials for the Notion Trigger node. Create create credentials for Mattermost. Notion Trigger: The Notion Trigger node will trigger the workflow when new data gets added to Notion. IF node: This node will check if the notes belong to the team Marketing. If the team is Marketing the node will true, otherwise false. Mattermost node: This node will send a message about the new data in the channel 'Marketing' in Mattermost. If you have a different channel, use that instead. You can even replace the Mattermost node with nodes of other messaging platforms, like Slack, Telegram, Discord, etc. NoOp node: Adding this node here is optional, as the absence of this node won't make a difference to the functioning of the workflow.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. π Google Drive MCP Workflow β AI-Powered File Management Automation π π§ Overview A secure and intelligent n8n workflow that connects with Google Drive via MCP (Model Control Protocol). Ideal for AI agent tasks, compliance-driven storage, and document automation. π Key Features π Built-In Safety Backs up files before edits (timestamped) Supports rollback using file history Validates file size, type, and permissions π Smart Organization Automatically converts file types (PDF, DOCX, etc.) Moves files to structured folders Auto-archives old files based on age or rules π MCP Integration Accepts standardized JSON via webhook Real-time execution for AI agents Fully customizable input (action, fileId, format, etc.) β AI Callable MCP Actions These are the commands AI agents can perform via MCP: Download a file (with optional format conversion) Upload a new file to Google Drive Copy a file for backup Move a file to a specific folder Archive old or inactive files Organize documents into folders Convert files to a new format (PDF, DOCX, etc.) Retrieve and review file history for rollback π Example Input { "action": "download", "fileId": "abc123", "folderPath": "/projects/clientA", "convertFormat": "pdf" } π Security & Performance OAuth2 secured access to Google Drive API No sensitive data stored in transit Real-time audit logs and alerts Batch-friendly with built-in rate limiting π Ideal For Businesses automating file management AI Agents retrieving, sorting, converting, or archiving files Compliance teams needing file versioning and backups βοΈ Requirements n8n + Google Drive API v3 MCP server + Webhook integration Google OAuth2 Credentials
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a topic in MQTT using the MQTT node. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. AWS SQS: This node will send the data from the previous node to the iss-position topic. If you have created a topic with a different one, you can use that topic instead.
by Jitesh Dugar
Newsletter Sign-up with Email Verification & Welcome Email Automation π Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. β¨ Key Features Email Verification Real-time validation** using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails** with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging** to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing π― Use Cases Newsletter sign-ups** on websites and landing pages Lead generation** forms with quality control Marketing campaigns** requiring verified email lists Community building** with automated onboarding SaaS product launches** with email collection Content creator** audience building E-commerce** customer list management π What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up π§ Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting π Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included π Benefits β Improve Email Deliverability - Remove invalid emails before sending campaigns β Reduce Bounce Rates - Only send to verified, active email addresses β Save Money - Don't waste email credits on invalid addresses β Better Analytics - Track conversion rates by source β Professional Onboarding - Personalized welcome experience β Scalable Solution - Handles high-volume sign-ups automatically β Data Quality - Build a clean, high-quality subscriber list π¨ Customization Options Email Template** - Fully customizable HTML design Verification Threshold** - Adjust score requirements Brand Colors** - Match your company branding Confirmation Flow** - Add double opt-in if desired Multiple Sources** - Track different signup forms Language** - Easily translate email content π¦ What's Included β Complete n8n workflow JSON (ready to import) β Google Sheets template structure β Responsive HTML email template β Setup documentation with screenshots β Troubleshooting guide β Customization examples π Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export π‘ Quick Stats 12 Nodes** - Fully automated workflow 3 Data Paths** - Valid, invalid, and incomplete submissions 100% Uptime** - When properly configured Instant Processing** - Real-time email verification Unlimited Scale** - Based on your API limits π Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers π Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today!
by Brian
This template automates posting to Instagram Business and Facebook Pages using the Meta Graph API. It supports both short-lived and long-lived tokens, with a secure approach using System User tokens for reliable, ongoing automation. Includes detailed guidance for authentication, token refresh logic, and API use. Features: πΈ Publish to Instagram via /media + /media_publish π Post to Facebook Pages via /photos π Long-lived token support via Meta Business System User β»οΈ Token refresh support using staticData in n8n π§ In-line sticky note instructions Use Cases: Schedule and publish branded social media content Automate marketing flows with CRM + social sync Empower internal teams or clients to post without manual steps Tags: Instagram, Facebook, Meta Graph API, Social Media, Token Refresh, Long-Lived Token, Marketing Automation, System User
by Shahrear
π AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) βοΈ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Runβs Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms β then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. π§© Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) β‘Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row β this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. βοΈ How It Works Monitor Contract Uploads β Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File β Automatically downloads new contracts for AI analysis. VLM Run ContractParser β Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data β The webhook endpoint receives the structured JSON response. Format Contract Data β Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) β Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack β Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events β Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) π‘ Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. π§ Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets π οΈ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. β οΈ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by Daniel Ng
Auto Backup n8n Workflows to Google Drive Imagine the sinking feeling: hours, weeks, or even months of meticulous work building your n8n workflows, suddenly gone. A server crash, an accidental deletion, data corruption, or an unexpected platform issue β and all your automated processes vanish. Without a reliable backup system, you're facing a complete rebuild from scratch, a scenario that's not just frustrating but can be catastrophic for business operations. Furthermore, consider the daunting task of migrating your n8n instance to a new host or server. Manually exporting each workflow, one by one, then painstakingly importing them into the new environment is not only incredibly time-consuming, especially if you have tens or hundreds of workflows, but also highly prone to errors and omissions. You need a systematic, automated solution. This workflow provides a robust solution for automatically backing up all your n8n workflows to Google Drive on schedule (default to every hour). It creates a uniquely named folder for each backup instance, incorporating the date and hour, and then systematically uploads each workflow as an individual JSON file. To manage storage space, the workflow also includes a cleanup mechanism that deletes backup folders older than a user-defined retention period (defaulting to 7 days). Ideally, this backup workflow should be used in conjunction with a restore solution like our "Restore Workflows from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Feature highlights Triggers on schedule (default to hourly). Creates a \n8n\_backup\_YYYY-MM-DD\_HH\ folder in Google Drive. Fetches all n8n workflows. Saves each workflow as a JSON file to the new folder. Deletes backup folders older than the 'Coverage Period' (default to 7 days). Who is this for? This template is designed for: n8n Administrators and Developers:** Who need a reliable, automated system to safeguard their workflows against accidental loss, corruption, or system issues. Proactive n8n Users:** Who want to maintain a version history of their workflows, enabling easy rollback to previous configurations if necessary. Organizations:** Seeking to implement disaster recovery and data integrity practices for their n8n automation infrastructure. What problem is this workflow solving? / use case This workflow directly addresses these critical risks and challenges by: Automating Backups:** Eliminates the manual effort and inconsistency of ad-hoc backups, ensuring your workflows are regularly and reliably saved. Preventing Data Loss:** Safeguards your valuable automation assets against unforeseen disasters by creating secure, versioned copies in Google Drive. Facilitating Migration & Recovery:** Provides the foundational backups needed for a smoother, more systematic migration or a full disaster recovery, allowing you to restore your operations efficiently. Version Control:** By storing scheduled backups (defaulting to hourly), it allows you to access and restore previous versions of your workflows, offering an undo capability for significant changes or corruptions. Storage Management:** Automatically removes old backups based on a configurable retention period, preventing excessive use of Google Drive storage while keeping a relevant history. What this workflow does Scheduled Trigger: Runs automatically every hour. Timestamping: Fetches the current date and hour to create a unique name for the backup folder. Folder Creation: Creates a new folder in a specified Google Drive location. The folder is named in the format: n8n_backup_YYYY-MM-DD_HH. Workflow Retrieval: Connects to your n8n instance via its API and fetches a list of all existing workflows. Individual Backup: Processes each workflow one by one: Converts the workflow data to a binary JSON file. Uploads the JSON file (named after the workflow) to the hourly backup folder in Google Drive. Includes a short wait step between uploads to respect potential API rate limits. Old Backup Deletion: Calculates a cut-off date based on the "Coverage Period" set in the "Settings" node (e.g., 7 days prior to the current date). Searches Google Drive for backup folders (matching the naming convention) that are older than this cut-off date. Deletes these identified old backup folders to free up storage space. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: n8n (node that fetches workflows) Configure n8n API credentials to allow the workflow to access your instance's workflow data. Specify Google Drive Backup Location: Open the "Google Drive Backup Folder Every Hour" node. Under the "Drive ID" parameter: select it from the list or provide its ID. Under the "Folder ID" parameter: select or input the ID of the parent folder in Google Drive where you want the n8n_backup_YYYY-MM-DD_HH folders to be created (e.g., a general "n8n\_Backups" folder). Set Backup Retention Period: Open the "Settings" node. Modify the value for "Coverage Period" (default is 7). This number represents the number of days backups should be kept before being deleted. Activate Workflow: Toggle the "Active" switch for the workflow in your n8n dashboard. How to customize this workflow to your needs Backup Frequency:* Adjust the "Rule" in the *"Schedule Trigger"** node to change the backup interval (e.g., daily, specific times). Folder/File Naming:* Modify the expressions in the "Parameters" tab of the *"Google Drive Backup Folder Every Hour"* node (for folder name) or the *"Google Drive Upload Workflows"** node (for file name) if you require a different naming convention. Targeted Backups:* To back up only specific workflows, insert a "Filter" node after the *"n8n"** node to filter workflows based on criteria like name, tags, or ID before they reach the "Move Binary Data" node. Wait Time:* The *"Wait"** node is set to 3 seconds between uploads. If you have a very large number of workflows or encounter rate limiting, you might adjust this duration. Error Workflow:** The workflow is pre-configured with an "Error Workflow" setting. Ensure this error workflow exists in your n8n instance, or update the setting to point to your preferred error handling workflow. This can be used to send notifications on failure. Important Considerations Resource Usage:** While the workflow includes a wait step between individual workflow uploads to minimize load, backing up an extremely large number of workflows could still consume resources on your n8n instance and make many API calls to Google Drive. Monitor performance if you have thousands of workflows. Testing Restore Process**: Regularly test restoring a few workflows from your Google Drive backups using the companion "Restore All n8n Workflows from Google Drive" template or a manual import. This verifies the integrity of your backups and ensures you can recover when needed. Workflow Modifications**: If you modify this backup workflow (e.g., change the folder naming convention), ensure your restore process or workflow is also updated to match these changes.
by Lorena
This workflow ensures gender inclusive language in Mattermost channels. If someone addresses the group with βguysβ or βgalsβ, a bot promptly replies with: "May I suggest βfolksβ or βy'allβ? We use gender inclusive language here. π". Webhook node**: triggers the workflow when a new message is posted in Mattermost. IF node**: verifies if the message includes the words "guys" or "gals". If false, it does not take any action. If true, it triggers the Mattermost node. Mattermost node**: posts the language warning message in the Mattermost channel.
by Shrey
This workflow can be used to save all of your workflows in: a raw state (as a json file in Dropbox) an Airtable base, in a pre-designed format. It runs periodically (currently, every 30 minutes) and either updates (if already existing in Airtable) or creates a new record in Airtable for each workflow. Here's the Airtable base to give you an idea: View Airtable base Note: This workflows uses the "http://localhost:5678/rest" API which the UI editor uses but is still not officially supported. Hence, it may suffer breaking changes at some point in the future and the workflow might become dysfunctional then.