by Harshil Agrawal
This workflow handles the incoming issues and issues comments for your open-source project. If a contributor is interested, the workflow will assign them the issue. Note: For organizations, you will have to use the Webhook node to trigger the workflow. You will also have to use the HTTP Request node instead of the regular GitHub node. You can learn more about this workflow by reading the blog on https://n8n.io/blog.
by Lorena
This workflow is triggered when a meeting is scheduled via Calendly. Then, an activity is automatically created in Pipedrive and 15 minutes after the end of the meeting, a message is sent to the interviewer in Slack, reminding them to write down their notes and insights from the meeting.
by Harshil Agrawal
This workflow demonstrates the use of $runIndex expression. It demonstrates how the expression can be used to avoid an infinite loop. The workflow will create 5 Tweets with the content 'Hello from n8n!'. You can use this workflow by replacing the Twitter node with any other node(s) and updating the condition in the IF node.
by Zacharia Kimotho
What problem is this workflow solving? This workflow is aimed for email marketing enthusiasts looking for an easy way to either extract the domain from an email ad also check if the syntax is correct without having to use the code node. How this works For this to work, replace the debugger node with your actual data source. Map your data at match the above layout Run your workflow and check for all the emails that are either valid or not Once done, you will have a list of all your emails, domains, and whether they are valid or not.
by Harshil Agrawal
This workflow allows you to receive a Mattermost message when meeting notes get added to the Notion. Prerequisites Create a table in Notion similar to this: Meeting Notes Follow the steps mentioned in the documentation, to create credentials for the Notion Trigger node. Create create credentials for Mattermost. Notion Trigger: The Notion Trigger node will trigger the workflow when new data gets added to Notion. IF node: This node will check if the notes belong to the team Marketing. If the team is Marketing the node will true, otherwise false. Mattermost node: This node will send a message about the new data in the channel 'Marketing' in Mattermost. If you have a different channel, use that instead. You can even replace the Mattermost node with nodes of other messaging platforms, like Slack, Telegram, Discord, etc. NoOp node: Adding this node here is optional, as the absence of this node won't make a difference to the functioning of the workflow.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📁 Google Drive MCP Workflow – AI-Powered File Management Automation 🚀 🧠 Overview A secure and intelligent n8n workflow that connects with Google Drive via MCP (Model Control Protocol). Ideal for AI agent tasks, compliance-driven storage, and document automation. 🌟 Key Features 🔒 Built-In Safety Backs up files before edits (timestamped) Supports rollback using file history Validates file size, type, and permissions 📁 Smart Organization Automatically converts file types (PDF, DOCX, etc.) Moves files to structured folders Auto-archives old files based on age or rules 🔄 MCP Integration Accepts standardized JSON via webhook Real-time execution for AI agents Fully customizable input (action, fileId, format, etc.) ✅ AI Callable MCP Actions These are the commands AI agents can perform via MCP: Download a file (with optional format conversion) Upload a new file to Google Drive Copy a file for backup Move a file to a specific folder Archive old or inactive files Organize documents into folders Convert files to a new format (PDF, DOCX, etc.) Retrieve and review file history for rollback 📝 Example Input { "action": "download", "fileId": "abc123", "folderPath": "/projects/clientA", "convertFormat": "pdf" } 🔐 Security & Performance OAuth2 secured access to Google Drive API No sensitive data stored in transit Real-time audit logs and alerts Batch-friendly with built-in rate limiting 📌 Ideal For Businesses automating file management AI Agents retrieving, sorting, converting, or archiving files Compliance teams needing file versioning and backups ⚙️ Requirements n8n + Google Drive API v3 MCP server + Webhook integration Google OAuth2 Credentials
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a topic in MQTT using the MQTT node. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. AWS SQS: This node will send the data from the previous node to the iss-position topic. If you have created a topic with a different one, you can use that topic instead.
by Brian
This template automates posting to Instagram Business and Facebook Pages using the Meta Graph API. It supports both short-lived and long-lived tokens, with a secure approach using System User tokens for reliable, ongoing automation. Includes detailed guidance for authentication, token refresh logic, and API use. Features: 📸 Publish to Instagram via /media + /media_publish 📘 Post to Facebook Pages via /photos 🔐 Long-lived token support via Meta Business System User ♻️ Token refresh support using staticData in n8n 🧠 In-line sticky note instructions Use Cases: Schedule and publish branded social media content Automate marketing flows with CRM + social sync Empower internal teams or clients to post without manual steps Tags: Instagram, Facebook, Meta Graph API, Social Media, Token Refresh, Long-Lived Token, Marketing Automation, System User
by Ranjan Dailata
This workflow automates AI-powered search insights by combining SE Ranking AI Search data with OpenAI summarization. It starts with a manual trigger and fetches the time-series AI visibility data via the SE Ranking API. The response is summarized using OpenAI to produce both detailed and concise insights. The workflow enriches the original metrics with these AI-generated summaries and exports the final structured JSON to disk, making it ready for reporting, analytics, or further automation. Who this is for This workflow is designed for: SEO professionals & growth marketers** tracking AI search visibility Content strategists** analyzing how brands appear in AI-powered search results Data & automation engineers** building SEO intelligence pipelines Agencies** producing automated search performance reports for clients What problem is this workflow solving? SE Ranking’s AI Search API provides rich but highly technical time-series data. While powerful, this data: Is difficult to interpret quickly Requires manual analysis to extract insights Is not presentation-ready for reports or stakeholders This workflow solves that by automatically transforming raw AI search metrics into clear, structured summaries, saving time and reducing analysis friction. What this workflow does At a high level, the workflow: Accepts input parameters such as target domain, AI engine, and region Fetches AI search visibility time-series data from SE Ranking Uses OpenAI GPT-4.1-mini to generate: A comprehensive summary A concise abstract summary Enriches the original dataset with AI-generated insights Exports the final structured JSON to disk for: Reporting Dashboards Further automation or analytics Setup Prerequisites n8n (self-hosted or cloud)** SE Ranking API access** OpenAI API key** Setup Steps If you are new to SE Ranking, please signup on seranking.com Import the workflow JSON into n8n Configure credentials: SE Ranking using HTTP Header Authentication. Please make sure to set the header authentication as below. The value should contain a Token followed by a space with the SE Ranking API Key. OpenAI for GPT-4.1-mini Open Set the Input Fields and update: target_site (e.g., your domain) engine (e.g., ai-overview) source (e.g., us, uk, in) Verify the file path in Write File to Disk Click Execute Workflow How to customize this workflow to your needs You can easily extend or tailor this workflow: Change analysis scope** Update domain, region, or AI engine Modify AI outputs** Adjust prompts or output schema for insights like trends, risks, or recommendations Replace storage** Send output to: Google Sheets Databases S3 / cloud storage Webhooks or BI tools Automate monitoring** Add a Cron trigger to run daily, weekly, or monthly Summary This workflow turns raw SE Ranking AI Search data into clear, executive-ready insights using OpenAI GPT-4.1-mini. By combining automated data collection with AI summarization, it enables faster decision-making, better reporting, and scalable SEO intelligence without manual analysis.
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by Daniel Ng
Auto Backup n8n Workflows to Google Drive Imagine the sinking feeling: hours, weeks, or even months of meticulous work building your n8n workflows, suddenly gone. A server crash, an accidental deletion, data corruption, or an unexpected platform issue – and all your automated processes vanish. Without a reliable backup system, you're facing a complete rebuild from scratch, a scenario that's not just frustrating but can be catastrophic for business operations. Furthermore, consider the daunting task of migrating your n8n instance to a new host or server. Manually exporting each workflow, one by one, then painstakingly importing them into the new environment is not only incredibly time-consuming, especially if you have tens or hundreds of workflows, but also highly prone to errors and omissions. You need a systematic, automated solution. This workflow provides a robust solution for automatically backing up all your n8n workflows to Google Drive on schedule (default to every hour). It creates a uniquely named folder for each backup instance, incorporating the date and hour, and then systematically uploads each workflow as an individual JSON file. To manage storage space, the workflow also includes a cleanup mechanism that deletes backup folders older than a user-defined retention period (defaulting to 7 days). Ideally, this backup workflow should be used in conjunction with a restore solution like our "Restore Workflows from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Feature highlights Triggers on schedule (default to hourly). Creates a \n8n\_backup\_YYYY-MM-DD\_HH\ folder in Google Drive. Fetches all n8n workflows. Saves each workflow as a JSON file to the new folder. Deletes backup folders older than the 'Coverage Period' (default to 7 days). Who is this for? This template is designed for: n8n Administrators and Developers:** Who need a reliable, automated system to safeguard their workflows against accidental loss, corruption, or system issues. Proactive n8n Users:** Who want to maintain a version history of their workflows, enabling easy rollback to previous configurations if necessary. Organizations:** Seeking to implement disaster recovery and data integrity practices for their n8n automation infrastructure. What problem is this workflow solving? / use case This workflow directly addresses these critical risks and challenges by: Automating Backups:** Eliminates the manual effort and inconsistency of ad-hoc backups, ensuring your workflows are regularly and reliably saved. Preventing Data Loss:** Safeguards your valuable automation assets against unforeseen disasters by creating secure, versioned copies in Google Drive. Facilitating Migration & Recovery:** Provides the foundational backups needed for a smoother, more systematic migration or a full disaster recovery, allowing you to restore your operations efficiently. Version Control:** By storing scheduled backups (defaulting to hourly), it allows you to access and restore previous versions of your workflows, offering an undo capability for significant changes or corruptions. Storage Management:** Automatically removes old backups based on a configurable retention period, preventing excessive use of Google Drive storage while keeping a relevant history. What this workflow does Scheduled Trigger: Runs automatically every hour. Timestamping: Fetches the current date and hour to create a unique name for the backup folder. Folder Creation: Creates a new folder in a specified Google Drive location. The folder is named in the format: n8n_backup_YYYY-MM-DD_HH. Workflow Retrieval: Connects to your n8n instance via its API and fetches a list of all existing workflows. Individual Backup: Processes each workflow one by one: Converts the workflow data to a binary JSON file. Uploads the JSON file (named after the workflow) to the hourly backup folder in Google Drive. Includes a short wait step between uploads to respect potential API rate limits. Old Backup Deletion: Calculates a cut-off date based on the "Coverage Period" set in the "Settings" node (e.g., 7 days prior to the current date). Searches Google Drive for backup folders (matching the naming convention) that are older than this cut-off date. Deletes these identified old backup folders to free up storage space. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: n8n (node that fetches workflows) Configure n8n API credentials to allow the workflow to access your instance's workflow data. Specify Google Drive Backup Location: Open the "Google Drive Backup Folder Every Hour" node. Under the "Drive ID" parameter: select it from the list or provide its ID. Under the "Folder ID" parameter: select or input the ID of the parent folder in Google Drive where you want the n8n_backup_YYYY-MM-DD_HH folders to be created (e.g., a general "n8n\_Backups" folder). Set Backup Retention Period: Open the "Settings" node. Modify the value for "Coverage Period" (default is 7). This number represents the number of days backups should be kept before being deleted. Activate Workflow: Toggle the "Active" switch for the workflow in your n8n dashboard. How to customize this workflow to your needs Backup Frequency:* Adjust the "Rule" in the *"Schedule Trigger"** node to change the backup interval (e.g., daily, specific times). Folder/File Naming:* Modify the expressions in the "Parameters" tab of the *"Google Drive Backup Folder Every Hour"* node (for folder name) or the *"Google Drive Upload Workflows"** node (for file name) if you require a different naming convention. Targeted Backups:* To back up only specific workflows, insert a "Filter" node after the *"n8n"** node to filter workflows based on criteria like name, tags, or ID before they reach the "Move Binary Data" node. Wait Time:* The *"Wait"** node is set to 3 seconds between uploads. If you have a very large number of workflows or encounter rate limiting, you might adjust this duration. Error Workflow:** The workflow is pre-configured with an "Error Workflow" setting. Ensure this error workflow exists in your n8n instance, or update the setting to point to your preferred error handling workflow. This can be used to send notifications on failure. Important Considerations Resource Usage:** While the workflow includes a wait step between individual workflow uploads to minimize load, backing up an extremely large number of workflows could still consume resources on your n8n instance and make many API calls to Google Drive. Monitor performance if you have thousands of workflows. Testing Restore Process**: Regularly test restoring a few workflows from your Google Drive backups using the companion "Restore All n8n Workflows from Google Drive" template or a manual import. This verifies the integrity of your backups and ensures you can recover when needed. Workflow Modifications**: If you modify this backup workflow (e.g., change the folder naming convention), ensure your restore process or workflow is also updated to match these changes.
by Lorena
This workflow ensures gender inclusive language in Mattermost channels. If someone addresses the group with “guys” or “gals”, a bot promptly replies with: "May I suggest “folks” or “y'all”? We use gender inclusive language here. 😄". Webhook node**: triggers the workflow when a new message is posted in Mattermost. IF node**: verifies if the message includes the words "guys" or "gals". If false, it does not take any action. If true, it triggers the Mattermost node. Mattermost node**: posts the language warning message in the Mattermost channel.