by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)
by Femi Ad
Google Sheets to MailChimp Auto-Importer Overview This n8n workflow automatically imports contacts from Google Sheets into your MailChimp mailing list. Perfect for businesses collecting leads through Google Forms, event registrations, or maintaining contact lists in spreadsheets. Key Features π Bulk Import: Process entire Google Sheets at once π Smart Name Parsing: Automatically splits full names into first and last names π± Phone Number Support: Includes phone numbers as merge fields β‘ Error Resilience: Continues processing even if individual contacts fail π Import Summary: Generates a summary of processed contacts Prerequisites Before using this workflow, ensure you have: An active n8n instance (self-hosted or cloud) A Google account with access to Google Sheets A MailChimp account with at least one audience/list created Basic understanding of n8n workflows Initial Setup Step 1: Import the Workflow Copy the workflow JSON In n8n, click "Import from File" or paste the JSON Save the workflow with a meaningful name Step 2: Configure Google Sheets Connection Click on the "Get Google Sheet Data" node Click on "Credential to connect with" Select "Create New" and choose "Google Sheets OAuth2" Follow the OAuth flow to authenticate your Google account Save the credentials Step 3: Configure MailChimp Connection Click on the "Add to MailChimp" node Click on "Credential to connect with" Select "Create New" and choose "MailChimp OAuth2" or "MailChimp API" For API method: Log into MailChimp Go to Account β Extras β API keys Generate a new API key Copy and paste it into n8n Save the credentials Step 4: Configure Your Specific Settings Google Sheets Settings: Open the "Get Google Sheet Data" node Replace YOUR_GOOGLE_SHEET_ID with your actual sheet ID Find this in your Google Sheets URL: https://docs.google.com/spreadsheets/d/[SHEET_ID]/edit Replace YOUR_SHEET_NAME with your worksheet name (e.g., "Sheet1" or "Form Responses 1") MailChimp Settings: Open the "Add to MailChimp" node Replace YOUR_MAILCHIMP_LIST_ID with your audience ID Find this in MailChimp: Audience β Settings β Audience name and defaults Verify the status is set to "subscribed" Google Sheets Format Requirements Your Google Sheet must have the following columns (exact names): Names**: Full name of the contact (e.g., "John Doe") Email address**: Valid email address Phone Number**: Contact phone number (optional) Example: | Names | Email address | Phone Number | |-------|--------------|--------------| | John Doe | john@example.com | +1234567890 | | Jane Smith | jane@example.com | +0987654321 | How to Use Manual Execution: Open the workflow in n8n Click "Execute Workflow" Monitor the execution progress Check the output of "Create Import Summary" for results Scheduling (Optional): To run this automatically: Replace the "Manual Trigger" node with a "Schedule Trigger" node Set your desired schedule (e.g., daily at 9 AM) Activate the workflow Customization Options Adding More Fields: To include additional fields like company name or address: Add columns to your Google Sheet Modify the "Edit Fields" node to include new fields Update the "Format Subscriber Data" code to map new fields Add corresponding merge fields in the MailChimp node Handling Duplicates: The workflow uses "continueRegularOutput" error handling, which means: Existing subscribers will be skipped New subscribers will be added The workflow continues processing Adding Email Notifications: To receive import summaries via email: Add a Gmail or Email node after "Create Import Summary" Configure with your email settings Use the import summary data in the email body Troubleshooting Common Issues: "Invalid API Key" (MailChimp) Verify your API key is correct Check that your MailChimp account is active "Sheet not found" (Google Sheets) Verify the sheet ID is correct Ensure the service account has access to the sheet "Email already exists" errors This is normal for existing subscribers The workflow will continue processing other contacts Missing data in MailChimp Check that column names match exactly (case-sensitive) Verify data exists in the Google Sheet Best Practices Test First: Always test with a small dataset first Backup Data: Export your MailChimp list before large imports Clean Data: Ensure email addresses are valid before importing Monitor Regularly: Check import summaries for any issues Respect Privacy: Only import contacts who have consented to receive emails Support For issues specific to: n8n platform: Visit n8n Community Forum Google Sheets API: Check Google Developers Documentation MailChimp API: See MailChimp API Documentation Need help customizing? Contact me for consulting and support or add me on LinkedIn - https://www.linkedin.com/in/femi-adedayo-h44/ License This workflow template is provided free for personal and commercial use. Feel free to modify and share!
by Kevin
Monitor Postgres Data Freshness and Email Alert If Stale This template monitors a set of tables inside a Postgres database to ensure they're getting updated. If the table hasn't been updated in 3 days (configurable), an email alert is sent containing the tables that are stale. Requirements You must have a Postgres database containing one or more tables that you'd like to monitor. Each table to monitor must have a date or timestamp column that tracks when data was pushed. For example, this might be: A timestamp column if your table holds event/timeseries data A last_updated column if your rows are expected to be modified Usage Use this template Add your Postgres and email credentials Adjust the Produce tables + date columns node to produce pairs of [table, date_column] that should be monitored for freshness πββοΈ Note that a timestamp column also works (Optional) Adjust the Remove fresh tables node for your desired staleness window (default is 3 days, but you can adjust as you please) (Optional) Customize the Send alerts node to call whichever alerting workflow you please (I recommend my alerting workflow for easiest plug-and-play) How it works This template works by: Pulling the most recent row for each table Calculating how out-of-date each table is, in days Dropping fresh tables that have been updated within the past 3 days Sending an email alert with the stale tables that haven't been updated within the past 3 days
by Lucas Peyrin
How it works This workflow demonstrates a fundamental pattern for securing a webhook by requiring an API key. It acts as a gatekeeper, checking for a valid key in the request header before allowing the request to proceed. Incoming Request: The Secured Webhook node receives an incoming POST request. It expects an API key to be sent in the x-api-key header. API Key Verification: The Check API Key node takes the key from the incoming request's header. It then makes an internal HTTP request to a second webhook (Get API Key) which acts as a mock database. This second webhook retrieves a list of registered API keys (from the Registered API Keys node) and filters it to find a match for the key that was provided. Conditional Response: If a match is found, the API Key Identified node routes the execution to the "success" path, returning a 200 OK response with the identified user's ID. If no match is found, it routes to the "unauthorized" path, returning a 401 Unauthorized error. This pattern separates the public-facing endpoint from the data source, which is a good security practice. Set up steps Setup time: ~2 minutes This workflow is designed to be a self-contained example. Set up Credentials: This workflow uses "Header Auth" for its internal communication. Go to Credentials and create a new Header Auth credential. You can use any name and value (e.g., Name: X-N8N-Auth, Value: my-secret-password). Select this credential in all four webhook/HTTP Request nodes. Add Your API Keys: Open the Registered API Keys node. This is your mock database. Edit the array to include the user_id and api_key pairs you want to authorize. Activate the workflow. Test it: Use the Test Secure Webhook node to send a request. Try it with a valid key from your list to see the success response. Change the x-api-key header to an invalid key to see the 401 Unauthorized error. For Production: Replace the mock database part of this workflow (the Get API Key webhook and Registered API Keys node) with a real database node like Supabase, Postgres, or Baserow to look up keys.
by Lucas Peyrin
How it works This workflow changes the file name, and therefore the extension and MIME type, of any binary file passed to it. This is perfect for converting file formats on the fly, like turning a Telegram voice message (.oga) into an MP3 for an AI transcription service. Set New File Name: The SET OUTPUT FILE NAME node is where you define the desired output file name and extension (e.g., audio.mp3). It also dynamically captures the property name of the incoming binary (e.g., data). Extract Binary Data: The workflow temporarily converts the binary file into a Base64 text string to make it accessible in the next step. Rebuild Binary with New Name: A Code node takes the Base64 data and reconstructs it as a binary file, but this time, it assigns the new file name you specified. n8n automatically sets the MIME type based on the new file extension. Set up steps Setup time: < 1 minute This workflow is designed to be used as a sub-workflow. In your main workflow, add an Execute Sub-Workflow node where you need to change a file's type. In the Workflow parameter, select this "Change Binary MimeType/Extension" workflow. Open this workflow and go to the SET OUTPUT FILE NAME node. Modify the output_file_name value to your desired file name (e.g., voice_message.mp3 or document.pdf). Save this workflow. Now, any binary file you send to it from your main workflow will be returned with the new fileName and mimeType.
by Jimleuk
This template is for self-hosted n8n instances only. This n8n demonstrates how to build a simple FileSystem MCP server. Connecting to this server allows MCP clients and agents to list, read and create directories and files on the local machine or remote server. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem How it works A MCP server trigger is used and connected to 5 tools: 3 Execute Command tools and 2 custom workflow tools. The 3 Execute Command tools allow for listing, searching and creating directories. The 2 custom workflow tools are for reading and writing files to disk. Special care has been to not allow the MCP agent to execute arbitrary linux commands on the target server. This is achieved by only allowing the agent to provide parameters such as filenames and paths rather than raw commands. How to use This Filesystem MCP server will write to the server which hosts the n8n instance - this can be your local machine or a remove server. If your target filesystem is on neither, then modify the commands to connect to the desired server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please help me list all folders under the project directory." "Help me create a bash script to send a notification to Slack." "Search for the log file on the 22nd April and read its contents. What was the cause of the outage?" Requirements Linux file system for this example template. Feel free to modify if working on Windows. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Implement the moving and renaming of files by adding more custom workflow tools to the MCP server. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Oneclick AI Squad
This n8n workflow automates subdomain creation and deletion on GoDaddy using their API, triggered via email requests. This empowers developers to manage subdomains directly without involving DevOps for minor tasks. Good to know Ensure GoDaddy API credentials are securely configured to avoid unauthorized access. Email parsing accuracy depends on the consistency of request formats. How it works Detect new email requests using the Start Workflow (GET Request) node. Use the Extract Data from Email node to parse relevant details (e.g., subdomain name, action type). Validate the action type with the Validate Action Type node to proceed with create (true) or delete (false). If true, the Create Subdomain node sends a POST request to GoDaddyβs API to create the subdomain. If false, the Delete Subdomain node sends a DELETE request to remove the subdomain. The Send Email Response node notifies the requester of the actionβs success or failure. How to use Import the workflow into n8n and configure the nodes with your GoDaddy API and email credentials. Test with sample email requests to ensure proper parsing and API calls. Requirements GoDaddy API credentials Email service (e.g., SMTP or API) for notifications Customising this workflow Adjust the Extract Data from Email node to match your email format or add additional validation steps for security.
by CreativeCreature
Workflow Overview This workflow automates the process of forwarding e-book files to a Kindle device using a Telegram bot and Outlook email. Setup Steps: Telegram Bot Setup: Create a Telegram bot via BotFather and configure its credentials in the workflow. Outlook Email Configuration: Set up your Outlook email credentials. (Currently, only Outlook is supported, but you can modify the workflow to support other email providers.) Amazon Kindle Email Setup: Find your Kindle device's email address from your Amazon account. This will be the recipient address for the e-books. Allow Email Sending to Kindle: Ensure your Amazon account is configured to allow emails from your Outlook address to send files to your Kindle. Workflow Explanation: The workflow begins with a Telegram bot trigger node that listens for new chat messages. When a new message is received, the workflow checks if the message contains a file attachment. If no file is detected, the bot will send a warning reply to the user in the chat. If a file is found, it will be renamed to ensure it appears correctly on the Kindle device when sent. The workflow then composes an email with the file attached and sends it to the Kindle's receiving address. If the email is sent successfully, the bot will notify the user with a success message in the chat. Only Amazon-supported file types will be accepted by Kindle. If sending fails, you will receive a notification email from Amazon in your Outlook inbox. In case of delivery issues, retry sending the file as network issues may occasionally interfere with the process.
by Agent Studio
Overview This workflow provides Retell agent builders with a simple way to populate dynamic variables using n8n. The workflow fetches user information from a Google Sheet based on the phone number and sends it back to Retell. It is based on Retell's Inbound Webhook Call. Retell is a service that lets you create Voice Agents that handle voice calls simply, based on a prompt or using a conversational flow builder. Who is it for For builders of Retell's Voice Agents who want to make their agents more personalized. Prerequisites Have a Retell AI Account Create a Retell agent Purchase a phone number and associate it with your agent Create a Google Sheets - for example, make a copy of this one. Your Google Sheet must have at least one column with the phone number. The remaining columns will be used to populate your Retell agentβs dynamic variables. All fields are returned as strings to Retell (variables are replaced as text) How it works The webhook call is received from Retell. We filter the call using their whitelisted IP address. It extracts data from the webhook call and uses it to retrieve the user from Google Sheets. It formats the data in the response to match Retell's expected format. Retell uses this data to replace dynamic variables in the prompts. How to use it See the description for screenshots! Set the webhook name (keep it as POST). Copy the Webhook URL (e.g., https://your-instance.app.n8n.cloud/webhook/retell-dynamic-variables) and paste it into Retell's interface. Navigate to "Phone Numbers", click on the phone number, and enable "Add an inbound webhook". In your prompt (e.g., "welcome message"), use the variable with this syntax: {{variable_name}} (see Retell's documentation). These variables will be dynamically replaced by the data in your Google Sheet. Notes In Google Sheets, the phone number must start with '+. Phone numbers must be formatted like the example: with the +, extension, and no spaces. You can use any databaseβjust replace Google Sheets with your own, making sure to keep the phone number formatting consistent. π Reach out to us if you're interested in analysing your Retell Agent conversations.
by Gareth B. Davies
An automated backup solution designed for self-hosted n8n users to automatically backup their workflows to Bitbucket, leveraging Bitbucket's free private repository offering. Perfect for maintaining version control of your n8n workflows without additional costs. How it works: Runs on a regular schedule to check all workflows in your n8n instance Compares each workflow with its version in Bitbucket Only uploads workflows that are new or have changed Uses basic rate limiting to stay within Bitbucket's API limits Formats filenames for easy tracking and includes timestamps in commit messages Handles errors gracefully with automatic retries Set up steps (10-15 minutes): Create a free Bitbucket account and private repository Create a Bitbucket App Password with repository write access Add Bitbucket credentials to n8n (using your username and app password) Set up n8n API access (generate API key in your n8n instance) Configure your Bitbucket workspace and repository names in the Set node Optional: Adjust the backup schedule (default: 2 AM daily) Perfect for n8n self-hosters who want: Version control for their workflows Automated daily backups Free private repository storage Easy workflow recovery Change tracking over time The workflow includes basic error handling and rate limiting to ensure reliable backups even with larger numbers of workflows. Adjust your timing based on https://support.atlassian.com/bitbucket-cloud/docs/api-request-limits/.
by Daniel Nolde
What this does Show you how to us XMLRPC APIs via the generic HTTP-Request-node, by the example of posting to a wordpress blog This is also a feasible workaround if a specific n8n integration does not work or stops working (which happens e.g. with the Wordpress node) How it works First, the XML payload for the request is being prepared (in a code node, which also properly escapes special character in the values that you want to send to the XMLRPC endpoint) Then, the HTTP Request node sends the request using the HTTP post method Last, the returned XML response is converted to JSON which a conditional node uses to determine whether th operation was successful or not Setup steps: Import workflow Ensure you have a wordpress blog with a user that has an app-Password Edit the "Settings"-node and enter your individual values for url/user/app-pw
by Mirajul Mohin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform your video uploads into AI-powered summaries with key topic extraction and instant team notifications. What this workflow does Monitors Google Drive for new video uploads Downloads and processes videos using VLM Run AI Generates intelligent summaries with key topics extracted Posts results to Slack for immediate team access Setup Prerequisites: Google Drive account, VLM Run API credentials, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Google Drive OAuth2 and create video upload folder Add VLM Run API credentials Set up Slack integration for notifications Update folder/channel IDs in workflow nodes Test and activate Perfect for Meeting recordings and training videos Webinar summaries and educational content Content analysis and team collaboration Any video content requiring quick insights Key Benefits Asynchronous processing** handles large files without timeouts Multi-format support** for MP4, AVI, MOV, WebM, MKV Instant team updates** via Slack notifications Saves hours** of manual video review time How to customize Extend by adding: Video categorization and tagging Integration with project management tools Email notifications alongside Slack Searchable video databases with summaries This workflow transforms lengthy videos into actionable insights, making your content instantly accessible and shareable with your team.