by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)
by Daniel Shashko
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the process of scraping product data from e-commerce websites and using it to fine-tune a custom OpenAI GPT model for generating high-quality marketing copy and product descriptions. Main Use Cases Fine-tune OpenAI models with real product data from hundreds of supported e-commerce websites for marketing content generation. Create custom AI models specialized in writing compelling product descriptions across different industries and platforms. Automate the entire pipeline from data collection to model training using Bright Data's extensive scraper library. Generate marketing copy using your custom-trained model via an interactive chat interface. How it works The workflow operates in two main phases: model training and model usage, organized into these stages: Data Collection & Processing Manually triggered to start the fine-tuning process. Uses Bright Data's web scraper to extract product information from any supported e-commerce platform (Amazon, eBay, Shopify stores, Walmart, Target, and hundreds of other websites). Collects product titles, brands, features, descriptions, ratings, and availability status from your chosen platform. Easily customizable to scrape from different websites by simply changing the dataset configuration and product URLs. Training Data Preparation A Code node processes the scraped product data to create training examples in OpenAI's required JSONL format. For each product, generates a complete training example with: System message defining the AI's role as a marketing assistant. User prompt containing specific product details (title, brand, features, original description snippet). Assistant response providing an ideal marketing description template. Compiles all training examples into a single JSONL file ready for OpenAI fine-tuning. Model Fine-Tuning Uploads the training file to OpenAI using the OpenAI File Upload node. Initiates a fine-tuning job via HTTP Request to OpenAI's fine-tuning API using the GPT-4o-mini model as the base. The fine-tuning process runs on OpenAI's servers to create your custom model. Interactive Chat Interface Provides a chat trigger that allows real-time interaction with your fine-tuned model. An AI Agent node connects to your custom-trained OpenAI model. Users can chat with the model to generate product descriptions, marketing copy, or other content based on the training. Custom Model Integration The OpenAI Chat Model node is configured to use your specific fine-tuned model ID. Delivers responses trained on your product data for consistent, high-quality marketing content. Summary Flow: Manual Trigger → Scrape E-commerce Products (Bright Data) → Process & Format Training Data (Code) → Upload Training File (OpenAI) → Start Fine-Tuning Job (HTTP Request) | Parallel: Chat Trigger → AI Agent → Custom Fine-Tuned Model Response Benefits: Fully automated pipeline from raw product data to trained AI model. Works with hundreds of different e-commerce websites through Bright Data's extensive scraper library. Creates specialized models trained on real e-commerce data for authentic marketing copy across various industries. Scalable solution that can be adapted to different product categories, niches, or websites. Interactive chat interface for immediate access to your custom-trained model. Cost-effective fine-tuning using OpenAI's most efficient model (GPT-4o-mini). Easily customizable with different websites, product URLs, training prompts, and model configurations. Setup Requirements: Bright Data API credentials for web scraping (supports hundreds of e-commerce websites). OpenAI API key with fine-tuning access. Replace placeholder credential IDs and model IDs with your actual values. Customize the product URLs list and Bright Data dataset for your specific website and use case. The workflow can be adapted for any e-commerce platform supported by Bright Data's scraping infrastructure.
by Femi Ad
Google Sheets to MailChimp Auto-Importer Overview This n8n workflow automatically imports contacts from Google Sheets into your MailChimp mailing list. Perfect for businesses collecting leads through Google Forms, event registrations, or maintaining contact lists in spreadsheets. Key Features 📊 Bulk Import: Process entire Google Sheets at once 🔄 Smart Name Parsing: Automatically splits full names into first and last names 📱 Phone Number Support: Includes phone numbers as merge fields ⚡ Error Resilience: Continues processing even if individual contacts fail 📝 Import Summary: Generates a summary of processed contacts Prerequisites Before using this workflow, ensure you have: An active n8n instance (self-hosted or cloud) A Google account with access to Google Sheets A MailChimp account with at least one audience/list created Basic understanding of n8n workflows Initial Setup Step 1: Import the Workflow Copy the workflow JSON In n8n, click "Import from File" or paste the JSON Save the workflow with a meaningful name Step 2: Configure Google Sheets Connection Click on the "Get Google Sheet Data" node Click on "Credential to connect with" Select "Create New" and choose "Google Sheets OAuth2" Follow the OAuth flow to authenticate your Google account Save the credentials Step 3: Configure MailChimp Connection Click on the "Add to MailChimp" node Click on "Credential to connect with" Select "Create New" and choose "MailChimp OAuth2" or "MailChimp API" For API method: Log into MailChimp Go to Account → Extras → API keys Generate a new API key Copy and paste it into n8n Save the credentials Step 4: Configure Your Specific Settings Google Sheets Settings: Open the "Get Google Sheet Data" node Replace YOUR_GOOGLE_SHEET_ID with your actual sheet ID Find this in your Google Sheets URL: https://docs.google.com/spreadsheets/d/[SHEET_ID]/edit Replace YOUR_SHEET_NAME with your worksheet name (e.g., "Sheet1" or "Form Responses 1") MailChimp Settings: Open the "Add to MailChimp" node Replace YOUR_MAILCHIMP_LIST_ID with your audience ID Find this in MailChimp: Audience → Settings → Audience name and defaults Verify the status is set to "subscribed" Google Sheets Format Requirements Your Google Sheet must have the following columns (exact names): Names**: Full name of the contact (e.g., "John Doe") Email address**: Valid email address Phone Number**: Contact phone number (optional) Example: | Names | Email address | Phone Number | |-------|--------------|--------------| | John Doe | john@example.com | +1234567890 | | Jane Smith | jane@example.com | +0987654321 | How to Use Manual Execution: Open the workflow in n8n Click "Execute Workflow" Monitor the execution progress Check the output of "Create Import Summary" for results Scheduling (Optional): To run this automatically: Replace the "Manual Trigger" node with a "Schedule Trigger" node Set your desired schedule (e.g., daily at 9 AM) Activate the workflow Customization Options Adding More Fields: To include additional fields like company name or address: Add columns to your Google Sheet Modify the "Edit Fields" node to include new fields Update the "Format Subscriber Data" code to map new fields Add corresponding merge fields in the MailChimp node Handling Duplicates: The workflow uses "continueRegularOutput" error handling, which means: Existing subscribers will be skipped New subscribers will be added The workflow continues processing Adding Email Notifications: To receive import summaries via email: Add a Gmail or Email node after "Create Import Summary" Configure with your email settings Use the import summary data in the email body Troubleshooting Common Issues: "Invalid API Key" (MailChimp) Verify your API key is correct Check that your MailChimp account is active "Sheet not found" (Google Sheets) Verify the sheet ID is correct Ensure the service account has access to the sheet "Email already exists" errors This is normal for existing subscribers The workflow will continue processing other contacts Missing data in MailChimp Check that column names match exactly (case-sensitive) Verify data exists in the Google Sheet Best Practices Test First: Always test with a small dataset first Backup Data: Export your MailChimp list before large imports Clean Data: Ensure email addresses are valid before importing Monitor Regularly: Check import summaries for any issues Respect Privacy: Only import contacts who have consented to receive emails Support For issues specific to: n8n platform: Visit n8n Community Forum Google Sheets API: Check Google Developers Documentation MailChimp API: See MailChimp API Documentation Need help customizing? Contact me for consulting and support or add me on LinkedIn - https://www.linkedin.com/in/femi-adedayo-h44/ License This workflow template is provided free for personal and commercial use. Feel free to modify and share!
by John Alejandro SIlva
🤖🥗 Telegram Nutrition AI Assistant (Alternative to Cal AI App) > AI-powered nutrition assistant for Telegram — log meals, set goals, and get personalized daily reports with Google Sheets integration. 📋 Description This n8n template creates a Telegram-based Nutrition AI Assistant 🥑🔥 designed as an open-source alternative to the Cal AI mobile app. It allows users to interact with an AI agent via text, voice, or images to track meals, calculate macros, and monitor nutrition goals directly from Telegram. The system integrates Google Sheets as the database, handling both user profiles and meal logs, while leveraging Gemini AI for natural conversation, food recognition, and daily progress reports. ✨ Key Features 💬 Multi-input support: Text, voice messages (transcribed), and food images (AI analysis). 📊 Macro calculation: Automatic estimation of calories, proteins, carbs, and fats. 📝 User-friendly registration: Simple onboarding without storing personal health data (no weight/height required). 🎯 Goal tracking: Users can set and update calorie and protein targets. 📈 Daily reports: Personalized progress messages with visual progress bars. 🗂 Google Sheets integration: Profile table for user targets. Meals table for food logs. 🔄 Advanced n8n nodes: Includes use of Merge, Subworkflow, and Code nodes for data processing and report generation. 💡 Acknowledgment Inspired by the Cal AI concept 💡 — this template demonstrates how to reproduce its main functionality with n8n, Telegram, and AI agents as a flexible, open-source automation workflow. 🏷 Tags telegram ai-assistant nutrition meal-tracking google-sheets food-logging voice-transcription image-analysis daily-reports n8n-template merge-node subworkflow-node code-node telegram-trigger google-gemini 💼 Use Case Use this template if you want to: 🥗 Log meals using text, images, or voice messages. 📊 Track nutrition goals (calories, proteins) with daily progress updates. 🤖 Provide a chat-based nutrition assistant without building a full app. 🗂 Store structured nutrition data in Google Sheets for easy access and analysis. 💬 Example User Interactions 📸 User sends a photo of a meal → AI analyzes the food and logs calories/macros. 🎤 User sends a voice message → AI transcribes and logs the meal. ⌨️ User types “report” → AI returns a daily nutrition summary with progress bars. 🥅 User says “update my protein goal” → AI updates profile in Google Sheets. 🔑 Required Credentials Telegram Bot API (Bot Token) Google Sheets API credentials AI Provider API (Google Gemini or compatible LLM) ⚙️ Setup Instructions 🗂 Create two Google Sheets tables: Profile: User_ID, Name, Calories_target, Protein_target Meals: User_ID, Date, Meal_description, Calories, Proteins, Carbs, Fats 🔌 Configure the Telegram Trigger with your bot token. 🤖 Connect your AI provider credentials (Gemini recommended). 📑 Connect Google Sheets with your credentials. ▶️ Deploy the workflow in n8n. 🎯 Start interacting with your nutrition assistant via Telegram. 📌 Extra Notes 🟩 Green section: Handles Telegram trigger and user check. 🟥 Red section: Registers new users and sets goals. 🟦 Blue section: Processes text, voice, and images. 🟨 Yellow section: Generates nutrition reports. 🟪 Purple section: Main AI agent controlling tools and logic. 💡 Need Assistance? If you’d like help customizing or extending this workflow, feel free to reach out: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez
by Kevin
Monitor Postgres Data Freshness and Email Alert If Stale This template monitors a set of tables inside a Postgres database to ensure they're getting updated. If the table hasn't been updated in 3 days (configurable), an email alert is sent containing the tables that are stale. Requirements You must have a Postgres database containing one or more tables that you'd like to monitor. Each table to monitor must have a date or timestamp column that tracks when data was pushed. For example, this might be: A timestamp column if your table holds event/timeseries data A last_updated column if your rows are expected to be modified Usage Use this template Add your Postgres and email credentials Adjust the Produce tables + date columns node to produce pairs of [table, date_column] that should be monitored for freshness 💁♂️ Note that a timestamp column also works (Optional) Adjust the Remove fresh tables node for your desired staleness window (default is 3 days, but you can adjust as you please) (Optional) Customize the Send alerts node to call whichever alerting workflow you please (I recommend my alerting workflow for easiest plug-and-play) How it works This template works by: Pulling the most recent row for each table Calculating how out-of-date each table is, in days Dropping fresh tables that have been updated within the past 3 days Sending an email alert with the stale tables that haven't been updated within the past 3 days
by Lucas Peyrin
How it works This workflow demonstrates a fundamental pattern for securing a webhook by requiring an API key. It acts as a gatekeeper, checking for a valid key in the request header before allowing the request to proceed. Incoming Request: The Secured Webhook node receives an incoming POST request. It expects an API key to be sent in the x-api-key header. API Key Verification: The Check API Key node takes the key from the incoming request's header. It then makes an internal HTTP request to a second webhook (Get API Key) which acts as a mock database. This second webhook retrieves a list of registered API keys (from the Registered API Keys node) and filters it to find a match for the key that was provided. Conditional Response: If a match is found, the API Key Identified node routes the execution to the "success" path, returning a 200 OK response with the identified user's ID. If no match is found, it routes to the "unauthorized" path, returning a 401 Unauthorized error. This pattern separates the public-facing endpoint from the data source, which is a good security practice. Set up steps Setup time: ~2 minutes This workflow is designed to be a self-contained example. Set up Credentials: This workflow uses "Header Auth" for its internal communication. Go to Credentials and create a new Header Auth credential. You can use any name and value (e.g., Name: X-N8N-Auth, Value: my-secret-password). Select this credential in all four webhook/HTTP Request nodes. Add Your API Keys: Open the Registered API Keys node. This is your mock database. Edit the array to include the user_id and api_key pairs you want to authorize. Activate the workflow. Test it: Use the Test Secure Webhook node to send a request. Try it with a valid key from your list to see the success response. Change the x-api-key header to an invalid key to see the 401 Unauthorized error. For Production: Replace the mock database part of this workflow (the Get API Key webhook and Registered API Keys node) with a real database node like Supabase, Postgres, or Baserow to look up keys.
by Lucas Peyrin
How it works This workflow changes the file name, and therefore the extension and MIME type, of any binary file passed to it. This is perfect for converting file formats on the fly, like turning a Telegram voice message (.oga) into an MP3 for an AI transcription service. Set New File Name: The SET OUTPUT FILE NAME node is where you define the desired output file name and extension (e.g., audio.mp3). It also dynamically captures the property name of the incoming binary (e.g., data). Extract Binary Data: The workflow temporarily converts the binary file into a Base64 text string to make it accessible in the next step. Rebuild Binary with New Name: A Code node takes the Base64 data and reconstructs it as a binary file, but this time, it assigns the new file name you specified. n8n automatically sets the MIME type based on the new file extension. Set up steps Setup time: < 1 minute This workflow is designed to be used as a sub-workflow. In your main workflow, add an Execute Sub-Workflow node where you need to change a file's type. In the Workflow parameter, select this "Change Binary MimeType/Extension" workflow. Open this workflow and go to the SET OUTPUT FILE NAME node. Modify the output_file_name value to your desired file name (e.g., voice_message.mp3 or document.pdf). Save this workflow. Now, any binary file you send to it from your main workflow will be returned with the new fileName and mimeType.
by Jimleuk
This template is for self-hosted n8n instances only. This n8n demonstrates how to build a simple FileSystem MCP server. Connecting to this server allows MCP clients and agents to list, read and create directories and files on the local machine or remote server. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem How it works A MCP server trigger is used and connected to 5 tools: 3 Execute Command tools and 2 custom workflow tools. The 3 Execute Command tools allow for listing, searching and creating directories. The 2 custom workflow tools are for reading and writing files to disk. Special care has been to not allow the MCP agent to execute arbitrary linux commands on the target server. This is achieved by only allowing the agent to provide parameters such as filenames and paths rather than raw commands. How to use This Filesystem MCP server will write to the server which hosts the n8n instance - this can be your local machine or a remove server. If your target filesystem is on neither, then modify the commands to connect to the desired server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please help me list all folders under the project directory." "Help me create a bash script to send a notification to Slack." "Search for the log file on the 22nd April and read its contents. What was the cause of the outage?" Requirements Linux file system for this example template. Feel free to modify if working on Windows. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Implement the moving and renaming of files by adding more custom workflow tools to the MCP server. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Oneclick AI Squad
This n8n workflow automates subdomain creation and deletion on GoDaddy using their API, triggered via email requests. This empowers developers to manage subdomains directly without involving DevOps for minor tasks. Good to know Ensure GoDaddy API credentials are securely configured to avoid unauthorized access. Email parsing accuracy depends on the consistency of request formats. How it works Detect new email requests using the Start Workflow (GET Request) node. Use the Extract Data from Email node to parse relevant details (e.g., subdomain name, action type). Validate the action type with the Validate Action Type node to proceed with create (true) or delete (false). If true, the Create Subdomain node sends a POST request to GoDaddy’s API to create the subdomain. If false, the Delete Subdomain node sends a DELETE request to remove the subdomain. The Send Email Response node notifies the requester of the action’s success or failure. How to use Import the workflow into n8n and configure the nodes with your GoDaddy API and email credentials. Test with sample email requests to ensure proper parsing and API calls. Requirements GoDaddy API credentials Email service (e.g., SMTP or API) for notifications Customising this workflow Adjust the Extract Data from Email node to match your email format or add additional validation steps for security.
by CreativeCreature
Workflow Overview This workflow automates the process of forwarding e-book files to a Kindle device using a Telegram bot and Outlook email. Setup Steps: Telegram Bot Setup: Create a Telegram bot via BotFather and configure its credentials in the workflow. Outlook Email Configuration: Set up your Outlook email credentials. (Currently, only Outlook is supported, but you can modify the workflow to support other email providers.) Amazon Kindle Email Setup: Find your Kindle device's email address from your Amazon account. This will be the recipient address for the e-books. Allow Email Sending to Kindle: Ensure your Amazon account is configured to allow emails from your Outlook address to send files to your Kindle. Workflow Explanation: The workflow begins with a Telegram bot trigger node that listens for new chat messages. When a new message is received, the workflow checks if the message contains a file attachment. If no file is detected, the bot will send a warning reply to the user in the chat. If a file is found, it will be renamed to ensure it appears correctly on the Kindle device when sent. The workflow then composes an email with the file attached and sends it to the Kindle's receiving address. If the email is sent successfully, the bot will notify the user with a success message in the chat. Only Amazon-supported file types will be accepted by Kindle. If sending fails, you will receive a notification email from Amazon in your Outlook inbox. In case of delivery issues, retry sending the file as network issues may occasionally interfere with the process.
by Agent Studio
Overview This workflow provides Retell agent builders with a simple way to populate dynamic variables using n8n. The workflow fetches user information from a Google Sheet based on the phone number and sends it back to Retell. It is based on Retell's Inbound Webhook Call. Retell is a service that lets you create Voice Agents that handle voice calls simply, based on a prompt or using a conversational flow builder. Who is it for For builders of Retell's Voice Agents who want to make their agents more personalized. Prerequisites Have a Retell AI Account Create a Retell agent Purchase a phone number and associate it with your agent Create a Google Sheets - for example, make a copy of this one. Your Google Sheet must have at least one column with the phone number. The remaining columns will be used to populate your Retell agent’s dynamic variables. All fields are returned as strings to Retell (variables are replaced as text) How it works The webhook call is received from Retell. We filter the call using their whitelisted IP address. It extracts data from the webhook call and uses it to retrieve the user from Google Sheets. It formats the data in the response to match Retell's expected format. Retell uses this data to replace dynamic variables in the prompts. How to use it See the description for screenshots! Set the webhook name (keep it as POST). Copy the Webhook URL (e.g., https://your-instance.app.n8n.cloud/webhook/retell-dynamic-variables) and paste it into Retell's interface. Navigate to "Phone Numbers", click on the phone number, and enable "Add an inbound webhook". In your prompt (e.g., "welcome message"), use the variable with this syntax: {{variable_name}} (see Retell's documentation). These variables will be dynamically replaced by the data in your Google Sheet. Notes In Google Sheets, the phone number must start with '+. Phone numbers must be formatted like the example: with the +, extension, and no spaces. You can use any database—just replace Google Sheets with your own, making sure to keep the phone number formatting consistent. 👉 Reach out to us if you're interested in analysing your Retell Agent conversations.
by Gareth B. Davies
An automated backup solution designed for self-hosted n8n users to automatically backup their workflows to Bitbucket, leveraging Bitbucket's free private repository offering. Perfect for maintaining version control of your n8n workflows without additional costs. How it works: Runs on a regular schedule to check all workflows in your n8n instance Compares each workflow with its version in Bitbucket Only uploads workflows that are new or have changed Uses basic rate limiting to stay within Bitbucket's API limits Formats filenames for easy tracking and includes timestamps in commit messages Handles errors gracefully with automatic retries Set up steps (10-15 minutes): Create a free Bitbucket account and private repository Create a Bitbucket App Password with repository write access Add Bitbucket credentials to n8n (using your username and app password) Set up n8n API access (generate API key in your n8n instance) Configure your Bitbucket workspace and repository names in the Set node Optional: Adjust the backup schedule (default: 2 AM daily) Perfect for n8n self-hosters who want: Version control for their workflows Automated daily backups Free private repository storage Easy workflow recovery Change tracking over time The workflow includes basic error handling and rate limiting to ensure reliable backups even with larger numbers of workflows. Adjust your timing based on https://support.atlassian.com/bitbucket-cloud/docs/api-request-limits/.