by Daniel Shashko
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the process of scraping product data from e-commerce websites and using it to fine-tune a custom OpenAI GPT model for generating high-quality marketing copy and product descriptions. Main Use Cases Fine-tune OpenAI models with real product data from hundreds of supported e-commerce websites for marketing content generation. Create custom AI models specialized in writing compelling product descriptions across different industries and platforms. Automate the entire pipeline from data collection to model training using Bright Data's extensive scraper library. Generate marketing copy using your custom-trained model via an interactive chat interface. How it works The workflow operates in two main phases: model training and model usage, organized into these stages: Data Collection & Processing Manually triggered to start the fine-tuning process. Uses Bright Data's web scraper to extract product information from any supported e-commerce platform (Amazon, eBay, Shopify stores, Walmart, Target, and hundreds of other websites). Collects product titles, brands, features, descriptions, ratings, and availability status from your chosen platform. Easily customizable to scrape from different websites by simply changing the dataset configuration and product URLs. Training Data Preparation A Code node processes the scraped product data to create training examples in OpenAI's required JSONL format. For each product, generates a complete training example with: System message defining the AI's role as a marketing assistant. User prompt containing specific product details (title, brand, features, original description snippet). Assistant response providing an ideal marketing description template. Compiles all training examples into a single JSONL file ready for OpenAI fine-tuning. Model Fine-Tuning Uploads the training file to OpenAI using the OpenAI File Upload node. Initiates a fine-tuning job via HTTP Request to OpenAI's fine-tuning API using the GPT-4o-mini model as the base. The fine-tuning process runs on OpenAI's servers to create your custom model. Interactive Chat Interface Provides a chat trigger that allows real-time interaction with your fine-tuned model. An AI Agent node connects to your custom-trained OpenAI model. Users can chat with the model to generate product descriptions, marketing copy, or other content based on the training. Custom Model Integration The OpenAI Chat Model node is configured to use your specific fine-tuned model ID. Delivers responses trained on your product data for consistent, high-quality marketing content. Summary Flow: Manual Trigger → Scrape E-commerce Products (Bright Data) → Process & Format Training Data (Code) → Upload Training File (OpenAI) → Start Fine-Tuning Job (HTTP Request) | Parallel: Chat Trigger → AI Agent → Custom Fine-Tuned Model Response Benefits: Fully automated pipeline from raw product data to trained AI model. Works with hundreds of different e-commerce websites through Bright Data's extensive scraper library. Creates specialized models trained on real e-commerce data for authentic marketing copy across various industries. Scalable solution that can be adapted to different product categories, niches, or websites. Interactive chat interface for immediate access to your custom-trained model. Cost-effective fine-tuning using OpenAI's most efficient model (GPT-4o-mini). Easily customizable with different websites, product URLs, training prompts, and model configurations. Setup Requirements: Bright Data API credentials for web scraping (supports hundreds of e-commerce websites). OpenAI API key with fine-tuning access. Replace placeholder credential IDs and model IDs with your actual values. Customize the product URLs list and Bright Data dataset for your specific website and use case. The workflow can be adapted for any e-commerce platform supported by Bright Data's scraping infrastructure.
by Femi Ad
Google Sheets to MailChimp Auto-Importer Overview This n8n workflow automatically imports contacts from Google Sheets into your MailChimp mailing list. Perfect for businesses collecting leads through Google Forms, event registrations, or maintaining contact lists in spreadsheets. Key Features 📊 Bulk Import: Process entire Google Sheets at once 🔄 Smart Name Parsing: Automatically splits full names into first and last names 📱 Phone Number Support: Includes phone numbers as merge fields ⚡ Error Resilience: Continues processing even if individual contacts fail 📝 Import Summary: Generates a summary of processed contacts Prerequisites Before using this workflow, ensure you have: An active n8n instance (self-hosted or cloud) A Google account with access to Google Sheets A MailChimp account with at least one audience/list created Basic understanding of n8n workflows Initial Setup Step 1: Import the Workflow Copy the workflow JSON In n8n, click "Import from File" or paste the JSON Save the workflow with a meaningful name Step 2: Configure Google Sheets Connection Click on the "Get Google Sheet Data" node Click on "Credential to connect with" Select "Create New" and choose "Google Sheets OAuth2" Follow the OAuth flow to authenticate your Google account Save the credentials Step 3: Configure MailChimp Connection Click on the "Add to MailChimp" node Click on "Credential to connect with" Select "Create New" and choose "MailChimp OAuth2" or "MailChimp API" For API method: Log into MailChimp Go to Account → Extras → API keys Generate a new API key Copy and paste it into n8n Save the credentials Step 4: Configure Your Specific Settings Google Sheets Settings: Open the "Get Google Sheet Data" node Replace YOUR_GOOGLE_SHEET_ID with your actual sheet ID Find this in your Google Sheets URL: https://docs.google.com/spreadsheets/d/[SHEET_ID]/edit Replace YOUR_SHEET_NAME with your worksheet name (e.g., "Sheet1" or "Form Responses 1") MailChimp Settings: Open the "Add to MailChimp" node Replace YOUR_MAILCHIMP_LIST_ID with your audience ID Find this in MailChimp: Audience → Settings → Audience name and defaults Verify the status is set to "subscribed" Google Sheets Format Requirements Your Google Sheet must have the following columns (exact names): Names**: Full name of the contact (e.g., "John Doe") Email address**: Valid email address Phone Number**: Contact phone number (optional) Example: | Names | Email address | Phone Number | |-------|--------------|--------------| | John Doe | john@example.com | +1234567890 | | Jane Smith | jane@example.com | +0987654321 | How to Use Manual Execution: Open the workflow in n8n Click "Execute Workflow" Monitor the execution progress Check the output of "Create Import Summary" for results Scheduling (Optional): To run this automatically: Replace the "Manual Trigger" node with a "Schedule Trigger" node Set your desired schedule (e.g., daily at 9 AM) Activate the workflow Customization Options Adding More Fields: To include additional fields like company name or address: Add columns to your Google Sheet Modify the "Edit Fields" node to include new fields Update the "Format Subscriber Data" code to map new fields Add corresponding merge fields in the MailChimp node Handling Duplicates: The workflow uses "continueRegularOutput" error handling, which means: Existing subscribers will be skipped New subscribers will be added The workflow continues processing Adding Email Notifications: To receive import summaries via email: Add a Gmail or Email node after "Create Import Summary" Configure with your email settings Use the import summary data in the email body Troubleshooting Common Issues: "Invalid API Key" (MailChimp) Verify your API key is correct Check that your MailChimp account is active "Sheet not found" (Google Sheets) Verify the sheet ID is correct Ensure the service account has access to the sheet "Email already exists" errors This is normal for existing subscribers The workflow will continue processing other contacts Missing data in MailChimp Check that column names match exactly (case-sensitive) Verify data exists in the Google Sheet Best Practices Test First: Always test with a small dataset first Backup Data: Export your MailChimp list before large imports Clean Data: Ensure email addresses are valid before importing Monitor Regularly: Check import summaries for any issues Respect Privacy: Only import contacts who have consented to receive emails Support For issues specific to: n8n platform: Visit n8n Community Forum Google Sheets API: Check Google Developers Documentation MailChimp API: See MailChimp API Documentation Need help customizing? Contact me for consulting and support or add me on LinkedIn - https://www.linkedin.com/in/femi-adedayo-h44/ License This workflow template is provided free for personal and commercial use. Feel free to modify and share!
by John Alejandro SIlva
🤖🥗 Telegram Nutrition AI Assistant (Alternative to Cal AI App) > AI-powered nutrition assistant for Telegram — log meals, set goals, and get personalized daily reports with Google Sheets integration. 📋 Description This n8n template creates a Telegram-based Nutrition AI Assistant 🥑🔥 designed as an open-source alternative to the Cal AI mobile app. It allows users to interact with an AI agent via text, voice, or images to track meals, calculate macros, and monitor nutrition goals directly from Telegram. The system integrates Google Sheets as the database, handling both user profiles and meal logs, while leveraging Gemini AI for natural conversation, food recognition, and daily progress reports. ✨ Key Features 💬 Multi-input support: Text, voice messages (transcribed), and food images (AI analysis). 📊 Macro calculation: Automatic estimation of calories, proteins, carbs, and fats. 📝 User-friendly registration: Simple onboarding without storing personal health data (no weight/height required). 🎯 Goal tracking: Users can set and update calorie and protein targets. 📈 Daily reports: Personalized progress messages with visual progress bars. 🗂 Google Sheets integration: Profile table for user targets. Meals table for food logs. 🔄 Advanced n8n nodes: Includes use of Merge, Subworkflow, and Code nodes for data processing and report generation. 💡 Acknowledgment Inspired by the Cal AI concept 💡 — this template demonstrates how to reproduce its main functionality with n8n, Telegram, and AI agents as a flexible, open-source automation workflow. 🏷 Tags telegram ai-assistant nutrition meal-tracking google-sheets food-logging voice-transcription image-analysis daily-reports n8n-template merge-node subworkflow-node code-node telegram-trigger google-gemini 💼 Use Case Use this template if you want to: 🥗 Log meals using text, images, or voice messages. 📊 Track nutrition goals (calories, proteins) with daily progress updates. 🤖 Provide a chat-based nutrition assistant without building a full app. 🗂 Store structured nutrition data in Google Sheets for easy access and analysis. 💬 Example User Interactions 📸 User sends a photo of a meal → AI analyzes the food and logs calories/macros. 🎤 User sends a voice message → AI transcribes and logs the meal. ⌨️ User types “report” → AI returns a daily nutrition summary with progress bars. 🥅 User says “update my protein goal” → AI updates profile in Google Sheets. 🔑 Required Credentials Telegram Bot API (Bot Token) Google Sheets API credentials AI Provider API (Google Gemini or compatible LLM) ⚙️ Setup Instructions 🗂 Create two Google Sheets tables: Profile: User_ID, Name, Calories_target, Protein_target Meals: User_ID, Date, Meal_description, Calories, Proteins, Carbs, Fats 🔌 Configure the Telegram Trigger with your bot token. 🤖 Connect your AI provider credentials (Gemini recommended). 📑 Connect Google Sheets with your credentials. ▶️ Deploy the workflow in n8n. 🎯 Start interacting with your nutrition assistant via Telegram. 📌 Extra Notes 🟩 Green section: Handles Telegram trigger and user check. 🟥 Red section: Registers new users and sets goals. 🟦 Blue section: Processes text, voice, and images. 🟨 Yellow section: Generates nutrition reports. 🟪 Purple section: Main AI agent controlling tools and logic. 💡 Need Assistance? If you’d like help customizing or extending this workflow, feel free to reach out: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez
by Lucas Peyrin
How it works This workflow demonstrates a fundamental pattern for securing a webhook by requiring an API key. It acts as a gatekeeper, checking for a valid key in the request header before allowing the request to proceed. Incoming Request: The Secured Webhook node receives an incoming POST request. It expects an API key to be sent in the x-api-key header. API Key Verification: The Check API Key node takes the key from the incoming request's header. It then makes an internal HTTP request to a second webhook (Get API Key) which acts as a mock database. This second webhook retrieves a list of registered API keys (from the Registered API Keys node) and filters it to find a match for the key that was provided. Conditional Response: If a match is found, the API Key Identified node routes the execution to the "success" path, returning a 200 OK response with the identified user's ID. If no match is found, it routes to the "unauthorized" path, returning a 401 Unauthorized error. This pattern separates the public-facing endpoint from the data source, which is a good security practice. Set up steps Setup time: ~2 minutes This workflow is designed to be a self-contained example. Set up Credentials: This workflow uses "Header Auth" for its internal communication. Go to Credentials and create a new Header Auth credential. You can use any name and value (e.g., Name: X-N8N-Auth, Value: my-secret-password). Select this credential in all four webhook/HTTP Request nodes. Add Your API Keys: Open the Registered API Keys node. This is your mock database. Edit the array to include the user_id and api_key pairs you want to authorize. Activate the workflow. Test it: Use the Test Secure Webhook node to send a request. Try it with a valid key from your list to see the success response. Change the x-api-key header to an invalid key to see the 401 Unauthorized error. For Production: Replace the mock database part of this workflow (the Get API Key webhook and Registered API Keys node) with a real database node like Supabase, Postgres, or Baserow to look up keys.
by Oneclick AI Squad
This n8n workflow automates subdomain creation and deletion on GoDaddy using their API, triggered via email requests. This empowers developers to manage subdomains directly without involving DevOps for minor tasks. Good to know Ensure GoDaddy API credentials are securely configured to avoid unauthorized access. Email parsing accuracy depends on the consistency of request formats. How it works Detect new email requests using the Start Workflow (GET Request) node. Use the Extract Data from Email node to parse relevant details (e.g., subdomain name, action type). Validate the action type with the Validate Action Type node to proceed with create (true) or delete (false). If true, the Create Subdomain node sends a POST request to GoDaddy’s API to create the subdomain. If false, the Delete Subdomain node sends a DELETE request to remove the subdomain. The Send Email Response node notifies the requester of the action’s success or failure. How to use Import the workflow into n8n and configure the nodes with your GoDaddy API and email credentials. Test with sample email requests to ensure proper parsing and API calls. Requirements GoDaddy API credentials Email service (e.g., SMTP or API) for notifications Customising this workflow Adjust the Extract Data from Email node to match your email format or add additional validation steps for security.
by Gareth B. Davies
An automated backup solution designed for self-hosted n8n users to automatically backup their workflows to Bitbucket, leveraging Bitbucket's free private repository offering. Perfect for maintaining version control of your n8n workflows without additional costs. How it works: Runs on a regular schedule to check all workflows in your n8n instance Compares each workflow with its version in Bitbucket Only uploads workflows that are new or have changed Uses basic rate limiting to stay within Bitbucket's API limits Formats filenames for easy tracking and includes timestamps in commit messages Handles errors gracefully with automatic retries Set up steps (10-15 minutes): Create a free Bitbucket account and private repository Create a Bitbucket App Password with repository write access Add Bitbucket credentials to n8n (using your username and app password) Set up n8n API access (generate API key in your n8n instance) Configure your Bitbucket workspace and repository names in the Set node Optional: Adjust the backup schedule (default: 2 AM daily) Perfect for n8n self-hosters who want: Version control for their workflows Automated daily backups Free private repository storage Easy workflow recovery Change tracking over time The workflow includes basic error handling and rate limiting to ensure reliable backups even with larger numbers of workflows. Adjust your timing based on https://support.atlassian.com/bitbucket-cloud/docs/api-request-limits/.
by Oneclick AI Squad
This n8n workflow automates personalized travel assistance via WhatsApp through a friendly virtual agent named Alex. It helps users plan trips, explore destinations, get visa/weather/hotel information, and book packages—all through a conversational interface. The system ensures quick, human-like support 24/7, improving customer experience and reducing manual handling by travel agents. Key Features The Travel Assistant agent provides contextual responses based on conversation history stored in memory. Alex maintains a friendly, professional tone throughout all interactions to enhance user experience. The workflow includes intelligent waiting mechanisms to ensure proper response processing. Memory functionality allows for seamless continuation of conversations across multiple interactions. Workflow Process The Get WhatsApp Message node captures incoming messages from users on WhatsApp, initiating the travel assistance process. The Travel Assistant node processes user queries using AI to understand travel needs and generate appropriate responses for trip planning, destination information, visa requirements, weather updates, and booking assistance. The Travel Plan Creator agent works in conjunction with the main assistant to generate detailed itineraries and travel recommendations based on user preferences. The Memory node stores conversation context and user preferences, enabling personalized responses and seamless conversation flow across multiple interactions. The Wait For Response node introduces intelligent delays to ensure proper message processing and natural conversation pacing. The Send Reply On WhatsApp node delivers the AI-generated travel assistance back to the user through WhatsApp messaging. Setup Instructions Import the workflow into n8n and configure WhatsApp Business API credentials for message handling. Set up the AI service for the Travel Assistant and Travel Plan Creator agents with your preferred language model. Configure the Memory node with appropriate storage settings for conversation persistence. Test the workflow by sending various travel-related queries through WhatsApp to ensure proper responses. Monitor conversation quality and adjust AI parameters as needed for optimal user experience. Prerequisites WhatsApp Business API access or WhatsApp integration service AI/LLM service for travel assistance (OpenAI, Anthropic, or similar) Database or storage service for conversation memory Access to travel data APIs for real-time information (weather, visa requirements, hotel availability) Modification Options Modify the Travel Assistant node to include specific travel databases, local recommendations, or branded responses. Adjust the conversation memory settings to control how much context is retained across interactions. Customize the Travel Plan Creator to include preferred booking platforms, hotel chains, or travel partners. Add additional specialized agents for specific travel services like flight booking, car rentals, or activity reservations. Configure response timing in the Wait For Response node to match your desired conversation flow.
by Daniel Nolde
What this does Show you how to us XMLRPC APIs via the generic HTTP-Request-node, by the example of posting to a wordpress blog This is also a feasible workaround if a specific n8n integration does not work or stops working (which happens e.g. with the Wordpress node) How it works First, the XML payload for the request is being prepared (in a code node, which also properly escapes special character in the values that you want to send to the XMLRPC endpoint) Then, the HTTP Request node sends the request using the HTTP post method Last, the returned XML response is converted to JSON which a conditional node uses to determine whether th operation was successful or not Setup steps: Import workflow Ensure you have a wordpress blog with a user that has an app-Password Edit the "Settings"-node and enter your individual values for url/user/app-pw
by Mirajul Mohin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform your video uploads into AI-powered summaries with key topic extraction and instant team notifications. What this workflow does Monitors Google Drive for new video uploads Downloads and processes videos using VLM Run AI Generates intelligent summaries with key topics extracted Posts results to Slack for immediate team access Setup Prerequisites: Google Drive account, VLM Run API credentials, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Google Drive OAuth2 and create video upload folder Add VLM Run API credentials Set up Slack integration for notifications Update folder/channel IDs in workflow nodes Test and activate Perfect for Meeting recordings and training videos Webinar summaries and educational content Content analysis and team collaboration Any video content requiring quick insights Key Benefits Asynchronous processing** handles large files without timeouts Multi-format support** for MP4, AVI, MOV, WebM, MKV Instant team updates** via Slack notifications Saves hours** of manual video review time How to customize Extend by adding: Video categorization and tagging Integration with project management tools Email notifications alongside Slack Searchable video databases with summaries This workflow transforms lengthy videos into actionable insights, making your content instantly accessible and shareable with your team.
by Fan Luo
Daily Company News Bot This n8n template demonstrates how to use Free FinnHub API to retrieve the company news from a list stock tickers and post messages in Slack channel with a pre-scheduled time. How it works We firstly define the list of stock tickers you are interested Loop over items to call FinnHub API to get the latest company news for the ticker Then we format the company news as a markdown text content which could be sent to Slack Post a new message in Slack channel Wait for 5 seconds, then move to the next ticker How to use Simply setup a scheduler trigger to automatically trigger the workflow Requirements FinnHub API Key Slack channel webhook Need Help? Contact me via My Blog or ask in the Forum! Happy Hacking!
by Abdullah
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Overview This workflow automates the process of transcribing audio files and summarizing them using OpenAI models, with the final output stored neatly in Notion. Whether you're a researcher, content creator, student, or professional, this automation saves time by converting voice recordings into actionable summaries with zero manual effort. Created by: Abdullah Dilshad Contact: iamabdullahdilshad@gmail.com Who It’s For This template is ideal for: Researchers**: Transcribe and summarize interviews, lectures, or research recordings. Content Creators**: Convert podcasts or videos into transcripts and social captions/show notes. Students**: Automatically turn lectures or study group audio into summarized notes. Professionals**: Log meeting notes and summaries directly into your Notion workspace. How It Works This four-step workflow performs the following: Step 1:* *Trigger: New Audio in Google Drive** Automatically triggers when a new audio file (MP3/WAV) is uploaded to a specified Google Drive folder.The file is then downloaded for processing. Step 2: Transcribe Audio with Whisper** The audio file is sent to OpenAI’s Whisper model for high-accuracy transcription. Step 3: Summarize Transcript with GPT-4** The transcript is passed to GPT-4, which generates a clean, concise summary. Step 4: Store Summary in Notion** A new Notion page is created with the generated summary and optional metadata (file name, upload time, etc.). Setup Instructions Step 1: Google Drive Trigger** Connect your Google Drive account. Select the folder you want to monitor. This node detects new file uploads and passes the file for download. Step 2: Download File** Downloads the new audio file for transcription. Step 3: Transcribe Recording (OpenAI Whisper) Connect your OpenAI API Key. Ensure this node receives the binary audio file. It will return the transcription as plain text. Step 3: Transcribe Recording (OpenAI Whisper)** Connect your OpenAI API Key. Ensure this node receives the binary audio file. It will return the transcription as plain text. Step 4: Summarize Transcript (GPT-4 via AI Agent)** Use your OpenAI API Key. Configure a summarization prompt like: "Summarize the following transcript in a clear and concise manner:" Connect the output from Whisper into this GPT-4 prompt. Step 5: Notion Integration** Connect your Notion account. Choose or create a database to store summaries. Map the GPT output (summary) to a "Text" or "Rich Text" property. Optionally include metadata like filename, file upload date, etc.
by Airtop
Automating LinkedIn Competitive Monitoring Use Case Automatically track and summarize LinkedIn posts from key executives at competitor companies. This agent provides structured insights into hiring trends, product announcements, strategic shifts, and thought leadership, helping teams stay informed and responsive without manual monitoring. What This Automation Does This automation monitors and summarizes LinkedIn posts from competitor profiles and shares the results on Slack. It uses the following input parameters: Airtop Profile**: A browser profile authenticated to LinkedIn. Create one Google Sheet**: A document listing LinkedIn profile URLs of competitors, copy this one. Slack Channel**: The destination for sharing summarized post insights. How It Works Trigger: The workflow is scheduled to run weekly at a specific time. Data Collection: Retrieves the list of competitor LinkedIn URLs from a Google Sheet. Browser Automation: Uses Airtop to navigate to each LinkedIn profile and analyze up to 5 recent posts. Summarization: Summarizes number of recent posts, main topics, and engagement levels using Airtop’s AI. Slack Notification: Posts a formatted summary to a predefined Slack channel. Setup Requirements Airtop API Key — free to generate. An Airtop Profile authenticated to LinkedIn. Google Sheet with competitor post URLs, copy this one. Slack Bot credentials with access to the target channel. Next Steps Expand Coverage**: Add more competitor profiles to the Google Sheet to scale monitoring. Integrate with CRM**: Feed summarized insights into your CRM for competitor tracking. Enhance Analysis**: Include post-level engagement metrics over time for trend analysis. Read more about competitve analysis using Linkedin