by Hamed Nickmehr
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Title: n8n Credentials and Workflows Backup on Change Detection Purpose: Never lose track of your n8n changes again! This workflow smartly backs up all your workflows and credentials, automatically detects any changes using hash comparison, and pushes updates to GitHub—but only when something has actually changed. Set your own interval and stop cluttering your repo with redundant commits. Walkthrough Video on YouTube Trigger: Schedule Trigger**: Executes the entire process at a user-defined interval. No need to worry about traceability or managing countless backups, as the workflow only commits changes when a difference is detected. Workflow Backup Process: Set Workflow Path: Defines the local backup file path for workflows. Get Old Workflow Hash: Executes a helper workflow to retrieve the previous hash. Execute Workflow Backup: Runs n8n export:workflow to export all workflows to the defined file path. Get New Workflow Hash: Executes a helper workflow to generate the new hash from the exported file. Compare Hashes (If Workflow Updated): Checks if the new hash differs from the old one. If Updated: Read Workflow Data → Extract Text → Push to GitHub: Reads, extracts, and commits the updated workflow JSON to GitHub under a timestamped filename. Credential Backup Process: Set Credential Path: Defines the local backup file path for credentials. Get Old Credential Hash: Executes a helper workflow to retrieve the previous hash. Execute Credential Backup: Runs n8n export:credentials to export all credentials. Get New Credential Hash: Executes a helper workflow to generate the new hash from the exported file. Compare Hashes (If Credential Updated): Checks for changes. If Updated: Read Credential Data → Extract Text → Push to GitHub: Commits the new credentials JSON to GitHub if changes are found. Hash Generator (Helper Flow): Used in both workflow and credential backup paths: Read File* → *Extract Text* → *Hash Data** Outputs SHA-256 hash used for comparison GitHub Integration: Commits are created with ISO timestamp in the filename and message. Repository: https://github.com/your-github-name/n8n-onchange-bachup File paths: backups/WorkFlow Backup -timestamp-.json and backups/Credential Backup -timestamp-.json Change Detection Logic: Only commits files when hash changes are detected (i.e., actual content change). Avoids unnecessary GitHub commits and storage use. Error Handling: GitHub nodes are set to continue workflow execution on error, avoiding full process interruption.
by Danielle Gomes
This n8n workflow collects and summarizes news from multiple RSS feeds, using OpenAI to generate a concise summary that can be sent to WhatsApp or other destinations. Perfect for automating your daily news digest. 🔁 Workflow Breakdown: Schedule Trigger Start the workflow on your desired schedule (daily, hourly, etc.). 🟨 Note: Set the trigger however you wish. RSS Feeds (My RSS 01–04) Fetches articles from four different RSS sources. 🟨 Note: You can add as many RSS feeds as you want. Edit Fields (Edit Fields1–3) Normalizes RSS fields (title, link, etc.) to ensure consistency across different sources. Merge (append mode) Combines the RSS items into a single unified list. Filter Optionally filter articles by keywords, date, or categories. Limit Limits the analysis to the 10 most recent articles. 🟨 Note: This keeps the result concise and avoids overloading the summary. Aggregate Prepares the selected news for summarization by combining them into a single content block. OpenAI (Message Assistant) Summarizes the aggregated news items in a clean and readable format using AI. Send Summary to WhatsApp Sends the AI-generated summary to a WhatsApp endpoint via webhook (yoururlapi.com). You can replace this with an email service, Google Drive, or any other destination. 🟨 Note: You can send it to your WhatsApp API, email, drive, etc. No Operation (End) Final placeholder to safely close the workflow. You may expand from here if needed.
by Alex Halfborg
BACKGROUND Malaysia's Inland Revenue (LHDN) provides an API to get the tax id number for a business entity, based on a given Business Registration number (BRN or SSM), or NRIC (MyKad). PROBLEM However, the API only allows one search at a time. SOLUTION This free workflow lets you do a batch search to get TIN for multiple SSM or NRIC. This is useful if you need to prepare your internal DB for e-invoicing PRE-REQUISITES 1) Get your connection client id and client secret from myhasil.gov.my website 2) Prepare your Google Sheet containing a list of SSM and NRIC you want to get the TIN 3) Create N8N credential to connect to your google sheet above SUPPORT Questions? Ask alex at halfborg dot com
by Sarfaraz Muhammad Sajib
This n8n workflow sends SMS messages through the Textbelt API by accepting phone numbers, messages, and API keys as inputs. It uses a manual trigger to start the process, sets the necessary data, and executes an HTTP POST request to deliver the SMS. Step-by-Step Explanation: Manual Trigger: Starts the workflow manually by clicking ‘Execute workflow’. Set Data Node: Defines the required input parameters (phone, message, and key) that will be sent to the SMS API. You can populate these fields with your target phone number, the text message, and your Textbelt API key. HTTP Request Node: Sends a POST request to https://textbelt.com/tex with the phone number, message, and API key in the request body to send the SMS. The response from the API confirms whether the message was successfully sent.
by Dataki
This workflow helps you generate an llms.txt file (if you're unfamiliar with it, check out this article) using a Screaming Frog export. Screaming Frog is a well-known website crawler. You can easily crawl a website. Then, export the "internal_html" section in CSV format. How It Works: A form allows you to enter: The name of the website A short description The internal_html.csv file from your Screaming Frog export Once the form is submitted, the workflow is triggered automatically, and you can download the llms.txt file directly from n8n. Downloading the File Since the last node in this workflow is "Convert to File", you will need to download the file directly from the n8n UI. However, you can easily add a node (e.g., Google Drive, OneDrive) to automatically upload the file wherever you want. AI-Powered Filtering (Optional): This workflow includes a text classifier node, which is deactivated by default. You can activate it to apply a more intelligent filter to select URLs for the llms.txt file. Consider modifying the description in the classifier node to specify the type of URLs you want to include. How to Use This Workflow Crawl the website you want to generate an llms.txt file for using Screaming Frog. Export the "internal_html" section in CSV format. In n8n, click "Test Workflow", fill in the form, and upload the internal_html.csv file. Once the workflow is complete, go to the "Export to File" node and download the output. That's it! You now have your llms.txt file! Recommended Usage: Use this workflow directly in the n8n UI by clicking 'Test Workflow' and uploading the file in the form.
by Franz
🚀 What the “Agent Builder” template does Need to turn a one-line chat request into a fully-wired n8n workflow template—complete with AI agents, RAG, and web-search super-powers—without lifting a finger? That’s exactly what Agent Builder automates: Listens to any incoming chat message (via the Chat Trigger). Spins up an AI architect that analyses the request, searches the web, reads n8n docs from a Pinecone vector store, and designs the smallest possible set of nodes. Auto-generates a ready-to-import JSON template and hands it back as a downloadable file—plus all the supporting assets (embeddings, vector store etc.) so the next prompt is even smarter. Think of it as your personal “workflow chef”: you shout the order, it shops for ingredients, cooks, plates, and serves the meal. All you do is eat. 🤗 Who will love this? No-code builders / power users** who don’t want to wrestle with AI node wiring. Agencies & consultants** delivering lots of bespoke automations. Internal platform teams** who need a “workflow self-service portal” for non-technical colleagues. 🧩 How it’s wired | Sub-process | What happens inside | Key nodes | | -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------- | | Web Crawler (optional) | Firecrawl scrapes docs.n8n.io (or any URL you drop in) and streams raw markdown back. | Set URL → HTTP Request (Extract) → Wait & Retry | | RAG Trainer | Splits the scraped docs, embeds them with OpenAI, and upserts vectors into Pinecone. | Recursive Text Splitter → Embeddings OpenAI → Train Pinecone | | Agent Builder | The star of the show – orchestrates GPT-4o (via OpenRouter), SerpAPI web-search, your Pinecone index and a Structured Output Parser to produce → validate → prettify the final n8n template. | Chat Trigger → AI Agent → OpenAI (validator) → Code (extract) → Convert to JSON file | Every arrow in the drawn workflow is pre-connected, so the generated template always passes n8n’s import check. 🛠️ Getting set up (5 quick creds) | Service | Credential type | | --------------------------------------------------- | ---------------------------------------------------------- | | OpenAI / Azure OpenAI – embeddings & validation | OpenAI API | | Pinecone – vector store | Pinecone API | | OpenRouter – GPT-4o LLM | OpenRouter API Key | | SerpAPI – web search | SerpAPI Key | | Firecrawl (only if you plan to crawl) | Generic Header Auth → Authorization: Bearer YOUR_KEY | Each node already expects those creds; just create them once, select in the dropdown, hit Activate. 🏃♀️ What a typical run looks like User says: “Build me a workflow that monitors our support inbox, summarises new tickets with GPT and posts to Slack.” Chat Trigger captures the message. AI Agent: queries Pinecone for relevant n8n docs, fires a SerpAPI search for “n8n gmail trigger example”, sketches an architecture (Gmail Trigger → GPT Model → Slack). The agent returns JSON ➜ OpenAI node double-checks field names, connections, type versions. A tiny JS Code node slices the JSON out of the chat blob and saves it as template.json ready for download. You download, import, and… done. ✏️ Customising Switch the LLM* – plug in Claude 3, Gemini 1.5, or a local model; just swap the *OpenRouter Chat Model node. Point the RAG at your own docs* – change the crawl URL or feed PDFs via the *Default Data Loader. Hard-code preferred nodes* – edit the “User node preferences” in the system message so the agent always chooses *Notion for databases, etc. 🥡 Take-away notes It's a prototype feel free to experiment with it to improve its capabilities. Have fun building!**
by Alex Kim
Overview The n8n Workflow Cloner is a powerful automation tool designed to copy, sync, and migrate workflows across different n8n instances or projects. Whether you're managing multiple environments (development, staging, production) or organizing workflows within a team, this workflow automates the transfer process, ensuring seamless workflow deployment with minimal manual effort. By automatically detecting and copying only the missing workflows, this tool helps maintain consistency, improve collaboration, and streamline workflow migration between projects or instances. How to Use 1️⃣ Set Up API Credentials Configure API credentials for both source and destination n8n instances. Ensure the credentials have read and write access to manage workflows. 2️⃣ Select Source & Destination Update the "GET - Workflows" node to define the source instance. Set the "CREATE - Workflow" node to specify the destination instance. 3️⃣ Run the Workflow Click "Test Workflow" to start the transfer. The system will fetch all workflows from the source, compare them with the destination, and copy any missing workflows. 4️⃣ Change the Destination Project (Optional) By default, workflows are moved to the "KBB Workflows" project. Modify the "Filter" node to transfer workflows to a different project. 5️⃣ Monitor & Verify The Loop Over Items node ensures batch processing for multiple workflows. Log outputs provide details on transferred workflows and statuses. Key Benefits ✅ Automate Workflow Transfers – No more manual exports/imports. ✅ Sync Workflows Across Environments – Keep workflows up to date in dev, staging, and production. ✅ Effortless Team Collaboration – Share workflows across projects seamlessly. ✅ Backup & Migration Ready – Easily move workflows between n8n instances. Use Cases 🔹 CI/CD for Workflows – Deploy workflows between development and production environments. 🔹 Team Workflow Sharing – Share workflows across multiple n8n projects. 🔹 Workflow Backup Solution – Store copies of workflows in a dedicated backup project. Tags 🚀 Workflow Migration 🚀 n8n Automation 🚀 Sync Workflows 🚀 Backup & Deployment
by Cheney Zhang
Paul Graham Essay Search & Chat with Milvus Vector Database How It Works This workflow creates a RAG (Retrieval-Augmented Generation) system using Milvus vector database to search Paul Graham essays: Scrape & Load: Fetches Paul Graham essays, extracts text, and stores them as vector embeddings in Milvus Chat Interface: Enables semantic search and AI-powered conversations about the essays Set Up Steps Set up Milvus server following the official installation guide, then create a collection Execute the workflow to scrape essays and load them into your Milvus collection Chat with the AI agent using the Milvus tool to query and discuss essay content
by Nskha
Overview The [n8n] YouTube Channel Advanced RSS Feeds Generator workflow facilitates the generation of various RSS feed formats for YouTube channels without requiring API access or administrative permissions. It utilizes third-party services to extract data, making it extremely user-friendly and accessible. Key Use Cases and Benefits Content Aggregation**: Easily gather and syndicate content from any public YouTube channel. No API Key Required**: Avoid the complexities and limitations of Google's API. Multiple Formats**: Supports ATOM, JSON, MRSS, Plaintext, Sfeed, and direct YouTube XML feeds. Flexibility**: Input can be a YouTube channel or video URL, ID, or username. Services/APIs Utilized This workflow integrates with: commentpicker.com**: For retrieving YouTube channel IDs. rss-bridge.org**: To generate various RSS formats. Configuration Instructions Start the Workflow: Activate the workflow in your n8n instance. Input Details: Enter the YouTube channel or video URL, ID, or username via the provided form trigger. Run the Workflow: Execute the workflow to receive links to 13 different RSS feeds, including community and video content feeds. Screenshots Additional Notes Customization**: You can modify the RSS feed formats or integrate additional services as needed. Support and Contributions For support, questions, or contributions, please visit the n8n community forum or the GitHub repository. We welcome contributions from the community!
by Omar Akoudad
This n8n workflow helps eCommerce businesses (especially in the Cash on Delivery space) send real-time order events to the Meta (Facebook) Conversions API, ensuring accurate event tracking and better ad attribution. Features Webhook Listener**: Accepts incoming order data (name, phone, IP, user-agent, etc.) via HTTP POST/GET. Data Normalization**: Cleans and formats first_name, last_name, phone, and event_time according to Facebook's strict specs. Data Hashing**: Securely hashes sensitive user data (SHA256), as required by Meta. Full Custom Data Suppor**t: Pass order value, currency, and more. Ideal For: Shopify, WooCommerce, custom stores (Laravel, Node, etc.) Businesses using Meta Ads and needing high-quality server-side tracking Teams without access to full dev resources, but using n8n for automation How It Works: Receive Order from your store via Webhook or API. Format & Normalize fields to match Facebook’s expected structure. Encrypt Sensitive Fields using SHA256 (name, phone, email). Send to Facebook via the Conversions API endpoint. Requirements: A Meta Business Manager account with Conversions API access Your Access Token and Pixel ID set up in n8n credentials
by CustomJS
This n8n template shows how to extract selected pages from a generated PDF with the PDF Toolkit by www.customjs.space. @custom-js/n8n-nodes-pdf-toolkit Notice Community nodes can only be installed on self-hosted instances of n8n. What this workflow does Downloads** each PDF using an HTTP Request. Extract** pages from the PDF file as needed. Requirements Self-hosted** n8n instance CustomJS API key** for extracting PDF files. PDF files to be merged** to be converted into a PDF Workflow Steps: Manual Trigger: Runs with user interaction. Download PDF File: Pass urls for PDF files to merge. Extract Pages from PDF: Extract selected pages from a generated PDF Usage Get API key from customJS Sign up to customJS platform. Navigate to your profile page Press "Show" button to get API key Set Credentials for CustomJS API on n8n Copy and paste your API key generated from CustomJS here. Design workflow A Manual Trigger for starting workflow. HTTP Request Nodes for downloading PDF files. Extract Pages from PDF. You can replace logic for triggering and returning results. For example, you can trigger this workflow by calling a webhook and get a result as a response from webhook. Simply replace Manual Trigger and Write to Disk nodes. Perfect for Taking a note of specific pages from PDF files. Splitting PDF file into multiple parts.
by Harshil Agrawal
This workflow allows you to compress binary files to zip format. HTTP Request node: The workflow uses the HTTP Request node to fetch files from the internet. If you want to fetch files from your local machine, replace it with the Read Binary File or Read Binary Files node. Compression node: The Compression node compresses the file into a zip. If you want to compress the files to gzip, then select the gzip format instead. Based on your use-case, you may want to write the files to your disk or upload it to Google Drive or Box. If you want to write the compressed file to your disk, replace the Dropbox node with the Write Binary File node, or if you want to upload the file to a different service, use the respective node.