by Luis Acosta
📰 Reddit to Newsletter (Automated Curation with Open AI 4o Mini ) Turn the best posts from a subreddit into a ready-to-send HTML newsletter — no copy-pasting, no wasted time. This workflow fetches new posts, filters by topic of interest, analyzes comments, summarizes insights, and composes a clean HTML email delivered straight to your inbox with Gmail. 💡 What this workflow does ✅ Fetches posts from your chosen subreddit (default: r/microsaas, sorted by “new”) 🏆 Selects the Top 10 by upvotes, comments, and recency 🧭 Defines a topic of interest and runs a lightweight AI filter (true/false) without altering the original JSON 💬 Pulls and flattens comments into a clean, structured list 🧠 Summarizes each post + comments into main_post_summary, comment_insights, and key_learnings ✍️ Generates a newsletter in HTML (not Markdown) with headline, outline, sections per post, quotes, and “by the numbers” 📤 Sends the HTML email via Gmail with subject “Reddit Digest” (editable) 🛠 What you’ll need 🔑 Reddit OAuth2 connected in n8n 🔑 OpenAI API key (e.g., gpt-4o-mini) for filtering and summarization 🔑 Gmail OAuth2 to deliver the newsletter 🧵 A target subreddit and a clearly defined topic of interest 🧩 How it works (high-level) Manual Trigger → Get many posts (from subreddit) Select Top 10 (Code node, ranking by ups + comments + date) Set topic of interest → AI filter → String to JSON → If topic of interest Loop Over Items for each valid post Fetch post comments → Clean comments (Code) → Merge comments → Merge with post Summarize post + comments (AI) → Merge summaries → Create newsletter HTML Send Gmail message with the generated HTML ⚙️ Key fields to adjust Subreddit name* and “new” filter in *Get many posts Ranking logic* inside *Top 10 Code node Text inside Set topic of interest** Prompts* for *AI filter, Summarize, and Create newsletter (tone & structure) Recipient & subject line* in *Send Gmail message ✨ Use cases Weekly digest** of your niche community Podcast or newsletter prep** with community insights Monitoring specific themes** (e.g., “how to get first customers”) and delivering insights to a team or client 🧠 Tips & gotchas ⏱️ Reddit API limits: tune batch size and rate if the subreddit is very active 🧹 Robust JSON parsing: the String to JSON node handles clean, fenced, or escaped JSON; failures return error + raw for debugging 📨 Email client quirks: test long newsletters; some clients clip lengthy HTML 💸 AI cost: the two-step (summarization + HTML generation) improves quality but can be merged to reduce cost 🧭 Quick customization Change microsaas to your target subreddit Rewrite the topic of interest (e.g., “growth strategies”, “fundraising”, etc.) Adapt the newsletter outline prompt for a different tone/format Schedule with a Cron node for daily or weekly digests 📬 Contact & Feedback Need help tailoring this workflow to your stack? 📩 Luis.acosta@news2podcast.com 🐦 @guanchehacker If you’re building something more advanced with curation + AI (like turning the digest into a podcast or video), let’s connect — I may have the missing piece you need.
by Adnan Azhar
Template Overview This n8n workflow provides an intelligent, timezone-aware AI voice calling system for e-commerce businesses to automatically confirm customer orders via phone calls. The system uses VAPI (Voice AI Platform) to make natural, conversational calls while respecting customer time zones and business hours. 🎯 Use Case Perfect for e-commerce businesses that want to: Automatically confirm high-value or important orders via phone Reduce order cancellations and disputes Provide personalized customer service at scale Maintain human-like interactions while automating the process Respect customer time zones and calling hours ✨ Key Features Timezone Intelligence Automatically detects customer timezone from shipping address or phone number Only calls during appropriate business hours (10 AM - 3 PM local time, weekdays) Schedules calls for appropriate times when outside calling hours Uses timezone-aware greetings (Good morning/afternoon/evening) AI-Powered Conversations Natural, context-aware conversations using VAPI Personalized greetings with customer names and local time awareness Intelligent confirmation detection from call transcripts Handles customer concerns and change requests gracefully Smart Call Management Automatic retry logic with attempt tracking Call quality assessment and cost tracking Detailed transcript analysis and sentiment detection Follow-up alerts for calls requiring human intervention Comprehensive Tracking Complete call history and analytics in Airtable Real-time status updates throughout the process Detailed reporting on confirmation rates and call quality Cost tracking and ROI analysis 🏗️ Workflow Architecture Main Flow (Order Confirmation) Order Webhook - Receives order data from e-commerce platform Data Validation - Validates required fields (phone, status) Timezone Detection - Determines customer timezone and calling eligibility Call Routing - Either initiates immediate call or schedules for later VAPI Integration - Makes the actual AI voice call Status Tracking - Updates database with call results Scheduled Flow (Retry System) Runs every 15 minutes to check for scheduled calls Respects retry limits and calling hours Automatically processes queued confirmations Webhook Handler (Results Processing) Receives VAPI call completion webhooks Analyzes call transcripts for confirmation status Sends follow-up alerts or confirmation emails Updates final order status 🔧 Prerequisites & Setup Required Services VAPI Account - For AI voice calling functionality Airtable Base - For order tracking and analytics SMTP Server - For email notifications n8n Instance - Self-hosted or cloud
by Rahul Joshi
📊 Description Simplify your social media publishing process by automating post scheduling from Google Sheets directly to Meta (Facebook Pages). 📅💬 This workflow detects pending posts, uploads images with captions to your Facebook Page, updates the sheet status, and sends real-time notifications via Slack and email — keeping your marketing team always in sync. 🚀 What This Template Does 1️⃣ Trigger – Monitors a Google Sheet for new or pending posts every minute. ⏰ 2️⃣ Filter – Identifies the latest “pending” entry for publishing. 🔍 3️⃣ Extract – Captures post details like caption, image URL, and ID. 🧾 4️⃣ Publish – Uploads the post to your Meta (Facebook) Page using the Graph API. 📤 5️⃣ Validate – Confirms success or failure of the post operation. ✅ 6️⃣ Notify – Sends instant Slack and email updates on publishing status. 💌 7️⃣ Update – Marks the published post as “Completed” in Google Sheets. 📊 Key Benefits ✅ Hands-free publishing from Google Sheets to Meta ✅ Instant Slack and email alerts for post outcomes ✅ Prevents duplicate or failed post uploads ✅ Centralized content tracking and status updates ✅ Improves consistency and speed in social media operations Features Google Sheets trigger for post scheduling Facebook Graph API integration for auto-posting Slack and Outlook notifications for success/error alerts Automatic sheet updates post-publication Error handling and reporting for failed posts Requirements Google Sheets OAuth2 credentials Facebook Page Access Token via Graph API Slack Bot token for notifications Outlook or SMTP credentials for email updates Target Audience Marketing teams managing Facebook content calendars 📆 Social media managers seeking automated posting 📣 Agencies coordinating client content delivery 📋 Teams tracking campaign publishing performance 📊
by Sridevi Edupuganti
Overview This workflow automates weather forecast delivery by collecting city names, fetching 5-day forecasts from OpenWeatherMap, and generating professionally formatted HTML emails using GPT-4. The AI creates condition-based color-coded reports with safety precautions and sends them via Gmail. How It Works A form trigger collects up to three city names, which are geocoded via OpenWeatherMap API to retrieve coordinates and 5-day forecasts. JavaScript nodes process the raw weather data into daily summaries, calculating temperature ranges, precipitation levels, wind speeds, and dominant weather conditions. GPT-4 then generates professionally formatted HTML emails with condition-based color coding: The AI intelligently adds contextual safety warnings for heavy rain, extreme heat, high winds, and thunderstorms. A validation node ensures proper JSON formatting before Gmail sends the final briefing. Use Cases • Field ops & construction crew briefings • Travel planning and itinerary preparation • Outdoor event planning & coordination • Logistics and transportation route planning • Real estate property viewing scheduling • Sports and recreational activity planning Setup Requirements 1) OpenWeatherMap API credentials 2) OpenAI API key 3) Gmail OAuth2 authentication Need Help? Join the Discord or ask in the Forum! README file available at https://tinyurl.com/MulticityWeatherForecast
by Daniel
AI Email Support Agent with RAG & Cohere Reranking Transform your inbox into an intelligent support system: automatically detect new emails, retrieve relevant knowledge from Pinecone, rerank with Cohere for precision, generate contextual replies using Gemini AI, and respond—all while maintaining conversation history. What It Does This workflow triggers on incoming Gmail messages, leverages a LangChain agent with PostgreSQL memory for context, queries a Pinecone vector store (RAG) enhanced by Cohere reranking and OpenAI embeddings, crafts personalized responses via Gemini 2.5, and auto-replies to keep support flowing. Key Features Gmail Integration** - Real-time polling for new emails every minute RAG with Pinecone** - Retrieves top 10 relevant docs from "agency-info" index as agent tool Cohere Reranking** - Boosts retrieval accuracy by reordering results semantically Persistent Memory** - Postgres chat history keyed by email ID for ongoing threads Gemini-Powered Agent** - Handles queries with custom system prompt for agency support Seamless Auto-Reply** - Sends formatted text responses directly in Gmail Perfect For Agencies**: Automate client FAQs on services, pricing, and ownership Support Teams**: Scale responses without losing conversation context Small Businesses**: Handle inquiries 24/7 with AI-driven accuracy Developers**: Prototype RAG agents with vector stores and rerankers Marketers**: Personalize outreach replies based on knowledge base Consultants**: Quick, informed answers from internal docs Technical Highlights Built on n8n's LangChain ecosystem, this workflow highlights: Trigger-to-response pipeline with polling and webhooks Hybrid retrieval: Embeddings + vector search + semantic reranking Stateful agents with database-backed memory for multi-turn chats Multi-provider setup: OpenAI (embeddings), Cohere (rerank), Google (LLM) Scalable for production with configurable topK and session keys Setup Instructions Prerequisites n8n instance with LangChain nodes enabled Accounts for: Gmail (OAuth2), OpenAI (API key), Cohere (API key), Google Gemini (API key), Pinecone (API key and index), Postgres (database connection, e.g., Neon or Supabase) Required Credentials Gmail OAuth2 Enable Gmail API in Google Cloud Console Create OAuth2 credential in n8n with scopes: https://www.googleapis.com/auth/gmail.readonly, https://www.googleapis.com/auth/gmail.send OpenAI API Get API key from platform.openai.com Add as OpenAI credential in n8n Cohere API Sign up at cohere.com Copy API key to n8n Cohere credential Google Gemini API Generate key at https://aistudio.google.com/ Add as Google PaLM credential in n8n (compatible with Gemini) Pinecone API Create index "agency-info" with dimension 1024 Add API key to n8n Pinecone credential Postgres Set up database (e.g., Neon/Supabase) with a table for chat history Add connection details (host, database, user, password) to n8n Postgres credential Configuration Steps Import the workflow JSON into your n8n instance Assign all required credentials to the respective nodes Populate the Pinecone "agency-info" index with your knowledge base documents (use a separate upsert workflow or Pinecone dashboard) Customize the tableName in the Postgres Memory node if needed (default: "email_support_agent_") Adjust the agent's system prompt or topK retrieval if required for your use case Activate the workflow and test by sending a sample email to trigger it Troubleshooting No trigger firing**: Verify Gmail scopes and polling interval Empty retrieval**: Check Pinecone index population, dimensions (must be 1024), and document embeddings Rerank errors**: Ensure Cohere API key is valid and has sufficient quota Memory issues**: Confirm Postgres connection and that sessionKey uses email ID Perfect for deploying hands-off email automation. Import, connect credentials, and activate!
by Khairul Muhtadin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Who is this for? Automation enthusiasts, content creators, or social media managers who post article-based threads to Bluesky and want to automate the process end-to-end. What problem is this solving? Manual content repackaging and posting can be repetitive and time-consuming. This workflow automates the process from capturing article URLs (via Telegram or RSS) to scraping content, transforming it into a styled thread, and posting on Bluesky platform. What this workflow does Listens on Telegram or fetches from RSS feeds (AI Trends, Machine Learning Mastery, Technology Review). Extracts content from URLs using JinaAI. Converts the article into a neat, scroll-stopping thread via LangChain + Gemini / OpenAI ChatGPT. Splits the thread into multiple posts. The first post is published with “Create a Post”, while subsequent posts are replies. Adds short delays between posting to avoid rate limits. Setup Add credentials for Telegram Bot API, JinaAI, Google Gemini, and Bluesky App Password. Add or customize RSS feeds if needed Test with a sample URL to validate posting sequence. How to customize Swap out RSS feeds or trigger sources. Modify prompt templates or thread formatting rules in the LangChain/Gemini node. Adjust wait times or content parsing logic. Replace Bluesky with another posting target if desired. Made by: Khaisa Studio Need customs workflows? Contact Me!
by Club de Inteligencia Artificial Politécnico CIAP
🤖 Interactive Academic Chatbot (Telegram + MongoDB) Overview 📋 This project is a template for building a complete academic virtual assistant using n8n. It connects to Telegram, answers frequently asked questions by querying MongoDB, keeps the community informed about key dates (via web scraping), and collects user feedback for continuous improvement. How It Works Architecture and Workflow ⚙️ n8n: Orchestration of 3 workflows (chatbot, scraping worker, announcer). Telegram: Frontend for user interaction and sending announcements. MongoDB: Centralized database for FAQs, academic calendar, and feedback logs. Web Scraping: HTTP Request and HTML Extract nodes to read the university's web calendar. Cron: For automatic periodic executions (daily and weekly). Core Processes 🧠 Real-time reception of user queries via Telegram. Querying MongoDB collections for FAQ answers and calendar dates. Daily scraping of the university website to keep the calendar updated. Instant logging of user feedback (👍/👎) in MongoDB. Proactive sending of weekly announcements to the Telegram channel. Key Benefits ✅ Complete automation of student communication 24/7. An always-accurate academic calendar database without manual intervention. A built-in continuous improvement system through user feedback. Proactive communication of important events to the entire community. Use Cases 💼 Automation of student support in universities, colleges, and institutions. A virtual assistant for any organization needing to manage FAQs and a dynamic calendar. An automated announcements channel to keep a community informed. Requirements 👨💻 n8n instance (self-hosted or cloud). Credentials for a Telegram Bot (obtained from @BotFather). Credentials for a MongoDB database (Connection URI). URL of the academic calendar to be scraped. Authors 👥 Doménica Amores Nicole Guevara Adrián Villamar Mentor: Jaren Pazmiño Applicants to the CIAP Polytechnic Artificial Intelligence Club
by Dr. Firas
💥 Automate Scrape Google Maps Business Leads (Email, Phone, Website) using Apify 🧠 AI-Powered Business Prospecting Workflow (Google Maps + Email Enrichment) Who is this for? This workflow is designed for entrepreneurs, sales teams, marketers, and agencies who want to automate lead discovery and build qualified business contact lists — without manual searching or copying data. It’s perfect for anyone seeking an AI-driven prospecting assistant that saves time, centralizes business data, and stays fully compliant with GDPR. What problem is this workflow solving? Manually searching for potential clients, copying their details, and qualifying them takes hours — and often leads to messy spreadsheets. This workflow automates the process by: Gathering publicly available business information from Google Maps Enriching that data with AI-powered summaries and contact insights Compiling it into a clean, ready-to-use Google Sheet database This means you can focus on closing deals, not collecting data. What this workflow does This automation identifies, analyzes, and organizes business opportunities in just a few steps: Telegram Trigger → Send a message specifying your business type, number of leads, and Google Maps URL. Apify Integration → Fetches business information from Google Maps (public data). Duplicate Removal → Ensures clean, non-redundant results. AI Summarization (GPT-4) → Generates concise business summaries for better understanding. Email Extraction (GPT-4) → Finds and extracts professional contact emails from company websites. Google Sheets Integration → Automatically stores results (name, category, location, phone, email, etc.) in a structured sheet. Telegram Notification → Confirms when all businesses are processed. All data is handled ethically and transparently — only from public sources and without any unsolicited contact. Setup Telegram Setup Create a Telegram bot via BotFather Copy the API token and paste it into the Telegram Trigger node credentials. Apify Setup Create an account on Apify Get your API token and connect it to the “Run Google Maps Scraper” node. Google Sheets Setup Connect your Google account under the “Google Maps Database” node. Specify the target spreadsheet and worksheet name. OpenAI Setup Add your OpenAI API key to the AI nodes (“Company Summary Info” and “Extract Business Email”). Test Send a Telegram message like: restaurants, 5, https://www.google.com/maps/search/restaurants+in+Paris How to customize this workflow to your needs Change search region or business type** by modifying the Telegram input message format. Adjust the number of leads** via the maxCrawledPlacesPerSearch parameter in Apify. Add filters or enrichments** (e.g., websites with social links, review counts, or opening hours). Customize AI summaries** by tweaking the prompt inside the “Company Summary Info” node. Integrate CRM tools** like HubSpot or Pipedrive by adding a connector after the Google Sheets node. ⚙️ Expected Outcome ✅ A clean, enriched, and ready-to-use Google Sheet of businesses with: Name, category, address, and city Phone number and website AI-generated business summary Extracted professional email (if available) ✅ Telegram confirmation once all businesses are processed ✅ Fully automated, scalable, and GDPR-compliant prospecting workflow 💡 This workflow provides a transparent, ethical way to streamline your B2B lead research while staying compliant with privacy and anti-spam regulations. 🎥 Watch This Tutorial 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Daniel Shashko
How it Works This workflow automatically monitors competitor affiliate programs twice daily using Bright Data's web scraping API to extract commission rates, cookie durations, average order values, and payout terms from competitor websites. The AI analysis engine scores each competitor (0-100 points) by comparing their commission rates, cookie windows, earnings per click (EPC), and affiliate-friendliness against your program, then categorizes them as Critical (70+), High (45-69), Medium (25-44), or Low (0-24) threat levels. Critical and high-threat competitors trigger immediate Slack alerts with detailed head-to-head comparisons and strategic recommendations, while lower threats route to monitoring channels. All competitors are logged to Google Sheets for tracking and historical analysis. The system generates personalized email reports—urgent action plans with 24-48 hour deadlines for critical threats, or standard intelligence updates for routine monitoring. The entire process takes minutes from scraping to strategic alert, eliminating manual competitive research and ensuring you never lose affiliates to better-positioned competitor programs. Who is this for? Affiliate program managers monitoring competitor programs who need automated intelligence E-commerce brands in competitive verticals who can't afford to lose top affiliates Affiliate networks managing multiple merchants needing competitive benchmarking Performance marketing teams responding to commission rate wars in their industry Setup Steps Setup time: Approx. 20-30 minutes (Bright Data setup, API configuration, spreadsheet creation) Requirements: Bright Data account with web scraping API access Google account with a competitor tracking spreadsheet Slack workspace SMTP email provider (Gmail, SendGrid, etc.) Sign up for Bright Data and get your API credentials and dataset ID. Create a Google Sheets with two tabs: "Competitor Analysis" and "Historical Log" with appropriate column headers. Set up these nodes: Schedule Competitor Check: Pre-configured for twice daily (adjust timing if needed). Scrape Competitor Sites: Add Bright Data credentials, dataset ID, and competitor URLs. AI Offer Analysis: Review scoring thresholds (commission, cookies, AOV, EPC). Route by Threat Level: Automatically splits by 70-point critical and 45-point high thresholds. Google Sheets Nodes: Connect spreadsheet and map data fields. Slack Alerts: Configure channels for critical alerts and routine monitoring. Email Reports: Set up SMTP and recipient addresses. Credentials must be entered into their respective nodes for successful execution. Customization Guidance Scoring Weights:** Adjust point values for commission (35), cookies (25), cost efficiency (25), volume (15) based on your priorities. Threat Thresholds:** Modify 70-point critical and 45-point high thresholds for your risk tolerance. Benchmark Values:** Update commission gap thresholds (5%+ = critical, 2%+ = warning) and cookie duration benchmarks (30+ days = critical). Competitor URLs:** Add or remove competitor websites to monitor in the HTTP Request node. Alert Routing:** Create tier-based channels or route to Microsoft Teams, Discord, or SMS via Twilio. Scraping Frequency:** Change from twice-daily to hourly for competitive markets or weekly for stable industries. Additional Networks:** Duplicate workflow for different affiliate networks (CJ, ShareASale, Impact, Rakuten). Once configured, this workflow will continuously monitor competitive threats and alert you before top affiliates switch to better-paying programs, protecting your affiliate revenue from competitive pressure. Built by Daniel Shashko Connect on LinkedIn
by PollupAI
Who it’s for Built for Customer Success and Account Management teams focused on proactive retention. This workflow helps you automatically identify at-risk customers – before they churn – by combining CRM, usage, and sentiment data into one actionable alert. What it does This end-to-end workflow continuously monitors customer health by consolidating data from HubSpot and Google Sheets. Here’s how it works: Fetch deals from HubSpot. Collect context — linked support tickets and feature usage from a Google Sheet. Run sentiment analysis on the tickets to generate a customer health score. Evaluate risk — an AI agent reviews deal age, sentiment score, and usage trends against predefined thresholds. Send alerts — if churn risk is detected, it automatically sends a clear, data-driven email to the responsible team member with next-step recommendations. How to set it up To get started, configure your credentials and parameters in the following nodes: Credentials: HubSpot: Connect your account (HubSpot: Get All Deals). LLM Model: Add credentials for your preferred provider (Config: Set LLM for Agent & Chains). Google Sheets: Connect your account (Tool: Get Feature Usage from Sheets). Email: Set up your SMTP credentials (Email: Send Churn Alert). Tool URLs: In Tool: Calculate Sentiment Score, enter the Webhook URL from the Trigger: Receive Tickets for Scoring node within this same workflow. In Tool: Get HubSpot Data, enter the Endpoint URL for your MCP HubSpot data workflow. (Note: This tool *does call an external workflow)*. Google Sheet: In Tool: Get Feature Usage from Sheets, enter the Document ID for your own Google Sheet. Email Details: In Email: Send Churn Alert, change the From and To email addresses. Requirements HubSpot account with Deals API access LLM provider account (e.g. OpenAI) Google Sheets tracking customer feature usage n8n with LangChain community nodes enabled A separate n8n workflow set up to act as an MCP endpoint for fetching HubSpot data (called by Tool: Get HubSpot Data). How to customize it Tailor this workflow to match your business logic: Scoring logic:** Adjust the JavaScript in the Code: Convert Sentiment to Score node to redefine how customer scores are calculated. Alert thresholds:** Update the prompt in the AI Chain: Analyze for Churn Risk node to fine-tune when alerts trigger (e.g. deal age, score cutoff, or usage drop). Data sources:** Swap HubSpot or Google Sheets for your CRM or database of choice — like Salesforce or Airtable. ✅ Outcome: A proactive customer health monitoring system that surfaces risks before it’s too late — keeping your team focused on prevention, not firefighting.
by Avkash Kakdiya
How it works This workflow starts whenever a new lead comes in through Typeform (form submission) or Calendly (meeting booking). It captures the lead’s information, standardizes it into a clean format, and checks the email domain. If it’s a business domain, the workflow uses AI to enrich the lead with company details such as industry, headquarters, size, and website. Finally, it merges all the data and automatically saves the enriched contact in HubSpot CRM. Step-by-step Capture Leads The workflow listens for new form responses in Typeform or new invitees in Calendly. Both sources are merged into a single stream of leads. Standardize Data All incoming data is cleaned and formatted into a consistent structure: Name, Email, Phone, Message, and Domain. Filter Domains Checks the email domain. If it’s a free/public domain (like Gmail or Yahoo), the lead is ignored. If it’s a business domain, the workflow continues. AI Company Enrichment Sends the domain to an AI Agent (OpenAI GPT-4o-mini). AI returns structured company details: Company Name Industry Headquarters (city & country) Employee Count Website LinkedIn Profile Short Company Description Merge Lead & AI Data Combines the original lead details with the AI-enriched company information. Adds metadata like timestamp and workflow ID. Save to HubSpot CRM Creates or updates a contact record in HubSpot. Maps enriched fields like company name, LinkedIn, website, and description. Why use this? Automatically enriches every qualified lead with valuable company intelligence. Filters out unqualified leads with personal email addresses. Keeps your CRM updated without manual research. Saves time by centralizing lead capture, enrichment, and CRM sync in one flow. Helps sales teams focus on warm, high-value prospects instead of raw, unverified leads.
by Guillaume Duvernay
Never worry about losing your n8n workflows again. This template provides a powerful, automated backup system that gives you the peace of mind of version control without the complexity of Git. On a schedule you define, it intelligently scans your n8n instance for new workflow versions and saves them as downloadable snapshots in a clean and organized Airtable base. But it’s more than just a backup. This workflow uses AI to automatically generate a concise summary of what each workflow does and even documents the changes between versions. The result is a fully searchable, self-documenting library of all your automations, making it the perfect "single source of truth" for your team or personal projects. Who is this for? Self-hosted n8n users:** This is an essential insurance policy to protect your critical automations from server issues or data loss. n8n developers & freelancers:** Maintain a complete version history for client projects, allowing you to easily review changes and restore previous versions. Teams using n8n:** Create a central, browseable, and documented repository of all team workflows, making collaboration and handovers seamless. Any n8n user who values their work:** Protect your time and effort with an easy-to-use, "set it and forget it" backup solution. What problem does this solve? Prevents catastrophic data loss:** Provides a simple, automated way to back up your most critical assets—your workflows. Creates "no-code" version control:** Offers the benefits of version history (like Git) but in a user-friendly Airtable interface, allowing you to browse and download any previous snapshot. Automates documentation:** Who has time to document every change? The AI summary and changelog features mean you always have up-to-date documentation, even if you forget to write it yourself. Improves workflow discovery:** Your Airtable base becomes a searchable and browseable library of all your workflows and their purposes, complete with AI-generated summaries. How it works Scheduled check: On a recurring schedule (e.g., daily), the workflow fetches a list of all workflows from your n8n instance. Detect new versions: It compares the current version ID of each workflow with the snapshot IDs already saved in your Airtable base. It only proceeds with new, unsaved versions. Generate AI documentation: For each new snapshot, the workflow performs two smart actions: AI Changelog: It compares the new workflow JSON with the previously saved version and uses AI to generate a one-sentence summary of what’s changed. AI Summary: It periodically re-analyzes the entire workflow to generate a fresh, high-level summary of its purpose, ensuring the main description stays up-to-date. Store in Airtable: It saves everything neatly in the provided two-table Airtable base: A Workflows table holds the main record and the AI summary. A linked Snapshots table stores the version-specific details, the AI changelog, and the actual .json backup file as an attachment. Setup Duplicate the Airtable base: Before you start, click here to duplicate the Airtable Base template into your own Airtable account. Configure the workflow: Connect your n8n API credentials to the n8n nodes. Connect your Airtable credentials and map the nodes to the base you just duplicated. Connect your AI provider credentials to the OpenAI Chat Model nodes. Important: In the Store workflow file into Airtable (HTTP Request) node, you must replace <AIRTABLE-BASE-ID> in the URL with your own base ID (it starts with app...). Set your schedule: Configure the Schedule Trigger to your desired frequency (daily is a good start). Activate the workflow. Your automated, AI-powered backup system is now live! Taking it further Add notifications:* Add a *Slack* or *Email** node at the end of the workflow to send a summary of which workflows were backed up during each run. Use different storage:* While designed for Airtable, you could adapt the logic to store the JSON files in *Google Drive* or *Dropbox* and the metadata in *Google Sheets* or *Notion**. Optimize AI costs:* The *Check workflow status** (Code) node is set to regenerate the main AI summary for the first few snapshots and then every 5th snapshot. You can edit the code in this node to change this frequency and manage your token consumption.