by Daniel
Secure your n8n automations with this comprehensive template that automates periodic backups to Telegram for instant access while enabling flexible restores from Google Drive links or direct file uploads—ensuring quick recovery without data loss. 📋 What This Template Does This dual-branch workflow handles full n8n instance backups and restores seamlessly. The backup arm runs every 3 days, fetching all workflows via the n8n API, aggregating them into a JSON array, converting to a text file, and sending it to Telegram for offsite storage and sharing. The restore arm supports two entry points: manual execution to pull a backup from Google Drive or form-based upload for local files, then parses the JSON, cleans workflows for compatibility, and loops to create missing ones or update existing by name—handling batches efficiently to respect API limits. Scheduled backups with Telegram delivery for easy stakeholder access Dual restore paths: Drive download or direct file upload via form Intelligent create-or-update logic with data sanitization to avoid conflicts Looped processing with existence checks and error continuation 🔧 Prerequisites n8n instance with API enabled (self-hosted or cloud) Telegram account for bot setup Google Drive account (optional, for Drive-based restores) 🔑 Required Credentials n8n API Setup In n8n, navigate to Settings → n8n API Enable the API and generate a new key Add to n8n as "n8n API" credential type, pasting the key in the API Key field Telegram API Setup Message @BotFather on Telegram to create a new bot and get your token Find your chat ID by messaging @userinfobot Add to n8n as "Telegram API" credential type, entering the Bot Token Google Drive OAuth2 API Setup In Google Cloud Console, go to APIs & Services → Credentials Create an OAuth 2.0 Client ID for Web application, enable Drive API Add redirect URI: [your-n8n-instance-url]/rest/oauth2-credential/callback Add to n8n as "Google Drive OAuth2 API" credential type and authorize ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the n8n API, Telegram API, and Google Drive credentials to their nodes Update the Telegram chat ID in the "Send Backup to Telegram" node Set the Google Drive file ID in the "Download Backup from Drive" node (from file URL) Activate the workflow and test backup by executing the Schedule node manually Test restore: Run manual trigger for Drive or use the form for upload 🎯 Use Cases Dev teams backing up staging workflows to Telegram for rapid production restores during deployments Solo automators uploading local backups via form to sync across devices after n8n migrations Agencies sharing client workflow archives via Drive links for secure, collaborative restores Educational setups scheduling exports to Telegram for student template distribution and recovery ⚠️ Troubleshooting Backup file empty: Verify n8n API permissions include read access to workflows Restore parse errors: Check JSON validity in backup file; adjust Code node property reference if needed API rate limits hit: Increase Wait node duration or reduce batch size in Loop Form upload fails: Ensure file is valid JSON text; test with small backup first
by Jay Emp0
Ebook to Audiobook Converter ▶️ Watch Full Demo Video What It Does Turn any PDF ebook into a professional audiobook automatically. Upload a PDF, get an MP3 audiobook in your Google Drive. Perfect for listening to books, research papers, or documents on the go. Example: Input PDF → Output Audiobook Key Features Upload PDF via web form → Get MP3 audiobook in Google Drive Natural-sounding AI voices (MiniMax Speech-02-HD) Automatic text extraction, chunking, and audio merging Customizable voice, speed, and emotion settings Processes long books in batches with smart rate limiting Perfect For Students**: Turn textbooks into study audiobooks Professionals**: Listen to reports and documents while commuting Content Creators**: Repurpose written content as audio Accessibility**: Make content accessible to visually impaired users Requirements | Component | Details | |-----------|---------| | n8n | Self-hosted ONLY (cannot run on n8n Cloud) | | FFmpeg | Must be installed in your n8n environment | | Replicate API | For MiniMax TTS (Sign up here) | | Google Drive | OAuth2 credentials + "Audiobook" folder | ⚠️ Important: This workflow does NOT work on n8n Cloud because FFmpeg installation is required. Quick Setup 1. Install FFmpeg Docker users: docker exec -it <n8n-container-name> /bin/bash apt-get update && apt-get install -y ffmpeg Native installation: sudo apt-get install ffmpeg # Linux brew install ffmpeg # macOS 2. Get API Keys Replicate**: Sign up at replicate.com and copy your API token Google Drive**: Set up OAuth2 in n8n and create an "Audiobook" folder in Drive 3. Import & Configure Import n8n.json into your n8n instance Replace the Replicate API token in the "MINIMAX TTS" node Configure Google Drive credentials and select your "Audiobook" folder Activate the workflow Cost Estimate | Component | Cost | |-----------|------| | MiniMax TTS API | $0.15 per 1000 characters ($3-5 for average book) | | Google Drive Storage | Free (up to 15GB) | | Processing Time | ~1-2 minutes per 10 pages | How It Works PDF Upload → Extract Text → Split into Chunks → Convert to Speech (batches of 5) → Merge Audio Files (FFmpeg) → Upload to Google Drive The workflow uses four main modules: Extraction: PDF text extraction and intelligent chunking Conversion: MiniMax TTS processes text in batches Merging: FFmpeg combines all audio files seamlessly Upload: Final audiobook saved to Google Drive Voice Settings (Customizable) { "voice_id": "Friendly_Person", "emotion": "happy", "speed": 1, "pitch": 0 } Available emotions: happy, neutral, sad, angry, excited Limitations ⚠️ Self-hosted n8n ONLY (not compatible with n8n Cloud) PDF files only (not EPUB, MOBI, or scanned images) Large books (500+ pages) take longer to process Requires FFmpeg installation (see setup above) Troubleshooting FFmpeg not found? Docker: Run docker exec -it <container> /bin/bash then apt-get install ffmpeg Native: Run sudo apt-get install ffmpeg (Linux) or brew install ffmpeg (macOS) Rate limit errors? Increase wait time in the "WAITS FOR 5 SECONDS" node to 10-15 seconds Google Drive upload fails? Make sure you created the "Audiobook" folder in your Google Drive Reconfigure OAuth2 credentials in n8n Created by emp0 | More workflows: n8n Gallery
by Automate With Marc
Gemini 3 Image & PDF Extractor (Google Drive → Gemini 3 → Summary) Automatically summarize newly uploaded images or PDF reports using Google Gemini 3, triggered directly from a Google Drive folder. Perfect for anyone who needs fast AI-powered analysis of financial reports, charts, screenshots, or scanned documents. 🎥 Watch the full step-by-step video tutorial: https://www.youtube.com/watch?v=UuWYT_uXiw0 What this template does This workflow watches a Google Drive folder for new files and automatically: Detects new uploaded files Uses Google Drive Trigger Watches a specific folder for fileCreated events Filters by MIME type: image/png image/webp application/pdf Downloads the file automatically Depending on the file type: Images → Download via HTTP Request → Send to Gemini 3 Vision PDFs → Download via HTTP Request → Extract content → Send to Gemini 3 Analyzes content using Gemini 3 Two separate processing lanes: 🖼️ Image Lane Image is sent to Gemini 3 (Vision / Image Analyze) Extracts textual + visual meaning from charts, diagrams, or screenshots Passes structured output to an AI Analyst Agent Agent summarizes and highlights top 3 findings 📄 PDF Lane PDF is downloaded Text is extracted using Extract From File Processed using Gemini 3 via OpenRouter Chat Model AI Analyst Agent summarizes charts/tables and extracts insights Why this workflow is useful Save hours manually reading PDFs, charts, and screenshots Convert dense financial or operational documents into digestible insights Great for: Financial analysts Operations teams Market researchers Content & reporting teams Anyone receiving frequent reports via Drive Requirements Before using this template, you will need: Google Drive OAuth credential (for Drive trigger + file download) Gemini 3 / PaLM or OpenRouter API key (Optional) Update folder ID to your own Google Drive target folder ⚠️ No credentials are included in this template. Add them manually after importing it. Node Overview Google Drive Trigger Watches a specific Drive folder for newly added files Provides metadata like webContentLink and MIME type Filter by Type (IF Node) Routes files to Image lane or PDF lane png or webp → Image pdf → PDF 🖼️ Image Processing Lane Download Image (HTTP Request) Analyze Image (Gemini Vision) Analyzer Agent Summarizes findings Highlights actionable insights Powered by OpenRouter Gemini 3 📄 PDF Processing Lane Download PDF (HTTP Request) Extract From File → PDF Analyzer Agent (PDF) Summarizes extracted chart/report information Highlights key takeaways Setup Guide Import the template into your n8n workspace Open Google Drive Trigger Select your Drive OAuth credential Replace folder ID with your target folder Open Gemini 3 / OpenRouter AI Model nodes Add your API credentials Test by uploading: A PNG/WebP chart screenshot A multi-page PDF report Check the execution to view summary outputs Customization Ideas Add email delivery (send the summary to yourself daily) Save summaries into: Google Sheets Notion Slack channels n8n Data Tables Add a second agent to convert summaries into: Weekly reports PowerPoint slides Slack-ready bullet points Add classification logic: Revenue reports Marketing analytics Product dashboards Financial charts Troubleshooting Trigger not firing? Confirm your Drive OAuth credential has read access to the folder. Gemini errors? Ensure your model ID matches your API provider: models/gemini-3-pro-preview google/gemini-3-pro-preview PDF extraction empty? Check if the file contains selectable text or only images. (You can add OCR if needed.)
by MUHAMMAD SHAHEER
Overview This workflow automates the process of turning your video transcripts into platform-specific social media posts using AI. It reads any uploaded transcript file, analyzes the text, and automatically generates full-length, engaging posts with image prompts for Facebook, LinkedIn, Instagram, Reddit, and WhatsApp. Perfect for creators, marketers, and automation builders who want to repurpose long-form content into viral posts, all in one click. How it Works The Manual Trigger starts the workflow. The Read Binary File node imports your video transcript (TXT format). The Move Binary Data and Set nodes convert it into a text string for processing. The AI Agent (LangChain) powered by Groq AI analyzes the transcript and generates human-like social media posts with realistic image prompts. The Function Node parses and structures the output by platform. The Google Sheets Node automatically saves all content — ready for scheduling or publishing. The SerpAPI Integration enhances contextual awareness by referencing real-time search trends. Set Up Steps Setting up this workflow typically takes 5–10 minutes. Connect your Google Sheets account (OAuth2). Connect your Groq AI and SerpAPI credentials. Upload your transcript file (e.g., from YouTube or podcast). Run the workflow to instantly generate platform-specific posts and prompts. View all results automatically saved in Google Sheets. Detailed instructions are included as sticky notes inside the workflow. Use Cases Turn YouTube videos or podcasts into multi-platform social content Auto-generate daily social posts using transcripts Build AI-powered repurposing systems for agencies or creators Save creative teams hours of manual copywriting work Requirements n8n account (self-hosted or cloud) Groq AI API Key SerpAPI Key (for optional trend enhancement) Google Sheets connection
by Avkash Kakdiya
How it works This workflow runs daily to collect the latest funding round data from Crunchbase. It retrieves up to 100 recent funding events, including company, investors, funding amount, and industry details. The data is cleaned and filtered to only include rounds announced in the last 30 days. Finally, the results are saved into both Google Sheets for reporting and Airtable for structured database management. Step-by-step Trigger & Data Fetching Schedule Trigger node – Runs the workflow once a day. HTTP Request node – Calls the Crunchbase API to fetch the latest 100 funding rounds with relevant details. Data Processing Code node – Parses the raw API response into clean fields such as company name, funding type, funding amount, investors, industry, and Crunchbase URL. Filter node – Keeps only funding rounds from the last 30 days to ensure the dataset remains fresh and relevant. Storage & Outputs Google Sheets node – Appends or updates the filtered funding records in a Google Sheet for easy sharing and reporting. Airtable node – Stores the same records in Airtable for more structured, database-style organization and management. Why use this? Automates daily collection of startup funding data from Crunchbase. Keeps only the most recent and relevant records for faster insights. Ensures data is consistently stored in both Google Sheets and Airtable. Supports reporting, collaboration, and database management in one flow.
by RealSimple Solutions
POML → Prompt/Messages (No-Deps) What this does Turns POML markup into either a single Markdown prompt or chat-style messages\[] — using a zero-dependency n8n Code node. It supports variable substitution (via context), basic components (headings, lists, code, images, tables, line breaks), and optional schema-driven validation using componentSpec + attributeSpec. Credits Created by Real Simple Solutions as an n8n template friendly POML compiler (no dependencies) for full POML feature parity. View more of our _templates here_ Who’s it for Teams who author prompts in POML and want a template-safe way to turn them into either a single Markdown prompt or chat-style messages—without installing external modules. Works on n8n Cloud and self-hosted. What it does This workflow converts POML into: prompt** (Markdown) for single-shot models, or messages[]** (system|user|assistant) for chat APIs when speakerMode is true. It supports variable substitution via a context object ({{dot.path}}), lists, headings, code blocks, images (incl. base64 → data: URL), tables from JSON (records/columns), and basic message components. How it works Set (Specs & Context):** Provide componentSpec (allowed attrs per tag), attributeSpec (typing/coercion), and optional context. Code (POML → Prompt/Messages):** A zero-dependency compiler parses the POML and emits prompt or messages[]. > Add a yellow Sticky Note that includes this description and any setup links. Use additional neutral sticky notes to explain each step. How to set up Import the template. Open the first Set node and paste your componentSpec, attributeSpec, and context (examples included). In the Code node, choose: speakerMode: true to get messages[], or false for a single prompt. listStyle: dash | star | plus | decimal | latin. Run → inspect prompt/messages in the output. Requirements No credentials or community nodes. Works without external libraries (template-compliant). How to customize Add message tags (<system-msg>, <user-msg>, <ai-msg>) in your POML when using speakerMode: true. Extend componentSpec/attributeSpec to validate or coerce additional tags/attributes. Preformat arrays in context (e.g., bulleted, csv) for display, or add a small Set node to build them on the fly. Rename nodes and keep all user-editable fields grouped in the first Set node. Security & best practices Never** hardcode API keys in nodes. Remove any personal IDs before publishing. Keep your Sticky Note(s) up to date and instructional.
by Supira Inc.
How it works This workflow automatically collects the latest news articles from both English and Japanese sources using NewsAPI, summarizes them with OpenAI, and appends the results into a Google Sheet. The summaries are concise (about 50 characters) in Japanese, making it easy to review news highlights at a glance. Set up steps Create a Google Sheet with two tabs: 01_Input (columns: Keyword, SearchRequired) 02_Output (columns: Date, Keyword, Summary, URL) Enter your own Google Sheet ID and tab names in the workflow. Add your NewsAPI key in the HTTP Request nodes. Connect your OpenAI account (or deactivate the summarization node if not needed). Run the workflow manually or use the daily schedule trigger at 13:00. This template is ready to use with minimal changes. Sticky notes inside the workflow provide extra guidance.
by Jeremiah Wright
Who’s it for Recruiters, freelancers, and ops teams who scan job briefs and want quick, relevant n8n template suggestions, saved in a Google Sheet for tracking. What it does Parses any job text, extracts exactly 5 search keywords, queries the n8n template library, and appends the matched templates (ID, name, description, author) to Google Sheets, including the canonical template URL. How it works Trigger receives a message or paste-in job brief. LLM agent returns 5 concise search terms (JSON). For each keyword, an HTTP request searches the n8n templates API. Results are split and written to Google Sheets; the workflow builds the public URL from ID+slug. Set up Add credentials for OpenAI (or swap the LLM node to your provider). Create a Google Sheet with columns: Template ID, Name, User, Description, URL. In the ⚙️ Config node, set: GOOGLE_SHEETS_DOC_ID, GOOGLE_SHEET_NAME, N8N_TEMPLATES_API_URL. Requirements • n8n (cloud or self-hosted) • OpenAI (or alternative LLM) credentials • Google Sheets OAuth credentials Customize • Change the model/system prompt to tailor keyword extraction. • Swap Google Sheets for Airtable/Notion. • Extend filters (e.g., only AI/CRM templates) before writing rows.
by Avkash Kakdiya
How it works This workflow automatically syncs new Productboard features into Linear as issues and notifies the team via Telegram. It starts on a schedule, fetches Productboard features through API requests, and transforms the raw data into clean, structured fields. Newly created features are filtered, then inserted into Linear, and a success message is sent to Telegram for confirmation. Step-by-step 1. Trigger and fetch data Schedule Trigger** – Starts the workflow at predefined intervals. HTTP Request to Productboard** – Pulls the latest features from the Productboard API. 2. Transform and clean data Code (Transform Features)** – Strips HTML, formats dates, and extracts clean fields like name, description, status, owner, and link. 3. Filter for new items If (Filter New Features)** – Compares createdAt with today’s date, allowing only new features to proceed. 4. Create issues in Linear Create Linear Issue** – Opens a new Linear issue using the feature’s name and description. 5. Notify via Telegram Success Notification (Telegram)** – Sends a confirmation message once the sync is successful. Why use this? Automates the sync of Productboard features into Linear without manual copying. Ensures only new features are captured, preventing duplicates. Keeps your team updated instantly through Telegram notifications. Saves time by standardizing data and formatting before inserting into Linear. Creates a smooth handoff from product planning to engineering execution.
by Khairul Muhtadin
Why You Need This Right Now 💡 Stop the panic attacks. We've all been there - accidentally deleted a workflow that took hours to build, or worse, corrupted your entire automation setup. This workflow is your safety net. Save your weekends. Instead of spending hours recreating lost work, get back to what matters. One setup protects everything, automatically. Sleep better at night. Your workflows are safely stored in two places with full version history. If something breaks, you're back online in minutes, not days. Perfect For These Situations ⚡ ✅ Business owners running critical automations ✅ Agencies managing client workflows ✅ Teams who need audit trails ✅ Anyone who values their time and sanity How It Actually Works 🔧 Think of it like having a personal assistant who: Checks your workflows twice daily (you can change this) Creates organized backups with timestamps Stores them safely in Google Drive AND GitHub Tells you it's done via Telegram or Discord Keeps everything tidy with smart folder organization The result? A timestamped folder in your Google Drive and organized files in your GitHub repo. Everything is searchable, restorable, and audit-ready. Quick 5-Minute Setup 🚀 Import this workflow to your n8n Connect your accounts (Google Drive, GitHub, optional notifications) Set your preferences (which folder, which repo, how often) Test it once to make sure everything works Relax knowing your workflows are protected What You'll Need 📋 Your n8n instance (obviously!) Google Drive account (free works fine) GitHub account (free works too) 5 minutes of setup time Optional: Telegram or Discord for notifications Pro Tips for Power Users 🧠 Want to level up? Here are some ideas: Add encryption** for sensitive workflows Create restore workflows** for one-click recovery Set up pull requests** for team review of changes Customize schedules** based on your workflow update frequency Created by: khaisa Studio - Automation experts who actually use this stuff daily Tags: backup, automation, n8n, google-drive, github, workflow-protection, business-continuity Questions? Get in touch - I'm always happy to help fellow automation enthusiasts! Remember: The best backup is the one you set up before you need it. Your future self will thank you!
by AppUnits AI
Generate Invoices and Send Reminders for Customers with Jotform, QuickBooks and Gmail This workflow automates the entire process of receiving a product/service order, checking or creating a customer in QuickBooks Online (QBO), generating an invoice, emailing it — all triggered by a form submission (via Jotform), and sending invoice reminders. How It Works Receive Submission Triggered when a user submits a form. Collects data like customer details, selected product/service, etc. Check If Customer Exists Searches QBO to determine if the customer already exists. If Customer Exists:* *Update** customer details (e.g., billing address). If Customer Doesn’t Exist:* *Create** a new customer in QBO. Get The Item Retrieves the selected product or service from QBO. Create The Invoice Generates a new invoice for the customer using the item selected. Send The Invoice Automatically sends the invoice via email to the customer. Store The Invoice In DB Stores the needed invoice details in the DB. Send Reminders Every day at 8 AM, the automation checks each invoice to decide whether to: send a reminder email, skip and send it later, or delete the invoice from the DB (if it's paid or all reminders have been sent). Who Can Benefit from This Workflow? Freelancers** Service Providers** Consultants & Coaches** Small Businesses** E-commerce or Custom Product Sellers** Requirements Jotform webhook setup, more info here QuickBooks Online credentials, more info here Email setup, update email nodes (Send reminder email & Send reminders sent summary), more info about Gmail setup here Create data table with the following columns: invoiceId (string) remainingAmount (number) currency (string) remindersSent (number) lastSentAt (date time) Update Add reminders config node so update the data table id and intervals in days (default is after 2 days, then after 3 days and finally after 5 days ) LLM model credentials
by Rahul Joshi
Description Boost your LinkedIn influence with AI-curated daily content ideas! This n8n automation fetches trending professional topics from LinkedIn, analyzes them with Azure OpenAI (GPT-4o-mini), and delivers a ready-to-use, Outlook-compatible email report with: Engagement scoring AI-generated hashtags Concise content suggestions Perfect for influencers, marketers, and thought leaders, this template ensures you never run out of fresh, relevant post ideas—tailored to boost reach and engagement. **Step-by-Step Workflow: ** 📅 Manual or Scheduled Trigger Run on-demand or set it to execute daily for fresh content ideas. 🤖 AI Topic Extraction (Basic LLM Chain) Pulls 3–5 trending LinkedIn topics with short professional descriptions. Ensures relevance for a business/corporate audience. 🧠 AI Processing & Optimization (Code Node) Generates high-impact hashtags based on topic and description. Calculates an Engagement Potential Score (0–100%) for prioritization. Creates short, copy-ready content suggestions. 📊 HTML Report Generation (Outlook-Compatible) Professionally styled with: Topic ranking Engagement percentage Hashtags Ready-to-post snippets 📧 Automated Email Delivery (Gmail Node) Sends the formatted daily report directly to your inbox. Optimized for Outlook, Gmail, and mobile viewing. Perfect For: LinkedIn Influencers – Daily inspiration for posts that trend. Marketing Teams – Streamlined trend analysis and content ideation. Brand Managers – Stay ahead with data-driven post suggestions. Thought Leaders – Maintain a consistent posting cadence with minimal effort. Built With: Azure OpenAI GPT-4o-mini – AI topic generation & optimization. n8n Code Node – Hashtag generation, scoring & formatting. Gmail API – Automated report delivery. HTML Email Template – Fully mobile and Outlook compatible. Key Benefits: ✅ Saves hours of manual trend research. 📈 Maximizes reach with AI-optimized hashtags. 🧠 Prioritizes high-engagement topics for better ROI. 🛠 Fully no-code & customizable to match your niche.