by furuidoreandoro
Automated TikTok Repurposing & Video Generation Workflow Who’s it for This workflow is designed for content creators, social media managers, and marketers—specifically those in the career, recruitment, or "job change" (転職/就職) niches. It is ideal for anyone looking to automate the process of finding trending short-form content concepts and converting them into fresh AI-generated videos. How it works / What it does This workflow automates the pipeline from content research to video creation: Scrape Data: It triggers an Apify actor (clockworks/tiktok-scraper) to search and scrape TikTok videos related to "Job Change" (転職) and "Employment" (就職). Store Raw Data: It saves the scraped TikTok metadata (text, stats, author info) into a Google Sheet. AI Analysis & Prompting: An AI Agent (via OpenRouter) analyzes the scraped video content and creates a detailed prompt for a new video (concept, visual cues, aspect ratio). Log Prompts: The generated prompt is saved to a separate tab in the Google Sheet. Video Generation: The prompt is sent to Fal AI (Veo3 model) to generate a new 8-second, vertical (9:16) video with audio. Wait & Retrieve: The workflow waits for the generation to complete, then retrieves the video file. Cloud Storage: Finally, it uploads the generated video file to a specific Google Drive folder. How to set up Credentials: Configure the following credentials in n8n: Apify API: (Currently passed via URL query params in the workflow, recommended to switch to Header Auth). Google Sheets OAuth2: Connect your Google account. OpenRouter API: For the AI Agent. Fal AI (Header Auth): For the video generation API. Google Drive OAuth2: For uploading the final video. Google Sheets: Create a spreadsheet. Note the documentId and update the Google Sheets nodes. Ensure you have the necessary Sheet names (e.g., "シート1" for raw data, "生成済み" for prompts) and columns mapped. Google Drive: Create a destination folder. Update the Upload file node with the correct folderId. Apify: Update the token in the HTTP Request and HTTP Request1 URLs with your own Apify API token. Requirements n8n Version:** 1.x or higher (Workflow uses version 4.3 nodes). Apify Account:** With access to clockworks/tiktok-scraper and sufficient credits. Fal.ai Account:** With credits for the fal-ai/veo3 model. OpenRouter Account:** With credits for the selected LLM. Google Workspace:** Access to Drive and Sheets. How to customize the workflow Change the Niche:* Update the searchQueries JSON body in the first *HTTP Request** node (e.g., change "転職" to "Cooking" or "Fitness"). Adjust AI Logic:* Modify the *AI Agent** system prompt to change the style, tone, or structure of the video prompts it generates. Video Settings:* In the *Fal Submit** node, adjust bodyParameters to change the duration (e.g., 5s), aspect ratio (e.g., 16:9), or disable audio. Scale:* Increase the amount in the *Limit** node to process more than one video per execution.
by Dahiana
AI Content Summarizer Suite This n8n template collection demonstrates how to build a comprehensive AI-powered content summarization system that handles multiple input types: URLs, raw text, and PDF files. Built as 4 separate workflows for maximum flexibility. Use cases: Research workflows, content curation, document processing, meeting prep, social media content creation, or integrating smart summarization into any app or platform. How it works Multi-input handling: Separate workflows for URLs (web scraping), direct text input, and PDF file processing Smart PDF processing: Attempts text extraction first, falls back to OCR.Space for image-based PDFs AI summarization: Uses OpenAI's GPT-4.1-mini with customizable length (brief/standard/detailed) and focus areas (key points/numbers/conclusions/action items) Language support: Multi-language summaries with automatic language detection Flexible output: Returns clean markdown-formatted summaries via webhook responses Unified option: The all-in-one workflow automatically detects input type and routes accordingly How to use Replace webhook triggers with your preferred method (manual, form, API endpoint) Each workflow accepts different parameters: URL, text content, or file upload Customize summary length and focus in the AI prompt nodes Authentication is optional - switch to "none" if running internally Perfect for integration with Bubble, Zapier, or any platform that can make HTTP requests Requirements OpenAI API key or OpenRouter Keys OCR.Space API key (for PDF fallback processing) n8n instance (cloud or self-hosted) Any platform that can make HTTP requests. Setup Steps Replace "Dummy OpenAI" with your OpenAI credentials Add your OCR.Space API key in the OCR nodes is not mandatory. Update webhook authentication as needed Test each workflow path individually
by Fahmi Fahreza
Sign up for Decodo HERE for Discount Automatically scrape, structure, and log forum or news content using Decodo and Google Gemini AI. This workflow extracts key details like titles, URLs, authors, and engagement stats, then appends them to a Google Sheet for tracking and analysis. Who’s it for? Ideal for data journalists, market researchers, or AI enthusiasts who want to monitor trending topics across specific domains. How it works Trigger: Workflow runs on schedule. Data Setup: Defines forum URLs and geolocation. Scraping: Extracts raw text data using the Decodo API. AI Extraction: Gemini parses and structures the scraped text into clean JSON. Data Storage: Each news item is appended or updated in Google Sheets. Logging: Records scraping results for monitoring and debugging. How to set up Add your Decodo, Google Gemini, and Google Sheets credentials in n8n. Adjust the forum URLs, geolocation, and Google Sheet ID in the Workflow Config node. Set your preferred trigger interval in Schedule Trigger. Activate and monitor from the n8n dashboard.
by Fahmi Fahreza
TikTok Trend Analyzer with Apify + Gemini + Airtable Automatically scrape trending TikTok videos, analyze their virality using Gemini AI, and store insights directly into Airtable for creative research or content planning. Who’s it for? Marketing analysts, creators, and creative agencies looking to understand why videos go viral and how to replicate successful hooks and formats. How it works A scheduled trigger runs the Apify TikTok Trends Scraper weekly. The scraper collects trending video metadata. Data is stored in Airtable (views, likes, captions, sounds, etc.). When a specific video is submitted via webhook, the workflow fetches it from Airtable. Gemini AI analyzes the video and extracts structured insights: summary, visual hook, audio, and subtitle analysis. The workflow updates the Airtable record with these AI insights. How to set up Connect Apify and Airtable credentials, link Gemini or OpenAI keys, and adjust the schedule frequency. Add your Airtable base and table IDs. You can trigger analysis manually via the webhook endpoint.
by Atik
Automate multi-document handling with AI-powered extraction that adapts to any format and organizes it instantly. What this workflow does Monitors Google Drive for new uploads (receipts, resumes, claims, physician orders, blueprints, or any doc type) Automatically downloads and prepares files for analysis Identifies the document type using Google Gemini Parses structured data via the trusted VLM Run node with OCR + layout parsing Stores records in Google Sheets — AI Agent maps values to the correct sheet dynamically Setup Prerequisites: Google Drive & Google Sheets accounts, VLM Run API credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can integrate it directly for high-accuracy data extraction. Quick Setup: Configure Google Drive OAuth2 and select a folder for uploads Add VLM Run API credentials Create a Master Reference Google Sheet with the following structure: | Document_Name | Spreadsheet_ID | | ---------------------- | ----------------------------- | | Receipt | your-receipt-sheet-id | | Resume | your-resume-sheet-id | | Physician Order | your-physician-order-sheet-id | | Claims Processing | your-claims-sheet-id | | Construction Blueprint | your-blueprint-sheet-id | The first column holds the document type, and the second column holds the target sheet ID where extracted data should be appended. In the AI Agent node, edit the agent prompt to: Analyze the JSON payload from VLM Run Look up the document type in the Master Reference Sheet If a matching sheet exists → fetch headers, then append data accordingly If headers don’t exist → create them from JSON keys, then insert values If no sheet exists → add the new type to the Master Reference with an empty Spreadsheet ID Test with a sample upload and activate the workflow How to customize this workflow to your needs Extend functionality by: Adjusting the AI Agent prompt to support any new document schema (just update field mappings) Adding support for multi-language OCR or complex layouts in VLM Run Linking Sheets data to BI dashboards or reporting tools Triggering notifications when new entries are stored This workflow leverages the VLM Run node for flexible, precision extraction and the AI Agent for intelligent mapping, creating a powerful system that adapts to any document type with minimal setup changes.
by Arlin Perez
🔍 Description: Effortlessly delete unused or inactive workflows from your n8n instance while automatically backing them up as .json files into your Google Drive. Keep your instance clean, fast, and organized — no more clutter slowing you down. This workflow is ideal for users managing large self-hosted n8n setups, or anyone who wants to maintain optimal performance while preserving full workflow backups. ✅ What it does: Accepts a full n8n Workflow URL via a form Retrieves workflow info automatically Converts the workflow’s full JSON definition into a file Uploads that file to Google Drive Deletes the workflow safely using the official n8n API Sends a Telegram notification confirming backup and deletion ⚙️ How it works: 📝 Form – Collects the full workflow URL from the user 🔍 n8n Node (Get Workflow) – Uses the URL to fetch workflow details 📦 Code Node ("JSON to File") – Converts the workflow JSON into a properly formatted .json file with UTF-8 encoding, ready to be uploaded to Google Drive. ☁️ Google Drive Upload – Uploads the .json backup file to your selected Drive folder 🗑️ n8n Node (Delete Workflow) – Deletes the workflow from your instance using its ID 📬 Telegram Notification – Notifies you that the workflow was backed up and deleted, showing title, ID, and date 📋 Requirements: Google Drive connected to your n8n account Telegram Bot connected to n8n An n8n instance with API access (self-hosted or Cloud) Your n8n API Key (Create one in the settings) 🛠️ How to Set Up: ✅ Add your Google Drive credentials ✅ Add your Telegram Bot credentials 🧾 In the “JSON to File” Code node, no additional setup is required — it automatically converts the workflow JSON into a downloadable .json file using the correct encoding and filename format. ☁️ In the Google Drive node: Binary Property: data Folder ID: your target folder in Google Drive 🔑 Create a new credential for the n8n node using: API Key: your personal n8n API key Base URL: your full n8n instance API path (e.g. https://your-n8n-instance.com/api/v1) ⚙️ Use this credential in both the Get Workflow and Delete Workflow n8n nodes 📬 In the Telegram node, use this message template: 🗑️ Workflow "{{ $json.name }}" (ID: {{ $json.id }}) was backed up to Google Drive and deleted from n8n. 📅 {{ $now }} 🔒 Important: This workflow backs up the entire workflow data to Google Drive. Please be careful with the permissions of your Google Drive folder and avoid sharing it publicly, as the backups may contain sensitive information. Ensuring proper security and access control is essential to protect your data. 🚀 Activate the workflow and you're ready to safely back up and remove workflows from your n8n instance
by Rahul Joshi
Description Turn incoming Gmail messages into Zendesk tickets and keep a synchronized log in Google Sheets. Uses Gmail as the trigger, creates Zendesk tickets, and appends or updates a central sheet for tracking. Gain a clean, auditable pipeline from inbox to support queue. ✨ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail payload for consistent fields. 🧹 Creates a Zendesk ticket from the email content. 🎫 Formats data for Sheets and appends or updates a row. 📊 Executes helper sub-workflows and writes logs for traceability. 🔁🧾 Key Benefits Converts emails to actionable support tickets automatically. ⚡ Maintains a single source of truth in Google Sheets. 📒 Reduces manual triage and data entry. 🕒 Improves accountability with structured logs. ✅ Features Gmail Trigger for real-time intake. ⏱️ Normalize Gmail Data for consistent fields. 🧩 Create Zendesk Ticket (create: ticket). 🎟️ Format Sheet Data for clean columns. 🧱 Log to Google Sheets with appendOrUpdate. 🔄 Execute workflow (sub-workflow) steps for modularity. 🧩 Requirements n8n instance (cloud or self-hosted). 🛠️ Gmail credentials configured in n8n (with read access to the monitored inbox). ✉️ Zendesk credentials (API token or OAuth) with permission to create tickets. 🔐 Google Sheets credentials with access to the target spreadsheet for append/update. 📊 Access to any sub-workflows referenced by the Execute workflow nodes. 🔁 Target Audience IT support and helpdesk teams managing email-based requests. 🖥️ Ops teams needing auditable intake logs. 🧾 Agencies and service providers converting client emails to tickets. 🤝 Small teams standardizing email-to-ticket flows. 🧑💼 Step-by-Step Setup Instructions Connect Gmail, Zendesk, and Google Sheets in n8n Credentials. 🔑 Set the Gmail Trigger to watch the desired label/inbox. 📨 Map Zendesk fields (description) from normalized Gmail data. 🧭 Point the Google Sheets node to your spreadsheet and confirm appendOrUpdate mode. 📄 Assign credentials to all nodes, including any Execute workflow steps. 🔁 Run once to test end-to-end; then activate the workflow. ✅
by Meak
Firecrawl Web Search Agent → Google Sheets Logger with OpenRouter + n8n Most teams craft search operators by hand and copy results into spreadsheets. This workflow automates query generation, multi-operator searches, scraping, and logging — from a single webhook call. Benefits Auto-generate Firecrawl queries from natural language (OpenRouter Agent) Use pro operators: site:, inurl:, intitle:, exclusions, related Run parallel searches (site match, in-URL, exclusions, YouTube/intitle) Append titles/URLs/results to Google Sheets automatically Return results to the caller via webhook response Optional scraping of markdown + full-page screenshots How It Works Webhook receives a natural-language search request OpenRouter-powered Agent converts it to a Firecrawl query (+ limit) Firecrawl Search runs with scrapeOptions (markdown, screenshot) Parallel queries: site:, inurl:, negative filters, YouTube intitle:automation Collect results (title, url, data fields) from each call Append rows to Google Sheets (one per result) Respond to the webhook with the aggregated payload Ready to chain into alerts, enrichment, or CRM sync Who Is This For Researchers and content teams building source lists Growth/SEO teams needing precise operator queries Agencies automating discovery, monitoring, and logging Setup Connect OpenRouter (select your LLM; e.g., GPT-4.1-mini) Add Firecrawl API key and endpoint (/v1/search) Connect Google Sheets (Document ID + Sheet/Tab) Set webhook path and allow POST from your app Define default limit (fallback = 5) and scrapeOptions ROI & Monetization Save 3–6 hours/week on manual searching & copy/paste Offer as a $500–$2k/month research automation for clients Upsell alerts (cron/webhook) and data enrichment for premium retainers Strategy Insights In the full walkthrough, I show how to: Prompt the Agent to produce flawless site:/inurl/intitle/-exclusions Map Firecrawl data fields cleanly into Sheets Handle rate limits, empty results, and retries Extend with dedupe, domain filtering, and Slack/Telegram alerts Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. It’s perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who don’t use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this ➤ Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Evoort Solutions
🚀 Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. ⚙️ Node-by-Node Workflow Breakdown 1. 🟢 On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. 🌐 Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. 🔄 Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. 📄 Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. 💡 Use Cases 📊 SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. 🌐 Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. 📈 Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. 🕵️ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. 🎯 Key Benefits ✅ Automated Traffic Monitoring — Run reports automatically on-demand or on a scheduled basis. ✅ Real-Time Google Sheets Logging — Easily centralize and structure traffic data for easy sharing and visualization. ✅ Zero Code Required — Powered by n8n’s visual builder, set up workflows quickly without writing a single line of code. ✅ Scalable & Flexible — Extend the workflow to include alerts, additional API integrations, or other automated tasks. 🔐 How to Get Your SEMrush API Key via RapidAPI Visit the API Listing 👉 SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. 🔧 Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). 📈 Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. 🚀 Start Automating in Minutes Build your automated website traffic dashboard with n8n today — no coding required. 👉 Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!
by Daniel
Secure your n8n automations with this comprehensive template that automates periodic backups to Telegram for instant access while enabling flexible restores from Google Drive links or direct file uploads—ensuring quick recovery without data loss. 📋 What This Template Does This dual-branch workflow handles full n8n instance backups and restores seamlessly. The backup arm runs every 3 days, fetching all workflows via the n8n API, aggregating them into a JSON array, converting to a text file, and sending it to Telegram for offsite storage and sharing. The restore arm supports two entry points: manual execution to pull a backup from Google Drive or form-based upload for local files, then parses the JSON, cleans workflows for compatibility, and loops to create missing ones or update existing by name—handling batches efficiently to respect API limits. Scheduled backups with Telegram delivery for easy stakeholder access Dual restore paths: Drive download or direct file upload via form Intelligent create-or-update logic with data sanitization to avoid conflicts Looped processing with existence checks and error continuation 🔧 Prerequisites n8n instance with API enabled (self-hosted or cloud) Telegram account for bot setup Google Drive account (optional, for Drive-based restores) 🔑 Required Credentials n8n API Setup In n8n, navigate to Settings → n8n API Enable the API and generate a new key Add to n8n as "n8n API" credential type, pasting the key in the API Key field Telegram API Setup Message @BotFather on Telegram to create a new bot and get your token Find your chat ID by messaging @userinfobot Add to n8n as "Telegram API" credential type, entering the Bot Token Google Drive OAuth2 API Setup In Google Cloud Console, go to APIs & Services → Credentials Create an OAuth 2.0 Client ID for Web application, enable Drive API Add redirect URI: [your-n8n-instance-url]/rest/oauth2-credential/callback Add to n8n as "Google Drive OAuth2 API" credential type and authorize ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the n8n API, Telegram API, and Google Drive credentials to their nodes Update the Telegram chat ID in the "Send Backup to Telegram" node Set the Google Drive file ID in the "Download Backup from Drive" node (from file URL) Activate the workflow and test backup by executing the Schedule node manually Test restore: Run manual trigger for Drive or use the form for upload 🎯 Use Cases Dev teams backing up staging workflows to Telegram for rapid production restores during deployments Solo automators uploading local backups via form to sync across devices after n8n migrations Agencies sharing client workflow archives via Drive links for secure, collaborative restores Educational setups scheduling exports to Telegram for student template distribution and recovery ⚠️ Troubleshooting Backup file empty: Verify n8n API permissions include read access to workflows Restore parse errors: Check JSON validity in backup file; adjust Code node property reference if needed API rate limits hit: Increase Wait node duration or reduce batch size in Loop Form upload fails: Ensure file is valid JSON text; test with small backup first
by Jay Emp0
Ebook to Audiobook Converter ▶️ Watch Full Demo Video What It Does Turn any PDF ebook into a professional audiobook automatically. Upload a PDF, get an MP3 audiobook in your Google Drive. Perfect for listening to books, research papers, or documents on the go. Example: Input PDF → Output Audiobook Key Features Upload PDF via web form → Get MP3 audiobook in Google Drive Natural-sounding AI voices (MiniMax Speech-02-HD) Automatic text extraction, chunking, and audio merging Customizable voice, speed, and emotion settings Processes long books in batches with smart rate limiting Perfect For Students**: Turn textbooks into study audiobooks Professionals**: Listen to reports and documents while commuting Content Creators**: Repurpose written content as audio Accessibility**: Make content accessible to visually impaired users Requirements | Component | Details | |-----------|---------| | n8n | Self-hosted ONLY (cannot run on n8n Cloud) | | FFmpeg | Must be installed in your n8n environment | | Replicate API | For MiniMax TTS (Sign up here) | | Google Drive | OAuth2 credentials + "Audiobook" folder | ⚠️ Important: This workflow does NOT work on n8n Cloud because FFmpeg installation is required. Quick Setup 1. Install FFmpeg Docker users: docker exec -it <n8n-container-name> /bin/bash apt-get update && apt-get install -y ffmpeg Native installation: sudo apt-get install ffmpeg # Linux brew install ffmpeg # macOS 2. Get API Keys Replicate**: Sign up at replicate.com and copy your API token Google Drive**: Set up OAuth2 in n8n and create an "Audiobook" folder in Drive 3. Import & Configure Import n8n.json into your n8n instance Replace the Replicate API token in the "MINIMAX TTS" node Configure Google Drive credentials and select your "Audiobook" folder Activate the workflow Cost Estimate | Component | Cost | |-----------|------| | MiniMax TTS API | $0.15 per 1000 characters ($3-5 for average book) | | Google Drive Storage | Free (up to 15GB) | | Processing Time | ~1-2 minutes per 10 pages | How It Works PDF Upload → Extract Text → Split into Chunks → Convert to Speech (batches of 5) → Merge Audio Files (FFmpeg) → Upload to Google Drive The workflow uses four main modules: Extraction: PDF text extraction and intelligent chunking Conversion: MiniMax TTS processes text in batches Merging: FFmpeg combines all audio files seamlessly Upload: Final audiobook saved to Google Drive Voice Settings (Customizable) { "voice_id": "Friendly_Person", "emotion": "happy", "speed": 1, "pitch": 0 } Available emotions: happy, neutral, sad, angry, excited Limitations ⚠️ Self-hosted n8n ONLY (not compatible with n8n Cloud) PDF files only (not EPUB, MOBI, or scanned images) Large books (500+ pages) take longer to process Requires FFmpeg installation (see setup above) Troubleshooting FFmpeg not found? Docker: Run docker exec -it <container> /bin/bash then apt-get install ffmpeg Native: Run sudo apt-get install ffmpeg (Linux) or brew install ffmpeg (macOS) Rate limit errors? Increase wait time in the "WAITS FOR 5 SECONDS" node to 10-15 seconds Google Drive upload fails? Make sure you created the "Audiobook" folder in your Google Drive Reconfigure OAuth2 credentials in n8n Created by emp0 | More workflows: n8n Gallery