by Robert Breen
Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order:* Use *Manual mapping** in the Google Sheets node Binary not found:* Ensure each *Convert to File* node’s binaryPropertyName matches what *Merge** expects Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Mutasem
Use case This workflow snoozes any Todoist tasks, by moving them into a Snoozed todoist list and unsnoozes them 3 days before due date. Helps keep inbox clear only of tasks you need to worry about soon. How to setup Add your Todoist creds Create a Todoist project called snoozed Set the project ids in the relevant nodes Add due dates to your tasks in Inbox. Watch them disappear to snoozed. Set their date to tomorrow, watch it return to inbox. How to adjust this template Adjust the timeline.. Maybe 3 days is too close for you. Works mostly for me :)
by Deborah
Want to learn the basics of n8n? Our comprehensive quick quickstart tutorial is here to guide you through the basics of n8n, step by step. Designed with beginners in mind, this tutorial provides a hands-on approach to learning n8n's basic functionalities.
by Rajeet Nair
Automatically converts CSV/XLSX files into a fully validated database schema using AI, generating SQL scripts, ERD diagrams, a data dictionary, and load plans to accelerate database design and data onboarding. EXPLANATION This workflow automates the end-to-end process of transforming raw CSV or Excel data into a production-ready relational database schema. It begins by accepting file uploads through a webhook, detecting file type, and extracting structured data. The workflow performs data cleaning and deep profiling to analyze column types, uniqueness, null values, and patterns. A column analysis engine identifies candidate primary keys and potential relationships. An AI agent then generates a normalized schema by organizing data into tables, assigning appropriate SQL data types, and defining primary and foreign keys. The schema is validated using rule-based checks to ensure data integrity, correct relationships, and proper normalization. If validation fails, the workflow automatically refines the schema through a revision loop. Once validated, it generates SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan that determines the correct order for inserting data. Finally, all outputs are combined and returned via webhook as a structured response, making the workflow ideal for rapid database creation, data migration, and AI-assisted data modeling. Overview This workflow automatically converts CSV or Excel files into a production-ready relational database schema using AI and rule-based validation. It analyzes uploaded data to detect column types, relationships, and data quality, then generates a normalized schema with proper keys and constraints. The output includes SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan. This eliminates manual schema design and accelerates database setup from raw data. How It Works File Upload (Webhook) Accepts CSV or XLSX files and initializes workflow configuration such as thresholds and retry limits. File Extraction Detects file format and extracts rows into structured JSON format. Data Cleaning & Profiling Cleans data, removes duplicates, normalizes values, and computes column statistics such as null percentage and uniqueness. Column Analysis Engine Identifies candidate primary keys, analyzes cardinality, and suggests potential foreign key relationships. AI Schema Generation Uses an AI agent to design normalized tables, assign SQL data types, and define primary keys, foreign keys, and constraints. Validation Layer Validates schema integrity by checking data types, primary key uniqueness, foreign key overlap, and constraint consistency. Revision Loop If validation fails, the workflow sends feedback to the AI agent and regenerates the schema until it meets requirements. Schema Output Generation Generates SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan. Load Plan Engine Determines the correct order for inserting data and detects circular dependencies. Combine & Explain Merges all outputs and optionally provides AI-generated explanations of schema decisions. Response Output Returns all generated artifacts as a structured JSON response via webhook. Setup Instructions Activate the workflow and copy the webhook URL Send a POST request with a CSV or XLSX file Configure OpenAI credentials for the AI agent Adjust thresholds if needed (FK overlap, retries, confidence) Execute the workflow and review outputs Use Cases Automatically generate database schemas from CSV/Excel files Accelerate data migration and onboarding pipelines Rapidly prototype relational database designs Reverse engineer structured schemas from raw datasets AI-assisted data modeling and normalization Requirements n8n (latest version recommended) OpenAI API credentials LangChain nodes enabled CSV or XLSX input file
by Harshil Agrawal
This workflow demonstrates how to use the Netlify Trigger node to capture form submissions and add it Airtable. You can reuse the workflow to add the data to another similar database by replacing the Airtable node with the corresponding node. Netlify Trigger node: This node triggers the workflow when a new form is submitted. Select your site from the Site Name/ID dropdown list and the form from the Form ID dropdown list. Set node: This node extract the required data from the Netlify Trigger node. In this example, we only want to add the Name, Email, and Role of the user. Airtable node: This node appends the data to Airtable. If you want the data to Google Sheets or a database, replace this node with the corresponding node.
by Peter
Read a value by key from a local json file. Related workflow: WriteKey Create a subfolder in your n8n homedir: /home/node/.n8n/local-files. In docker look at the data path and create a subfolder local-files. Set the correct access rights chmod 1000.1000 local-files. Put the workflow code in a new workflow named GetKey. Create another workflow with a function item: return { file: '/4711.json', // 4711 should be your workflow id key: 'MyKey', default: 'Optional returned value if key is empty / not exists' } Pipe the function item to an Execution Workflow that calls the GetKey workflow. It would be nice if we could get someday a shiny built-in n8n node that does the job. :)
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Jitesh Dugar
Newsletter Sign-up with Email Verification & Welcome Email Automation 📋 Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. ✨ Key Features Email Verification Real-time validation** using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails** with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging** to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing 🎯 Use Cases Newsletter sign-ups** on websites and landing pages Lead generation** forms with quality control Marketing campaigns** requiring verified email lists Community building** with automated onboarding SaaS product launches** with email collection Content creator** audience building E-commerce** customer list management 📊 What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up 🔧 Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting 🚀 Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included 📈 Benefits ✅ Improve Email Deliverability - Remove invalid emails before sending campaigns ✅ Reduce Bounce Rates - Only send to verified, active email addresses ✅ Save Money - Don't waste email credits on invalid addresses ✅ Better Analytics - Track conversion rates by source ✅ Professional Onboarding - Personalized welcome experience ✅ Scalable Solution - Handles high-volume sign-ups automatically ✅ Data Quality - Build a clean, high-quality subscriber list 🎨 Customization Options Email Template** - Fully customizable HTML design Verification Threshold** - Adjust score requirements Brand Colors** - Match your company branding Confirmation Flow** - Add double opt-in if desired Multiple Sources** - Track different signup forms Language** - Easily translate email content 📦 What's Included ✅ Complete n8n workflow JSON (ready to import) ✅ Google Sheets template structure ✅ Responsive HTML email template ✅ Setup documentation with screenshots ✅ Troubleshooting guide ✅ Customization examples 🔒 Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export 💡 Quick Stats 12 Nodes** - Fully automated workflow 3 Data Paths** - Valid, invalid, and incomplete submissions 100% Uptime** - When properly configured Instant Processing** - Real-time email verification Unlimited Scale** - Based on your API limits 🏆 Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers 🌟 Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today!
by WeblineIndia
APK Security Scanner & PDF Report Generator This workflow automatically analyzes any newly uploaded APK file and produces a clean, professional PDF security report. When an APK appears in Google Drive, the workflow downloads it, sends it to MobSF for security scanning, summarizes the results, generates an HTML report using AI, converts it into a PDF via PDF.co and finally saves the PDF back to Google Drive. Quick Start: Fastest Way to Use This Workflow Set up a Google Drive folder for uploading APKs. Install MobSF using Docker and copy your API key. Add credentials for Google Drive, MobSF, OpenAI and PDF.co in n8n. Import the workflow JSON. Update node credentials. Upload an APK to the watched folder and let the automation run. What It Does This workflow provides a complete automated pipeline for analyzing Android APK files. It removes the manual process of scanning apps, extracting security insights, formatting reports and distributing results. Each step is designed to streamline application security checks for development teams, QA engineers and product managers. Once the workflow detects a new APK in Google Drive, it passes the file to MobSF for a detailed static analysis. The workflow extracts the results, transforms them into a clear and well-structured HTML report using AI and then converts the report into a PDF. This ensures the end-user receives a polished audit-ready security document with zero manual involvement. Who’s It For This workflow is ideal for: Mobile development teams performing security checks on apps. QA and testing teams validating APK builds before release. DevSecOps engineers needing automated, repeatable security audits. Software companies generating compliance and audit documentation. Agencies reviewing client apps for vulnerabilities. Requirements to Use This Workflow An n8n instance (self-hosted or cloud) A Google Drive account with a folder for APK uploads Docker installed to run MobSF locally MobSF API key OpenAI API key PDF.co API key Basic understanding of n8n nodes and credentials setup How It Works & Setup Instructions Step 1 — Prepare Google Drive Create a folder specifically for APK uploads. Configure the Watch APK Uploads (Google Drive) node to monitor this folder for new files. Step 2 — Install and Run MobSF Using Docker Install Docker and run: docker run -it --rm -p 8000:8000 \ -v $(pwd)/mobsf:/home/mobsf/.MobSF \ opensecurity/mobile-security-framework-mobsf Open MobSF at http://localhost:8000 and copy your API key. Step 3 — Add Credentials in n8n Add credentials for: Google Drive MobSF (API key in headers) OpenAI PDF.co Step 4 — Configure Malware Scanning Upload APK to Analyzer (MobSF Upload API)** sends the file. Start Security Scan (MobSF Scan API)** triggers the vulnerability scan. Step 5 — Summarize & Generate HTML Report Summarize MobSF Report (JS Code)** extracts key vulnerabilities. Generate HTML Report (GPT Model)** formats them in a structured report. Clean HTML Output (JS Code)** removes escaped characters. Step 6 — Convert HTML to PDF Use Generate PDF (PDF.co API) to convert the HTML to PDF. Step 7 — Save Final Report Download using Download Generated PDF, then upload via Upload PDF to Google Drive. How To Customize Nodes Google Drive Trigger:** Change the folder ID to watch a different upload directory. MobSF API Nodes:** Update URLs if MobSF runs on another port or server. AI Report Generator:** Modify prompt instructions to change the writing style or report template. PDF Generation:** Edit margins, page size, or output filename in the PDF.co node. Save Location:** Change Google Drive folder where the final PDF is stored. Add-Ons You can extend this workflow with: Slack or Email Notifications** when a report is ready Automatic naming conventions** (e.g., report-{{date}}-{{app_name}}.pdf) Saving reports into Airtable or Notion** Multi-file batch scanning** VirusTotal scan integration** before generating the PDF Use Case Examples Automated security scanning for every new build generated by CI/CD. Pre-release vulnerability checks for client-delivered APKs. Compliance documentation generation for internal security audits. Bulk scanning of legacy APKs for modernization projects. Creating professional PDF security reports for customers. (Many more use cases can be built using the same workflow foundation.) Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------- | -------------------------- | ---------------------------------------------------------- | | MobSF API call fails | Wrong API key or URL | Check MobSF is running and API key is correct. | | PDF not generated | Invalid HTML or PDF.co key | Validate HTML output and verify PDF.co credentials. | | Workflow not triggering | Wrong Google Drive folder | Reconfigure Drive Trigger node with the correct folder ID. | | APK upload fails | File not in binary mode | Ensure HTTP Upload node is using “Binary Data” correctly. | | Scan returns empty data | MobSF not fully started | Wait for full MobSF startup logs before scanning. | Need Help? If you need assistance setting up this workflow, customizing it or adding advanced features such as Slack alerts, CI/CD integration or bulk scanning, our n8n workflow development team at WeblineIndia can help. We specialize in building secure, scalable, automation-driven workflows on n8n for businesses of all sizes. Contact us anytime for support or to build custom workflow automation solutions.
by Harshil Agrawal
This workflow allows you to trigger a build in Travis CI when code changes are pushed to a GitHub repo or a pull request gets opened. GitHub Trigger node: This node will trigger the workflow when changes are pushed or when a pull request is created, updated, or deleted. IF node: This node checks for the action type. We want to trigger a build when code changes are pushed or when a pull request is opened. We don't want to build the project when a PR is closed or updated. TravisCI node: This node will trigger the build in Travis CI. If you're using CircleCI in your pipeline, replace the node with the CircleCI node. NoOp node: Adding this node is optional.
by Rajeet Nair
Overview This workflow automatically converts CSV or Excel files into a production-ready database schema using AI and rule-based validation. It analyzes uploaded data, detects column types, relationships, and data quality, then generates a normalized schema. The output includes SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan. This eliminates manual schema design and accelerates database setup from raw data. How It Works File Upload (Webhook) Accepts CSV or XLSX files via webhook endpoint Initializes workflow configuration (thresholds, retry limits) File Extraction Detects file format (CSV or Excel) Extracts rows into structured JSON Merges extracted datasets Data Cleaning & Profiling Removes duplicates and normalizes values Detects data types (integer, float, date, boolean, string) Computes column statistics (nulls, uniqueness, distributions) Generates file hash and sample dataset Column Profiling Engine Identifies potential primary keys Detects cardinality and uniqueness levels Suggests foreign key relationships based on value overlap AI Schema Generation Uses an AI agent to design normalized tables Assigns SQL data types based on real data Defines primary keys, foreign keys, constraints, and indexes Validation Layer Ensures schema matches actual data Validates: Data types Primary key uniqueness Foreign key overlap (>70%) Constraint consistency Detects circular dependencies Revision Loop If validation fails: Sends feedback to AI agent Regenerates schema Retries up to configured limit Schema Output Generation Generates: SQL DDL scripts ERD (Mermaid format) Data dictionary Load plan with dependency graph Load Plan Engine Computes optimal table insertion order Detects circular dependencies Suggests batching strategy Combine & Explain Merges all outputs Optional AI explanation of schema decisions Response Output Returns structured JSON via webhook: SQL schema ERD summary Data dictionary Load plan Optional explanation Setup Instructions Activate the workflow and copy the webhook URL Send a POST request with a CSV or XLSX file Configure OpenAI credentials (used by AI agent) Adjust thresholds if needed (FK overlap, retries, confidence) Execute workflow and review generated outputs Use Cases Auto-generate database schema from CSV/Excel files Data migration and onboarding pipelines Rapid database prototyping Reverse engineering datasets AI-assisted data modeling Requirements n8n (latest version recommended) OpenAI API credentials LangChain nodes enabled CSV or XLSX input file
by Miquel Colomer
Do you want to avoid bounces in your Email Marketing campaigns? This workflow verifies emails using the uProc.io email verifier. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. Node "Create Email Item" can be replaced by any other supported service with email value, like Mailchimp, Calendly, MySQL, or Typeform. The "uProc" node returns a status per checked email (deliverable, undeliverable, spamtrap, softbounce,...). "If" node checks if "deliverable" status exists. If value is not present, you can mark email as invalid to discard bounces. If "deliverable" status is present, you can use email in your Email Marketing campaigns. If you need to know detailed indicators of any email, you can use the tool "Communication" > "Check Email Exists (Extended)" to get advanced information.