by Stéphane Heckel
Emailing Using Google Sheet, Google Docs, and SMTP Automate personalized email campaigns using a Google Sheets contact list, a Google Docs template, and SMTP delivery. How It Works Google Docs** is used as the email template with variables: {{firstname}}, {{lastname}}, {{company}}, {{email}}. Google Sheet** contains your list of recipients (one per row). For each contact, the workflow merges personal data into the Google Docs template. Email is sent to each recipient via SMTP (batch size: 1). Use the Wait node to respect provider quotas. After sending, the workflow updates the "process" column of the Google Sheet with the date/time. How to Use Copy Templates: Google Docs Template Google Sheet Template Find each document’s ID (the text after /d/ and before /edit in the URL). Configure Workflow: Enter your Google Docs and Google Sheets IDs in the settings node. Set your email subject in the appropriate parameter. Set Up Credentials: Connect your Google account. Configure the SMTP node with your mail server details. Update Data: Edit the Google Docs template with your message and variables. Prepare your Google Sheet with these columns: email, firstname, lastname, company. Deploy and Test: Connect all nodes. Test with a small contact batch. Troubleshoot any node errors (indicated in red in n8n). Requirements Google Credentials & permissions**: For Sheets and Docs access. SMTP Server**: For email delivery (adjust Wait node for rate limits). n8n Version**: Tested on 1.105.2 (Ubuntu). Need Help? Contact me on LinkedIn or ask in the n8n Community Forum!
by Ranjan Dailata
This workflow automates AI-powered search insights by combining SE Ranking AI Search data with OpenAI summarization. It starts with a manual trigger and fetches the time-series AI visibility data via the SE Ranking API. The response is summarized using OpenAI to produce both detailed and concise insights. The workflow enriches the original metrics with these AI-generated summaries and exports the final structured JSON to disk, making it ready for reporting, analytics, or further automation. Who this is for This workflow is designed for: SEO professionals & growth marketers** tracking AI search visibility Content strategists** analyzing how brands appear in AI-powered search results Data & automation engineers** building SEO intelligence pipelines Agencies** producing automated search performance reports for clients What problem is this workflow solving? SE Ranking’s AI Search API provides rich but highly technical time-series data. While powerful, this data: Is difficult to interpret quickly Requires manual analysis to extract insights Is not presentation-ready for reports or stakeholders This workflow solves that by automatically transforming raw AI search metrics into clear, structured summaries, saving time and reducing analysis friction. What this workflow does At a high level, the workflow: Accepts input parameters such as target domain, AI engine, and region Fetches AI search visibility time-series data from SE Ranking Uses OpenAI GPT-4.1-mini to generate: A comprehensive summary A concise abstract summary Enriches the original dataset with AI-generated insights Exports the final structured JSON to disk for: Reporting Dashboards Further automation or analytics Setup Prerequisites n8n (self-hosted or cloud)** SE Ranking API access** OpenAI API key** Setup Steps If you are new to SE Ranking, please signup on seranking.com Import the workflow JSON into n8n Configure credentials: SE Ranking using HTTP Header Authentication. Please make sure to set the header authentication as below. The value should contain a Token followed by a space with the SE Ranking API Key. OpenAI for GPT-4.1-mini Open Set the Input Fields and update: target_site (e.g., your domain) engine (e.g., ai-overview) source (e.g., us, uk, in) Verify the file path in Write File to Disk Click Execute Workflow How to customize this workflow to your needs You can easily extend or tailor this workflow: Change analysis scope** Update domain, region, or AI engine Modify AI outputs** Adjust prompts or output schema for insights like trends, risks, or recommendations Replace storage** Send output to: Google Sheets Databases S3 / cloud storage Webhooks or BI tools Automate monitoring** Add a Cron trigger to run daily, weekly, or monthly Summary This workflow turns raw SE Ranking AI Search data into clear, executive-ready insights using OpenAI GPT-4.1-mini. By combining automated data collection with AI summarization, it enables faster decision-making, better reporting, and scalable SEO intelligence without manual analysis.
by Onur
Yelp Business Scraper by URL via Scrape.do API with Google Sheets Storage Overview This n8n workflow automates the process of scraping comprehensive business information from Yelp using individual business URLs. It integrates with Scrape.do for professional web scraping with anti-bot bypass capabilities and Google Sheets for centralized data storage, providing detailed business intelligence for market research, competitor analysis, and lead generation. Workflow Components 1. 📥 Form Trigger | Property | Value | |----------|-------| | Type | Form Trigger | | Purpose | Initiates the workflow with user-submitted Yelp business URL | | Input Fields | Yelp Business URL | | Function | Captures target business URL to start the scraping process | 2. 🔍 Create Scrape.do Job | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Creates an async scraping job via Scrape.do API | | Endpoint | https://q.scrape.do/api/v1/jobs | | Authentication | X-Token header | Request Parameters: Targets**: Array containing the Yelp business URL Super**: true (uses residential/mobile proxies for better success rate) GeoCode**: us (targets US-based content) Device**: desktop Render**: JavaScript rendering enabled with networkidle2 wait condition Function: Initiates comprehensive business data extraction from Yelp with headless browser rendering to handle dynamic content. 3. 🔧 Parse Yelp HTML | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Extracts structured business data from raw HTML | | Mode | Run once for each item | Function: Parses the scraped HTML content using regex patterns and JSON-LD extraction to retrieve: Business name Overall rating Review count Phone number Full address Price range Categories Website URL Business hours Image URLs 4. 📊 Store to Google Sheet | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores scraped business data for analysis and storage | | Operation | Append rows | | Target | "Yelp Scraper Data - Scrape.do" sheet | Data Mapping: Business Name, Overall Rating, Reviews Count Business URL, Phone, Address Price Range, Categories, Website Hours, Images/Videos URLs, Scraped Timestamp Workflow Flow Form Input → Create Scrape.do Job → Parse Yelp HTML → Store to Google Sheet │ │ │ │ ▼ ▼ ▼ ▼ User submits API creates job JavaScript code Data appended Yelp URL with JS rendering extracts fields to spreadsheet Configuration Requirements API Keys & Credentials | Credential | Purpose | |------------|---------| | Scrape.do API Token | Required for Yelp business scraping with anti-bot bypass | | Google Sheets OAuth2 | For data storage and export access | | n8n Form Webhook | For user input collection | Setup Parameters | Parameter | Description | |-----------|-------------| | YOUR_SCRAPEDO_TOKEN | Your Scrape.do API token (appears in 3 places) | | YOUR_GOOGLE_SHEET_ID | Target spreadsheet identifier | | YOUR_GOOGLE_SHEETS_CREDENTIAL_ID | OAuth2 authentication reference | Key Features 🛡️ Anti-Bot Bypass Technology Residential Proxy Rotation**: 110M+ proxies across 150 countries WAF Bypass**: Handles Cloudflare, Akamai, DataDome, and PerimeterX Dynamic TLS Fingerprinting**: Authentic browser signatures CAPTCHA Handling**: Automatic bypass for uninterrupted scraping 🌐 JavaScript Rendering Full headless browser support for dynamic Yelp content networkidle2 wait condition ensures complete page load Custom wait times for complex page elements Real device fingerprints for detection avoidance 📊 Comprehensive Data Extraction | Field | Description | Example | |-------|-------------|---------| | name | Business name | "Joe's Pizza Restaurant" | | overall_rating | Average customer rating | "4.5" | | reviews_count | Total number of reviews | "247" | | url | Original Yelp business URL | "https://www.yelp.com/biz/..." | | phone | Business phone number | "(555) 123-4567" | | address | Full street address | "123 Main St, New York, NY 10001" | | price_range | Price indicator | "$$" | | categories | Business categories | "Pizza, Italian, Delivery" | | website | Business website URL | "https://joespizza.com" | | hours | Operating hours | "Mon-Fri 11:00-22:00" | | images_videos_urls | Media content links | "https://s3-media1.fl.yelpcdn.com/..." | | scraped_at | Extraction timestamp | "2025-01-15T10:30:00Z" | 🗂️ Centralized Data Storage Automatic Google Sheets export Organized business data format with 12 data fields Historical scraping records with timestamps Easy sharing and collaboration Use Cases 📈 Market Research Competitor business analysis Local market intelligence gathering Industry benchmark establishment Service offering comparison 🎯 Lead Generation Business contact information extraction Potential client identification Market opportunity assessment Sales prospect development 📊 Business Intelligence Customer sentiment analysis through ratings Competitor performance monitoring Market positioning research Brand reputation tracking 📍 Location Analysis Geographic business distribution Local competition assessment Market saturation evaluation Expansion opportunity identification Technical Notes | Specification | Value | |--------------|-------| | Processing Time | 15-45 seconds per business URL | | Data Accuracy | 95%+ for publicly available business information | | Success Rate | 99.98% (Scrape.do guarantee) | | Proxy Pool | 110M+ residential, mobile, and datacenter IPs | | JS Rendering | Full headless browser with networkidle2 wait | | Data Format | JSON with structured field mapping | | Storage Format | Structured Google Sheets with 12 predefined columns | Setup Instructions Step 1: Import Workflow Copy the JSON workflow configuration Import into n8n: Workflows → Import from JSON Paste configuration and save Step 2: Configure Scrape.do Get your API token: Sign up at Scrape.do Navigate to Dashboard → API Token Copy your token Update workflow references (3 places): 🔍 Create Scrape.do Job node → Headers → X-Token 📡 Check Job Status node → Headers → X-Token 📥 Fetch Task Results node → Headers → X-Token Replace YOUR_SCRAPEDO_TOKEN with your actual API token. Step 3: Configure Google Sheets Create target spreadsheet: Create new Google Sheet named "Yelp Business Data" or similar Add header row with columns: name | overall_rating | reviews_count | url | phone | address | price_range | categories | website | hours | images_videos_urls | scraped_at Copy the Sheet ID from URL (the long string between /d/ and /edit) Set up OAuth2 credentials: In n8n: Credentials → Add Credential → Google Sheets OAuth2 Complete the Google authentication process Grant access to Google Sheets Update workflow references: Replace YOUR_GOOGLE_SHEET_ID with your actual Sheet ID Update YOUR_GOOGLE_SHEETS_CREDENTIAL_ID with credential reference Step 4: Test and Activate Test with sample URL: Use a known Yelp business URL (e.g., https://www.yelp.com/biz/example-business-city) Submit through the form trigger Monitor execution progress in n8n Verify data appears in Google Sheet Activate workflow: Toggle workflow to "Active" Share form URL with users Sample Business Data The workflow captures comprehensive business information including: | Category | Data Points | |----------|-------------| | Basic Information | Name, category, location | | Performance Metrics | Ratings, review counts, popularity | | Contact Details | Phone, website, address | | Visual Content | Photos, videos, gallery URLs | | Operational Data | Hours, services, price range | Advanced Configuration Batch Processing Modify the input to accept multiple URLs by updating the job creation body: { "Targets": [ "https://www.yelp.com/biz/business-1", "https://www.yelp.com/biz/business-2", "https://www.yelp.com/biz/business-3" ], "Super": true, "GeoCode": "us", "Render": { "WaitUntil": "networkidle2", "CustomWait": 3000 } } Enhanced Rendering Options For complex Yelp pages, add browser interactions: { "Render": { "BlockResources": false, "WaitUntil": "networkidle2", "CustomWait": 5000, "WaitSelector": ".biz-page-header", "PlayWithBrowser": [ { "Action": "Scroll", "Direction": "down" }, { "Action": "Wait", "Timeout": 2000 } ] } } Notification Integration Add alert mechanisms: Email notifications for completed scrapes Slack messages for team updates Webhook triggers for external systems Error Handling Common Issues | Issue | Cause | Solution | |-------|-------|----------| | Invalid URL | URL is not a valid Yelp business page | Ensure URL format: https://www.yelp.com/biz/... | | 401 Unauthorized | Invalid or missing API token | Verify X-Token header value | | Job Timeout | Page too complex or slow | Increase CustomWait value | | Empty Data | HTML parsing failed | Check page structure, update regex patterns | | Rate Limiting | Too many concurrent requests | Reduce request frequency or upgrade plan | Troubleshooting Steps Verify URLs: Ensure Yelp business URLs are correctly formatted Check Credentials: Validate Scrape.do token and Google OAuth Monitor Logs: Review n8n execution logs for detailed errors Test Connectivity: Verify network access to all external services Check Job Status: Use Scrape.do dashboard to monitor job progress Performance Specifications | Metric | Value | |--------|-------| | Processing Time | 15-45 seconds per business URL | | Data Accuracy | 95%+ for publicly available information | | Success Rate | 99.98% (with Scrape.do anti-bot bypass) | | Concurrent Processing | Depends on Scrape.do plan limits | | Storage Capacity | Unlimited (Google Sheets based) | | Proxy Pool | 110M+ IPs across 150 countries | Scrape.do API Reference Async API Endpoints | Endpoint | Method | Purpose | |----------|--------|---------| | /api/v1/jobs | POST | Create new scraping job | | /api/v1/jobs/{jobID} | GET | Check job status | | /api/v1/jobs/{jobID}/{taskID} | GET | Retrieve task results | | /api/v1/me | GET | Get account information | Job Status Values | Status | Description | |--------|-------------| | queuing | Job is being prepared | | queued | Job is in queue waiting to be processed | | pending | Job is currently being processed | | rotating | Job is retrying with different proxies | | success | Job completed successfully | | error | Job failed | | canceled | Job was canceled by user | For complete API documentation, visit: Scrape.do Documentation Support & Resources Scrape.do Documentation**: https://scrape.do/documentation/ Scrape.do Dashboard**: https://dashboard.scrape.do/ n8n Documentation**: https://docs.n8n.io/ Google Sheets API**: https://developers.google.com/sheets/api This workflow is powered by Scrape.do - Reliable, Scalable, Unstoppable Web Scraping
by Robert Breen
Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order:* Use *Manual mapping** in the Google Sheets node Binary not found:* Ensure each *Convert to File* node’s binaryPropertyName matches what *Merge** expects Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Mutasem
Use case This workflow snoozes any Todoist tasks, by moving them into a Snoozed todoist list and unsnoozes them 3 days before due date. Helps keep inbox clear only of tasks you need to worry about soon. How to setup Add your Todoist creds Create a Todoist project called snoozed Set the project ids in the relevant nodes Add due dates to your tasks in Inbox. Watch them disappear to snoozed. Set their date to tomorrow, watch it return to inbox. How to adjust this template Adjust the timeline.. Maybe 3 days is too close for you. Works mostly for me :)
by Deborah
Want to learn the basics of n8n? Our comprehensive quick quickstart tutorial is here to guide you through the basics of n8n, step by step. Designed with beginners in mind, this tutorial provides a hands-on approach to learning n8n's basic functionalities.
by Rajeet Nair
Automatically converts CSV/XLSX files into a fully validated database schema using AI, generating SQL scripts, ERD diagrams, a data dictionary, and load plans to accelerate database design and data onboarding. EXPLANATION This workflow automates the end-to-end process of transforming raw CSV or Excel data into a production-ready relational database schema. It begins by accepting file uploads through a webhook, detecting file type, and extracting structured data. The workflow performs data cleaning and deep profiling to analyze column types, uniqueness, null values, and patterns. A column analysis engine identifies candidate primary keys and potential relationships. An AI agent then generates a normalized schema by organizing data into tables, assigning appropriate SQL data types, and defining primary and foreign keys. The schema is validated using rule-based checks to ensure data integrity, correct relationships, and proper normalization. If validation fails, the workflow automatically refines the schema through a revision loop. Once validated, it generates SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan that determines the correct order for inserting data. Finally, all outputs are combined and returned via webhook as a structured response, making the workflow ideal for rapid database creation, data migration, and AI-assisted data modeling. Overview This workflow automatically converts CSV or Excel files into a production-ready relational database schema using AI and rule-based validation. It analyzes uploaded data to detect column types, relationships, and data quality, then generates a normalized schema with proper keys and constraints. The output includes SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan. This eliminates manual schema design and accelerates database setup from raw data. How It Works File Upload (Webhook) Accepts CSV or XLSX files and initializes workflow configuration such as thresholds and retry limits. File Extraction Detects file format and extracts rows into structured JSON format. Data Cleaning & Profiling Cleans data, removes duplicates, normalizes values, and computes column statistics such as null percentage and uniqueness. Column Analysis Engine Identifies candidate primary keys, analyzes cardinality, and suggests potential foreign key relationships. AI Schema Generation Uses an AI agent to design normalized tables, assign SQL data types, and define primary keys, foreign keys, and constraints. Validation Layer Validates schema integrity by checking data types, primary key uniqueness, foreign key overlap, and constraint consistency. Revision Loop If validation fails, the workflow sends feedback to the AI agent and regenerates the schema until it meets requirements. Schema Output Generation Generates SQL DDL scripts, ERD diagrams, a data dictionary, and a load plan. Load Plan Engine Determines the correct order for inserting data and detects circular dependencies. Combine & Explain Merges all outputs and optionally provides AI-generated explanations of schema decisions. Response Output Returns all generated artifacts as a structured JSON response via webhook. Setup Instructions Activate the workflow and copy the webhook URL Send a POST request with a CSV or XLSX file Configure OpenAI credentials for the AI agent Adjust thresholds if needed (FK overlap, retries, confidence) Execute the workflow and review outputs Use Cases Automatically generate database schemas from CSV/Excel files Accelerate data migration and onboarding pipelines Rapidly prototype relational database designs Reverse engineer structured schemas from raw datasets AI-assisted data modeling and normalization Requirements n8n (latest version recommended) OpenAI API credentials LangChain nodes enabled CSV or XLSX input file
by Harshil Agrawal
This workflow demonstrates how to use the Netlify Trigger node to capture form submissions and add it Airtable. You can reuse the workflow to add the data to another similar database by replacing the Airtable node with the corresponding node. Netlify Trigger node: This node triggers the workflow when a new form is submitted. Select your site from the Site Name/ID dropdown list and the form from the Form ID dropdown list. Set node: This node extract the required data from the Netlify Trigger node. In this example, we only want to add the Name, Email, and Role of the user. Airtable node: This node appends the data to Airtable. If you want the data to Google Sheets or a database, replace this node with the corresponding node.
by Peter
Read a value by key from a local json file. Related workflow: WriteKey Create a subfolder in your n8n homedir: /home/node/.n8n/local-files. In docker look at the data path and create a subfolder local-files. Set the correct access rights chmod 1000.1000 local-files. Put the workflow code in a new workflow named GetKey. Create another workflow with a function item: return { file: '/4711.json', // 4711 should be your workflow id key: 'MyKey', default: 'Optional returned value if key is empty / not exists' } Pipe the function item to an Execution Workflow that calls the GetKey workflow. It would be nice if we could get someday a shiny built-in n8n node that does the job. :)
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Jitesh Dugar
Newsletter Sign-up with Email Verification & Welcome Email Automation 📋 Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. ✨ Key Features Email Verification Real-time validation** using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails** with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging** to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing 🎯 Use Cases Newsletter sign-ups** on websites and landing pages Lead generation** forms with quality control Marketing campaigns** requiring verified email lists Community building** with automated onboarding SaaS product launches** with email collection Content creator** audience building E-commerce** customer list management 📊 What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up 🔧 Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting 🚀 Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included 📈 Benefits ✅ Improve Email Deliverability - Remove invalid emails before sending campaigns ✅ Reduce Bounce Rates - Only send to verified, active email addresses ✅ Save Money - Don't waste email credits on invalid addresses ✅ Better Analytics - Track conversion rates by source ✅ Professional Onboarding - Personalized welcome experience ✅ Scalable Solution - Handles high-volume sign-ups automatically ✅ Data Quality - Build a clean, high-quality subscriber list 🎨 Customization Options Email Template** - Fully customizable HTML design Verification Threshold** - Adjust score requirements Brand Colors** - Match your company branding Confirmation Flow** - Add double opt-in if desired Multiple Sources** - Track different signup forms Language** - Easily translate email content 📦 What's Included ✅ Complete n8n workflow JSON (ready to import) ✅ Google Sheets template structure ✅ Responsive HTML email template ✅ Setup documentation with screenshots ✅ Troubleshooting guide ✅ Customization examples 🔒 Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export 💡 Quick Stats 12 Nodes** - Fully automated workflow 3 Data Paths** - Valid, invalid, and incomplete submissions 100% Uptime** - When properly configured Instant Processing** - Real-time email verification Unlimited Scale** - Based on your API limits 🏆 Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers 🌟 Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today!
by WeblineIndia
APK Security Scanner & PDF Report Generator This workflow automatically analyzes any newly uploaded APK file and produces a clean, professional PDF security report. When an APK appears in Google Drive, the workflow downloads it, sends it to MobSF for security scanning, summarizes the results, generates an HTML report using AI, converts it into a PDF via PDF.co and finally saves the PDF back to Google Drive. Quick Start: Fastest Way to Use This Workflow Set up a Google Drive folder for uploading APKs. Install MobSF using Docker and copy your API key. Add credentials for Google Drive, MobSF, OpenAI and PDF.co in n8n. Import the workflow JSON. Update node credentials. Upload an APK to the watched folder and let the automation run. What It Does This workflow provides a complete automated pipeline for analyzing Android APK files. It removes the manual process of scanning apps, extracting security insights, formatting reports and distributing results. Each step is designed to streamline application security checks for development teams, QA engineers and product managers. Once the workflow detects a new APK in Google Drive, it passes the file to MobSF for a detailed static analysis. The workflow extracts the results, transforms them into a clear and well-structured HTML report using AI and then converts the report into a PDF. This ensures the end-user receives a polished audit-ready security document with zero manual involvement. Who’s It For This workflow is ideal for: Mobile development teams performing security checks on apps. QA and testing teams validating APK builds before release. DevSecOps engineers needing automated, repeatable security audits. Software companies generating compliance and audit documentation. Agencies reviewing client apps for vulnerabilities. Requirements to Use This Workflow An n8n instance (self-hosted or cloud) A Google Drive account with a folder for APK uploads Docker installed to run MobSF locally MobSF API key OpenAI API key PDF.co API key Basic understanding of n8n nodes and credentials setup How It Works & Setup Instructions Step 1 — Prepare Google Drive Create a folder specifically for APK uploads. Configure the Watch APK Uploads (Google Drive) node to monitor this folder for new files. Step 2 — Install and Run MobSF Using Docker Install Docker and run: docker run -it --rm -p 8000:8000 \ -v $(pwd)/mobsf:/home/mobsf/.MobSF \ opensecurity/mobile-security-framework-mobsf Open MobSF at http://localhost:8000 and copy your API key. Step 3 — Add Credentials in n8n Add credentials for: Google Drive MobSF (API key in headers) OpenAI PDF.co Step 4 — Configure Malware Scanning Upload APK to Analyzer (MobSF Upload API)** sends the file. Start Security Scan (MobSF Scan API)** triggers the vulnerability scan. Step 5 — Summarize & Generate HTML Report Summarize MobSF Report (JS Code)** extracts key vulnerabilities. Generate HTML Report (GPT Model)** formats them in a structured report. Clean HTML Output (JS Code)** removes escaped characters. Step 6 — Convert HTML to PDF Use Generate PDF (PDF.co API) to convert the HTML to PDF. Step 7 — Save Final Report Download using Download Generated PDF, then upload via Upload PDF to Google Drive. How To Customize Nodes Google Drive Trigger:** Change the folder ID to watch a different upload directory. MobSF API Nodes:** Update URLs if MobSF runs on another port or server. AI Report Generator:** Modify prompt instructions to change the writing style or report template. PDF Generation:** Edit margins, page size, or output filename in the PDF.co node. Save Location:** Change Google Drive folder where the final PDF is stored. Add-Ons You can extend this workflow with: Slack or Email Notifications** when a report is ready Automatic naming conventions** (e.g., report-{{date}}-{{app_name}}.pdf) Saving reports into Airtable or Notion** Multi-file batch scanning** VirusTotal scan integration** before generating the PDF Use Case Examples Automated security scanning for every new build generated by CI/CD. Pre-release vulnerability checks for client-delivered APKs. Compliance documentation generation for internal security audits. Bulk scanning of legacy APKs for modernization projects. Creating professional PDF security reports for customers. (Many more use cases can be built using the same workflow foundation.) Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------- | -------------------------- | ---------------------------------------------------------- | | MobSF API call fails | Wrong API key or URL | Check MobSF is running and API key is correct. | | PDF not generated | Invalid HTML or PDF.co key | Validate HTML output and verify PDF.co credentials. | | Workflow not triggering | Wrong Google Drive folder | Reconfigure Drive Trigger node with the correct folder ID. | | APK upload fails | File not in binary mode | Ensure HTTP Upload node is using “Binary Data” correctly. | | Scan returns empty data | MobSF not fully started | Wait for full MobSF startup logs before scanning. | Need Help? If you need assistance setting up this workflow, customizing it or adding advanced features such as Slack alerts, CI/CD integration or bulk scanning, our n8n workflow development team at WeblineIndia can help. We specialize in building secure, scalable, automation-driven workflows on n8n for businesses of all sizes. Contact us anytime for support or to build custom workflow automation solutions.