by Thomas
"I used to spend hours every week just copy-pasting product descriptions to find the right tariff codes for our international shipments. It was tedious and prone to errors." - Accounting specialist. This workflow eliminates that manual work entirely. It automatically finds customs tariff numbers (also known as HS Codes or "Zolltarifnummern") for your products and enriches your data in Google Sheets. It offers two powerful modes: bulk processing for entire product lists and an on-demand chat interface for quick single lookups. new features added the API score in percentage (80 to 100% is perfect, 70-80% still good) added description of the found HS Code to better verify the accuracy -please keep in mind, that is still a beta https://www.zolltarifnummern.de/services/api What this workflow does Bulk Enrichment from Google Sheets:** Reads a list of product descriptions from a specified Google Sheet. External API Lookup:** For each product, it queries the zolltarifnummern.de API to find the most relevant customs tariff number. Automated Data Update:** Writes the found tariff numbers back into the correct row in your Google Sheet. On-Demand Single Lookup:** Use the integrated Chat Trigger to instantly look up a tariff number for a single product description without leaving n8n. Completion Notification:** Sends a confirmation email via Gmail once the bulk processing job is finished. Nodes Used Google Sheets HTTP Request Loop Over Items (Split in Batches) Set Gmail Chat Trigger Manual Trigger Preparation A Google Sheet prepared with at least two columns: one for your product descriptions (e.g., ProductDescription) and an empty one for the results (e.g., TariffCode). How to set up this workflow Configure Google Sheets (Read): Open the "Read Item Descriptions" node. Select your Google Sheets credentials. Enter your Spreadsheet ID and the name of the sheet containing your product data. Make sure the "Columns to Read" field includes the name of your product description column. Configure Google Sheets (Update): Open the "Write Customs Tariff to Sheet" node. Select the same Google Sheets credentials. Enter the same Spreadsheet ID and Sheet Name. Under Columns, set Matching Columns to your product description column name. This is crucial for updating the correct rows. Configure Email Notification: Open the "Send Completion Email" (Gmail) node. Select your Gmail credentials. In the Send To field, enter the email address where you want to receive the completion notification. Run the Workflow: For Bulk Processing: Activate the workflow and execute the "Abfrage starten" (Start Query) Manual Trigger. For a Single Lookup: Use the Chat Trigger. Open the chat pane, type a product description, and hit send. The workflow will return the suggested tariff number.
by Oneclick AI Squad
This workflow automates flight price comparison across multiple booking platforms (Kayak, Skyscanner, Expedia, Google Flights). It accepts natural language queries, extracts flight details using NLP, scrapes prices in parallel, identifies the best deals, and sends professional email reports with comprehensive price breakdowns and booking links. π¦ What You'll Get A fully functional, production-ready n8n workflow that: β Compares flight prices across 4 major platforms (Kayak, Skyscanner, Expedia, Google Flights) β Accepts natural language requests ("Flight from NYC to London on March 25") β Sends beautiful email reports with best deals β Returns real-time JSON responses for web apps β Handles errors gracefully with helpful messages β Includes detailed documentation with sticky notes π Quick Setup (3 Steps) Step 1: Import Workflow to n8n Copy the JSON from the first artifact (workflow file) Open n8n β Go to Workflows Click "Import from File" β Paste JSON β Click Import β Workflow imported successfully! Step 2: Setup Python Scraper On your server (where n8n SSH nodes will connect): Navigate to your scripts directory cd /home/oneclick-server2/ Create the scraper file nano flight_scraper.py Copy the entire Python script from the second artifact Save with Ctrl+X, then Y, then Enter Make it executable chmod +x flight_scraper.py Install required packages pip3 install selenium Install Chrome and ChromeDriver sudo apt update sudo apt install -y chromium-browser chromium-chromedriver Test the scraper python3 flight_scraper.py JFK LHR 2025-03-25 2025-03-30 round-trip 1 economy kayak Expected Output: Delta|$450|7h 30m|0|10:00 AM|6:30 PM|https://kayak.com/... British Airways|$485|7h 45m|0|11:30 AM|8:15 PM|https://kayak.com/... ... Step 3: Configure n8n Credentials A. Setup SMTP (for sending emails): In n8n: Credentials β Add Credential β SMTP Fill in details: Host: smtp.gmail.com Port: 587 User: your-email@gmail.com Password: [Your App Password] For Gmail Users: Enable 2FA: https://myaccount.google.com/security Create App Password: https://myaccount.google.com/apppasswords Use the 16-character password in n8n B. Setup SSH (already configured if you used existing credentials): In workflow, SSH nodes use: ilPh8oO4GfSlc0Qy Verify credential exists and points to correct server Update path if needed: /home/oneclick-server2/ C. Activate Workflow: Click the workflow toggle β Active β Webhook is now live! π― How to Use Method 1: Direct Webhook Call curl -X POST https://your-n8n-domain.com/webhook/flight-price-compare \ -H "Content-Type: application/json" \ -d '{ "message": "Flight from Mumbai to Dubai on 15th March, round-trip returning 20th March", "email": "user@example.com", "name": "John Doe" }' Response: { "success": true, "message": "Flight comparison sent to user@example.com", "route": "BOM β DXB", "bestPrice": 450, "airline": "Emirates", "totalResults": 18 } Method 2: Natural Language Queries The workflow understands various formats: β All these work: "Flight from New York to London on 25th March, one-way" "NYC to LHR March 25 round-trip return March 30" "I need a flight from Mumbai to Dubai departing 15th March" "JFK LHR 2025-03-25 2025-03-30 round-trip" Supported cities (auto-converts to airport codes): New York β JFK London β LHR Mumbai β BOM Dubai β DXB Singapore β SIN And 20+ more cities Method 3: Structured JSON { "from": "JFK", "to": "LHR", "departure_date": "2025-03-25", "return_date": "2025-03-30", "trip_type": "round-trip", "passengers": 1, "class": "economy", "email": "user@example.com", "name": "John" } π§ Email Report Example Users receive an email like this: FLIGHT PRICE COMPARISON Route: JFK β LHR Departure: 25 Mar 2025 Return: 30 Mar 2025 Trip Type: round-trip Passengers: 1 π BEST DEAL British Airways Price: $450 Duration: 7h 30m Stops: Non-stop Platform: Kayak π° Save $85 vs highest price! π ALL RESULTS (Top 10) British Airways - $450 (Non-stop) - Kayak Delta - $475 (Non-stop) - Google Flights American Airlines - $485 (Non-stop) - Expedia Virgin Atlantic - $495 (Non-stop) - Skyscanner United - $520 (1 stop) - Kayak ... Average Price: $495 Total Results: 23 Prices subject to availability. Happy travels! βοΈ π§ Customization Options Change Scraping Platforms Add more platforms: Duplicate an SSH scraping node Change platform parameter: kayak β new-platform Add scraping logic in flight_scraper.py Connect to "Aggregate & Analyze Prices" node Remove platforms: Delete unwanted SSH node Workflow continues with remaining platforms Modify Email Format Edit the "Format Email Report" node: // Change to HTML format const html = ` <!DOCTYPE html> <html> <body> Flight Deals Best price: ${bestDeal.currency}${bestDeal.price} </body> </html> `; return [{ json: { subject: "...", html: html, // Instead of text ...data } }]; Then update "Send Email Report" node: Change emailFormat to html Use {{$json.html}} instead of {{$json.text}} Add More Cities/Airports Edit "Parse & Validate Flight Request" node: const airportCodes = { ...existing codes..., 'berlin': 'BER', 'rome': 'FCO', 'barcelona': 'BCN', // Add your cities here }; Change Timeout Settings In each SSH node, add: "timeout": 30000 // 30 seconds π Troubleshooting Issue: "No flights found" Possible causes: Scraper script not working Website structure changed Dates in past Invalid airport codes Solutions: Test scraper manually cd /home/oneclick-server2/ python3 flight_scraper.py JFK LHR 2025-03-25 "" one-way 1 economy kayak Check if output shows flights If no output, check Chrome/ChromeDriver installation Issue: "Connection refused" (SSH) Solutions: Verify SSH credentials in n8n Check server is accessible: ssh user@your-server Verify path exists: /home/oneclick-server2/ Check Python installed: which python3 Issue: "Email not sending" Solutions: Verify SMTP credentials Check email in spam folder For Gmail: Confirm App Password is used (not regular password) Test SMTP connection: telnet smtp.gmail.com 587 Issue: "Webhook not responding" Solutions: Ensure workflow is Active (toggle on) Check webhook path: /webhook/flight-price-compare Test with curl command (see "How to Use" section) Check n8n logs: Settings β Log Streaming Issue: "Scraper timing out" Solutions: In flight_scraper.py, increase wait times time.sleep(10) # Instead of time.sleep(5) Or increase WebDriverWait timeout WebDriverWait(driver, 30) # Instead of 20 π Understanding the Workflow Node-by-Node Explanation 1. Webhook - Receive Flight Request Entry point for all requests Accepts POST requests Path: /webhook/flight-price-compare 2. Parse & Validate Flight Request Extracts flight details from natural language Converts city names to airport codes Validates required fields Returns helpful errors if data missing 3. Check If Request Valid Routes to scraping if valid Routes to error response if invalid 4-7. Scrape [Platform] (4 nodes) Run in parallel for speed Each calls Python script with platform parameter Continue on failure (don't break workflow) Return pipe-delimited flight data 8. Aggregate & Analyze Prices Collects all scraper results Parses flight data Finds best overall deal Finds best non-stop flight Calculates statistics Sorts by price 9. Format Email Report Creates readable text report Includes route details Highlights best deal Lists top 10 results Shows statistics 10. Send Email Report Sends formatted email to user Uses SMTP credentials 11. Webhook Response (Success) Returns JSON response immediately Includes best price summary Confirms email sent 12. Webhook Response (Error) Returns helpful error message Guides user on what's missing π¨ Workflow Features β Included Features Natural Language Processing**: Understands flexible input formats Multi-Platform Comparison**: 4 major booking sites Parallel Scraping**: All platforms scraped simultaneously Error Handling**: Graceful failures, helpful messages Email Reports**: Professional format with all details Real-Time Responses**: Instant webhook feedback Sticky Notes**: Detailed documentation in workflow Airport Code Mapping**: Auto-converts 20+ cities π§ Not Included (Easy to Add) Price Alerts**: Monitor price drops (add Google Sheets) Analytics Dashboard**: Track searches (add Google Sheets) SMS Notifications**: Send via Twilio Slack Integration**: Post to channels Database Logging**: Store searches in PostgreSQL Multi-Currency**: Show prices in the user's currency π‘ Pro Tips Tip 1: Speed Up Scraping Use faster scraping service (like ScraperAPI): // Replace SSH nodes with HTTP Request nodes { "url": "http://api.scraperapi.com", "qs": { "api_key": "YOUR_KEY", "url": "https://kayak.com/flights/..." } } Tip 2: Cache Results Add caching to avoid duplicate scraping: // In Parse node, check cache first const cacheKey = ${origin}-${dest}-${departureDate}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.time < 3600000) { return cached.data; // Use 1-hour cache } Tip 3: Add More Platforms Easy to add Momondo, CheapOair, etc.: Add function in flight_scraper.py Add SSH node in workflow Connect to aggregator Tip 4: Improve Date Parsing Handle more formats: // Add to Parse node const formats = [ 'DD/MM/YYYY', 'MM-DD-YYYY', 'YYYY.MM.DD', // Add your formats ];
by Oneclick AI Squad
This n8n workflow transforms uploaded health details or lab reports received via email into a customized diet plan using AI analysis, then sends the plan back to the user via email, optimizing nutrition based on individual health data. Why Use It This workflow automates the creation of personalized diet plans from health data, saving time for nutritionists, improving patient outcomes with AI-driven insights, and providing a convenient email delivery system for users. How to Import It Download the Workflow JSON: Obtain the workflow file from the n8n template or create it based on this document. Import into n8n: In your n8n instance, go to "Workflows," click the three dots, select "Import from File," and upload the JSON. Configure Credentials: Set up email (e.g., IMAP for receiving, SMTP for sending), AI model, and optional Google Sheets credentials in n8n. Run the Workflow: Test with a sample email containing health data and verify the diet plan delivery. System Architecture Data Input Pipeline**: Email Trigger: Initiates the workflow when a health report email is received. Extract Health Data: Parses uploaded health details or lab reports from the email. AI Analysis Flow**: Send to AI Model: Analyzes health data using an AI model. Generate Diet Plan: Creates a customized diet plan based on AI output. Delivery Flow**: Prepare Email Content: Formats the diet plan for email delivery. Send Diet Plan Email: Sends the plan to the user via SMTP. Update Log (Optional): Logs the process in a Google Sheet. Google Sheet Structure Columns**: timestamp: Date and time of the diet plan generation. user_email: Userβs email address for receiving the plan. health_data: Extracted health metrics or lab report summary. condition: AI-identified health condition. diet_plan: Generated diet plan summary. sent_status: Status of email delivery (e.g., Sent, Failed). Customization Add SMS Alerts**: Integrate Twilio or WhatsApp for additional notifications. Enhance AI**: Train the AI model with more nutritional data for better plans. Include Recipes**: Add a node to suggest recipes based on the diet plan. Multilingual Support**: Adapt email content for different languages. Integration with Apps**: Connect to fitness apps (e.g., MyFitnessPal) for tracking. Requirements Email Service**: IMAP (e.g., Gmail) for receiving health data emails and SMTP for sending diet plans. AI Model**: Ollama or similar for health analysis and diet plan generation (requires API access). n8n Instance**: With email (IMAP/SMTP) and AI connectors configured. Internet Connection**: To access email and AI APIs. Optional Google Sheets Account**: For logging health data and diet plans. User Consent**: Ensure compliance with data privacy laws (e.g., HIPAA) for health data. Want a tailored workflow for your business? Our experts can craft it quickly Contact our team
by Aitor | 1Node
This automated n8n workflow streamlines lead qualification by taking structured lead data from Tally forms, enriching it with Qwen-3βs AI analysis, and promptly notifying your sales or delivery teams. It provides concise summaries, actionable insights, and highlights missing information to focus outreach efforts efficiently. The workflow includes security best practices to prevent prompt injections and ensures data integrity and privacy throughout. Requirements Tally Forms A Tally account with an active lead qualification form Webhook integration enabled to send form responses to n8n Qwen-3 Large Language Model API key and access to your chosen AI model via OpenRouter Gmail Notification Gmail account credentials connected in n8n Workflow Breakdown Trigger: Receive Tally form submission via n8n Webhook The workflow starts from a Webhook node listening for POST requests from your Tally form. Extract and map Tally form data Parse JSON to obtain fields like Company Name, Full Name, Work Email, Employee Count, Industry, Main Challenges Encountered, Goals With the Project, Urgency or Date When Solution Is Needed, Estimated Budget, and Anything Else We Should Know. Construct the Lead Qualification prompt Combine a secure system prompt with user data from the form. This prompt instructs Qwen-3 to generate summaries, identify key challenges, recommend action points, suggest follow-up questions, and more. Send notification with AI analysis Deliver the formatted message through your chosen channel(s) such as email or Slack, enabling your team to quickly act on qualified leads. Potential Improvements Capture Lead Role and Authority:** Add fields to the form for role and decision-making authority to improve lead qualification accuracy. Expand Notification Channels:** Include SMS or Microsoft Teams notifications alongside email and Slack for better team reach. Automate Lead Scoring:** Incorporate a numeric or qualitative lead score based on key input factors to prioritize follow-ups. Integrate CRM Task Creation:** Automatically create follow-up tasks or reminders in CRM systemss. πββοΈ Need Help? Feel free to contact us at 1 Node Get instant access to a library of free resources we created.
by WeblineIndia
βοΈ Advanced Equipment Health Monitor with MS Teams Integration (n8n | API | Google Sheets | MSTeams) This n8n workflow automatically monitors equipment health by fetching real-time metrics like temperature, voltage and operational status. If any of these parameters cross critical thresholds, an alert is instantly sent to a Microsoft Teams channel and the event is logged in Google Sheets. The workflow runs every 15 minutes by default. β‘ Quick Implementation Steps Import the workflow JSON into your n8n instance. Open the "Set Config" node and update: API endpoint Teams webhook URL Threshold values Google Sheet ID Activate the workflow to start receiving alerts every 15 minutes. π― Whoβs It For Renewable energy site operators (solar, wind) Plant maintenance and operations teams Remote infrastructure monitoring services IoT-integrated energy platforms Enterprise environments using Microsoft Teams π Requirements | Tool | Purpose | |------|---------| | n8n Instance | To run and schedule automation | | HTTP API | Access to your equipment or IoT platform health API | | Microsoft Teams | Incoming Webhook URL configured | | Google Sheets | Logging and analytics | | SMTP (optional) | For email-based alternatives or expansions | π§ What It Does Runs every 15 minutes** to check the latest equipment metrics. Compares values** (temperature, voltage, status) against configured thresholds. Triggers a Microsoft Teams message** when a threshold is breached. Appends the alert data** to a Google Sheet for logging and review. π§© Workflow Components Set Node:** Configures thresholds, endpoints, webhook URL and Sheet ID. Cron Node:** Triggers the check every 15 minutes. HTTP Request Node:** Pulls data from your equipment health monitoring API. IF Node:** Evaluates if conditions are within or outside defined limits. MS Teams Alert Node:** Sends structured alerts using a Teams incoming webhook. Google Sheets Node:** Logs alert details for recordkeeping and analytics. π§ How To Set Up β Step-by-Step Import Workflow: In n8n, click Import and upload the provided .json file. Update Configurations: Open the Set Config node. Replace the placeholder values: apiEndpoint: URL to fetch equipment data. teamsWebhookUrl: Your MS Teams channel webhook. temperatureThreshold: Example = 80 voltageThreshold: Example = 400 googleSheetId: Google Sheet ID (must be shared with n8n service account). Check Webhook Integration: Ensure your MS Teams webhook is properly authorized and points to a live channel. Run & Monitor: Enable the workflow and view logs/alerts. Adjust thresholds as needed. π§ͺ How To Customize | Customization | How | |---------------|-----| | Add more parameters (humidity, pressure) | Extend the HTTP + IF node conditions | | Change alert frequency | Edit the Cron node | | Use Slack or Email instead of Teams | Replace MS Teams node with Slack or Email node | | Add PDF Report Generation | Use HTML β PDF node and email the report | | Export to Database | Add a PostgreSQL or MySQL node instead of Google Sheets | β Addβons (Advanced) | Add-on | Description | |--------|-------------| | π¦ Auto-Ticketing | Auto-create issues in Jira, Trello or ClickUp for serious faults | | π Dashboard Sync | Send real-time logs to BigQuery or InfluxDB | | π§ Predictive Alerts | Use machine learning APIs to flag anomalies | | π Daily Digest | Compile all incidents into a daily summary email or Teams post | | π± Mobile Alert | Integrate Twilio for SMS alerts or WhatsApp notifications | π Example Use Cases Monitor solar inverter health for overheating or voltage drops. Alert field engineers via Teams when a wind turbine sensor fails. Log and visualize hardware issues for weekly analytics. Automate SLA compliance tracking through timely notifications. Ensure distributed infrastructure (e.g., substations) are always in operational range. π§― Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | No Teams alert | Incorrect webhook URL or formatting | Recheck the Teams webhook and payload | | Workflow not triggering | Cron node misconfigured | Ensure itβs set to run every 15 mins and workflow is active | | Google Sheet not updating | Sheet ID is wrong or not shared | Share Sheet with your n8n Google service account | | No data from API | Endpoint URL is down or wrong | Test the endpoint manually with Postman or browser | π Need Assistance? Need help tailoring this to your exact equipment type or expanding the workflow? π Contact WeblineIndia β Expert automation partners for renewable energy, infrastructure and enterprise workflows.
by StΓ©phane Heckel
Keep your Google Sheets contacts in sync with SeaTable Update or Insert records in SeaTable How it works Use a Google Sheet as your central contact list. For each contact in the sheet: Check if the record already exists in SeaTable (based on email). If it exists β update the record. If it doesnβt β insert the new contact. How to use Copy the Google Sheet Template Link. Get the Google Sheet ID (the string between d/ and /edit). In the workflow, set the Sheet ID in the settings node. In SeaTable, create or update a base with a Table1 containing these fields: email firstname lastname company Configure your Google Sheets and SeaTable credentials in n8n. Add your own contacts to the Google Sheet & run the workflow Requirements Google credentials (for Sheets access) SeaTable account (Cloud) n8n (tested on version 1.105.2, Ubuntu) Example use cases Maintain a central CRM-like database in SeaTable. Ensure consistent contact data when collecting leads in Google Sheets. Automate record deduplication (prevent duplicate entries). Need Help? Join the discussion here or contact me directly on LinkedIn. Ask the community in the n8n Forum.
by Nguyen Thieu Toan
π¬ TikTok Video Downloader (No Watermark) - Telegram Bot > Download TikTok videos instantly without watermarks via Telegram > Fast, reliable, and user-friendly automated workflow β¨ What This Workflow Does This powerful automation turns your Telegram bot into a TikTok video downloader. Simply send any TikTok link, and the bot will: β Validate the URL automatically β‘ Extract video without watermark π Display video statistics (views, likes, author) π Send the clean video file directly to you No ads. No watermarks. Pure automation magic. π― Key Features | Feature | Description | |---------|-------------| | π Smart Validation | Automatically checks if the link is a valid TikTok URL | | π¬ Real-time Feedback | Keeps users informed with status messages at every step | | β οΈ Error Handling | Catches and explains errors in user-friendly language | | π Video Analytics | Shows author name, view count, and likes | | π₯ High Quality | Downloads original video quality without TikTok watermark | | β‘ Fast Processing | Optimized HTTP requests with proper headers and timeouts | π§ How It Works Workflow Flow Diagram π± User sends TikTok link β β URL Validation ββ Valid β Continue ββ Invalid β Send error message β π¬ Send "Processing..." status β π Fetch TikTok page HTML β π Extract video URL from page data β β¬οΈ Download video file (no watermark) β π€ Send video to user with stats Technical Process Trigger Reception: Telegram webhook receives user message URL Validation: IF node checks for tiktok.com or vm.tiktok.com domains User Feedback: Bot sends "uploading video..." chat action + status message Variable Configuration: Stores chat ID and video URL for later use HTML Fetching: HTTP request to TikTok with browser-like headers Data Extraction: JavaScript code parses UNIVERSAL_DATA_FOR_REHYDRATION JSON Video Download: HTTP request with proper cookies and referrer headers Delivery: Telegram sends video file with formatted caption including stats Error Handling Strategy Each critical node (HTTP requests, code execution) has error output enabled: On Success**: Continues to next processing step On Error**: Routes to "Format Error" β "Send Error Message" path User Experience**: Clear, actionable error messages instead of silent failures π Set Up Steps Prerequisites β n8n instance (v1.116.0 or higher) β Telegram Bot Token (Create via @BotFather) β Basic understanding of n8n workflows Step 1: Import Workflow Copy the workflow JSON In n8n, click "+ Add workflow" β "Import from JSON" Paste the JSON and click "Import" Step 2: Configure Telegram Credentials Click on any Telegram node Select "Create New Credential" in the Credentials dropdown Enter your Bot Token from @BotFather Click "Save" All Telegram nodes will automatically use this credential Step 3: Enable Error Handling β οΈ CRITICAL You MUST manually configure error outputs on these 3 nodes: Node: "Get TikTok Page HTML" Click the node β Settings tab Find "On Error" section Select "Continue With Error Output" Click Save Node: "Extract Video URL" Click the node β Settings tab Set "On Error" to "Continue With Error Output" Click Save Node: "Download Video File" Click the node β Settings tab Set "On Error" to "Continue With Error Output" Click Save > π‘ Why? n8n cannot import error handling settings via JSON. This manual step ensures errors are caught instead of crashing the workflow. Step 4: Activate Workflow Click the "Active" toggle in the top-right corner The workflow is now listening for Telegram messages Step 5: Test Your Bot Open Telegram and find your bot Send a TikTok link like: https://www.tiktok.com/@user/video/123456789 Watch the magic happen! π π§ͺ Testing Scenarios | Test Case | Input | Expected Output | |-----------|-------|----------------| | Valid Video | Working TikTok link | β Video file + stats caption | | Invalid URL | hello world | β "Please send valid TikTok link" | | Deleted Video | Link to deleted video | β "Video data not found" error | | Private Video | Private account video | β "Video may be private" error | | Short Link | https://vm.tiktok.com/abc | β Resolves and downloads | π¨ Customization Ideas Change Language Edit text in Telegram nodes to translate messages: "β³ Downloading video..." β "β³ Δang tαΊ£i video..." Add Video Compression Insert a Compress node between "Download Video File" and "Send Video to User" for smaller files. Store Statistics Add a Google Sheets node after "Extract Video URL" to log: Video URL Author Views/Likes Download timestamp Multi-Platform Support Duplicate the workflow and modify URL validation + extraction logic for Instagram, YouTube Shorts, etc. Rate Limiting Add a Wait node (2 seconds) before "Get TikTok Page HTML" to avoid IP bans. π Troubleshooting Problem: Bot doesn't respond β Check if workflow is Active β Verify Telegram credentials are correct β Check Executions tab for errors Problem: "Video data not found" error β TikTok may have changed their HTML structure β Update the regex in "Extract Video URL" node β Check if video is actually deleted/private Problem: Download fails β Ensure "On Error" is set to "Continue With Error Output" β Check if your IP is blocked by TikTok (use VPN) β Verify headers in "Download Video File" node Problem: Error messages not appearing β Double-check error output connections (red dots) β Make sure "Format Error" node references correct variables β Test by intentionally breaking a node (invalid URL) π Performance Metrics | Metric | Value | |--------|-------| | Average Processing Time | 5-10 seconds | | Success Rate | ~95% (valid public videos) | | Max Video Size | Limited by Telegram (50MB) | | Concurrent Users | Unlimited (webhook-based) | π Privacy & Security β No Data Storage: Videos are streamed directly to users, not stored β No Logging: User IDs and links are processed in-memory only β Secure Headers: Mimics browser requests to avoid detection β Error Sanitization: Sensitive data is filtered from error messages π Technical Stack n8n Version**: 1.116.0+ Node Types Used**: telegramTrigger (v1.2) telegram (v1.2) if (v2.2) set (v3.4) httpRequest (v4.2) code (v2) stickyNote (v1) External APIs**: TikTok CDN, Telegram Bot API π Learning Resources Want to understand the workflow better? Check these concepts: n8n Error Handling Telegram Bot API HTTP Request Headers JavaScript Code Node π€ Contributing Found a bug? Have an improvement idea? Test your changes thoroughly Document any new nodes or logic Share your enhanced workflow with the community Credit the original author (see below) π¨βπ» About the Author Nguyen Thieu Toan n8n Automation Specialist & Workflow Creator π Website: nguyenthieutoan.com π§ Contact: Available on website π― Specialty: Building production-ready n8n workflows for real-world automation > "I create workflows that just work. No fluff, no complexityβjust reliable automation that saves time and solves problems." Other Workflows by Nguyen Thieu Toan π΅ Spotify to YouTube Playlist Converter πΈ Instagram Media Downloader Bot π Multi-Channel Social Media Scheduler π Automated Content Repurposing Pipeline Visit nguyenthieutoan.com for more automation workflows and tutorials. π License & Attribution This workflow is provided free of charge for personal and commercial use. Required Attribution: When sharing or modifying: Include author name and website link When showcasing: Tag @nguyenthieutoan or link to nguyenthieutoan.com Not Required But Appreciated: Star the workflow on n8n community Share your success story Suggest improvements π Version History | Version | Date | Changes | |---------|------|---------| | 2.0 | 2025-10-22 | β’ Added comprehensive error handlingβ’ Improved user feedbackβ’ Added video statisticsβ’ English language supportβ’ Enhanced documentation | | 1.0 | 2025-10-21 | β’ Initial releaseβ’ Basic download functionality | β Support This Work If this workflow saved you time: β Star it on n8n community π’ Share with fellow automation enthusiasts π¬ Leave feedback on nguyenthieutoan.com β Buy me a coffee (link on website) Happy Automating! π Last Updated: October 22, 2025 Workflow Name: TikTok Video Downloader (No Watermark) - Telegram Bot Author: Nguyen Thieu Toan
by Sabrina Ramonov π
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This fully automated AI Avatar Social Media system creates talking head AI clone videos, WITHOUT having to film or edit yourself. It combines n8n, AI agent, HeyGen, and Blotato to research, create, and distribute talking head AI clone videos to every social media platform every single day. This template is ideal for content creators, social media managers, social media agencies, small businesses, and marketers who want to to scale short-form video creation, without manually filming and editing every single video. Overview 1. Trigger: Schedule Configured to run once daily at 10am 2. AI News Research Research viral news from tech-focused forum, Hackernews Fetch the selected news item, plus discussion comments 3. AI Writer AI writes 30-second monologue script AI writes short video caption 4. Create Avatar Video Call Heygen API (requires paid API plan), specifying your avatar ID and voice ID Create avatar video, optionally passing in an image/video background if you have a green screen avatar (matte: true) 5. Get Video Wait awhile, then fetch completed avatar video Upload video to Blotato 6. Publish to Social Media via Blotato Connect your Blotato account Choose your social accounts Either post immediately or schedule for later" π Documentation Full Tutorial Troubleshooting Check your Blotato API Dashboard to see every request, response, and error. Click on a request to see the details. Need Help? In the Blotato web app, click the orange button on the bottom right corner. This opens the Support messenger where I help answer technical questions.
by Oneclick AI Squad
This n8n workflow receives files sent in a Telegram chat, uploads them to Google Drive, extracts text using OCR (for images and PDFs), and stores the extracted content in Airtable for quick search and retrieval. Users can later search through documents using a Telegram /search command. Key Features Accepts images and documents from Telegram Uploads files to Google Drive automatically Detects file type and runs OCR if eligible Extracts text from images & PDFs via Google Vision Stores file metadata + text in Airtable Search documents using /search command in Telegram Sends result previews and file links Error handling & user notifications included Use Cases Personal document vault with search Team knowledge filing system Receipt & invoice OCR archive Legal documents store & retrieval Research papers & notes indexing Company file inbox for AI knowledge base Workflow Steps | Step | Action | Description | | ---- | --------------------- | ---------------------------------------------------------- | | 1 | Telegram Trigger | Detects incoming docs/images or /search command | | 2 | Filter File or Search | Routes based on whether message has file or search command | | 3 | Extract Metadata | Reads file info such as name, MIME type, user | | 4 | Download File | Downloads file via Telegram API | | 5 | Upload to Drive | Saves file in Google Drive | | 6 | OCR Check | Determines if file supports OCR | | 7 | Google OCR | Runs OCR for images/PDFs | | 8 | Extract Text | Pulls text output from OCR | | 9 | Merge OCR Text | Combines file data + text | | 10 | Save to Airtable | Indexes with metadata + text | | 11 | Success Reply | Sends link + success message | | 12 | /search Flow | Parse search query | | 13 | Airtable Search | Full-text search for records | | 14 | Send Results | Sends matches to Telegram | | 15 | Error Handler | Notifies user on failure | Input Formats File Messages Supported Images PDFs Documents Search Command /search keyword Example: /search invoice Output After Upload: β File saved & indexed successfully! π Drive Link: <link> After Search: Returns structured result: File name Preview text snippet Google Drive link Data Stored in Airtable | Field | Description | | ------------- | ------------------------- | | File Name | Original name | | File Link | Google Drive link | | MIME Type | File type | | Telegram User | Sender info | | OCR Text | Extracted searchable text | | Uploaded Date | Timestamp | Technical Requirements Telegram Bot Token Google Drive API connection Google Vision API key Airtable API key & table Benefits Automatically organizes Telegram files Makes PDFs & images searchable Saves manual sorting and indexing time AI-ready data storage (future LLM integration) Fast search experience right in Telegram Enhancement Ideas Add Whisper for voice message transcription Add chat GPT summarization for large docs Build dashboard for uploaded files Auto-tag documents (invoice, ID, receipt etc.) Multi-language OCR support Status β Ready for production β Handles images, PDFs, and files β End-to-end automation π Optional: add more AI enrichment later
by Ahmed Sherif
AI-Powered Lead Scraping Automation using APIFY Scraper and Gemini Filtering to Google Sheets This is a fully automated, end-to-end pipeline designed to solve the challenge of inconsistent and low-quality lead data from large-scale scraping operations. The system programmatically fetches raw lead information from sources like Apollo or via Apify, processes it through an intelligent validation layer, and delivers a clean, deduplicated, and ready-to-use dataset directly into Google Sheets. By integrating Google Gemini for data cleansing, it moves beyond simple presence checks to enforce data hygiene and standardization, ensuring that sales teams only engage with properly formatted and complete leads. This automation eliminates hours of manual data cleaning, accelerates the speed from lead acquisition to outreach, and significantly improves the integrity of the sales pipeline. Features Batch Processing**: Systematically processes up to 1000 leads per batch and automatically loops through the entire dataset. This ensures stable, memory-efficient operation even with tens of thousands of scraped contacts. AI Validation**: Google Gemini acts as a data quality gatekeeper. It validates the presence and plausible format of critical fields (e.g., First Name, Company Name) and cleanses data by correcting common formatting issues. Smart Deduplication**: Before appending a new lead, the system cross-references its email address against the entire Google Sheet to prevent duplicate entries, ensuring a single source of truth. Auto Lead IDs**: Generates a unique, sequential ID for every new lead in the format AP-DDMMYY-xxxx. This provides a consistent reference key for tracking and CRM integration. Data Quality Reports**: Delivers real-time operational visibility by sending a concise summary to a Telegram channel after each batch, detailing success, warning, and error counts. Rate Limiting**: Incorporates a 30-second delay between batches to respect Google Sheets API limits, preventing throttling and ensuring reliable, uninterrupted execution. How It Works The workflow is initiated by an external trigger, such as a webhook, carrying the raw scraped data payload. It authenticates and fetches the complete list of leads from the Apify or Apollo API endpoint. The full list is automatically partitioned into manageable batches of 1000 leads for efficient processing. Each lead is individually passed to the Gemini AI Agent, which validates that required fields like Name, Email, and Company are present and correctly formatted. Validated leads are assigned a unique Lead ID, and all data fields are standardized for consistency. The system performs a lookup in the target Google Sheet to confirm the lead's email does not already exist. Clean, unique leads are appended as a new row to the designated spreadsheet. A completion notice is sent via the Telegram Bot, summarizing the batch results with clear statistics. Requirements Apify/Apollo API access credentials. Google Cloud project with OAuth2 credentials for Google Sheets API access. A configured Telegram Bot with its API Token and a target Chat ID. A Google Gemini API Key for data validation and cleansing. This system is ideal for sales and marketing operations teams managing high-volume lead generation campaigns, providing automated data quality assurance and accelerating pipeline development.
by Evoort Solutions
π₯ TikTok to MP4 Converter with Google Drive & Sheets Convert TikTok videos to MP4 , MP3 (without watermark), upload to Google Drive, and log conversion attempts into Google Sheets automatically β powered by TikTok Download Audio Video API. π Description This n8n automation accepts a TikTok video URL via a form, sends it to the TikTok Download Audio Video API, downloads the watermark-free MP4, uploads it to Google Drive, and logs the result (success/failure) into Google Sheets. π§© Node-by-Node Overview | # | Node | Functionality | |---|-------------------------------|-------------------------------------------------------------------------------| | 1 | π’ Form Trigger | Displays a form for user input of TikTok video URL. | | 2 | π TikTok RapidAPI Request | Calls the TikTok Downloader API to get the MP4 link. | | 3 | π If Condition | Checks if the API response status is "success". | | 4 | β¬οΈ MP4 Downloader | Downloads the video file using the returned "no watermark" MP4 URL. | | 5 | βοΈ Upload to Google Drive | Uploads the video file to Google Drive root folder. | | 6 | π Set Google Drive Permission | Makes the file publicly shareable via link. | | 7 | π Google Sheets (Success) | Logs TikTok URL + public Drive link into a Google Sheet. | | 8 | β±οΈ Wait Node | Delays to prevent rapid write operations on error. | | 9 | π Google Sheets (Failure) | Logs failed attempts with Drive_URL = N/A. | β Use Cases π² Social media managers downloading user-generated content π§ Educators saving TikTok content for offline lessons πΌ Agencies automating short-form video curation π€ Workflow automation demonstrations with n8n π― Key Benefits βοΈ MP4 without watermark via TikTok Download Audio Video API βοΈ Automated Google Drive upload & shareable links βοΈ Centralized logging in Google Sheets βοΈ Error handling and retry-safe structure βοΈ Fully customizable and extendable within n8n π‘ Ideal for anyone looking to automate TikTok video archiving with full control over file storage and access. π How to Get Your API Key for the TikTok Download Audio Video API Go to π TikTok Download Audio Video API - RapidAPI Click "Subscribe to Test" (you may need to sign up or log in). Choose a pricing plan (thereβs a free tier for testing). After subscribing, click on the "Endpoints" tab. Your API Key will be visible in the "x-rapidapi-key" header. π Copy and paste this key into the httpRequest node in your workflow. Create your free n8n account and set up the workflow in just a few minutes using the link below: π Start Automating with n8n
by Open Paws
π― Who's it for ESG analysts, investors, procurement teams, activists and sustainability professionals who need comprehensive, objective assessments of companies' environmental impact and animal welfare policies. Perfect for: Due diligence and investment screening Supplier evaluation and ethical sourcing Compliance reporting and ESG benchmarking Consumer guidance for ethical purchasing decisions β‘ How it works This workflow automates the entire research and analysis process for comprehensive sustainability and animal welfare assessment. Simply input a company name, and the system handles everything: π Multi-Source Research: Calls a specialized subworkflow that queries: Open Paws database for animal welfare data Web scraping for sustainability reports Search engines for recent developments Social media monitoring for real-time insights π€ Parallel AI Analysis: Two specialized chains process data simultaneously: Structured scoring** with percentages and letter grades (A+ to D) Detailed HTML reports** with narrative analysis and insights π Complete Assessment: Final output combines both formats for actionable intelligence on: Environmental policies and carbon footprint Animal welfare practices and ethical sourcing Vegan accommodation and plant-based initiatives π Requirements Prerequisites: Download the research subworkflow from **Multi-Tool Research Agent for Animal Advocacy with OpenRouter, Serper & Open Paws DB and save it in your n8n instance API key for OpenRouter or other AI service provider π How to set up Install Research Subworkflow: First download the Multi-Tool Research Agent for Animal Advocacy with OpenRouter, Serper & Open Paws DB and import it into your n8n instance Configure API Keys: Set up your AI service credentials in the LLM nodes Link Subworkflow: Connect the Research Agent node to reference your installed research subworkflow Test Connection: Verify the research tools and databases are accessible Run Test: Input a well-known company name to validate the complete pipeline π οΈ How to customize the workflow Scoring Weights**: Adjust percentage weightings for environmental impact, animal welfare, and vegan accommodation Research Sources**: Modify the subworkflow to include additional databases or exclude certain sources Output Format**: Customize the HTML report template or JSON schema structure Grading Scale**: Change letter grade thresholds (A+, A, B+, etc.) in the scoring logic Assessment Focus**: Adapt prompts to emphasize specific sustainability or animal welfare aspects for your industry