by Brian Money
Gmail MCP Server Expose Gmail’s full API as a single SSE “tool server” endpoint for your AI agents. What it does Spins up an MCP Trigger that streams Server‑Sent Events to LangChain/N8N AI Agent nodes. Maps 20+ common Gmail operations (search, send, reply, draft, label & thread management, mark read/unread, delete, etc.) to ai_tool connections, so agents can invoke them with a simple JSON payload. Why you’ll love it Agent‑ready: Plug the SSE URL into any N8N Agent or any other AI tool that uses MCP and start reasoning over email immediately. Extensible: Add more GmailTool operations or swap credentials without touching your agent logic. How to use Import the workflow (n8n ≥ v1.88). Set up a gmailOAuth2 credential and select it on the GmailTool nodes. Open the Gmail MCP Server node, copy the SSE URL, and paste it into your AI agent’s “Tool Server” field.
by Automate With Marc
AI Clone Instagram Influencer Reel Builder + Auto-Post (Heygen + Submagic + Blotato) Description Turn an idea into a finished Instagram Reel—end to end, on autopilot. This template generates a compelling Reel script, sends it to Heygen to produce an AI avatar/clone video, applies dynamic on-video captions with Submagic, then uploads and auto-posts to Instagram via Blotato, complete with a tailored AI-written caption. Ideal for creators, agencies, and brands who want consistent short-form output without manual editing. 👉 Watch step-by-step automation builds on YouTube: https://youtu.be/MmZxLuAkqig?si=DRfS89yQlSlbMbfZ What This Template Does ✍️ Generates a short-form Reel script from your topic/idea (optimized hook → body → close). 🧑🎤 Creates an AI avatar video using Heygen (character + voice) from that script. 🅰️ Adds stylized overlaid captions using Submagic (template selectable). ☁️ Uploads media to Blotato and auto-posts to Instagram Reels. 🧠 Writes an IG caption (with hashtags) using an AI Caption Agent tuned for engagement. How It Works (Node Flow) Chat Trigger – Send a topic/idea to start the run. Instagram Script Generator (Agent) – Creates a 25–30s script (hook → insights → soft CTA). POST to Heygen – Generates an avatar video from the script (avatar_id, voice_id, size). Wait & Poll – Checks Heygen status until the video is ready. POST to Submagic – Creates a project and applies your caption style (e.g., “Hormozi 2”). Wait & Poll – Retrieves the captioned video URL when completed. Upload media (Blotato) – Uploads the final video to your Blotato account. Instagram Caption Agent – Produces an on-brand IG caption + hashtag block from the original script. Create Post (Blotato) – Publishes to Instagram as a Reel with the AI caption and uploaded media. Required Credentials OpenAI (or compatible) – for script + caption agents. Heygen API – HTTP Header Auth (API key). Submagic API – HTTP Header Auth (API key). Blotato API – Account + token. (Optional) KodeKey/Base URL if you route OpenAI-compatible models through your gateway. Best practice: store all secrets in n8n Credentials, not hard-coded in nodes. Quick Start Import the template into n8n. Create/assign credentials for OpenAI (or compatible), Heygen, Submagic, and Blotato. In Heygen, set your avatar_id and voice_id (or swap with your own). In Submagic, set templateName, language, and style preferences. In Blotato, confirm the accountId and instagramMediaType: reel. Run the workflow from the Chat Trigger with a topic (e.g., “3 money habits for 2025”). Confirm the Reel shows up in your connected Instagram account. Customization Tips Script Persona: Adjust the agent system prompt (niche, tone, audience). Caption Style: Tweak the Caption Agent for hook length, CTAs, and hashtag strategy. Heygen Output: Change dimension to 1080×1920 for full-HD vertical. Submagic Template: Swap templateName to match your brand. Posting Targets: Extend Blotato to cross-post to TikTok/YouTube Shorts. Error Handling & Reliability Uses Wait + status polling for both Heygen and Submagic before downstream steps. Includes IF checks to re-poll when processing is not complete. Recommendation: add Slack/Email alerts and Retry options for production use. Ideal For Solo creators and founders posting daily Social media managers and agencies Edu/Coach brands scaling short-form content
by Yaron Been
This workflow automatically analyzes Reddit comments to understand public sentiment and community reactions. It saves you hours of manual reading by using AI to classify comments as positive, negative, or neutral, providing instant insights into how people feel about any Reddit post. Overview This workflow scrapes Reddit post comments using Bright Data's web scraping capabilities, then uses Google Gemini AI to analyze the sentiment of each comment. The results are automatically saved to Google Sheets with the comment text, sentiment classification, and reasoning behind each classification. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Reddit comments without restrictions or rate limits Google Gemini**: AI model for intelligent sentiment analysis Google Sheets**: For storing and tracking sentiment analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the scraping nodes Set Up Google Gemini: Configure your Google Gemini API credentials Configure Google Sheets: Connect your Google Sheets account and copy the template spreadsheet Customize: Simply paste any Reddit post URL and run the workflow Use Cases Brand Monitoring**: Track sentiment around your brand or products on Reddit Product Managers**: Understand user feedback and pain points from Reddit discussions Market Research**: Analyze community reactions to news, launches, or announcements Community Managers**: Monitor sentiment trends and identify issues early Content Creators**: Gauge audience reactions to topics before creating content Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #sentimentanalysis #reddit #brightdata #webscraping #marketresearch #n8nworkflow #workflow #nocode #brandmonitoring #communityanalysis #redditanalytics #customersentiment #sociallistening #aianalysis #publicsentiment #marketintelligence #userresearch #communityinsights #redditmonitoring #sentimenttracking #customervoice #brandreputation #socialmediaanalysis #consumerinsights #feedbackan
by Khair Ahammed
Streamline your email list hygiene with this automated validation workflow that monitors Google Sheets for new email entries and instantly verifies their deliverability. Perfect for maintaining clean contact databases, reducing bounce rates, and ensuring successful email marketing campaigns. Key Features 📊 Real-Time Processing Monitors Google Sheets for new email entries every minute Automatic validation triggers when emails are added Instant deliverability status updates in the same spreadsheet 🔍 Dual Validation Sources Hunter.io API for comprehensive email verification EmailValidation.io API as backup validation service Cross-verification for higher accuracy ✅ Smart Filtering Skips empty email cells to prevent unnecessary API calls Processes only valid email formats Handles bulk email list uploads efficiently 📈 Seamless Integration Updates original spreadsheet with validation results Preserves existing data while adding deliverability status No manual intervention required after setup Workflow Components Processing Flow: Google Sheets Trigger monitors for new rows Filter removes empty email entries Email extraction and formatting Dual API validation (Hunter + EmailValidation.io) Status processing and formatting Automatic spreadsheet updates Use Cases Email Marketing: Clean lists before campaigns Lead Generation: Validate new prospect emails Database Maintenance: Regular email hygiene checks CRM Integration: Ensure contact data quality Setup Requirements Required: Google Sheets with email column Hunter.io API key EmailValidation.io API key Google Sheets OAuth2 credentials Benefits: Reduce email bounce rates Improve sender reputation Save costs on invalid email sends Maintain clean contact databases This workflow transforms manual email validation into an automated process, ensuring your email lists stay clean and deliverable without any manual effort. Tags: email-validation, google-sheets, hunter-io, data-cleaning, automation
by Maneeshwar
Stop wasting time on video editing! This template is designed for content creators and marketers who need a fast, scalable way to convert simple text scripts into polished, shareable AI-generated videos and publish them automatically to Instagram. The workflow includes a critical human review step to maintain quality control. Prerequisites To use this template, you must have the following: An active n8n instance. An active Blotato account (required for the AI video generation and Instagram upload API access). The Blotato n8n node must be installed on your n8n instance. A configured Gmail node to handle the human approval step. How It Works (Workflow Breakdown) This automation efficiently moves your content from a text script to a live Instagram post: Script Input (Form Node): The workflow starts by accepting your script via a simple form input. Data Structuring (Information Extractor Node): The raw script is processed to arrange the information into a specific JSON schema, ensuring the Blotato Create Content node receives the correct input format required for AI video generation. AI Video Generation (Blotato Node): The process triggers the Blotato AI video generation. This process is set to run in a Queue mode. Status Check Loop (Loop, Wait, IF Nodes): Since video generation takes time, the workflow enters a loop: It uses a Wait node to pause for 80 seconds. It then uses the Blotato Get Video node, combined with an IF node, to check the generation status. This loop continues until the video is fully generated. Human Approval (Gmail Node): Once the video is generated, the hosted video link is sent to your Gmail. The email includes two interactive buttons: "Approve" and "Disapprove." Instagram Upload (Blotato Node): If you choose to "Approve" the video, the final step uses the Blotato node to automatically upload the finished video as an Instagram Reel. Setup Instructions Template Download:** The full, free template is available here: Template JSON Configuration Tutorial:** For a visual, step-by-step guide on configuring the Blotato node and setting up the workflow, please watch the full tutorial: YouTube Tutorial It takes 10 mins to configure the Workflow
by masaya kawabe
Who’s it for Social media managers, creators, and brand accounts that rely on retweets for reach but want an automated, hands-off cleanup after campaigns to keep profiles tidy and on-brand. What it does / How it works On a schedule, the workflow resolves your handle to a user ID, fetches recent tweets, filters retweets only, and safely unretweets them using batching and delays to respect rate limits. A dedicated CONFIG (Set Fields) node centralizes variables (e.g., target_username, max_results, batch_delay_minutes) so you can adjust behavior without touching logic. API endpoints used GET** /2/users/by/username/{username} – resolve handle → user ID GET** /2/users/{id}/tweets?tweet.fields=created_at,referenced_tweets – fetch recent tweets (identify retweets via referenced_tweets.type === "retweeted") DELETE** /2/users/{id}/retweets/{tweet_id} – unretweet Example response payloads GET /2/users/by/username/{username} { "data": { "id": "2244994945", "name": "Twitter Dev", "username": "TwitterDev" } } GET /2/users/{id}/tweets (truncated) { "data": [ { "id": "1760000000000000000", "text": "RT @someone: …", "referenced_tweets": [{ "type": "retweeted", "id": "1759999999999999999" }], "created_at": "2025-01-15T09:10:11.000Z" } ], "meta": { "result_count": 20 } } DELETE /2/users/{id}/retweets/{tweet_id} { "data": { "retweeted": false } } Use cases Brand hygiene:** Auto-unretweet promos after 48–72h. Campaign cadence:** Remove event retweets once the event ends. Feed freshness:** Clear low-priority retweets on a rolling basis. How to set up Open CONFIG (Set Fields) and replace placeholders: target_username = "your_handle" max_results = 100 (per fetch) batch_delay_minutes = 2 (throttle between batches) Connect X/Twitter credentials in n8n (no keys hard-coded in HTTP nodes). Run once with small values, verify logs, then enable the schedule. > Optional enhancements: add a dead-letter path (Error Trigger → Set → Sheets/Email/Slack) and a notification node (e.g., Slack) for execution feedback.
by Open Paws
Who’s it for 🎯 This workflow is designed for animal advocacy organizations, activists, and campaigners who want to automatically receive a weekly email update summarizing the latest news and developments related to animal rights, welfare, vegetarianism, and veganism. It can also be easily altered to allow daily updates. How it works / What it does ⚙️ Runs on a weekly schedule and uses a multi-tool research agent subworkflow to gather verified news strictly from the past week. It compiles the information and URLs into a clean, well-structured HTML email, then sends it to the specified recipient. URLs are never altered or omitted. How to set up 🛠️ Import this workflow into your n8n instance. Add and install the required research subworkflow: Multi-tool Research Agent for Animal Advocacy Configure API keys in n8n credentials. Set your topics, instructions, and recipient email in the “Set Preferences” node. Adjust the schedule node to control when emails are sent. Test the full workflow to ensure proper operation. Requirements 📋 n8n instance with internet access Valid API keys The Multi-tool Research Agent subworkflow installed SMTP or email sending configured How to customize 🔧 Update Topics:** Change topics in the “Set Preferences” node to focus the research. Update Instructions:** Tailor summary style and focus in the preferences node. Email Recipient:** Set who receives the update email. Scheduling:** Change frequency or time in the Schedule node. Duplicate for daily versions with adjusted research parameters. HTML Styling:** Modify the “Write HTML” node’s template for custom branding or layout. Error Handling:** Add workflows to capture and alert on errors for robustness. Adapt and extend as needed for your advocacy goals!
by AI/ML API | D1m7asis
🧠 AI Image Generator Bot — Telegram + AI/ML API This n8n workflow allows users to generate AI-generated images by sending messages to a Telegram bot. Each request is logged in Google Sheets and limited by a daily quota per user. Image prompts are enhanced by LLM before generation. 🚀 Features 📩 Telegram-based input 🧠 Prompt enhancement with GPT-4o 🎨 AI image generation via flux-pro model (AIMLAPI) 🖋 Auto-caption generation 📊 Usage tracked per user daily in Google Sheets 🔒 Daily request limits ✅ Graceful UX for over-limit cases 🛠 Setup Guide 1. 📲 Create Telegram Bot Talk to @BotFather Use /newbot → Choose a name and username Save the bot token 2. 🔐 Set Up Credentials in n8n Telegram API: Use your bot token Google Sheets: Set up via OAuth2 or Service Account AI/ML API: Set up with your API key from aimlapi.com 3. 📗 Prepare Google Sheet Name: Any (e.g., Image bot usage statistic) Sheet: Sheet1 Columns: user_id | date | query | result_url Share the sheet with the email of your service/OAuth2 account 4. 🔧 Configure the Workflow Open the n8n editor and import the JSON Update: Telegram credential Google Sheets credential and Sheet ID AI/ML API credentials ⚙️ Flow Summary | Node | Function | | ------------------------------ | ------------------------------------ | | 📩 Receive Telegram Message | Triggered by user message | | 📊 Fetch Usage Logs | Reads today's entries from Sheet | | 📈 Count Today’s Requests | Counts how many generations today | | 🔢 Set Daily Limit | Sets default limit (5) | | 🚦 Check Limit Exceeded? | If over limit → notify | | 🧠 Enhance Prompt | Uses GPT-4o to improve user's prompt | | 🎨 Generate Image | Sends to AIMLAPI to generate | | 🖋 Describe Image | Generates caption for the image | | 📤 Send Image to User | Sends back to Telegram | | 📝 Log Successful Generation | Writes to Google Sheets | 📁 Data Logging Each successful generation is stored in Google Sheets: | user\_id | date | query | result\_url | | -------- | ---- | ----- | ----------- | 💡 Example Prompt Flow User sends: astronaut cat floating in space Bot replies: > Here’s your image: > A majestic feline astronaut drifts through a glittering cosmic void, its helmet reflecting starlight. The image is sent with the caption 🔄 Daily Limit Default: 5 generations/day per Telegram user You can change this in the 🔢 Set Daily Limit node 🧪 Testing Use /execute workflow in Telegram — not "Execute Node" in editor Log test results to sheet Add extra Set nodes for debugging as needed 📎 Resources 🔗 AI/ML API Docs 🖼️ flux-pro Model UI
by Open Paws
Who’s it for 🎯 This workflow is designed for animal advocacy campaigners, strategists, and researchers who need detailed intelligence on corporate targets and their key stakeholders like executives, investors, and suppliers. How it works / What it does ⚙️ It uses the Multi-tool Research Agent subworkflow to research a target company, extract relevant sub-targets, and then runs focused research on each sub-target. It compiles all findings into a detailed HTML report outlining tailored campaign tactics. How to set up 🛠️ Import this workflow and the Multi-tool Research Agent subworkflow. Configure API credentials in n8n. Set the target company and campaign details. Test the workflow to verify multi-level research and report generation. Requirements 📋 n8n instance with internet access Valid API keys The Multi-tool Research Agent subworkflow installed and linked Optional email node for sending reports How to customize 🔧 Modify target inputs and sub-target extraction for different industries. Adjust research prompts in the subworkflow for style or focus. Customize the HTML report template for branding. Attach an email node to send reports automatically or route output as needed. Add error handling or branching for campaign specifics. Use this template to generate strategic, research-driven campaigns with actionable intelligence on complex corporate targets.
by Hashir Bin Waseem
Managing your inbox can feel like a full-time job. Some emails deserve an instant response, others need thoughtful handling, and many don’t need a reply at all. This workflow takes that weight off your shoulders by combining AI intelligence with human oversight, so you spend less time sorting and more time focusing on what matters. Why This Workflow Matters Think about how much energy gets drained just deciding: Should I reply to this now? Is this too sensitive for an automatic response? Or is this just noise I can safely ignore? This workflow does that decision-making for you. With the help of Google Gemini, it reads each incoming email, categorizes it, and then either: Replies instantly with a warm, professional message, Prepares a draft reply for you to review, Or does nothing if the message is irrelevant. It’s like having a personal assistant who knows when to step in and when to leave things for you. Benefits You’ll Notice Clarity in your inbox**: No more second-guessing which emails need your attention. Faster replies**: Routine messages get answered automatically in seconds. Peace of mind**: Sensitive or complex topics are flagged for your review, you’re always in control. Less mental clutter**: By ignoring noise (marketing blasts, spam, automated notifications), you can focus on meaningful conversations. Consistency**: Every reply feels polite, professional, and human, without you lifting a finger. Over time, you’ll notice your inbox feels lighter, your response times improve, and your focus shifts back to real work rather than inbox triage. How It Works Gmail Trigger catches every new incoming email. AI Categorizer decides whether the message should be: Reply (safe to answer immediately), Draft (needs your review), Nothing (ignore). AI Writer generates either a ready-to-send reply or a draft that feels natural and professional. Gmail Integration then either sends, drafts, or ignores, based on the AI’s decision. Use Cases Customer support**: Quick replies for common questions, while important issues get drafted for your review. Freelancers & solopreneurs**: Keep clients happy with fast replies, but stay safe on sensitive topics. Personal inbox management**: Lighten the load of newsletters, promotions, and low-value emails. Requirements An n8n instance (self-hosted or cloud). A Gmail account connected via OAuth2 in n8n. Google Gemini API access** for AI categorization and drafting. Basic familiarity with n8n workflows and how to connect credentials. FAQ Q: Will this replace my judgment entirely? No. It only automates what’s safe. Complex or sensitive emails are always drafted for you to review. Q: Can I customize how the AI writes replies? Yes. You can adjust the prompt inside the workflow to match your own style and tone. Q: What happens if the AI misclassifies an email? At worst, you’ll get a draft instead of an auto-reply. You’ll never lose control over sensitive communication. Q: Does this cost extra to run? Using Gmail via n8n is free, but you’ll need Google Gemini API access, which may come with its own usage limits or costs. ⚠️ Important Note: This workflow comes with no support. You need to be comfortable working with n8n, credentials, and AI nodes on your own. It’s provided as-is for the community to experiment with and adapt. Start using it, and you’ll quickly see how much lighter your inbox feels.
by Thomas
"I used to spend hours every week just copy-pasting product descriptions to find the right tariff codes for our international shipments. It was tedious and prone to errors." - Accounting specialist. This workflow eliminates that manual work entirely. It automatically finds customs tariff numbers (also known as HS Codes or "Zolltarifnummern") for your products and enriches your data in Google Sheets. It offers two powerful modes: bulk processing for entire product lists and an on-demand chat interface for quick single lookups. new features added the API score in percentage (80 to 100% is perfect, 70-80% still good) added description of the found HS Code to better verify the accuracy -please keep in mind, that is still a beta https://www.zolltarifnummern.de/services/api What this workflow does Bulk Enrichment from Google Sheets:** Reads a list of product descriptions from a specified Google Sheet. External API Lookup:** For each product, it queries the zolltarifnummern.de API to find the most relevant customs tariff number. Automated Data Update:** Writes the found tariff numbers back into the correct row in your Google Sheet. On-Demand Single Lookup:** Use the integrated Chat Trigger to instantly look up a tariff number for a single product description without leaving n8n. Completion Notification:** Sends a confirmation email via Gmail once the bulk processing job is finished. Nodes Used Google Sheets HTTP Request Loop Over Items (Split in Batches) Set Gmail Chat Trigger Manual Trigger Preparation A Google Sheet prepared with at least two columns: one for your product descriptions (e.g., ProductDescription) and an empty one for the results (e.g., TariffCode). How to set up this workflow Configure Google Sheets (Read): Open the "Read Item Descriptions" node. Select your Google Sheets credentials. Enter your Spreadsheet ID and the name of the sheet containing your product data. Make sure the "Columns to Read" field includes the name of your product description column. Configure Google Sheets (Update): Open the "Write Customs Tariff to Sheet" node. Select the same Google Sheets credentials. Enter the same Spreadsheet ID and Sheet Name. Under Columns, set Matching Columns to your product description column name. This is crucial for updating the correct rows. Configure Email Notification: Open the "Send Completion Email" (Gmail) node. Select your Gmail credentials. In the Send To field, enter the email address where you want to receive the completion notification. Run the Workflow: For Bulk Processing: Activate the workflow and execute the "Abfrage starten" (Start Query) Manual Trigger. For a Single Lookup: Use the Chat Trigger. Open the chat pane, type a product description, and hit send. The workflow will return the suggested tariff number.
by Oneclick AI Squad
This workflow automates flight price comparison across multiple booking platforms (Kayak, Skyscanner, Expedia, Google Flights). It accepts natural language queries, extracts flight details using NLP, scrapes prices in parallel, identifies the best deals, and sends professional email reports with comprehensive price breakdowns and booking links. 📦 What You'll Get A fully functional, production-ready n8n workflow that: ✅ Compares flight prices across 4 major platforms (Kayak, Skyscanner, Expedia, Google Flights) ✅ Accepts natural language requests ("Flight from NYC to London on March 25") ✅ Sends beautiful email reports with best deals ✅ Returns real-time JSON responses for web apps ✅ Handles errors gracefully with helpful messages ✅ Includes detailed documentation with sticky notes 🚀 Quick Setup (3 Steps) Step 1: Import Workflow to n8n Copy the JSON from the first artifact (workflow file) Open n8n → Go to Workflows Click "Import from File" → Paste JSON → Click Import ✅ Workflow imported successfully! Step 2: Setup Python Scraper On your server (where n8n SSH nodes will connect): Navigate to your scripts directory cd /home/oneclick-server2/ Create the scraper file nano flight_scraper.py Copy the entire Python script from the second artifact Save with Ctrl+X, then Y, then Enter Make it executable chmod +x flight_scraper.py Install required packages pip3 install selenium Install Chrome and ChromeDriver sudo apt update sudo apt install -y chromium-browser chromium-chromedriver Test the scraper python3 flight_scraper.py JFK LHR 2025-03-25 2025-03-30 round-trip 1 economy kayak Expected Output: Delta|$450|7h 30m|0|10:00 AM|6:30 PM|https://kayak.com/... British Airways|$485|7h 45m|0|11:30 AM|8:15 PM|https://kayak.com/... ... Step 3: Configure n8n Credentials A. Setup SMTP (for sending emails): In n8n: Credentials → Add Credential → SMTP Fill in details: Host: smtp.gmail.com Port: 587 User: your-email@gmail.com Password: [Your App Password] For Gmail Users: Enable 2FA: https://myaccount.google.com/security Create App Password: https://myaccount.google.com/apppasswords Use the 16-character password in n8n B. Setup SSH (already configured if you used existing credentials): In workflow, SSH nodes use: ilPh8oO4GfSlc0Qy Verify credential exists and points to correct server Update path if needed: /home/oneclick-server2/ C. Activate Workflow: Click the workflow toggle → Active ✅ Webhook is now live! 🎯 How to Use Method 1: Direct Webhook Call curl -X POST https://your-n8n-domain.com/webhook/flight-price-compare \ -H "Content-Type: application/json" \ -d '{ "message": "Flight from Mumbai to Dubai on 15th March, round-trip returning 20th March", "email": "user@example.com", "name": "John Doe" }' Response: { "success": true, "message": "Flight comparison sent to user@example.com", "route": "BOM → DXB", "bestPrice": 450, "airline": "Emirates", "totalResults": 18 } Method 2: Natural Language Queries The workflow understands various formats: ✅ All these work: "Flight from New York to London on 25th March, one-way" "NYC to LHR March 25 round-trip return March 30" "I need a flight from Mumbai to Dubai departing 15th March" "JFK LHR 2025-03-25 2025-03-30 round-trip" Supported cities (auto-converts to airport codes): New York → JFK London → LHR Mumbai → BOM Dubai → DXB Singapore → SIN And 20+ more cities Method 3: Structured JSON { "from": "JFK", "to": "LHR", "departure_date": "2025-03-25", "return_date": "2025-03-30", "trip_type": "round-trip", "passengers": 1, "class": "economy", "email": "user@example.com", "name": "John" } 📧 Email Report Example Users receive an email like this: FLIGHT PRICE COMPARISON Route: JFK → LHR Departure: 25 Mar 2025 Return: 30 Mar 2025 Trip Type: round-trip Passengers: 1 🏆 BEST DEAL British Airways Price: $450 Duration: 7h 30m Stops: Non-stop Platform: Kayak 💰 Save $85 vs highest price! 📊 ALL RESULTS (Top 10) British Airways - $450 (Non-stop) - Kayak Delta - $475 (Non-stop) - Google Flights American Airlines - $485 (Non-stop) - Expedia Virgin Atlantic - $495 (Non-stop) - Skyscanner United - $520 (1 stop) - Kayak ... Average Price: $495 Total Results: 23 Prices subject to availability. Happy travels! ✈️ 🔧 Customization Options Change Scraping Platforms Add more platforms: Duplicate an SSH scraping node Change platform parameter: kayak → new-platform Add scraping logic in flight_scraper.py Connect to "Aggregate & Analyze Prices" node Remove platforms: Delete unwanted SSH node Workflow continues with remaining platforms Modify Email Format Edit the "Format Email Report" node: // Change to HTML format const html = ` <!DOCTYPE html> <html> <body> Flight Deals Best price: ${bestDeal.currency}${bestDeal.price} </body> </html> `; return [{ json: { subject: "...", html: html, // Instead of text ...data } }]; Then update "Send Email Report" node: Change emailFormat to html Use {{$json.html}} instead of {{$json.text}} Add More Cities/Airports Edit "Parse & Validate Flight Request" node: const airportCodes = { ...existing codes..., 'berlin': 'BER', 'rome': 'FCO', 'barcelona': 'BCN', // Add your cities here }; Change Timeout Settings In each SSH node, add: "timeout": 30000 // 30 seconds 🐛 Troubleshooting Issue: "No flights found" Possible causes: Scraper script not working Website structure changed Dates in past Invalid airport codes Solutions: Test scraper manually cd /home/oneclick-server2/ python3 flight_scraper.py JFK LHR 2025-03-25 "" one-way 1 economy kayak Check if output shows flights If no output, check Chrome/ChromeDriver installation Issue: "Connection refused" (SSH) Solutions: Verify SSH credentials in n8n Check server is accessible: ssh user@your-server Verify path exists: /home/oneclick-server2/ Check Python installed: which python3 Issue: "Email not sending" Solutions: Verify SMTP credentials Check email in spam folder For Gmail: Confirm App Password is used (not regular password) Test SMTP connection: telnet smtp.gmail.com 587 Issue: "Webhook not responding" Solutions: Ensure workflow is Active (toggle on) Check webhook path: /webhook/flight-price-compare Test with curl command (see "How to Use" section) Check n8n logs: Settings → Log Streaming Issue: "Scraper timing out" Solutions: In flight_scraper.py, increase wait times time.sleep(10) # Instead of time.sleep(5) Or increase WebDriverWait timeout WebDriverWait(driver, 30) # Instead of 20 📊 Understanding the Workflow Node-by-Node Explanation 1. Webhook - Receive Flight Request Entry point for all requests Accepts POST requests Path: /webhook/flight-price-compare 2. Parse & Validate Flight Request Extracts flight details from natural language Converts city names to airport codes Validates required fields Returns helpful errors if data missing 3. Check If Request Valid Routes to scraping if valid Routes to error response if invalid 4-7. Scrape [Platform] (4 nodes) Run in parallel for speed Each calls Python script with platform parameter Continue on failure (don't break workflow) Return pipe-delimited flight data 8. Aggregate & Analyze Prices Collects all scraper results Parses flight data Finds best overall deal Finds best non-stop flight Calculates statistics Sorts by price 9. Format Email Report Creates readable text report Includes route details Highlights best deal Lists top 10 results Shows statistics 10. Send Email Report Sends formatted email to user Uses SMTP credentials 11. Webhook Response (Success) Returns JSON response immediately Includes best price summary Confirms email sent 12. Webhook Response (Error) Returns helpful error message Guides user on what's missing 🎨 Workflow Features ✅ Included Features Natural Language Processing**: Understands flexible input formats Multi-Platform Comparison**: 4 major booking sites Parallel Scraping**: All platforms scraped simultaneously Error Handling**: Graceful failures, helpful messages Email Reports**: Professional format with all details Real-Time Responses**: Instant webhook feedback Sticky Notes**: Detailed documentation in workflow Airport Code Mapping**: Auto-converts 20+ cities 🚧 Not Included (Easy to Add) Price Alerts**: Monitor price drops (add Google Sheets) Analytics Dashboard**: Track searches (add Google Sheets) SMS Notifications**: Send via Twilio Slack Integration**: Post to channels Database Logging**: Store searches in PostgreSQL Multi-Currency**: Show prices in the user's currency 💡 Pro Tips Tip 1: Speed Up Scraping Use faster scraping service (like ScraperAPI): // Replace SSH nodes with HTTP Request nodes { "url": "http://api.scraperapi.com", "qs": { "api_key": "YOUR_KEY", "url": "https://kayak.com/flights/..." } } Tip 2: Cache Results Add caching to avoid duplicate scraping: // In Parse node, check cache first const cacheKey = ${origin}-${dest}-${departureDate}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.time < 3600000) { return cached.data; // Use 1-hour cache } Tip 3: Add More Platforms Easy to add Momondo, CheapOair, etc.: Add function in flight_scraper.py Add SSH node in workflow Connect to aggregator Tip 4: Improve Date Parsing Handle more formats: // Add to Parse node const formats = [ 'DD/MM/YYYY', 'MM-DD-YYYY', 'YYYY.MM.DD', // Add your formats ];