by Zach @BrightWayAI
Who's it for Content creators, researchers, educators, and digital marketers who need to discover high-quality YouTube training videos on specific topics. Perfect for building curated learning resource lists, competitive research, or content inspiration. What it does This workflow automatically searches YouTube using multiple search queries, filters for quality content, scores videos by relevance, and exports the top results to Google Sheets. It processes hundreds of videos and delivers only the most valuable educational content ranked by custom relevance criteria. The workflow searches for videos using 10 different AI automation-related queries (easily customizable), filters out low-quality content like shorts and clickbait, then ranks results based on title keywords, view counts, and engagement metrics. How it works Multi-query search: Searches YouTube with an array of related queries to get comprehensive coverage Content filtering: Removes shorts, spam, and low-quality videos using regex patterns Quality assessment: Filters videos based on view count, likes, and publication date Relevance scoring: Assigns scores based on title keywords and engagement metrics Result ranking: Sorts videos by relevance score and limits to top 50 results Export to Sheets: Delivers clean, organized data to Google Sheets with all metadata Requirements YouTube Data API v3 credentials from Google Cloud Console Google Sheets credentials for n8n workspace A Google Sheets document to receive the results How to set up Enable YouTube Data API v3 in your Google Cloud Console Add YouTube OAuth2 credentials to your n8n workspace Add Google Sheets credentials to your n8n workspace Create a Google Sheet and update the Google Sheets node with your document ID Customize search queries in the "Set Query" node for your topic Adjust filtering criteria in the Filter nodes based on your quality requirements How to customize the workflow Search topics: Modify the query array in the "Set Query" node to research any topic: [ "Python tutorial", "JavaScript course", "React beginner guide", // Add your queries here ] Quality thresholds: Adjust minimum views, likes, and date ranges in the "Filter for Quality" node Relevance scoring: Customize keyword weightings in the "Relevance Score" node to match your priorities Result limits: Change the number of final results in the "Limit" node (default: 50) Output format: Modify the "Set Fields" node to include additional YouTube metadata like duration, thumbnails, or category information The workflow is designed to be easily adaptable for any research topic while maintaining high content quality standards.
by InfyOm Technologies
✅ What problem does this workflow solve? If you're using a self-hosted n8n instance, there's no built-in version history or undo for your workflows. If a workflow is accidentally modified or deleted, there's no way to roll back. This backup workflow solves that problem by automatically syncing your workflows to Google Drive, giving you version control and peace of mind. ⚙️ What does this workflow do? ⏱ Runs on a set schedule (e.g., daily or every 12 hours). 🔍 Fetches all workflows from your self-hosted n8n instance. 🧠 Detects changes to avoid duplicate backups. 📁 Creates a dedicated folder for each workflow in Google Drive. 💾 Uploads new or updated workflow files in JSON format. 🗃️ Keeps backup history organized by date. 🔄 Allows for easy restore by importing backed-up JSON into n8n. 🔧 Setup Instructions 1. Google Drive Setup Connect your Google Drive account using the Google Drive node in n8n. Choose or create a root folder (e.g., n8n-workflow-backups) where backups will be stored. 2. n8n API Credentials Generate a Personal Access Token from your self-hosted n8n instance: Go to Settings → API in your n8n dashboard. Copy the token and use it in the HTTP Request node headers as: Authorization: Bearer <your_token> 3. Schedule the Workflow Use the Cron node to schedule this workflow to run at your desired frequency (e.g., once a day or every 12 hours). 🧠 How it Works Step-by-Step Flow: Scheduled Trigger The workflow begins on a timed schedule using the Cron node. Fetch All Workflows Uses the n8n API (/workflows) to retrieve a list of all existing workflows. Loop Through Workflows For each workflow: A folder is created in Google Drive using the workflow name. The workflow’s last updated timestamp is checked against Google Drive backups. Smart Change Detection If the workflow has changed since the last backup: A new .json file is uploaded to the corresponding folder. The file is named with the last updated date of the workflow (YYYY-MM-DD-HH-mm-ss.json) to maintain a versioned history. If no change is detected, the workflow is skipped. 🗂 Google Drive Folder Organization Backups are neatly organized by workflow and version: /n8n-workflow-backups/ ├── google-drive-backup-KqhdMBHIyAaE7p7v/ │ ├── 2025-07-15-13-03-32.json │ ├── 2025-07-14-03-08-12.json ├── resume-video-avatar-KqhdMBHIyAaE8p8vr/ │ ├── 2025-07-15-23-05-52.json Each folder is named after the workflow's name+id and contains timestamped versions. 🔧 Customization Options 📅 Change Backup Frequency Adjust the Cron node to run backups daily, weekly, or even hourly based on your needs. 📤 Use a Different Storage Provider You can swap out Google Drive for Dropbox, S3, or another cloud provider with minimal changes. 🧪 Add Workflow Filtering Only back up workflows that are active or match specific tags by filtering results from the n8n API. ♻️ How to Restore a Workflow from Backup Go to the Google Drive backup folder for the workflow you want to restore. Download the desired .json file (based on the date). Open your self-hosted n8n instance. Click Import Workflow from the sidebar menu. Upload the JSON file to restore the workflow. > You can choose to overwrite an existing workflow or import it as a new one. 👤 Who can use this? This template is ideal for: 🧑💻 Developers running self-hosted n8n 🏢 Teams managing large workflow libraries 🔐 Anyone needing workflow versioning, rollback, or disaster recovery 💾 Productivity enthusiasts looking for automated backups 📣 Tip Consider enabling version history in Google Drive so you get even more fine-grained backup recovery options on top of what this workflow provides! 🚀 Ready to use? Just plug in your n8n token, connect Google Drive, and schedule your backups. Your workflows are now protected!
by Javier Hita
Follow me on LinkedIn for more! Category: Lead Generation, Data Collection, Business Intelligence Tags: lead-generation, google-maps, rapidapi, business-data, contact-extraction, google-sheets, duplicate-prevention, automation Difficulty Level: Intermediate Estimated Setup Time: 15-20 minutes Template Description Overview This powerful n8n workflow automates the extraction of comprehensive business information from Google Maps using keyword-based searches via RapidAPI's Local Business Data service. Perfect for lead generation, market research, and competitive analysis, this template intelligently gathers business data including contact details, social media profiles, and location information while preventing duplicates and optimizing API usage. Key Features 🔍 Keyword-Based Google Maps Scraping**: Search for any business type in any location using natural language queries 📧 Contact Information Extraction**: Automatically extracts emails, phone numbers, and social media profiles (LinkedIn, Instagram, Facebook, etc.) 🚫 Smart Duplicate Prevention**: Two-level duplicate detection saves 50-80% on API costs by skipping processed searches and preventing duplicate business entries 📊 Google Sheets Integration**: Seamless data storage with automatic organization and structure 🌍 Multi-Location Support**: Process multiple cities, regions, or countries in a single workflow execution ⚡ Rate Limiting & Error Handling**: Built-in delays and error handling ensure reliable, uninterrupted execution 💰 Cost Optimization**: Intelligent batching and duplicate prevention minimize API usage and costs 📱 Comprehensive Data Collection**: Gather business names, addresses, ratings, reviews, websites, verification status, and more Prerequisites Required Services & Accounts RapidAPI Account with subscription to "Local Business Data" API Google Account for Google Sheets integration n8n Instance (cloud or self-hosted) Required Credentials RapidAPI HTTP Header Authentication** for Local Business Data API Google Sheets OAuth2** for data storage and retrieval Setup Instructions Step 1: RapidAPI Configuration Create RapidAPI Account Sign up at RapidAPI.com Navigate to "Local Business Data" API Subscribe to a plan (Basic plan supports 1000 requests/month) Get API Credentials Copy your X-RapidAPI-Key from the API dashboard Note the host: local-business-data.p.rapidapi.com Configure n8n Credential In n8n: Settings → Credentials → Create New Type: HTTP Header Auth Name: RapidAPI Local Business Data Add headers: X-RapidAPI-Key: YOUR_API_KEY X-RapidAPI-Host: local-business-data.p.rapidapi.com Step 2: Google Sheets Setup Enable Google Sheets API Go to Google Cloud Console Enable Google Sheets API for your project Create OAuth2 credentials Configure n8n Credential In n8n: Settings → Credentials → Create New Type: Google Sheets OAuth2 API Follow OAuth2 setup process Create Google Sheet Structure Create a new Google Sheet with these tabs: keyword_searches sheet: | select | query | lat | lon | country_iso_code | |--------|-------|-----|-----|------------------| | X | Restaurants Madrid | 40.4168 | -3.7038 | ES | | X | Hair Salons Brooklyn | 40.6782 | -73.9442 | US | | X | Coffee Shops Paris | 48.8566 | 2.3522 | FR | stores_data sheet: The workflow will automatically create columns for business data including: business_id, name, phone_number, email, website, full_address, rating, review_count, linkedin, instagram, query, lat, lon, and 25+ more fields Step 3: Workflow Configuration Import the Workflow Copy the provided JSON In n8n: Import from JSON Update Placeholder Values Replace YOUR_GOOGLE_SHEET_ID with your actual Google Sheet ID Update credential references to match your setup Configure Search Parameters (Optional) Adjust limit: 1-100 results per query (default: 100) Modify zoom: 10-18 search radius (default: 13) Change language: EN, ES, FR, etc. (default: EN) How It Works Workflow Process Load Search Criteria: Reads queries marked with "X" from keyword_searches sheet Load Existing Data: Retrieves previously processed data for duplicate detection Filter New Searches: Smart merge identifies only new query+location combinations Process Each Location: Sequential processing prevents API overload Configure Parameters: Prepares search parameters from sheet data API Request: Calls RapidAPI to extract business information Parse Data: Structures and cleans all business information Save Results: Stores new leads in stores_data sheet Rate Limiting: 10-second delay between requests Loop: Continues until all new searches are processed Duplicate Prevention Logic Search Level: Compares new queries against existing data using query+latitude combination, skipping already processed searches. Business Level: Each business receives a unique business_id to prevent duplicate entries even across different searches. Data Extracted Business Information Business name, full address, phone number Website URL, Google My Business rating and review count Business type, price level, verification status Geographic coordinates (latitude/longitude) Detailed location breakdown (street, city, state, country, zip) Contact Details Email addresses (when publicly available) Social media profiles: LinkedIn, Instagram, Facebook, Twitter, YouTube, TikTok, Pinterest Additional phone numbers Direct Google Maps and reviews links Search Metadata Original search query and parameters Extraction timestamp and geographic data API response details for tracking Use Cases Lead Generation Generate targeted prospect lists for B2B sales Build location-specific customer databases Create industry-specific contact lists Develop territory-based sales strategies Market Research Analyze competitor density in target markets Study business distribution
by Hybroht
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. JSON Architect - Dynamically Generate JSON Output Formats for Any AI Agent Overview Version: 1.0 The JSON Architect Workflow is designed to instruct AI agents on the required JSON structure for a given context and create the appropriate JSON output format. This workflow ensures that the generated JSON is validated and tested, providing a reliable JSON output format for use in various applications. ✨ Features Dynamic JSON Generation**: Automatically generate the JSON format based on the input requirements. Validation and Testing**: Validate the generated JSON format and test its functionality, ensuring reliability before output. Iterative Improvement**: If the generated JSON is invalid or fails testing, the workflow will attempt to regenerate it until successful or until a defined maximum number of rounds is reached. Structured Output**: The final output is the generated JSON output format, making it easy to integrate with other systems or workflows. 👤 Who is this for? This workflow is ideal for developers, data scientists, and businesses that require dynamic JSON structures for the responses of AI agents. It is particularly useful for those involved in procedural generation, data interchange formats, configuration management and machine learning model input/output. 💡 What problem does this solve? The workflow addresses the challenge of generating optimal JSON structures by automating the process of creation, validation, and testing. This approach ensures that the JSON format is appropriate for its intended use, reducing errors and enhancing the overall quality of data interchange. Use-Case examples: 🔄 Data Interchange Formats 🛠️ Procedural Generation 📊 Machine Learning Model Input/Output ⚙️ Configuration Management 🔍 What this workflow does The workflow orchestrates a process where AI agents generate, validate, and test JSON output formats based on the provided input. This approach leads to a more refined and functional JSON output parser. 🔄 Workflow Steps Input & Setup: The initial input is provided, and the workflow is configured with necessary parameters. Round Start: Initiates the round of JSON construction, ensuring the input is as expected. JSON Generation & Validation: Generates and validates the JSON output format according to the input. JSON Test: Verifies whether the generated JSON output format works as intended. Validation or Test Fails: If the JSON fails validation or testing, the process loops back to the Round Start for correction. Final Output: The final output is generated based on successful JSON construction, providing a cohesive response. 📌 Expected Input input**: The input that requires a proper JSON structure. max_rounds**: The maximum number of rounds before stopping the loop if it fails to produce and test a valid JSON structure. Suggested: 10. rounds**: The initial number of rounds. Default: 0. 📦 Expected Output input**: The original input used to create the JSON structure. json_format_name**: A snake_case identifier for the generated JSON format. Useful if you plan to reuse it for multiple AI agents or Workflows. json_format_usage**: A description of how to use the JSON output format in an input. Meant to be used by AI agents receiving the JSON output format in their output parser. json_format_valid_reason**: The reason provided by the AI agents explaining why this JSON format works for the input. json_format_structure: The JSON format itself, intended for application through the **Advanced JSON Output Parser custom node. json_format_input: The **input after the JSON output format ( json_format_structure ) has been applied in an AI agent's output parser. 📌 Example An example that includes both the input and the final output is provided in a note within the workflow. ⚙️ n8n Setup Used n8n Version**: 1.100.1 n8n-nodes-advanced-output-parser**: 1.0.1 Running n8n via**: Podman 4.3.1 Operating System**: Linux ⚡ Requirements to Use/Setup 🔐🔧 Credentials & Configuration Obtain the necessary LLM API key and permissions to utilize the workflow effectively. This workflow is dependent on a custom node for dynamically inputting JSON output formats called n8n-nodes-advanced-output-parser. You can find the repository here. Warning: As of 2025-07-09, the custom node creator has warned that this node is not production-ready. Beware when using it in production environments without being aware of its readiness. ⚠️ Notes, Assumptions & Warnings This workflow assumes that users have a basic understanding of n8n and JSON configuration. This workflow assumes that users have access to the necessary API keys and permissions to utilize the Mistral API or other LLM APIs. Ensure that the input provided to the AI agents is clear and concise to avoid confusion in the JSON generation process. Ambiguous inputs may lead to invalid or irrelevant JSON output formats. ℹ️ About Us This workflow was developed by the Hybroht team of AI enthusiasts and developers dedicated to enhancing the capabilities of AI through collaborative processes. Our goal is to create tools that harness the possibilities of AI technology and more.
by Jan Willem Altink
This workflow provides a secure API endpoint to remotely trigger other n8n workflows with custom data and to retrieve information about your existing workflows. It's perfect for users who want to integrate n8n into external systems or programmatically manage their automations. example usage: I use this workflow in a Raycast extension i have build, to execute n8n workflows from within Raycast: see Github ++How it works++ Receives API Calls: A webhook listens for incoming HTTP requests (e.g., POST to trigger, GET to retrieve info). Triggers Workflows: If the request is to trigger a workflow, it dynamically identifies the target workflow ID (from query parameters) and any input data (from the request body), then executes that workflow. This means you can control any of your workflows without modifying this manager template. Retrieves Workflow Info: Similarly, if the request is to get information, it dynamically uses query parameters (workflowId, mode, includedWorkflows) to fetch details about one or more n8n workflows (e.g., specific, all, active, inactive; full or summarized data). Responds: Sends back a JSON response indicating success/failure or the requested workflow data. ++Set it up++ Configure Webhook Security: Set up "Header Auth" credentials for the main Webhook node. This is the API key your external services will use. Add n8n API Credentials: For the nodes that fetch workflow information (like "Get specific workflowid", "get all active workflows", etc.), connect your n8n API credentials. This allows the workflow to query your n8n instance. Note Your Webhook URL: Once active, n8n provides a production URL for the webhook (path: workflow-manager). Use this URL to make API calls. Understand API Parameters: To trigger: Use ?workflowId=[ID_OF_WORKFLOW_TO_RUN] and send JSON data in the request body. To get info: Use parameters like ?workflowId=[ID], ?includedWorkflows=[all/active/inactive], and ?mode=[full/summary].
by VipinW
Apply to jobs automatically from Google Sheets with status tracking Who's it for Job seekers who want to streamline their application process, save time on repetitive tasks, and never miss following up on applications. Perfect for anyone managing multiple job applications across different platforms. What it does This workflow automatically applies to jobs from a Google Sheet, tracks application status, and keeps you updated with notifications. It handles the entire application lifecycle from submission to status monitoring. Key features: Reads job listings from Google Sheets with filtering by priority and status Automatically applies to jobs on LinkedIn, Indeed, and other platforms Updates application status in real-time Checks application status every 2 days and notifies you of changes Sends email notifications for successful applications and status updates Prevents duplicate applications and manages rate limiting How it works The workflow runs on two main schedules: Daily Application Process (9 AM, weekdays): Reads your job list from Google Sheets Filters for jobs marked as "Not Applied" with Medium/High priority Processes each job individually to prevent rate limiting Applies to jobs using platform-specific APIs (LinkedIn, Indeed, etc.) Updates the sheet with application status and reference ID Sends confirmation email for each application Status Monitoring (Every 2 days at 10 AM): Checks all jobs with "Applied" status Queries job platforms for application status updates Updates the sheet if status has changed Sends notification emails for status changes (interviews, rejections, etc.) Requirements Google account with Google Sheets access Gmail account for notifications Resume stored online (Google Drive, Dropbox, etc.) API access to job platforms (LinkedIn, Indeed) - optional for basic version n8n instance (self-hosted or cloud) How to set up Step 1: Create Your Job Tracking Sheet Create a Google Sheet with these exact column headers: | Job_ID | Company | Position | Status | Applied_Date | Last_Checked | Application_ID | Notes | Job_URL | Priority | |--------|---------|----------|--------|--------------|--------------|----------------|-------|---------|----------| | JOB001 | Google | Software Engineer | Not Applied | | | | | https://careers.google.com/jobs/123 | High | | JOB002 | Microsoft | Product Manager | Not Applied | | | | | https://careers.microsoft.com/jobs/456 | Medium | Column explanations: Job_ID**: Unique identifier (JOB001, JOB002, etc.) Company**: Company name Position**: Job title Status**: Not Applied, Applied, Under Review, Interview Scheduled, Rejected, Offer Applied_Date**: Auto-filled when application is submitted Last_Checked**: Auto-updated during status checks Application_ID**: Platform reference ID (auto-generated) Notes**: Additional information or application notes Job_URL**: Direct link to job posting Priority**: High, Medium, Low (Low priority jobs are skipped) Step 2: Configure Google Sheets Access In n8n, go to Credentials → Add Credential Select Google Sheets OAuth2 API Follow the OAuth setup process to authorize n8n Test the connection with your job tracking sheet Step 3: Set Up Gmail Notifications Add another credential for Gmail OAuth2 API Authorize n8n to send emails from your Gmail account Test by sending a sample email Step 4: Update Workflow Configuration In the "Set Configuration" node, update these values: spreadsheetId**: Your Google Sheet ID (found in the URL) resumeUrl**: Direct link to your resume (make sure it's publicly accessible) yourEmail**: Your email address for notifications coverLetterTemplate**: Customize your cover letter template Step 5: Customize Application Logic For basic version (no API access): The workflow includes placeholder HTTP requests that you can replace with actual job platform integrations. For advanced version (with API access): Replace LinkedIn/Indeed HTTP nodes with actual API calls Add your API credentials to n8n's credential store Update the platform detection logic for additional job boards Step 6: Test and Activate Add 1-2 test jobs to your sheet with "Not Applied" status Run the workflow manually to test Check that the sheet gets updated and you receive notifications Activate the workflow to run automatically How to customize the workflow Adding New Job Platforms Update Platform Detection: Modify the "Check Platform Type" node to recognize new job board URLs Add New Application Node: Create HTTP request nodes for new platforms Update Status Checking: Add status check logic for the new platform Customizing Application Strategy Rate Limiting**: Add "Wait" nodes between applications (recommended: 5-10 minutes) Application Timing**: Modify the cron schedule to apply during optimal hours Priority Filtering**: Adjust the filter conditions to match your criteria Multiple Resumes**: Use conditional logic to select different resumes based on job type Enhanced Notifications Slack Integration**: Replace Gmail nodes with Slack for team notifications Discord Webhooks**: Send updates to Discord channels SMS Notifications**: Use Twilio for urgent status updates Dashboard Updates**: Connect to Notion, Airtable, or other productivity tools Advanced Features AI-Powered Personalization**: Use OpenAI to generate custom cover letters Job Scoring**: Implement scoring logic based on job requirements vs. your skills Interview Scheduling**: Auto-schedule interviews when status changes Follow-up Automation**: Send follow-up emails after specific time periods Important Notes Platform Compliance Always respect rate limits to avoid being blocked Follow each platform's Terms of Service Use official APIs when available instead of web scraping Don't spam job boards with excessive applications Data Privacy Store credentials securely using n8n's credential store Don't hardcode API keys or personal information in nodes Regularly review and clean up old application data Ensure your resume link is secure but accessible Quality Control Start with a small number of jobs to test the workflow Review application success rates and adjust strategy Monitor for errors and set up proper error handling Keep your job list updated and remove expired postings This workflow transforms job searching from a manual, time-consuming process into an automated system that maximizes your application efficiency while maintaining quality and compliance.
by Jah coozi
AI Medical Symptom Checker & Health Assistant A responsible, privacy-focused health information assistant that provides general health guidance while maintaining strict safety protocols and medical disclaimers. ⚠️ IMPORTANT DISCLAIMER This tool provides general health information only and is NOT a substitute for professional medical advice, diagnosis, or treatment. Always consult qualified healthcare providers for medical concerns. 🚀 Key Features Safety First Emergency Detection**: Automatically identifies emergency situations Immediate Escalation**: Provides emergency numbers for critical cases Clear Disclaimers**: Every response includes medical disclaimers No Diagnosis**: Never attempts to diagnose conditions Professional Referral**: Always recommends consulting healthcare providers Core Functionality Symptom Information**: General information about common symptoms Wellness Guidance**: Health tips and preventive care Medication Reminders**: General medication information Multi-Language Support**: Serve diverse communities Privacy Protection**: No data storage, anonymous processing Resource Links**: Connects to trusted health resources 🎯 Use Cases General Health Information: Learn about symptoms and conditions Pre-Appointment Preparation: Organize questions for doctors Wellness Education: General health and prevention tips Emergency Detection: Immediate guidance for critical situations Health Resource Navigation: Find appropriate care providers 🛡️ Safety Protocols Emergency Keywords Detection Chest pain, heart attack, stroke Breathing difficulties Severe bleeding, unconsciousness Allergic reactions, poisoning Mental health crises Response Guidelines Never diagnoses conditions Never prescribes medications Always includes disclaimers Encourages professional consultation Provides emergency numbers when needed 🔧 Setup Instructions Configure OpenAI API Add your API key Set temperature to 0.3 for consistency Review Legal Requirements Check local health information regulations Customize disclaimers as needed Implement required data policies Emergency Contacts Update emergency numbers for your region Add local health resources Include mental health hotlines Test Thoroughly Verify emergency detection Check disclaimer display Test various symptom queries 💡 Example Interactions General Symptom Query: User: "I have a headache for 3 days" Bot: Provides general headache information, self-care tips, when to see a doctor Emergency Detection: User: "Chest pain, can't breathe" Bot: EMERGENCY response with immediate action steps and emergency numbers Wellness Query: User: "How can I improve my sleep?" Bot: General sleep hygiene tips and healthy habits information 🏥 Integration Options Healthcare Websites**: Embed as support widget Telemedicine Platforms**: Pre-consultation tool Health Apps**: General information module Insurance Portals**: Member resource Pharmacy Systems**: General drug information 📊 Compliance & Privacy HIPAA Considerations**: No PHI storage GDPR Compliant**: No personal data retention Anonymous Processing**: Session-based only Audit Trails**: Optional logging for compliance Data Encryption**: Secure transmission 🚨 Limitations Cannot diagnose medical conditions Cannot prescribe treatments Cannot replace emergency services Cannot provide specific medical advice Should not delay seeking medical care 🔒 Best Practices Always maintain clear disclaimers Never minimize serious symptoms Encourage professional consultation Keep information general and educational Update emergency contacts regularly Review and update health information Monitor for misuse Maintain audit trails where required 🌍 Customization Options Add local emergency numbers Include regional health resources Translate to local languages Integrate with local health systems Add specific disclaimers Customize for specific populations Start providing responsible health information today!
by Oneclick AI Squad
This guide walks you through setting up an automated workflow that compares live flight fares across multiple booking platforms (e.g., Skyscanner, Akasa Air, Air India, IndiGo) using API calls, sorts the results by price, and sends the best deals via email. Ready to automate your flight fare comparison process? Let’s get started! What’s the Goal? Automatically fetch and compare live flight fares from multiple platforms using scheduled triggers. Aggregate and sort fare data to identify the best deals. Send the comparison results via email for review or action. Enable 24/7 fare monitoring with seamless integration. By the end, you’ll have a self-running system that delivers the cheapest flight options effortlessly. Why Does It Matter? Manual flight fare comparison is time-consuming and often misses the best deals. Here’s why this workflow is a game-changer: Zero Human Error**: Automated data fetching and sorting ensure accuracy. Time-Saving Automation**: Instantly compare fares across platforms, boosting efficiency. 24/7 Availability**: Monitor fares anytime without manual effort. Cost Optimization**: Focus on securing the best deals rather than searching manually. Think of it as your tireless flight fare assistant that always finds the best prices. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Set Schedule Node**: Triggers the workflow at a predefined schedule to check flight fares automatically. Captures the timing for regular fare updates. Step 2: Process Input Data Set Input Data Node**: Sets the input parameters (e.g., origin, destination, departure date, return date) for flight searches. Prepares the data to be sent to various APIs. Step 3: Fetch Flight Data Skyscanner API Node**: Retrieves live flight fare data from Skyscanner using its API endpoint. Akasa Air API Node**: Fetches live flight fare data from Akasa Air using its API endpoint. Air India API Node**: Collects flight fare data directly from Air India’s API. IndiGo API Node**: Gathers flight fare data from IndiGo’s API. Step 4: Merge API Results Merge API Data Node**: Combines the flight data from Skyscanner and Akasa Air into a single dataset. Merge Both API Data Node**: Merges the data from Air India and IndiGo with the previous dataset. Merge All API Results Node**: Consolidates all API data into one unified result for further processing. Step 5: Analyze and Sort Compare Data and Sorting Price Node**: Compares all flight fares and sorts them by price to highlight the best deals. Step 6: Send Results Send Response via Email Node**: Sends the sorted flight fare comparison results to the user via email for review or action. How to Use the Workflow? Importing this workflow in n8n is a straightforward process that allows you to use this pre-built solution to save time. Below is a step-by-step guide to importing the Flight Fare Comparison Workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: The workflow is shared as a JSON file or code snippet (provided earlier or exported from another n8n instance). Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n: Open your n8n instance (via n8n Cloud or self-hosted). Navigate to Workflows: Go to the Workflows tab in the n8n dashboard. Open a New Workflow: Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): In the n8n editor, click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code (provided earlier) into the text box. Click Import to load the workflow. Option 2: Import via JSON File: In the n8n editor, click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import the workflow. Setup Notes API Credentials**: Configure each API node (Skyscanner, Akasa Air, Air India, IndiGo) with the respective API keys and endpoints. Check the API provider’s documentation for details. Email Integration**: Authorize the Send Response via Email node with your email service (e.g., Gmail SMTP settings or an email API like SendGrid). Input Customization**: Adjust the Set Input Data node to include specific origin/destination pairs and date ranges as needed. Schedule Configuration**: Set the desired frequency in the Set Schedule node (e.g., daily at 9 AM IST). Example Input Send a POST request to the workflow (if integrated with a webhook) with: { "origin": "DEL", "destination": "BOM", "departureDate": "2025-08-01", "returnDate": "2025-08-07" } Optimization Tips Error Handling**: Add IF nodes to manage API failures or rate limits. Rate Limits**: Include a Wait node if APIs have strict limits. Data Logging**: Add a node (e.g., Google Sheets) to log all comparisons for future analysis. This workflow transforms flight fare comparison into an automated, efficient process, delivering the best deals directly to your inbox!
by Sachin Shrestha
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n workflow automates invoice management by integrating Gmail, PDF analysis, and Azure OpenAI GPT-4.1, with an optional human verification step for accuracy and control. It's ideal for businesses or individuals who regularly receive invoice emails and want to streamline their accounts payable process with minimal manual effort. The system continuously monitors Gmail for new messages from specified senders. When it detects an email with a PDF attachment and relevant subject line (e.g., "Invoice"), it automatically extracts text from the PDF, analyzes it using Azure OpenAI, and determines if it is a valid invoice. If the AI is uncertain, the workflow sends a manual approval request to a human reviewer. Valid invoices are saved to local storage with a timestamped filename, and a confirmation email is sent upon successful processing. 🎯 Who This Is For Small to medium businesses Freelancers or consultants who receive invoices via email IT or automation teams looking to streamline document workflows Anyone using n8n with access to Gmail and Azure OpenAI ✅ Features Gmail Monitoring** – Automatically checks for new emails from trusted senders AI-Powered Invoice Detection** – Uses Azure GPT-4.1 to intelligently verify PDF contents PDF Text Extraction** – Extracts readable text for analysis Human-in-the-Loop Verification** – Requests approval when AI confidence is low Secure File Storag**e – Saves invoices locally with structured filenames Email Notifications** – Sends confirmations or manual review alerts ⚙️ Setup Instructions 1. Prerequisites An active n8n instance (self-hosted or cloud) A Gmail account with OAuth2 credentials An Azure OpenAI account with access to the GPT-4.1 model A local directory for saving invoices (e.g., C:/Test/Invoices/) 2. Gmail OAuth2 Setup In n8n, create Gmail OAuth2 credentials. Configure it with Gmail API access (read emails and attachments). Update the Gmail Trigger node to filter by sender email (e.g., sender@gmail.com). 3. Azure OpenAI Setup Create Azure OpenAI API credentials in n8n. Ensure your endpoint is correctly set and GPT-4.1 access is enabled. Link the credentials in the AI Analysis node. 4. Customize Workflow Settings Sender Email – Update in Gmail Trigger Notification Email – Update in Send Notification node Save Directory – Change in Save Invoice node 5. Testing the Workflow Send a test email from the configured sender with a PDF invoice. Wait for the workflow to trigger and check for: File saved in the directory Confirmation email received Manual review request (if needed) 🔄 Workflow Steps Gmail Trigger → Check for PDF Invoice → Extract PDF Text → Analyze with GPT-4.1 → ↳ If Invoice: Save & Notify ↳ If Uncertain: Request Human Review ↳ If Not Invoice: Send Invalid Alert
by Mario
Purpose This workflow creates a versioned backup of an entire Clockify workspace split up into monthly reports. How it works This backup routine runs daily by default The Clockify reports API endpoint is used to get all data from the workspace based on time entries A report file is being retrieved for every month starting with the current one, going back 3 month in total by default If changes happened during a day to any report, it is being updated in Github Prerequisites Create a private Github repository Create credentials for both Clockify and Github (make sure to give permissions for read and write operations) Setup Clone the workflow and select the belonging credentials Follow the instructions given in the yellow sticky notes Activate the workflow
by David Olusola
Learn Customer Onboarding Automation with n8n ✅ How It Works This smart onboarding automation handles new customer signups by: Receiving signup data via webhook Validating required customer info Creating a contact in HubSpot CRM Sending a personalized welcome email Delivering onboarding documents after 2 hours Sending a personal check-in email after 1 day Sending a Week 1 success guide after 3 days Updating CRM status and notifying the team at each milestone It’s designed for professional onboarding, with built-in timing, CRM integration, and smart notifications to improve engagement and retention. 🛠️ Setup Steps Create Webhook Add a Webhook node in n8n with POST method — this triggers when a new customer signs up. Validate Customer Data Add an IF node to check if email and customerName are present. Create CRM Contact Use a HubSpot node to create a new contact, map fields (e.g., split name into first/last). Send Notifications Add a Telegram or Slack node to alert your team instantly. Send Welcome Email Use an Email Send node for a warm welcome, customized with customer details. Wait 2 Hours Add a Wait node to delay next steps and avoid overwhelming the customer. Send Onboarding Documents Use another Email Send node to deliver helpful PDFs or guides. Wait 1 Day & Send Check-in Another Wait node, followed by a personal check-in email using the customer’s name. Wait 2 More Days & Send Success Guide Deliver Week 1 content via email to reinforce engagement. Update CRM & Notify Team Use HubSpot to update status and Telegram/Slack to notify your team of completion.
by n8n Team
This workflow automatically sends Zendesk tickets to Pipedrive contacts and makes them task assignees. The automation is triggered every 5 minutes, with Zendesk checking and collecting new tickets which are then individually assigned to a Pipedrive contact. Prerequisites Pipedrive account and Pipedrive credentials Zendesk account and Zendesk credentials Note: The Pipedrive and the Zendesk accounts need to be created by the same person / with the same email. How it works Cron node triggers the workflow every 5 minutes. Zendesk node collects all the tickets received after the last execution timestamp. Set node passes only the requester`s email and ID further to the Merge node. Merge by key node merges both inputs together, the tickets and their contact emails. Pipedrive node then searches for the requester. HTTP Request node gets owner information of Pipedrive contact. Set nodes keep only the requester owner's email and the agent`s email and id. Merge by key node merges the information and adds the contact owner to ticket data. Zendesk node changes the assignee to the Pipedrive contact owner or adds a note if the requester is not found. The Function Item node sets the new last execution timestamp.