by ConvertAPI
Who is this for? For developers and organizations that need to convert XLSX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the XLSX file from the web. Converts the XLSX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by AppStoneLab Technologies LLP
Automated AI Research Assistant: From Query to Polished Report with Jina & Gemini Turn a single research question into a comprehensive, multi-source report with proper citations. This workflow automates the entire research process by leveraging the web-crawling power of Jina AI and the advanced reasoning capabilities of Google's Gemini models. Simply input your query, and this AI-powered assembly line will search the web, scrape relevant sources, summarize the content, draft a structured research paper, and finally, evaluate and polish the report for accuracy and formatting. ✨ Key Features 🔎 Dynamic Web Search**: Kicks off by searching the web with Jina AI based on your initial query. 📚 Multi-Source Content Scraping**: Automatically reads and extracts content from the top 10 search results. 🧠 AI-Powered Summarization**: Uses a Gemini agent to intelligently summarize each webpage, retaining the core information. ✍️ Automated Report Generation**: A specialized "Generator Agent" synthesizes the summarized data into a structured research paper, complete with an executive summary, introduction, discussion, and conclusion. ✅ Citation & Quality Verification**: A final "Evaluator Agent" meticulously checks the generated report for citation accuracy, logical flow, and markdown formatting, delivering a polished final document. 📈Rate-Limit Ready**: Includes a configurable Wait node to ensure stable execution when dealing with multiple API calls. 📝 What This Workflow Does This workflow is designed to be your personal research assistant. It addresses the time-consuming process of gathering, reading, and synthesizing information from multiple online sources. Instead of spending hours manually searching, reading, and citing, you can delegate the entire task to this workflow and receive a well-structured and cited report as the final output. It's perfect for students, researchers, content creators, and analysts who need to quickly compile information on any given topic. ⚙️ How It Works (Step-by-Step) Initiate with a Query: The workflow starts when you send your research question or topic to the Chat Trigger node. Search the Web: The user's query is passed to the Jina AI node, which performs a web search and returns the top 10 most relevant URLs. Scrape, Summarize, Repeat: The workflow then loops through each URL: Read Content: The Jina AI node scrapes the full text content from the URL. Summarize: A Summarizer Agent powered by Google Gemini reads the scraped content and the original user query, then generates a concise summary. Wait: A one-second pause helps to avoid hitting API rate limits before processing the next URL. Aggregate the Knowledge: Once the loop is complete, a Code node gathers all 10 individual summaries into a single, neatly structured list. Draft the Research Report: This aggregated data is fed to the Generator Agent. Following a detailed prompt, this Gemini-powered agent writes a full research report, structuring it with headings and adding inline citations for every piece of information it uses. Evaluate and Finalize: The generated draft is passed to the final Evaluator Chain. This agent acts as a quality control supervisor. It verifies that all claims are correctly cited, refines the content for clarity and academic tone, and polishes the markdown formatting to produce the final, ready-to-use report. 🚀 How to Use This Workflow Credentials: Click on Use template, then configure your credentials for the following nodes: Jina AI: You will need a Jina AI API key for the Search web and Read URL content nodes. Get your key from here: JinaAI API Key Google Gemini: You will need a Google Gemini API key for the Summarizer Model, Generator Model, and Evaluator Model nodes. Get your key from here: Gemini API Key Activate Workflow: Make sure the workflow is active in your n8n instance. Start Research: Send a chat message with your research topic to the webhook URL provided in the When chat message received node. Get Your Report: Check the output of the final node, Evaluator Chain, to find your completed and polished research report. Nodes Used Chat Trigger Jina AI Code (Python) Split in Batches (Looping) Wait AI Agent Basic LLM Chain Google Gemini Chat Model
by Open Paws
This general-purpose sub-agent combines multiple research and automation tools to support high-impact decision-making for animal advocacy workflows. It’s designed to act as a reusable, modular unit within larger multi-agent systems—handling search, scraping, scoring, and domain-specific semantic lookup. It powers many of the advanced workflows released by Open Paws and serves as a versatile backend utility agent. 🛠️ What It Does Performs real-time Google Search using Serper Scrapes and extracts page content using Jina AI and Scraping Dog Conducts semantic search over the Open Paws knowledge base Generates OpenAI embeddings for similarity search and analysis Routes search and content analysis through OpenRouter LLMs Connects with downstream tools like the Text Scoring Sub-Workflow to evaluate message performance > 🧩 This agent is typically used as a sub-workflow in larger automations where agents need access to external tools or advocacy-specific knowledge. 🧠 Domain Focus: Animal Advocacy The agent is pre-configured to interface with the Open Paws database—an open-source, animal advocacy-specific knowledge graph—and is optimized for content and research tasks relevant to farmed animal issues, corporate campaigns, and activist communication. 🔗 Integrated Tools and APIs | Tool | Purpose | |---------------|------------------------------------------| | Serper API | Real-time Google Search queries | | Jina AI | Web scraping and content extraction | | Scraping Dog | Social media scraping where Jina is blocked | | OpenAI API | Embedding generation for semantic search | | OpenRouter | Proxy to multiple LLMs (e.g., GPT-4, Claude)| | Open Paws DB | Advocacy-specific semantic knowledge base | 📦 Use Cases Create and evaluate online content (e.g. social media, emails, petitions) for predicted performance and advocacy alignment Act as a research and reasoning agent within multi-agent workflows Automate web and social media research for real-time campaign support Surface relevant facts or arguments from an advocacy-specific knowledge base Assist communications teams with message testing and content ideation Monitor search results and scrape pages to inform rapid response messaging
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically analyzes competitor backlink profiles to understand their link building strategies and identify opportunities for your own SEO efforts. It saves you time by eliminating the need to manually research competitor links and provides detailed insights into their most valuable linking relationships. Overview This workflow automatically scrapes backlink analysis tools and competitor websites to extract comprehensive backlink data including referring domains, anchor text, link quality metrics, and link acquisition patterns. It uses Bright Data to access backlink databases and AI to intelligently analyze competitor link strategies. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping backlink analysis platforms without being blocked OpenAI**: AI agent for intelligent backlink strategy analysis Google Sheets**: For storing competitor backlink data and insights How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your backlink analysis spreadsheet Customize: Define competitor domains and backlink analysis parameters Use Cases SEO Strategy**: Learn from competitor link building success and replicate strategies Link Prospecting**: Identify websites that link to competitors but not to you Competitive Intelligence**: Understand competitor SEO strategies and authority sources Link Building**: Find high-quality link opportunities in your industry Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #competitorbacklinks #backlinkanalysis #seo #linkbuilding #brightdata #webscraping #competitoranalysis #n8nworkflow #workflow #nocode #linkanalysis #backlinkresearch #seoanalysis #competitiveintelligence #linkresearch #seostrategy #backlinkmonitoring #linkprospecting #domainanalysis #seotools #backlinkaudit #linkbuilding #organicseo #searchmarketing #competitorresearch #linkstrategy #seocompetitor #backlinkinsights
by Davide
This workflow allows you to send SMS messages globally using API without needing a physical phone number. 1. How It Works Consists of three main nodes: Manual Trigger**: The workflow starts when you click the "Test workflow" button in n8n. Set SMS Data**: This node defines the SMS message content and the recipient's phone number (including the international prefix). Send SMS**: This node sends the SMS using the ClickSend API. It uses HTTP Basic Authentication with your ClickSend credentials and sends a POST request to the ClickSend API endpoint with the message and recipient details. The workflow is simple and efficient, making it easy to automate SMS sending for various use cases, such as notifications, alerts, or marketing campaigns. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: Register on ClickSend: Go to ClickSend and create an account. Obtain your API Key and take advantage of the 2 € free credits provided. Set Up Basic Authentication in n8n: In the "Send SMS" node, configure the HTTP Basic Auth credentials: Username: Use the username you registered with on ClickSend. Password: Use the API Key provided by ClickSend. Configure the SMS Data: In the "Set SMS data" node, define: The message content (e.g., "Hi, this is my first message"). The recipient's phone number, including the international prefix (e.g., +39xxxxxxxx). Test the Workflow: Click the "Test workflow" button in n8n to trigger the workflow. The workflow will send the SMS using the ClickSend API, and you should receive the message on the specified phone number. Optional Customization: You can modify the workflow to dynamically set the message content or recipient phone number using data from other nodes or external sources. This workflow is a quick and efficient way to send SMS messages programmatically. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Yaron Been
This workflow automatically monitors competitor pricing across multiple products and services to track market positioning and pricing strategies. It saves you time by eliminating the need to manually check competitor prices and provides real-time insights into pricing changes and market trends. Overview This workflow automatically scrapes competitor websites and pricing pages to extract current pricing information, product details, and promotional offers. It uses Bright Data to access pricing data without restrictions and AI to intelligently parse pricing information and detect changes over time. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping competitor pricing pages without being blocked OpenAI**: AI agent for intelligent pricing data extraction and analysis Google Sheets**: For storing pricing data and tracking changes over time How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your pricing tracking spreadsheet Customize: Define competitor URLs and pricing monitoring parameters Use Cases Pricing Strategy**: Stay competitive by monitoring market pricing trends Product Management**: Track competitor feature and pricing changes Sales Teams**: Provide up-to-date competitive pricing information Market Research**: Analyze pricing patterns and market positioning Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #pricemonitoring #competitorpricing #brightdata #webscraping #pricinganalysis #n8nworkflow #workflow #nocode #competitoranalysis #pricingdata #marketresearch #pricingtrends #competitiveintelligence #pricingtracking #marketanalysis #pricecomparison #competitormonitoring #businessintelligence #pricingstrategy #marketpositioning #pricinginsights #competitorresearch #pricingautomation #markettrends #pricealerts #dynamicpricing #pricingoptimization #competitivepricing
by Itamar
🕵️ Company Research Agent (n8n + Explorium + LLM) This n8n workflow automates company research by combining Explorium’s MCP server, web scraping tools, and an AI agent. Results are written to a Google Sheet for easy use in GTM, product analysis, or competitive research. 🚀 What It Does Given a list of company domains or names, this workflow will: Look up company information using: 🧠 LLM Agent to guide the research 🔎 Explorium MCP Server for firmographic & tech signals 🌐 Website content and SerpAPI scraping (optional) Extract key commercial details (see below) Format the output in a consistent JSON structure Update a connected Google Sheet with the enriched results 🧩 Extracted Fields Each company is enriched with: domain linkedinUrl has_free_trial cheapest_plan has_enterprise_plan last_case_study_link market (e.g., B2B or B2C) integrations (e.g., Slack, Hubspot, MySQL) enrichment_status 📥 Input Sheet Format | input | |-------------| | Explorium | | n8n | | Apple | | ... | 📤 Output Sheet Format | domain | linkedinUrl | has_free_trial | cheapest_plan | has_enterprise_plan | last_case_study_link | market | integrations | enrichment_status | |--------------|----------------------------------|----------------|----------------|----------------------|-----------------------------|--------|---------------------------------------------------|-------------------| | Explorium.ai | https://linkedin.com/company/... | TRUE | 69 | TRUE | https://www.explorium.com | B2B | ["HubSpot", "Zapier", "Salesforce", ...] | done | | n8n.io | https://linkedin.com/company/... | TRUE | 20 | TRUE | https://n8n.io/case-studies | B2B | ["Slack", "Gmail", "MySQL", "Google Sheets", ...] | done | 🛠️ Tools Used n8n** (Automation platform) Explorium MCP Server** – rich company enrichment via API Anthropic Claude or OpenAI** – used by the AI researcher Google Sheets** – stores output data Structured Output Parser** – ensures clean, predictable JSON formatting 📦 How to Set It Up Add your company domains or names to the input sheet Configure your MCP and SerpAPI credentials in n8n Run the workflow using the Test Workflow trigger Watch the sheet populate with results You can adapt the system to output different formats or fields depending on your team's research goals. 📌 Use Cases Competitive landscape analysis Lead intelligence for outbound campaigns Feature benchmarking (e.g., who offers enterprise or free trial) VC/investment research 🧠 Notes This agent is easily customizable. Adjust the LLM prompt or Output Parser to extract different properties. Explorium MCP is leveraged as the core enrichment engine, ensuring signal accuracy and freshness.
by Weilun
🔄 n8n Workflow: Check and Update n8n Version This workflow automatically checks if the local n8n version is outdated and, if so, creates a file to signal an update is needed. 🖥️ Working Environment Operating System:** Ubuntu 24.04 n8n Installation:** Docker container 📁 Project Directory Structure n8n/ ├── check_update.txt ├── check-update.sh ├── compose.yml ├── update_n8n.cron 🧾 File Descriptions check_update.txt Contains a single word: true: Update is needed false: No update required check-update.sh #!/bin/bash PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin if grep -q "true" /home/sysadmin/n8n/check_update.txt; then Place your update logic here echo "Update needed - please insert update logic" echo true > /home/sysadmin/n8n/check_update.txt fi Purpose: Checks the contents of check_update.txt If it contains true, executes update logic (currently a placeholder) Resets check_update.txt to true update_n8n.cron SHELL=/bin/sh 10 5 * * * /bin/sh /home/sysadmin/n8n/check-update.sh Purpose: Runs the check-update.sh script daily at 5:10 AM Uses /bin/sh as the shell environment 🧩 n8n Workflow Breakdown 1. Schedule Trigger 🕓 Purpose:** Triggers the workflow every day at 5:00 AM Node Type:** Schedule Trigger 2. Get the latest n8n version 🌐 Purpose:** Fetches the latest version of n8n from npm Endpoint:** https://registry.npmjs.org/n8n/latest Node Type:** HTTP Request 3. Get Local n8n version 🖥️ Purpose:** Retrieves the currently running n8n version Endpoint:** http://192.168.100.18:5678/rest/settings Node Type:** HTTP Request 4. If 🔍 Purpose:** Compares the local and latest versions Condition:** If not equal → update is needed 5. SSH1 🧾 Purpose:** Writes the result to a file on the host via SSH Logic:** echo "{{ $('If').params.conditions ? 'false' : 'true' }}" > check_update.txt Effect: Updates check_update.txt with "true" if an update is needed, "false" otherwise. 🛠️ Setting up Crontab on Ubuntu 1. Register the cron job with: crontab update_n8n.cron 2. Verify that your cron job is registered: crontab -l ✅ Result 5:00 AM** – n8n workflow checks versions and writes result to check_update.txt 5:10 AM** – Cron runs check-update.sh to respond to update flag
by scrapeless official
Brief Overview This workflow integrates Linear, Scrapeless, and Claude AI to create an AI research assistant that can respond to natural language commands and automatically perform market research, trend analysis, data extraction, and intelligent analysis. Simply enter commands such as /search, /trends, /crawl in the Linear task, and the system will automatically perform search, crawling, or trend analysis operations, and return Claude AI's analysis results to Linear in the form of comments. How It Works Trigger: A user creates or updates an issue in Linear and enters a specific command (e.g. /search competitor analysis). n8n Webhook: Listens to Linear events and triggers automated processes. Command identification: Determines the type of command entered by the user through the Switch node (search/trends/unlock/scrape/crawl). Data extraction: Calls the Scrapeless API to perform the corresponding data crawling task. Data cleaning and aggregation: Use Code Node to unify the structure of the data returned by Scrapeless. Claude AI analysis: Claude receives structured data and generates summaries, insights, and recommendations. Result writing: Writes the analysis results to the original issue as comments through the Linear API. Features Multiple commands supported /search: Google SERP data query /trends: Google Trends trend analysis /unlock: Unlock protected web content (JS rendering) /scrape: Single page crawling /crawl: Whole site multi-page crawling Claude AI intelligent analysis Automatically structure Scrapeless data Generate executable suggestions and trend insights Format optimization to adapt to Linear comment format Complete automation process Codeless process management based on n8n Multi-channel parallel logic distribution + data standardization processing Support custom API Key, regional language settings and other parameters Requirements Scrapeless API Key**: Scrapeless Service request credentials. Log in to the Scrapeless Dashboard Then click "Setting" on the left -> select "API Key Management" -> click "Create API Key". Finally, click the API Key you created to copy it. n8n Instance**: Self-hosted or n8n.cloud account. Claude AI**: Anthropic API Key (Claude Sonnet 3.7 model recommended) Installation Log in to Linear and get a Personal API Token Log in to n8n Cloud or a local instance Import the n8n workflow JSON file provided by Scrapeless Configure the following environment variables and credentials: Linear API Token Scrapeless API Token Claude API Key Configure the Webhook URL and bind to the Linear Webhook settings page Usage This automated job finder agent is ideal for: | Industry / Role | Use Case | | --------------------------------- | -------------------------------------------------------------------------------------------------- | | SaaS / B2B Software | | | Market Research Teams | Analyze competitor pricing pages using /unlock, and feature pages via /scrape. | | Content & SEO | Discover trending keywords and SERP data via /search and /trends to guide content topics. | | Product Managers | Use /crawl to explore product documentation across competitor sites for feature benchmarking. | | AI & Data-Driven Teams | | | AI Application Developers | Automate info extraction + LLM summarization for building intelligent research agents. | | Data Analysts | Aggregate structured insights at scale using /crawl + Claude summarization. | | Automation Engineers | Integrate command workflows (e.g., /scrape, /search) into tools like Linear to boost productivity. | | E-commerce / DTC Brands | | | Market & Competitive Analysts | Monitor competitor sites, pricing, and discounts with /unlock and /scrape. | | SEO & Content Teams | Track keyword trends and popular queries via /search and /trends. | | Investment / Consulting / VC | | | Investment Analysts | Crawl startup product docs, guides, and support pages via /crawl for due diligence. | | Consulting Teams | Combine SERP and trend data (/search, /trends) for fast market snapshots. | | Media / Intelligence Research | | | Journalists & Editors | Extract forum/news content from platforms like HN or Reddit using /scrape. | | Public Opinion Analysts | Monitor multi-source keyword trends and sentiment signals to support real-time insights. | Output
by n8n Team
This workflow integrates both web scraping and NLP functionalities. It uses HTML parsing to extract links, HTTP requests to fetch essay content, and AI-based summarization using GPT-4o. It's an excellent example of an end-to-end automated task that is not only efficient but also provides real value by summarizing valuable content. Note that to use this template, you need to be on n8n version 1.50.0 or later.
by Abbas Ali
This n8n workflow automatically finds apartments for rent in Germany, filters them by your city, rent budget, and number of rooms, and applies to them via email. Each application includes: A personalized German cover letter Schufa report (fetched dynamically from Google Drive) Recent salary slips (also fetched from Google Drive) The workflow runs daily at a scheduled time, emails landlords or agencies automatically, and logs every application into a Google Sheet for tracking. How It Works Scheduled Trigger – Runs every day at 9 AM (adjustable). Fetch Listings – Uses immobilienscout24 API (or similar) to pull rental listings for your selected city. Filter Listings – Keeps only listings matching your CITY, MAX_RENT, and ROOMS settings. Fetch Documents – Retrieves your Schufa report and salary slips from Google Drive (no need for local hosting). Generate Cover Letter – Creates a personalized German-language letter per apartment. Send Email Application – Sends the email to the landlord or agent with cover letter + documents attached. Log Applications – Saves each application (title, address, rent, date) in a Google Sheet. How to Use Import the workflow JSON into n8n. Set environment variables in n8n (for security): immobilienscout24_TOKEN: Your immobilienscout24 API token immobilienscout24_LISTING_ACTOR: Actor ID for your preferred rental listing scraper (or custom) MY_EMAIL: Your sender email address (SMTP configured in n8n) SCHUFA_FILE_ID: Google Drive File ID for your Schufa PDF SALARY_FILE_ID: Google Drive File ID for your Salary Slips PDF APPLICATION_SHEET_ID: Google Sheet ID to log applications Authenticate Google Drive and Google Sheets (OAuth2 in n8n). Customize search filters in the Set Config node: CITY (e.g., Berlin) MAX_RENT (e.g., 1200) ROOMS (e.g., 2) Activate the workflow – It will run daily at the configured time and send applications automatically. Check your Google Sheet – Every application will be logged for tracking. Requirements An immobilienscout24 account (or another apartment listing API, can be substituted). A Google account (for Drive and Sheets integration). A Schufa report (PDF) uploaded to Google Drive. Recent salary slips (PDF) uploaded to Google Drive. An SMTP-configured email account for sending applications. n8n instance (self-hosted or cloud) with: Google Drive and Google Sheets credentials configured Environment variables set for tokens and file IDs A working email SMTP setup
by Oneclick AI Squad
This guide walks you through setting up an AI-driven workflow to automate flight and hotel reservation processes using a conversational travel booking system. The workflow accepts booking requests, processes them via APIs, and sends confirmations, enabling a seamless travel booking experience. What’s the Goal? Automatically accept and process booking requests for flights and hotels via HTTP POST. Use AI to understand natural language requests and route them to appropriate data processors. Search for flights and hotels using external APIs and process booking confirmations. Send confirmation emails and return structured booking data to users. Enable an automated system for efficient travel reservations. By the end, you’ll have a self-running system that handles travel bookings effortlessly. Why Does It Matter? Manual booking processes are time-consuming and prone to errors. This workflow offers: Zero Human Error**: AI ensures accurate request parsing and booking processing. Time-Saving Automation**: Automates the entire booking lifecycle, boosting efficiency. Seamless Confirmation**: Sends automated emails and responses without manual intervention. Enhanced User Experience**: Provides a conversational interface for bookings. Think of it as your reliable travel booking assistant that keeps the process smooth and efficient. How It Works Here’s the step-by-step flow of the automation: Step 1: Trigger the Workflow Webhook Trigger**: Accepts incoming booking requests via HTTP POST, initiating the workflow. Step 2: Parse the Request AI Request Parser**: Uses AI to understand natural language booking requests (e.g., flight or hotel) and extracts relevant details. Step 3: Route Booking Type Booking Type Router**: Determines whether the request is for a flight or hotel and routes it to the respective data processor. Step 4: Process Flight Data Flight Data Processor**: Handles flight-specific data and prepares it for the search API. Step 5: Search Flight API Flight Search API**: Searches for available flights based on parameters (e.g., https://api.aviationstack.com) and returns results. Step 6: Process Hotel Data Hotel Data Processor**: Handles hotel-specific data and prepares it for the search API. Step 7: Search Hotel API Hotel Search API**: Searches for available hotels based on parameters (e.g., https://api.booking.com) and returns results. Step 8: Process Flight Booking Flight Booking Processor**: Processes flight bookings and generates confirmation details. Step 9: Process Hotel Booking Hotel Booking Processor**: Processes hotel bookings and generates confirmation details. Step 10: Generate Confirmation Message Confirmation Message Generator**: Creates structured confirmation messages for the user. Step 11: Send Confirmation Email Send Confirmation Email**: Sends booking confirmation via email to the user. Step 12: Send Response Send Response**: Returns structured booking data to the user, completing the workflow. How to Use the Workflow? Importing the workflow in n8n is a straightforward process. Follow these steps to import the Conversational Travel Booker workflow: Download the Workflow: Obtain the workflow file (e.g., JSON export from n8n). Open n8n: Log in to your n8n instance. Import Workflow: Navigate to the workflows section, click "Import," and upload the workflow file. Configure Nodes: Adjust settings (e.g., API keys, webhook URLs) as needed. Execute Workflow: Test and activate the workflow to start processing bookings. Requirements n8n account and instance setup. Access to flight and hotel search APIs (e.g., Aviationstack, Booking.com). Email service integration for sending confirmations. Webhook URL for receiving booking requests. Customizing this Workflow Modify the AI Request Parser to handle additional languages or booking types. Update API endpoints in Flight Search API and Hotel Search API nodes to match your preferred providers. Adjust the Send Confirmation Email node to include custom email templates or additional recipients. Schedule the Webhook Trigger to align with your business hours or demand peaks.