by Max Tkacz
This n8n workflow template lets teams easily generate a custom AI chat assistant based on the schema of any Notion database. Simply provide the Notion database URL, and the workflow downloads the schema and creates a tailored AI assistant designed to interact with that specific database structure. Set Up Watch this quick set up video 👇 Key Features Instant Assistant Generation**: Enter a Notion database URL, and the workflow produces an AI assistant configured to the database schema. Advanced Querying**: The assistant performs flexible queries, filtering records by multiple fields (e.g., tags, names). It can also search inside Notion pages to pull relevant content from specific blocks. Schema Awareness**: Understands and interacts with various Notion column types like text, dates, and tags for accurate responses. Reference Links**: Each query returns direct links to the exact Notion pages that inform the assistant’s response, promoting transparency and easy access. Self-Validation**: The workflow has logic to check the generated assistant, and if any errors are detected, it reruns the agent to fix them. Ideal for Product Managers**: Easily access and query product data across Notion databases. Support Teams**: Quickly search through knowledge bases for precise information to enhance support accuracy. Operations Teams**: Streamline access to HR, finance, or logistics data for fast, efficient retrieval. Data Teams**: Automate large dataset queries across multiple properties and records. How It Works This AI assistant leverages two HTTP request tools—one for querying the Notion database and another for retrieving data within individual pages. It’s powered by the Anthropic LLM (or can be swapped for GPT-4) and always provides reference links for added transparency.
by n8n Team
This workflow integrates both web scraping and NLP functionalities. It uses HTML parsing to extract links, HTTP requests to fetch essay content, and AI-based summarization using GPT-4o. It's an excellent example of an end-to-end automated task that is not only efficient but also provides real value by summarizing valuable content. Note that to use this template, you need to be on n8n version 1.50.0 or later.
by scrapeless official
Brief Overview This workflow integrates Linear, Scrapeless, and Claude AI to create an AI research assistant that can respond to natural language commands and automatically perform market research, trend analysis, data extraction, and intelligent analysis. Simply enter commands such as /search, /trends, /crawl in the Linear task, and the system will automatically perform search, crawling, or trend analysis operations, and return Claude AI's analysis results to Linear in the form of comments. How It Works Trigger: A user creates or updates an issue in Linear and enters a specific command (e.g. /search competitor analysis). n8n Webhook: Listens to Linear events and triggers automated processes. Command identification: Determines the type of command entered by the user through the Switch node (search/trends/unlock/scrape/crawl). Data extraction: Calls the Scrapeless API to perform the corresponding data crawling task. Data cleaning and aggregation: Use Code Node to unify the structure of the data returned by Scrapeless. Claude AI analysis: Claude receives structured data and generates summaries, insights, and recommendations. Result writing: Writes the analysis results to the original issue as comments through the Linear API. Features Multiple commands supported /search: Google SERP data query /trends: Google Trends trend analysis /unlock: Unlock protected web content (JS rendering) /scrape: Single page crawling /crawl: Whole site multi-page crawling Claude AI intelligent analysis Automatically structure Scrapeless data Generate executable suggestions and trend insights Format optimization to adapt to Linear comment format Complete automation process Codeless process management based on n8n Multi-channel parallel logic distribution + data standardization processing Support custom API Key, regional language settings and other parameters Requirements Scrapeless API Key**: Scrapeless Service request credentials. Log in to the Scrapeless Dashboard Then click "Setting" on the left -> select "API Key Management" -> click "Create API Key". Finally, click the API Key you created to copy it. n8n Instance**: Self-hosted or n8n.cloud account. Claude AI**: Anthropic API Key (Claude Sonnet 3.7 model recommended) Installation Log in to Linear and get a Personal API Token Log in to n8n Cloud or a local instance Import the n8n workflow JSON file provided by Scrapeless Configure the following environment variables and credentials: Linear API Token Scrapeless API Token Claude API Key Configure the Webhook URL and bind to the Linear Webhook settings page Usage This automated job finder agent is ideal for: | Industry / Role | Use Case | | --------------------------------- | -------------------------------------------------------------------------------------------------- | | SaaS / B2B Software | | | Market Research Teams | Analyze competitor pricing pages using /unlock, and feature pages via /scrape. | | Content & SEO | Discover trending keywords and SERP data via /search and /trends to guide content topics. | | Product Managers | Use /crawl to explore product documentation across competitor sites for feature benchmarking. | | AI & Data-Driven Teams | | | AI Application Developers | Automate info extraction + LLM summarization for building intelligent research agents. | | Data Analysts | Aggregate structured insights at scale using /crawl + Claude summarization. | | Automation Engineers | Integrate command workflows (e.g., /scrape, /search) into tools like Linear to boost productivity. | | E-commerce / DTC Brands | | | Market & Competitive Analysts | Monitor competitor sites, pricing, and discounts with /unlock and /scrape. | | SEO & Content Teams | Track keyword trends and popular queries via /search and /trends. | | Investment / Consulting / VC | | | Investment Analysts | Crawl startup product docs, guides, and support pages via /crawl for due diligence. | | Consulting Teams | Combine SERP and trend data (/search, /trends) for fast market snapshots. | | Media / Intelligence Research | | | Journalists & Editors | Extract forum/news content from platforms like HN or Reddit using /scrape. | | Public Opinion Analysts | Monitor multi-source keyword trends and sentiment signals to support real-time insights. | Output
by Davide
💬🗂️🤖 This workflow automates the translation of Google Slides presentations from any languages, while preserving the original formatting and slide structure. It leverages Google APIs, AI translation (Gemini/PaLM), and modular execution for high flexibility and accuracy. DISCLAIMER: texts are split by Google Slides APIs into small blocks, so the translation will not always be contextualized. Key Benefits ⚡ Time-Saving**: Automates a typically manual and error-prone task of translating slides. 🌍 AI-Powered Accuracy**: Uses Google Gemini to provide context-aware translations while respecting defined rules. 🔒 Safe & Non-Destructive**: The original presentation is never modified — a new copy is always created. 🎯 Precision**: Skips irrelevant text (e.g., emails, URLs, names) to avoid mistranslation. 🔁 Modular & Scalable**: Uses subworkflows and batching, ideal for presentations with many slides. 🎨 Layout Preservation**: Keeps the original design and formatting intact. How it Works Initialization: The workflow starts with a manual trigger ("When clicking ‘Execute workflow’"). Set the language to translate (IMPORTANT format ISO-639) It duplicates a specified Google Slides presentation ("Duplicate presentation") to create a new copy for translation, preserving the original. Slide Processing: The workflow retrieves slides from the copied presentation ("Get slides from a presentation") and processes them in batches ("Loop Over Items"). For each slide, text content is extracted ("Extract Text") using a custom JavaScript snippet, which identifies and collects text elements while retaining the slide's objectId. Translation: The extracted texts are passed to a LangChain agent ("Translation expert"), which translates the content from Italian to English. The agent follows strict guidelines (e.g., skipping URLs, brand names, etc.). The translated text is sent to the "Translate Google Slides" node, which replaces the original text in the presentation using the slide's objectId for targeting. Execution Flow: The workflow includes delays ("Wait 10 sec" and "Wait 3 sec") to manage API rate limits and ensure smooth execution. The process repeats for each batch of slides until all content is translated. Set Up Steps Prerequisites: Ensure access to the source Google Slides presentation (specified by fileId in "Duplicate presentation"). Set up Google OAuth2 credentials for Google Drive and Slides (nodes reference credentials like "Google Slides account"). Configure the Google Gemini (PaLM) API credentials for the translation agent. Configuration: Update the fileId in the "Duplicate presentation" node to point to your source presentation. Adjust the translation guidelines in the "Translation expert" node if needed (e.g., language pairs or exclusion rules). Modify batch sizes or wait times (e.g., "Wait 10 sec") based on API constraints. Execution: Run the workflow manually or trigger it via the "Execute Workflow" node from another workflow. Monitor progress in n8n’s execution log, as each slide is processed and translated sequentially. Output: The translated presentation is saved as a new file in Google Drive, with the filename including a timestamp (e.g., NAME_PRESENTATION_{lang}_{timestamp}). Note: The workflow is currently inactive ("active": false); enable it after configuration. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Ranjan Dailata
Description This workflow automates the process of scraping the latest discussions from HackerNews, transforming raw threads into human readable content using Google Gemini, and exporting the final content into a well-formatted Google Doc. Overview This n8n workflow is responsible for extracting trending posts from the HackerNews API. It loops through each item, performs HTTP data extraction, utilizes Google Gemini to generate human-readable insights, and then exports the enriched content into Google Docs for distribution, archiving, or content creation. Who this workflow is for Tech Newsletter Writers**: Automate the collection and summarization of trending HackerNews posts for inclusion in weekly or daily newsletters. Content Creators & Bloggers**: Quickly generate structured summaries and insights from HackerNews threads to use as inspiration or supporting content for blog posts, videos, or social media. Startup Founders & Product Builders**: Monitor HackerNews for discussions relevant to your niche or competitors, and keep a pulse on the community’s opinions. Investors & Analysts**: Surface early signals from the tech ecosystem by identifying what’s trending and how the community is reacting. Researchers & Students**: Analyze popular discussions and emerging trends in technology, programming, and startups—enriched with AI-generated insights. Digital Agencies & Consultants**: Offer HackerNews monitoring and insight reports as a value-added service to clients interested in the tech space. Tools Used n8n**: The core automation engine that manages the trigger, transformation, and export. HackerNews API**: Provides access to trending or new HN posts. Google Gemini**: Enriches HackerNews content with structured insights and human-like summaries. Google Docs**: Automatically creates and updates a document with the enriched content, ready for sharing or publishing. How to Install Import the Workflow**: Download the .json file and import it into your n8n instance. Set Up HackerNews Source**: Choose whether to use the HN API (via HTTP Request node) or RSS Feed node. Configure Gemini API**: Add your Google Gemini API key and design the prompt to extract pros/cons, key themes, or insights. Set Up Google Docs Integration**: Connect your Google account and configure the Google Docs node to create/update a document. Test and Deploy**: Run a test job to ensure data flows correctly and outputs are formatted as expected. Use Cases Tech Newsletter Authors**: Generate ready-to-use summaries of trending HackerNews threads. Startup Founders**: Stay informed on key discussions, product launches, and community feedback. Investors & Analysts**: Spot early trends, technical insights, and startup momentum directly from HN. Researchers**: Track community reactions to new technologies or frameworks. Content Creators**: Use the enriched data to spark blog posts, YouTube scripts, or LinkedIn updates. Connect with Me Email: ranjancse@gmail.com LinkedIn: https://www.linkedin.com/in/ranjan-dailata/ Get Bright Data: Bright Data (Supports free workflows with a small commission) #n8n #automation #hackernews #contentcuration #aiwriting #geminiapi #googlegemini #techtrends #newsletterautomation #googleworkspace #rssautomation #nocode #structureddata #webscraping #contentautomation #hninsights #aiworkflow #googleintegration #webmonitoring #hnnews #aiassistant #gdocs #automationtools #gptlike #geminiwriter
by Abbas Ali
This n8n workflow automatically finds apartments for rent in Germany, filters them by your city, rent budget, and number of rooms, and applies to them via email. Each application includes: A personalized German cover letter Schufa report (fetched dynamically from Google Drive) Recent salary slips (also fetched from Google Drive) The workflow runs daily at a scheduled time, emails landlords or agencies automatically, and logs every application into a Google Sheet for tracking. How It Works Scheduled Trigger – Runs every day at 9 AM (adjustable). Fetch Listings – Uses immobilienscout24 API (or similar) to pull rental listings for your selected city. Filter Listings – Keeps only listings matching your CITY, MAX_RENT, and ROOMS settings. Fetch Documents – Retrieves your Schufa report and salary slips from Google Drive (no need for local hosting). Generate Cover Letter – Creates a personalized German-language letter per apartment. Send Email Application – Sends the email to the landlord or agent with cover letter + documents attached. Log Applications – Saves each application (title, address, rent, date) in a Google Sheet. How to Use Import the workflow JSON into n8n. Set environment variables in n8n (for security): immobilienscout24_TOKEN: Your immobilienscout24 API token immobilienscout24_LISTING_ACTOR: Actor ID for your preferred rental listing scraper (or custom) MY_EMAIL: Your sender email address (SMTP configured in n8n) SCHUFA_FILE_ID: Google Drive File ID for your Schufa PDF SALARY_FILE_ID: Google Drive File ID for your Salary Slips PDF APPLICATION_SHEET_ID: Google Sheet ID to log applications Authenticate Google Drive and Google Sheets (OAuth2 in n8n). Customize search filters in the Set Config node: CITY (e.g., Berlin) MAX_RENT (e.g., 1200) ROOMS (e.g., 2) Activate the workflow – It will run daily at the configured time and send applications automatically. Check your Google Sheet – Every application will be logged for tracking. Requirements An immobilienscout24 account (or another apartment listing API, can be substituted). A Google account (for Drive and Sheets integration). A Schufa report (PDF) uploaded to Google Drive. Recent salary slips (PDF) uploaded to Google Drive. An SMTP-configured email account for sending applications. n8n instance (self-hosted or cloud) with: Google Drive and Google Sheets credentials configured Environment variables set for tokens and file IDs A working email SMTP setup
by dmr
This n8n workflow implements a version of the Adaptive Retrieval-Augmented Generation (RAG) framework. It recognizes that the best way to retrieve information often depends on the type of question asked. Instead of a one-size-fits-all approach, this workflow adapts its strategy based on the user's query intent. 🌟 How it Works Receive Query: Takes a user query as input (along with context like a chat session ID and Vector Store collection ID if used as sub-workflow). Classify Query: First, the workflow classifies the query into a predefined category. This template uses four examples: Factual: For specific facts. Analytical: For deeper explanations or comparisons. Opinion: For subjective viewpoints. Contextual: For questions relying on specific background. Select & Adapt Strategy: Based on the classification, it selects a corresponding strategy to prepare for information retrieval. The example strategies aim to: Factual: Refine the query for precision. Analytical: Break the query into sub-questions for broad coverage. Opinion: Identify different viewpoints to look for. Contextual: Incorporate implied or user-specific context. Retrieve Info: Uses the output of the selected strategy to search the specified knowledge base (Qdrant vector store - change as needed) for relevant documents. Generate Response: Constructs a response using the retrieved documents, guided by a prompt tailored to the original query type. By adapting the retrieval strategy, this workflow aims to provide more relevant results tailored to the user's intent. ⚙️ Usage & Flexibility Sub-Workflow:** Designed to be called from other n8n workflows, passing user_query, chat_memory_key, and vector_store_id as inputs. Chat Testing:** Can also be triggered directly via the n8n Chat interface for easy testing and interaction. Customizable Framework:** The query categories (Factual, Analytical, etc.) and the associated retrieval strategies are examples. You can modify or replace them entirely to fit your specific domain or requirements. 🛠️ Requirements Credentials:** You will need API credentials configured in your n8n instance for: Google Gemini (AI Models) Qdrant (Vector Store)
by Catalina Kuo
Overview Do you often forget to record expenses? 你是不是會常常忘記紀錄花費? Let Spending Tracker Bot help you! 讓 Spending Tracker Bot 來幫你! This AI image/text Spending Tracker LINE Bot Workflow allows you to quickly create a customized spending tracker robot without writing a line of code. At any time, you can speak or send a photo, and the AI will parse it and automatically log the expense to your cloud ledger. 這套 AI 圖片文字記帳 LINE Bot Workflow ,讓你不用寫一行程式碼,就能快速打造一個量身訂製的記帳機器人。無論何時,只需要口述或發送一張照片,AI 就會幫你整理好自動計入雲端帳本 Preparation ① Enable the Google Sheets API in GCP and complete the OAuth setup ② Create the Google Sheet and populate the field names (Feel free to modify based on your own needs) ③ Configure the Webhook URL in the LINE Developers Console ④ OpenAI API Key ① 在 GCP 啟用 Google Sheets API,並完成 OAuth ② 建立並填好 Google Sheet 欄名 (按照自己的需求做更動) ③ 於 LINE Developers 控制台設定 Webhook URL ④ OpenAI API Key Node Configurations Webhook Purpose: The URL is used to receive incoming requests from LINE. Configuration: Paste this URL into the Webhook URL field in your LINE Developers Console. 用途: 要接收 Line 的 URL 設定: 將 URL 放到 Line Webhook URL Switch based on Expense Type & Set/Https Purpose: To distinguish whether the incoming message is text or an image. Configuration: Use a Switch node to route the flow accordingly. 用途: 區分 text 或 image 設定: switch 分流 AI Agent Purpose: To extract and organize the required fields. Configuration: Chat Model & Structured Output Parser. 用途: 整理出需要的欄位 設定: Chat Model & Structured Output Parser Create a deduplication field Purpose: To prevent duplicate entries by creating a unique "for_deduplication" field. Configuration: Join multiple field names using hyphens (-) as separators. 用途: 確保不會重複寫入,先創建一個"去重使用"欄位 設定: 用 - 連接多個欄位 Aggregrate & Merge_all Purpose: To prevent duplicate entries in the data table. Configuration: Read the Google sheet, extract the existing "for_deduplication" column into a dedupeList, and compare it against the newly generated "for_deduplication" value from the previous step. 用途: 防止重複寫入資料表 設定:讀取雲端表,將原本的"去重使用欄位"整理成dedupeList,與前一步整理好的"去重使用"欄位做比對 Response Switch Purpose: To route data and send appropriate replies based on the content. Configuration: Use the replyToken to respond after branching logic. Depending on the result, either write to the data table or return a message: ✅ Expense recorded successfully: <for_deduplication> Irrelevant details or images will not be logged. ⚠️ This entry has already been logged and will not be duplicated. 用途: 資料分流,回應訊息 設定:使用 replyToken ,資料分流後,寫入資料表或回應訊息 ✅ 記帳成功 : <去重使用欄位> 不相關明細或圖片,不會計入 ⚠️ 此筆資料已記錄過,不會重複記帳 Step by step teaching notes 【Auto Expense Tracker from LINE Messages with GPT-4 and Google Sheets】 【AI 圖片文字記帳 Line Bot,自動記帳寫入 Google Sheet】
by Oneclick AI Squad
This automated n8n workflow detects and manages fraudulent booking transactions through comprehensive AI-powered analysis and multi-layered security checks. The system processes incoming travel booking data, performs IP geolocation verification, enriches transaction details with AI insights, calculates dynamic risk scores, and executes automated responses based on threat levels. All transactions are logged and appropriate notifications are sent to relevant stakeholders. Good to Know The workflow combines multiple detection methods, including IP geolocation, AI analysis, and risk scoring algorithms Google Gemini Chat Model provides advanced natural language processing for transaction analysis Risk levels are dynamically calculated and categorized as CRITICAL, HIGH, or standard risk Automated blocking and flagging system protects against fraudulent transactions in real-time All transaction data is logged to Google Sheets for audit trails and pattern analysis The system respects API rate limits and includes proper error handling mechanisms How It Works 1. Initial Data Ingestion & Extraction Monitors and captures incoming booking transaction data from various sources Extracts key booking details including user information, payment data, booking location, and transaction metadata Performs initial data validation and formatting for downstream processing 2. IP Geolocation and AI Analysis IP Geolocation Check**: Validates booking IP addresses by checking geolocation details and comparing against expected user locations AI Agent Integration**: Utilizes Google Gemini Chat Model to analyze booking patterns, user behavior, and transaction anomalies Enhanced Data Processing**: Enriches transaction data with geographical context and AI-driven risk indicators 3. Risk Calculation and Decision Logic Enhanced Risk Calculator**: Combines AI-generated risk scores with geolocation-based factors, payment method analysis, and historical patterns Critical Risk Check**: Flags transactions with risk levels marked as CRITICAL for immediate action High Risk Check**: Identifies HIGH risk transactions requiring additional verification steps Dynamic Scoring**: Adjusts risk calculations based on real-time threat intelligence and pattern recognition 4. Action & Notification Block User Account**: Automatically blocks user accounts for CRITICAL risk transactions to prevent immediate fraud Flag for Review**: Marks HIGH risk transactions for manual review by fraud prevention teams Send Notifications**: Dispatches real-time alerts via email and messaging systems to security teams Automated Responses**: Sends appropriate messages to users based on transaction status and risk level 5. Logging & Response Log to Google Sheets**: Records all transaction details, risk scores, and actions taken for comprehensive audit trails Flag for Review**: Maintains detailed logs of flagged transactions for pattern analysis and machine learning improvements Response Tracking**: Monitors and logs all automated responses and manual interventions How to Use Import the workflow into your n8n instance Configure Google Gemini Chat Model API credentials for AI analysis Set up IP geolocation service API access for location verification Configure Google Sheets integration for transaction logging Establish Gmail/email credentials for notification delivery Define risk thresholds and scoring parameters based on your fraud tolerance levels Test the workflow with sample booking data to verify all components function correctly Monitor initial deployments closely to fine-tune risk scoring algorithms Establish manual review processes for flagged transactions Set up regular monitoring and maintenance schedules for optimal performance Requirements Google Gemini Chat Model API access IP Geolocation service API credentials Google Sheets API integration Gmail API or SMTP email service for notifications n8n instance with appropriate node modules installed Customizing This Workflow Risk Scoring Parameters**: Adjust risk calculation algorithms and thresholds based on your specific fraud patterns and business requirements AI Model Configuration**: Fine-tune Google Gemini prompts and analysis parameters for improved accuracy in your use case Notification Channels**: Add or modify notification methods including Slack, SMS, or webhook integrations Data Sources**: Extend input methods to accommodate additional booking platforms or payment processors Logging Destinations**: Configure alternative or additional logging systems such as databases or external SIEM platforms Geographic Rules**: Customize geolocation validation rules based on your service areas and customer base Automated Actions**: Modify or expand automated response actions based on your fraud prevention policies Review Workflows**: Integrate with existing fraud review systems or ticketing platforms for seamless manual review processes
by Oneclick AI Squad
This guide walks you through setting up an AI-driven workflow to automate flight and hotel reservation processes using a conversational travel booking system. The workflow accepts booking requests, processes them via APIs, and sends confirmations, enabling a seamless travel booking experience. What’s the Goal? Automatically accept and process booking requests for flights and hotels via HTTP POST. Use AI to understand natural language requests and route them to appropriate data processors. Search for flights and hotels using external APIs and process booking confirmations. Send confirmation emails and return structured booking data to users. Enable an automated system for efficient travel reservations. By the end, you’ll have a self-running system that handles travel bookings effortlessly. Why Does It Matter? Manual booking processes are time-consuming and prone to errors. This workflow offers: Zero Human Error**: AI ensures accurate request parsing and booking processing. Time-Saving Automation**: Automates the entire booking lifecycle, boosting efficiency. Seamless Confirmation**: Sends automated emails and responses without manual intervention. Enhanced User Experience**: Provides a conversational interface for bookings. Think of it as your reliable travel booking assistant that keeps the process smooth and efficient. How It Works Here’s the step-by-step flow of the automation: Step 1: Trigger the Workflow Webhook Trigger**: Accepts incoming booking requests via HTTP POST, initiating the workflow. Step 2: Parse the Request AI Request Parser**: Uses AI to understand natural language booking requests (e.g., flight or hotel) and extracts relevant details. Step 3: Route Booking Type Booking Type Router**: Determines whether the request is for a flight or hotel and routes it to the respective data processor. Step 4: Process Flight Data Flight Data Processor**: Handles flight-specific data and prepares it for the search API. Step 5: Search Flight API Flight Search API**: Searches for available flights based on parameters (e.g., https://api.aviationstack.com) and returns results. Step 6: Process Hotel Data Hotel Data Processor**: Handles hotel-specific data and prepares it for the search API. Step 7: Search Hotel API Hotel Search API**: Searches for available hotels based on parameters (e.g., https://api.booking.com) and returns results. Step 8: Process Flight Booking Flight Booking Processor**: Processes flight bookings and generates confirmation details. Step 9: Process Hotel Booking Hotel Booking Processor**: Processes hotel bookings and generates confirmation details. Step 10: Generate Confirmation Message Confirmation Message Generator**: Creates structured confirmation messages for the user. Step 11: Send Confirmation Email Send Confirmation Email**: Sends booking confirmation via email to the user. Step 12: Send Response Send Response**: Returns structured booking data to the user, completing the workflow. How to Use the Workflow? Importing the workflow in n8n is a straightforward process. Follow these steps to import the Conversational Travel Booker workflow: Download the Workflow: Obtain the workflow file (e.g., JSON export from n8n). Open n8n: Log in to your n8n instance. Import Workflow: Navigate to the workflows section, click "Import," and upload the workflow file. Configure Nodes: Adjust settings (e.g., API keys, webhook URLs) as needed. Execute Workflow: Test and activate the workflow to start processing bookings. Requirements n8n account and instance setup. Access to flight and hotel search APIs (e.g., Aviationstack, Booking.com). Email service integration for sending confirmations. Webhook URL for receiving booking requests. Customizing this Workflow Modify the AI Request Parser to handle additional languages or booking types. Update API endpoints in Flight Search API and Hotel Search API nodes to match your preferred providers. Adjust the Send Confirmation Email node to include custom email templates or additional recipients. Schedule the Webhook Trigger to align with your business hours or demand peaks.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors competitor pricing changes and website updates to keep you informed of market movements. It saves you time by eliminating the need to manually check competitor websites and provides alerts only when actual changes occur, preventing information overload. Overview This workflow automatically scrapes competitor pricing pages (like ClickUp) and compares current pricing with previously stored data. It uses Bright Data to access competitor websites without being blocked and AI to intelligently extract pricing information, updating your tracking spreadsheet only when changes are detected. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping competitor websites without being blocked OpenAI**: AI agent for intelligent pricing data extraction and parsing Google Sheets**: For storing and comparing historical pricing data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your pricing tracking spreadsheet Customize: Set your competitor URLs and pricing monitoring schedule Use Cases Product Teams**: Monitor competitor feature and pricing changes for strategic planning Sales Teams**: Stay informed of competitor pricing to adjust sales strategies Marketing Teams**: Track competitor messaging and positioning changes Business Intelligence**: Build comprehensive competitor analysis databases Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #competitoranalysis #pricingmonitoring #brightdata #webscraping #competitortracking #marketintelligence #n8nworkflow #workflow #nocode #pricetracking #businessintelligence #competitiveanalysis #marketresearch #competitormonitoring #pricingdata #websitemonitoring #competitorpricing #marketanalysis #competitorwatch #pricingalerts #businessautomation #competitorinsights #markettrends #pricingchanges #competitorupdates #strategicanalysis #marketposition #competitiveintelligence
by Cristian Tala Sánchez
✨ SEO Blog Post Automation with Perplexity, GPT, Leonardo AI & WordPress This workflow automates the creation and publishing of weekly SEO-optimized blog posts using AI and publishes them directly to WordPress — with featured images and tracking in Google Sheets. 🧠 Who is this for This automation is ideal for: Startup platforms and tech blogs Content creators and marketers Solopreneurs who want consistent blog output Spanish-speaking audiences focused on startup trends ⚙️ What it does ⏰ Runs every Monday at 6:00 AM via CRON 📡 Uses Perplexity AI to research trending startup topics 📝 Generates a 1000–1500 word article with GPT in structured HTML 🎨 Creates a cinematic blog image using Leonardo AI 🖼️ Uploads the image to WordPress with alt text and SEO-friendly filename 📰 Publishes the post in a pre-defined category 📊 Logs the post in Google Sheets for tracking 🚀 How to set it up Connect your credentials: Perplexity API OpenAI (GPT-4.1 Mini or similar) Leonardo AI (Bearer token) WordPress (Basic Auth) Google Sheets (OAuth2) Customize your content: Adjust the prompt inside the HTTP node to fit your tone or focus Change the WordPress category ID Update scheduling if you want a different publishing day Test the workflow manually to ensure all steps function correctly 💡 Pro tips Add Slack or email nodes to get notified when a post goes live Use multiple categories or RSS feeds for content diversification Adjust GPT prompt to support different languages or tones Add post-validation rules if needed before publishing 🎯 Why this matters This workflow gives you a full editorial process on autopilot: research, writing, design, publishing, and tracking — all powered by AI. No more blank pages or manual posting. Use it to scale your content strategy, boost your SEO, and stay relevant — 100% hands-free.