by JJ Tham
Generate AI Voiceovers from Scripts and Upload to Google Drive This is the final piece of the AI content factory. This workflow takes your text-based video scripts and automatically generates high-quality audio voiceovers for each one, turning your text into ready-to-use audio assets for your video ads. Go from a spreadsheet of text to a folder of audio files, completely on autopilot. ⚠️ CRITICAL REQUIREMENTS (Read First!) This is an advanced, self-hosted workflow that requires specific local setup: Self-Hosted n8n Only:** This workflow uses the Execute Command and Read/Write Files nodes, which requires you to run your own instance of n8n. It will not work on n8n Cloud. FFmpeg Installation:** You must have FFmpeg installed on the same machine where your n8n instance is running. This is used to convert the audio files to a standard format. What it does This is Part 3 of the AI marketing series. It connects to the Google Sheet where you generated your video scripts (in Part 2). For each script that hasn't been processed, it: Uses the Google Gemini Text-to-Speech (TTS) API to generate a voiceover. Saves the audio file to your local computer. Uses FFmpeg to convert the raw audio into a standard .wav file. Uploads the final .wav file to your Google Drive. Updates the original Google Sheet with a link to the audio file in Drive and marks the script as complete. How to set up IMPORTANT: This workflow is Part 3 of a series and requires the output from Part 2 ("Generate AI Video Ad Scripts"). If you need Part 1 or Part 2 of this workflow series, you can find them for free on my n8n Creator Profile. Connect to Your Scripts Sheet: In the "Getting Video Scripts" node, connect your Google Sheets account and provide the URL to the sheet containing your generated video scripts from Part 2. Configure AI Voice Generation (HTTP Request): In the "HTTP Request To Generate Voice" node, go to the Query Parameters and replace INSERT YOUR API KEY HERE with your Google Gemini API key. In the JSON Body, you can customize the voice prompt (e.g., change <INSERT YOUR DESIRED ACCENT HERE>). Set Your Local File Path: In the first "Read/Write Files from Disk" node, update the File Name field to a valid directory on your local machine where n8n has permission to write files. Replace /Users/INSERT_YOUR_LOCAL_STORAGE_HERE/. Connect Google Drive: In the "Uploading Wav File" node, connect your Google Drive account and choose the folder where your audio files will be saved. Update Your Tracking Sheet: In the final "Uploading Google Drive Link..." node, ensure it's connected to the same Google Sheet from Step 1. This node will update your sheet with the results. Name and Description for Submission Form Here are the name and description, updated with the new information, ready for you to copy and paste. Name: Generate AI Voiceovers from Scripts and Upload to Google Drive Description: Welcome to the final piece of the AI content factory! 🔊 This advanced workflow takes the video ad scripts you've generated and automatically creates high-quality audio voiceovers for each one, completing your journey from strategy to ready-to-use media assets. ⚠️ This is an advanced workflow for self-hosted n8n instances only and requires FFmpeg to be installed locally. ⚙️ How it works This workflow is Part 3 of a series. It reads your video scripts from a Google Sheet, then for each script it: Generates a voiceover using the Google Gemini TTS API. Saves the audio file to your local machine. Converts the file to a standard .wav format using FFmpeg. Uploads the final audio file to Google Drive. Updates your Google Sheet with a link to the new audio file. 👥 Who’s it for? Video Creators & Marketers: Mass-produce voiceovers for video ads, tutorials, or social media content without hiring voice actors. Automation Power Users: A powerful example of how n8n can bridge cloud APIs with local machine commands. Agencies: Drastically speed up the production of audio assets for client campaigns. 🛠️ How to set up This workflow requires specific local setup due to its advanced nature. IMPORTANT: This is Part 3 of a series. To find Part 1 ("Generate a Strategic Plan") and Part 2 ("Generate Video Scripts"), please visit my n8n Creator Profile where they are available for free. Setup involves connecting to your scripts sheet, configuring the AI voice API, setting a local file path for n8n to write to, and connecting your Google Drive.
by zahir khan
This workflow automates your daily social media content creation by generating unique, on-brand posts based on specific themes stored in Notion. It creates images using Fal.ai, writes captions with OpenAI, and schedules them to multiple platforms via Postiz. 📺 How It Works Daily Trigger: The workflow runs automatically every day at a set time. Context Fetching: It pulls your "Brand Guidelines" and the specific "Post Theme" for the day (e.g., Expert Advice, System, or Activity) from Notion. Image Generation: It uses OpenAI to craft a detailed image prompt based on the theme, then sends it to Fal.ai to generate a high-quality visual. Caption Writing: It uses OpenAI again to write an engaging caption that adheres to your brand voice. Scheduling: Finally, it uploads the media to Postiz and schedules it for publication on LinkedIn, X (Twitter), Facebook, and Instagram. 🔧 How to set up Notion: Create a "Brand Guidelines" database and a "Post Themes" database. Configure Nodes: Update the Notion nodes in the workflow to point to your specific Database IDs. Credentials: Connect your accounts for OpenAI, Fal.ai, Google Drive, Notion, and Postiz. Postiz IDs: In the final HTTP Request nodes, replace the integration_id fields with the specific IDs from your Postiz account for each social platform. 📋 Requirements n8n (Self-hosted or Cloud) Notion account OpenAI API Key Fal.ai API Key Postiz instance (or account) Google Drive account (for temporary image storage)
by Budi SJ
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Multi Platform Content Generator from YouTube using AI & RSS This workflow automates content generation by monitoring YouTube channels, extracting transcripts via AI, and creating platform-optimized content for LinkedIn, X/Twitter, Threads, and Instagram. Ideal for creators, marketers, and social media managers aiming to scale content production with minimal effort. ✨ Key Features 🔔 Automated YouTube Monitoring** via RSS feed 🧠 AI-Powered Transcript Extraction** using Supadata API ✍️ Multi-Platform Content Generation** with OpenRouter AI 🎯 Platform Optimization** based on tone and character limits 📬 Telegram Notification** for easy preview 📊 Centralized Data Management via Google Sheets** > 🗂️ All video data, summaries, and generated content are tracked and stored in a single, centralized Google Sheets template > This ensures full visibility, easy access, and smooth collaboration across your team. ⚙️ Workflow Components 1. 🧭 Channel Monitoring Schedule Trigger**: Initiates workflow periodically Google Sheets (Read)**: Pulls YouTube channel URLs HTTP Request + HTML Parser**: Extracts channel IDs from URLs RSS Reader**: Fetches latest video metadata 2. 🧾 Content Processing Supadata API**: Extracts transcript from YouTube video OpenRouter AI**: Summarizes transcript + generates content per platform Conditional Check**: Prevents duplicate content by checking existing records 3. 📤 Multi-Platform Output LinkedIn**: Story-driven format (≤ 1300 characters) X/Twitter**: Short, punchy copy (≤ 280 characters) Threads**: Friendly, conversational Instagram**: Short captions for visual posts 4. 🗃️ Data Management Google Sheets (Write)**: Stores video metadata + generated posts Telegram Bot**: Sends content preview ID Tracking**: Avoids reprocessing using video ID 🔐 Required Credentials Google Sheets OAuth2** Supadata API** OpenRouter API** Telegram Bot Token & Chat ID** 🎁 Benefits ⌛ Save Time**: Automates transcript + content generation 🔊 Consistent Tone**: Adjust AI prompts for brand voice 📡 Multi-Platform Ready**: One video → multiple formats 📂 Centralized Logs via Google Sheets**: Easily track, audit, and collaborate 🚀 Scalable**: Handle many channels with ease
by Amit Kumar
Overview This n8n template automates the entire process of generating short-form AI videos and publishing them across multiple social media platforms. It combines Google Gemini for structured prompt creation, KIE AI for video generation, and Blotato for centralized publishing. The result is a fully automated content pipeline ideal for creators, marketers, agencies, or anyone who wants consistent, hands-free content generation. This workflow is especially useful for short-video creators, meme pages, educational creators, UGC teams, auto-posting accounts, and brands who want to maintain high-frequency posting without manual effort. Good to Know API costs:** KIE AI generates videos using paid tokens/credits. Prices vary based on model, duration, and resolution (check KIE AI pricing). Google Gemini model restrictions:** Certain Gemini models are geo-limited. If you receive “model not found,” the model may not be available in your region. Blotato publishing:** Blotato supports many platforms: YouTube, Instagram, Facebook, LinkedIn, TikTok, X, Bluesky, and more. Platform availability depends on your Blotato setup. Runtime considerations:** Video generation can take time (10–60 seconds+, depending on the complexity). Self-hosted requirement:** This workflow uses a community node (Blotato). Community nodes do not run on n8n Cloud. A self-hosted instance is required. How It Works Scheduler Trigger Defines how frequently new videos should be created (e.g., every 12 hours). Random Template Selector A JavaScript node generates a random number to choose from multiple creative prompt templates. AI Agent (Google Gemini) Gemini generates a JSON object containing: A short title A human-readable video description A detailed text-to-video prompt The Structured Output Parser ensures strict JSON shape. Video Generation with KIE AI The prompt is sent to KIE AI’s video generation API. KIE AI creates a synthetic AI video based on the description and your chosen parameters (aspect ratio, frames, watermark removal, etc.). Polling & Retrieval The workflow waits until the video is fully rendered, then fetches the final video URL. Media Upload to Blotato The generated video is uploaded into Blotato’s media storage for publishing. Automatic Posting to Social Platforms Blotato distributes the video to all connected platforms. Examples include: YouTube Instagram Facebook LinkedIn Bluesky TikTok X Any platform supported by your Blotato account This results in a fully automated “idea → video → upload → publish” pipeline. How to Use Start by testing the workflow manually to verify video generation and posting. Adjust the Scheduler Trigger to fit your posting frequency. Add your API credentials for: Google Gemini KIE AI Blotato Ensure your Blotato account has social channels connected. Edit or expand the prompt templates for your content niche: Comedy clips Educational videos Product demos Storytelling Pet videos Motivational content The more template prompts you add, the more diverse your automated videos will be. Requirements Google Gemini** API Key Used for generating structured titles, descriptions, and video prompts. KIE AI API key** Required for creating the actual AI-generated video. Blotato account** Required for uploading media and automatically posting to platforms. Self-hosted n8n instance** Needed because Blotato uses a community node, which n8n Cloud does not support. Limitations KIE AI models may output inconsistent results if prompts are vague. High-frequency scheduling may consume API credits quickly. Some platforms (e.g., TikTok or Facebook Pages) may require additional permissions or account linking steps in Blotato. Video rendering time varies depending on prompt complexity. Customization Ideas Add more prompt templates to increase variety. Swap Gemini for an LLM of your choice (OpenAI, Claude, etc.). Add a Telegram, Discord, or Slack notification once posting is complete. Store all generated titles, descriptions, and video URLs in: Google Sheets Notion Airtable Supabase Add multi-language support using a translation node. Add an approval step where videos go to your team before publishing. Add analytics logging (impressions, views, etc.) using Blotato or another service. Troubleshooting Video not generating?** Check if your KIE AI model accepts your chosen parameters. Model not found?** Switch to a supported Gemini model for your region. Publishing fails?** Ensure Blotato platform accounts are authenticated. Workflow stops early?** Increase the wait timeout before polling KIE AI. This template is designed for easy setup and high flexibility. All technical details, configuration steps, and workflow logic are already included in sticky notes inside the workflow. Once configured, this pipeline becomes a hands-free AI-powered content engine capable of generating and publishing content at scale.
by Simeon Penev
Who’s it for Content/SEO teams who want a fast, consistent, research-driven brief for a copywriters from a single keyword—without manual review and analysis of the SERP (Google results). How it works / What it does Form Trigger collects the keyword/topic and redirects to Google Drive Folder after the final node. FireCrawl Search & Scrape pulls the top 5 pages for the chosen keyword. AI Agent (with Think + OpenAI Chat Model) analyzes sources and generates an original Markdown brief. Markdown to JSON converts the Markdown into Google Docs batchUpdate requests (H1/H2/H3, lists, links, spacing). Then this is used in Update a document for updating the empty doc. Create a document + Update a document write a Google Doc titled “SEO Brief for ” and update the Google Doc in your target Drive folder. How to set up Add credentials: Firecrawl (Authorization header), OpenAI (Chat), Google Docs OAuth2. Replace placeholders: {{APIKEY}}, {{googledrivefolderid}}, {{googledrivefolderurl}}. Publish and open the Form URL to test. Requirements Firecrawl API key • OpenAI API key • Google account with access to the target Drive folder. Resources Google OAuth2 Credentials Setup - https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/ Firecrawl API key - https://take.ms/lGcUp OpenAI API key - https://docs.n8n.io/integrations/builtin/credentials/openai/
by Vadim Mubi
This workflow acts as an automated market analyst for educational purposes. It scans Binance Futures (Testnet) for high-volume pairs, applies custom technical analysis (RSI, Bollinger Bands, EMA, ATR) using JavaScript, and uses AI to validate trends against recent news sentiment. It is designed for paper trading to demonstrate how to build advanced financial logic and adaptive risk management systems in n8n without risking real funds. 💡 Why use this? Smart Scanning:** Automatically filters top 150 pairs by volume and excludes stablecoins to find active markets. Dynamic Risk Management:* Uses *ATR (Average True Range)** to calculate adaptive Stop Loss and Take Profit levels based on current market volatility. Custom Technical Analysis:** Demonstrates how to calculate complex indicators via a Function node, eliminating the need for paid TA APIs. AI Sentiment Filter:** Scrapes recent news and uses an LLM (OpenAI) to "vet" the technical signal against potential FUD or risks. Secure Execution:** Shows how to sign HMAC SHA256 requests manually to interact with the Binance Futures API. ⚙️ How it works Filter: Runs every 15 minutes to find liquid assets on Binance. Calculate: Computes indicators (EMA 200, BB, RSI) and defines Entry/Exit points using ATR logic. Validate: If a technical signal matches, it fetches news and asks AI: "Is there any breaking news that contradicts this trade?" Execute: If AI returns "CONFIRM", it posts the detailed analysis to Telegram and places a paper trade order on the Testnet. 🛠 Setup Steps Binance Testnet: Create a free account on Binance Futures Testnet and generate API keys. Configuration: Open the 📝 MAIN CONFIG node and enter your Testnet Keys and Telegram Channel ID. Credentials: Add your OpenAI (or OpenRouter) credentials to the AI node. > Disclaimer: This workflow connects to the Binance Testnet by default. It is intended for educational purposes only. The author and n8n are not responsible for financial decisions.
by Pratyush Kumar Jha
Deep Multiline Icebreaker — AI-driven research + personalized cold outreach Deep Multiline Icebreaker automates high-quality, research-led outreach. Feed it a list of leads (emails + websites) and a short product brief via the built-in form; the workflow scrapes each company's site, extracts meaningful page content, uses GPT to produce concise page abstracts, aggregates insights, and then generates tailored, multi-line cold email bodies (JSON). Final outreach rows are appended automatically to a Google Sheet so you can review, sequence, or plug into your outreach stack. This template is built for SDRs, growth folks, and agencies who want dramatically better reply rates by replacing generic blasts with short, highly-specific icebreakers that reference subtle site signals. It’s opinionated (focuses on non-obvious details and concise, credible tone) but easy to tweak — prompts, output format, and the Google Sheet mapping are all editable inside n8n. How it works Form trigger — you submit product details, target designation, location, etc. Leads fetch — the workflow calls an external leads scraper (Apify act) to retrieve potential contacts. Filter & normalize — only rows with website + email proceed; links are normalized (relative/absolute handling). Scrape & convert — homepage and linked pages are fetched and converted to Markdown for clean input. Summarize (GPT) — each page is summarized into a two-paragraph abstract. Aggregate & generate — abstracts are aggregated and GPT generates a tailored multi-line icebreaker JSON (subject + body). Append to Google Sheets — resulting outreach content + lead metadata is appended to your sheet. Nodes of interest you can edit On form submission1 Leads Scraper1 Scrape Home1 Summarize Website Page1 Generate Multiline Icebreaker1 Add Row1 Quick Setup Guide 👉 Demo & Setup Video 👉 Sheet Template 👉 Course What you’ll need (credentials) OpenAI API key (used by Summarize Website Page1 and Generate Multiline Icebreaker1). Google Sheets OAuth (write access for Add Row1). Apify (or your leads-source) API token for Leads Scraper1 (the template calls an Apify act). Optional: outbound HTTP access from your n8n host to target websites. Recommended settings & best practices Limit batch sizes** (the template uses Limit1 set to 3 by default) — ramp the maxItems up slowly to respect rate limits and token costs. Prompt tweaks** — open the Generate Multiline Icebreaker1 prompt to tune tone, cost framing, or add product-specific selling points. Deduplication** — Remove Duplicate URLs1 is included; keep it ON to avoid repeated scraping. Privacy** — don’t store PII longer than necessary; if you store outreach drafts, ensure your Google Sheet access is restricted. Cost control** — set temperature lower (0–0.6) for more consistent outputs and monitor your OpenAI usage. Customization ideas Swap GPT model name or change prompt to produce shorter cold SMS or LinkedIn messages. Replace Apify with your own lead source (CSV upload, CRM query, or Airtable). Add an approval step (Slack/Email) before rows are appended to Google Sheets. Add a follow-up sequence generator that writes 2–3 follow-up messages per lead. Troubleshooting quick tips If pages return empty abstracts, check Request web page for URL1 and network access / user-agent restrictions. If outputs are malformed JSON, open the Generate Multiline Icebreaker1 node and validate the JSON output option. If Google Sheets fails, re-authorize the Google Sheets credential and ensure the sheet ID & sheet name are correct. Tags / Suggested listing fields outreach, lead-gen, sales-automation, openai, web-scraping, google-sheets
by oka hironobu
Who is this for DevOps engineers, site reliability teams, and business owners who need to know the moment their website goes down — and want AI-powered diagnostics instead of just a ping alert. What this workflow does A Schedule Trigger checks your website every 5 minutes by sending an HTTP Request to your target URL. A Code node evaluates the response status code and response time against your thresholds. When the site is down or degraded, Google Gemini analyzes the error pattern and provides a probable root cause diagnosis with recommended actions. Every health check is logged to Google Sheets for uptime history tracking. Downtime events trigger an immediate Slack alert that includes the AI diagnosis. A daily summary email is sent via Gmail so you have a record even when nothing goes wrong. How to set up Replace https://example.com in the "Check Website" node with your actual URL Add your free Google Gemini API key from Google AI Studio Create a Google Sheet called "Uptime Log" and connect it Connect Slack and select your monitoring channel Connect Gmail and set the recipient email address Requirements Google Gemini API key (free tier) Google Sheets account Slack workspace Gmail account n8n instance (self-hosted or cloud) How to customize Change the check interval in the "Check Every 5 Minutes" trigger node Adjust the response time threshold in the "Evaluate Response" Code node (default: 3000ms) Monitor multiple URLs by duplicating the HTTP Request and Code nodes
by Oneclick AI Squad
This AI-powered workflow monitors trending topics across multiple social platforms, generates creative post ideas with captions and visual suggestions, and recommends optimal posting times based on engagement data. How it works Trigger - Runs on schedule or webhook to start trend monitoring Fetch Trends - Pulls trending topics from Twitter, Reddit, Google Trends Wait & Aggregate - Allows trend data to settle for better analysis Filter & Parse - JavaScript code filters relevant trends for your niche AI Content Generation - Claude creates post ideas, captions, hashtags Visual Suggestions - Recommends image/video concepts Wait for Analysis - Pauses before engagement time calculation Optimal Timing - JavaScript calculates best posting times Log & Track - Records all ideas in Google Sheets Response - Returns ready-to-use content ideas Setup Steps Import this workflow into your n8n instance Configure credentials: Twitter API v2 - For trending hashtags and topics Reddit API - For subreddit trending posts Google Trends (no auth) - For search trends Anthropic API - For Claude AI content generation Google Sheets - To track generated ideas Update your brand profile and niche in the config node Set your target social platforms and audience Activate the workflow Sample Trigger Payload { "platforms": ["twitter", "instagram", "linkedin"], "niche": "AI & Technology", "trendSources": ["twitter", "reddit", "google"], "contentTypes": ["educational", "entertaining", "news"], "targetAudience": "tech professionals, 25-45", "brandVoice": "professional yet approachable", "minTrendScore": 60, "maxIdeasPerTrend": 3, "includeVisuals": true } Features Multi-platform trend monitoring** (Twitter, Reddit, Google Trends) AI-powered content generation** with brand voice matching Visual concept suggestions** for each post idea Optimal timing recommendations** based on engagement patterns Hashtag strategy** with trending and niche tags Content calendar integration** via Google Sheets Duplicate prevention** - tracks used trends Performance tracking** - logs which ideas perform best
by Yasser Sami
AI Documentation Crawler & Knowledge Base Builder This n8n template automatically crawls technical documentation websites, scrapes their content, and converts it into clean, structured, developer-friendly documentation. Each page is organized into folders and saved as Google Docs, making it easy to build or maintain an internal knowledge base. Who’s it for Developer teams maintaining internal or external documentation SaaS companies onboarding users or support teams AI builders creating documentation-based knowledge bases Anyone who wants to turn raw docs into structured, readable references How it works / What it does Manual Trigger The workflow starts manually whenever you want to crawl or refresh documentation. Documentation Discovery (Crawler) The workflow crawls a root documentation URL and generates a sitemap of all discoverable documentation pages. URL Processing The sitemap is split into individual URLs. The workflow dynamically analyzes URL depth to recreate the documentation hierarchy. Folder Structure Creation A parent folder is created in Google Drive for the service. Subfolders are automatically generated to mirror the documentation structure (based on URL paths). Content Scraping Each documentation page is scraped using the Olostep API. Clean markdown content is extracted from the page. Information Extraction AI extracts structured technical details such as: API summaries cURL examples Authentication methods Key notes and pitfalls AI Documentation Generation An AI agent transforms the scraped content into a polished, human-readable API reference or guide. Document Creation A Google Doc is created for each documentation page. The generated content is inserted into the document and saved in the correct folder. Rate Control A wait step prevents API throttling during large documentation crawls. The result is a fully structured documentation library generated automatically from live documentation websites. How to set up Import the template into your n8n workspace. Set the root documentation URL you want to crawl. Connect your Google Drive and Google Docs accounts. Add your Olostep API key and AI model credentials. Execute the workflow to generate your documentation library. Requirements n8n account (cloud or self-hosted) Olostep API key Google Drive & Google Docs access AI model provider (OpenAI or Gemini) How to customize the workflow Limit the number of pages crawled per run. Adjust AI prompts to match your documentation style. Store results in Notion, Confluence, or Markdown files instead of Google Docs. Add vector storage (Pinecone, Supabase) to turn docs into an AI knowledge base. Schedule automatic re-crawls to keep documentation up to date. 👉 This template turns complex technical documentation into an organized, searchable knowledge base — automatically.
by AppUnits AI
Automated Invoice Creation & Team Notification with Jotform, Xero, Outlook, and Telegram This workflow automates the entire process of receiving a product/service order, checking or creating a customer in Xero, generating an invoice, emailing it, and notifying the sales team for example after sometime if no action has been taken yet (via Telegram) — all triggered by a form submission (via Jotform). How It Works Receive Submission Triggered when a user submits a form. Collects data like customer details, selected product/service, etc. Check If Customer Exists Searches Xero to determine if the customer already exists. ✅ If Customer Exists: Update customer details. ❌ If Customer Doesn’t Exist: Create a new customer in Xero. Create The Invoice Generates a new invoice for the customer using the item selected. Send The Invoice Automatically sends the invoice via email to the customer. Wait For Sometime Now we will wait for 30 seconds (by default, you can change it) and then get the invoice details from Xero Notify The Team Notifies the sales team for example via Telegram in case no action has been taken on the invoice and thus the team can act fast. Who Can Benefit from This Workflow? Freelancers** Service Providers** Consultants & Coaches** Small Businesses** E-commerce or Custom Product Sellers** Requirements Jotform webhook setup, more info here Xero credentials, more info here Make sure that products/services values in Jotform are exactly the same as your item Code in your Xero account Email setup, update email node (Send email), more info about Outlook setup here LLM model credentials Telegram credentials, more info here
by Kevin Meneses
What this workflow does This workflow automatically monitors eBay Deals and sends Telegram alerts when relevant, high-quality deals are detected. It combines: Web scraping with Decodo** JavaScript pre-processing (no raw HTML sent to the LLM)** AI-based product classification and deal scoring** Rule-based filtering using price and score** Only valuable deals reach the final notification. How it works (overview) The workflow runs manually or on a schedule. The eBay Deals page is scraped using Decodo, which handles proxies and anti-bot protections. Decodo – Web Scraper for n8n JavaScript extracts only key product data (ID, title, price, URL, image). An AI Agent classifies each product and assigns a deal quality score (0–10). Price and score rules are applied. Matching deals are sent to Telegram. How to configure it 1. Decodo Add your Decodo API credentials to the Decodo node. Optionally change the target eBay URL. 2. AI Agent Add your LLM credentials (e.g. Google Gemini). No HTML is sent to the model — only compact, structured data. 3. Telegram Add your Telegram Bot Token. Set your chat_id in the Telegram node. Customize the alert message if needed. 4. Filtering rules Adjust price limits and minimum deal score in the IF node