by Wan Dinie
AI Website Analyzer to Product Ideas with FireCrawl and GPT-4.1 This n8n template demonstrates how to use AI to analyze any website and generate product ideas or summaries based on the website's content and purpose. Use cases are many: Try analyzing competitor websites, discovering product opportunities, understanding business models, or generating insights from landing pages! Good to know At time of writing, Firecrawl offers up to 500 free API calls. See Firecrawl Pricing for updated info. OpenAI API costs vary by model. GPT-3.5 is cheaper while GPT-4 and above offer deeper analysis but cost more per request. How it works We'll collect a website URL via a manual form trigger. The URL is sent to the Firecrawl API, which deeply crawls and analyzes the website content. Firecrawl returns the scraped data, including page structure, content, and metadata. The scraped data is then sent to OpenAI's API with a custom prompt. OpenAI generates an AI-powered summary analyzing what the website is doing, its purpose, and potential product ideas. The final output is displayed or can be stored for further use. How to use The manual trigger node is used as an example, but feel free to replace this with other triggers such as webhook or even a form. You can analyze multiple URLs by looping through a list, but of course, the processing will take longer and cost more. Requirements Firecrawl API key (get free 500 calls at https://firecrawl.dev) OpenAI API key for AI analysis Valid website URLs to analyze Customizing this workflow Change the output format from HTML to JSON, Markdown, or plain text by editing the Firecrawl parameters. Modify the AI prompt to focus on specific aspects like pricing strategy, target audience, or UX analysis. Upgrade to GPT-4.1, GPT-5.1, or GPT-5.2 for more advanced and detailed analysis. Add a webhook trigger to analyze websites automatically from other apps or services. Store results in a database like Supabase or Google Sheets for tracking competitor analysis over time.
by Alex Berman
Who is this for This template is for sales teams, recruiters, and growth professionals who need to enrich a list of email addresses with full contact details -- names, phone numbers, and physical addresses -- and push verified new leads directly into HubSpot CRM without manual data entry. How it works The workflow accepts a list of email addresses defined in a configuration node. It submits each email to the ScraperCity People Finder API, which performs a reverse lookup to surface associated names, phones, and addresses. Because the scrape runs asynchronously, the workflow polls the job status every 60 seconds until completion. Once the scrape succeeds, it downloads the CSV results, parses each row into structured contact records, removes duplicates, and upserts every new contact into HubSpot CRM. How to set up Create a ScraperCity API Key credential in n8n (HTTP Header Auth, header name Authorization, value Bearer YOUR_KEY). Create a HubSpot App Token credential in n8n. Open the Configure Lookup Parameters node and replace the example emails with your target list. Execute the workflow manually and monitor the polling loop until results appear in HubSpot. Requirements ScraperCity account with People Finder access (app.scrapercity.com) HubSpot account with a valid private app token n8n instance (cloud or self-hosted) How to customize the workflow Add more emails to the Configure Lookup Parameters node or replace it with a Google Sheets or webhook trigger to feed emails dynamically. Adjust the Wait Before Retry node duration if your scrapes consistently finish faster or slower. Extend the Map Contact Fields node to populate additional HubSpot properties such as lifecycle stage or lead source.
by Ali HAIDER
Receive WhatsApp messages via Whapi, generate AI replies with a local Ollama model, log conversations in Google Sheets, and auto-capture leads — all without touching a cloud LLM. This n8n template builds a fully automated WhatsApp AI CRM using Whapi.cloud for messaging and Ollama for 100% local AI inference — no OpenAI costs, no data leaving your server. How it works A Webhook node receives inbound WhatsApp messages from Whapi.cloud. A Code node extracts the sender's phone, name, message text, and filters out outbound/non-text messages. An IF node ensures only real inbound text messages from customers are processed. Google Sheets is used to fetch that customer's full conversation history, enabling memory across sessions. A Code node builds a full prompt — system instructions + conversation history + new message — passed to the AI model. Ollama (via LangChain LLM Chain node) generates a contextual reply using a local model (default: gemma3:1b). The user message and AI reply are each appended to Google Sheets as conversation history logs. A separate Google Sheets upsert captures or updates the lead record with phone and name. The AI reply is sent back to the customer via Whapi's HTTP API. How to use Set up a Whapi.cloud account and connect a WhatsApp number. Point the webhook to your n8n webhook URL. Create a Google Sheet with a History tab (columns: Phone, Name, Role, Message, Timestamp) and a Leads tab (columns: Phone, Name, CreatedAt). Add your Google Sheets credentials and replace YOUR_GOOGLE_SHEET_ID in the relevant nodes. Run Ollama locally or on your server. Pull the model: ollama pull gemma3:1b. Update the model name in the Ollama node if using a different model. Customise the system prompt inside the Build AI Prompt node to match your business (real estate, support, bookings, etc.). Activate the workflow and send a WhatsApp message to test. Requirements Whapi.cloud account (WhatsApp Business API) Ollama running locally or on a self-hosted server Google Sheets (with OAuth2 credentials connected in n8n) Customising this workflow Switch AI models: Swap gemma3:1b for any Ollama-supported model like llama3, mistral, or phi3 depending on your hardware. Change the industry: Edit the system prompt in Build AI Prompt to serve any business — bookings, customer support, sales qualification, etc. Upgrade the CRM: Replace Google Sheets with Airtable, Notion, or a real CRM (HubSpot, Pipedrive) by swapping out the Sheets nodes. Add handoff logic: Insert a condition to escalate to a human agent if the message contains keywords like "speak to someone" or "human". Multi-language: The system prompt already instructs the AI to reply in the customer's language — no extra setup needed. Who is this for It's designed for service businesses (real estate, consultants, agencies) that want to respond to inbound WhatsApp leads instantly, log conversations, and build a simple CRM — all from a single workflow.
by Avkash Kakdiya
How it works This workflow automates SEO analysis by comparing your website with a competitor’s site. It reads input URLs from Google Sheets, scrapes structured SEO data from both sites, and expands into important internal pages for deeper insights. The collected data is processed and merged before being analyzed using Google Gemini AI. Finally, it generates a structured SEO gap report and saves it back into Google Sheets while updating the workflow status. Step-by-step Trigger and filter input data** Manual Trigger – Starts the workflow execution manually. Google Sheets (Get row(s)) – Fetches website data from the input sheet. If – Filters only rows where status is NEW. Prepare and fetch website data** Set (Edit Fields) – Maps your website and competitor URLs. HTTP Request (My Website HTTP) – Fetches your website HTML. HTTP Request (Competitor) – Fetches competitor website HTML. HTML (Extract Data) – Extracts SEO elements like title, H1, H2, links, and content. Extract and process internal pages** Code (All Links) – Filters important internal URLs from both websites. Google Sheets (Insert Links) – Stores extracted links. Split In Batches – Iterates through each page URL. HTTP Request + HTML + Code – Scrapes and formats SEO data for each page. Store and update structured SEO data** Google Sheets (Append & Update) – Saves page-level SEO data. Wait nodes – Controls execution timing and prevents rate limits. Analyze SEO gaps using AI** Merge – Combines your site and competitor data. Code – Structures merged dataset. Google Gemini (Message a model) – Generates SEO gap analysis. Code (Parse JSON) – Cleans and validates AI output. Save report and finalize workflow** Google Sheets (Append Report) – Stores SEO gap report. Google Sheets (Update Row) – Marks input row as DONE. Why use this? Automates complete SEO competitor analysis without manual effort Identifies keyword, content, and technical SEO gaps instantly Scales across multiple websites and competitors efficiently Provides AI-driven insights and actionable SEO improvement plans Centralizes all SEO data and reports inside Google Sheets
by Daniel Shashko
How it Works This workflow automates competitive price intelligence using Bright Data's enterprise web scraping API. On a scheduled basis (default: daily at 9 AM), the system loops through configured competitor product URLs, triggers Bright Data's web scraper to extract real-time pricing data from each site, and intelligently compares competitor prices against your current pricing. The workflow handles the full scraping lifecycle: it sends scraping requests to Bright Data, waits for completion, fetches the scraped product data, and parses prices from various formats and website structures. All pricing data is automatically logged to Google Sheets for historical tracking and trend analysis. When a competitor's price drops below yours by more than the configured threshold (e.g., 10% cheaper), the system immediately sends detailed alerts via Slack and email to your pricing team with actionable intelligence. At the end of each monitoring run, the workflow generates a comprehensive daily summary report that aggregates all competitor data, calculates average price differences, identifies the lowest and highest competitors, and provides a complete competitive landscape view. This eliminates hours of manual competitor research and enables data-driven pricing decisions in real-time. Who is this for? E-commerce businesses and online retailers needing automated competitive price monitoring Product managers and pricing strategists requiring real-time competitive intelligence Revenue operations teams managing dynamic pricing strategies across multiple products Marketplaces competing in price-sensitive categories where margins matter Any business that needs to track competitor pricing without manual daily checks Setup Steps Setup time: Approx. 30-40 minutes (Bright Data configuration, credential setup, competitor URL configuration) Requirements: Bright Data account with Web Scraper API access Bright Data API token (from dashboard) Google account with a spreadsheet for price tracking Slack workspace with pricing channels SMTP email provider for alerts Sign up for Bright Data and create a web scraping dataset (use e-commerce template for product data) Obtain your Bright Data API token and dataset ID from the dashboard Configure these nodes: Schedule Daily Check: Set monitoring frequency using cron expression (default: 9 AM daily) Load Competitor URLs: Add competitor product URLs array, configure your current price, set alert threshold percentage Loop Through Competitors: Automatically handles multiple URLs (no configuration needed) Scrape with Bright Data: Add Bright Data
by Servify
Who is this for Sales teams and agencies using Retell AI for voice outreach who want to automatically analyze every call and push insights into their CRM. Ideal for businesses running AI voice agents that need structured post-call intelligence without manual review. How it works When a Retell AI voice call ends, the platform sends a call_analyzed webhook to this workflow. It parses the transcript, call duration, and metadata, then sends everything to OpenAI for analysis. The AI returns structured data: sentiment, lead score (1-10), key topics, buying signals, objections, action items, and a recommended next step. The enriched data is synced to HubSpot, updating the contact record. If the lead score meets your threshold, the workflow alerts your sales team on Slack and creates a priority follow-up task in HubSpot. Every call is logged to Google Sheets for tracking. How to set up Open the Set user config variables node and enter your Slack channel, lead score threshold (default: 7), Google Sheet ID, and sheet name. Connect your OpenAI, HubSpot, Slack, and Google Sheets credentials in each respective node. Copy the production webhook URL from the trigger node and add it to your Retell AI dashboard under Webhook Settings for the call_analyzed event. Create a Google Sheet with columns: Date, Call ID, From Number, Duration, Sentiment, Lead Score, Summary, Action Items, Qualified. Activate the workflow and make a test call to verify the full pipeline. Requirements Retell AI account with an active voice agent OpenAI API key (GPT-4o-mini or GPT-4o) HubSpot CRM account Slack workspace with a bot token Google Sheets How to customize Adjust the lead score threshold in the config node to control when hot lead alerts fire. Modify the AI analysis prompt to extract industry-specific fields (e.g., appointment booked, insurance type, budget range). Add a Twilio SMS branch for instant text follow-ups to hot leads. Connect additional CRM nodes if you use Pipedrive, Salesforce, or another platform instead of HubSpot.
by Avkash Kakdiya
How it works This workflow captures new subscribers from a Mailchimp list and extracts their key details. It then enriches the subscriber data using AI, predicting professional attributes and assigning a lead score. Based on the score, high-value leads are identified, and all leads are synced into HubSpot and Pipedrive. For top-priority leads, the workflow automatically creates new deals in Pipedrive for sales follow-up. Step-by-step Step-by-step 1. Capture subscriber data Mailchimp Subscriber Trigger** – Detects new signups in a Mailchimp list. Extract Subscriber Data** – Normalizes payload into clean fields like name, email, and timestamp. 2. Enrich with AI Lead Enrichment AI** – Uses AI to infer company, role, industry, intent, LinkedIn, and lead score. Parse & Merge Enrichment** – Merges AI output with subscriber info and sets defaults if parsing fails. 3. Qualify leads High-Value Lead Check** – Filters leads with a score ≥70 to flag them as priority. Create High-Value Deal** – Opens a new deal in Pipedrive for high-value leads. 4. Sync to CRMs HubSpot Contact Sync** – Updates enriched lead data in HubSpot CRM. Pipedrive Person Create** – Adds lead as a person in Pipedrive. Why use this? Automatically enrich raw Mailchimp subscribers with valuable professional insights. Score and qualify leads instantly for prioritization. Keep HubSpot and Pipedrive updated with enriched lead records. Automate deal creation for high-value opportunities, saving sales team effort. Build a seamless pipeline from marketing signups to CRM-ready opportunities.
by MUHAMMAD SHAHEER
Who’s it for This template is designed for creators, researchers, freelance writers, founders, and automation professionals who want a reliable way to generate structured, citation-backed research content without doing manual data collection. Anyone creating blog posts, reports, briefs, or research summaries will benefit from this system. What it does This workflow turns a simple form submission into a complete research pipeline. It accepts a topic, determines what needs to be researched, gathers information from the web, writes content, fact-checks it against the collected sources, edits the draft for clarity, and compiles a final report. It behaves like a small agentic research team inside n8n. How it works A form collects the research topic, depth, and desired output format. A research agent generates focused search queries. SERP API retrieves real-time results for each query. The workflow aggregates and structures all findings. A writing agent creates the first draft based on the data. A fact-checking agent verifies statements against the sources. An editor agent improves tone, flow, and structure. A final review agent produces the completed research document with citations. This workflow includes annotated sticky notes to explain each step and guide configuration. Requirements Groq API key for running the Llama 3.3 model. SERP API key for performing web searches. An n8n instance (cloud or self-hosted). No additional dependencies are required. How to set up Add your Groq and SERP API credentials using n8n’s credential manager. Update the form fields if you want custom depth or output formats. Follow the sticky notes for detailed configuration. Run the workflow and submit a topic through the form to generate your first research report. How to customize Replace the writer agent with a different model if you prefer a specific writing style. Adjust the number of search queries or SERP results for deeper research. Add additional steps such as PDF generation, sending outputs to Notion, or publishing to WordPress. Modify the form to suit industry-specific content needs.
by Zakwan
📖 Overview This template automates the process of researching a keyword, scraping top-ranking articles, cleaning their content, and generating a high-quality SEO-optimized blog post. It uses Google Search via RapidAPI, Ollama with Mistral AI, and Google Drive to deliver an end-to-end automated content workflow. Ideal for content creators, SEO specialists, bloggers, and marketers who need to quickly gather and summarize insights from multiple sources to create superior content. ⚙️ Prerequisites Before using this workflow, make sure you have: n8n installed (Desktop, Docker, or Cloud). Ollama installed with the mistral:7b model: ollama pull mistral:7b RapidAPI account (for Google Search API). Google Drive account (with a target folder where articles will be saved). 🔑 Credentials Required RapidAPI (Google Search API) Header authentication with your API key. Example headers: x-rapidapi-key: YOUR_API_KEY x-rapidapi-host: google-search74.p.rapidapi.com Google Drive OAuth2 Allow read/write permissions. Update the folderId with your Drive folder where articles should be stored. Ollama API Base URL: http://localhost:11434 (local n8n) http://host.docker.internal:11434 (inside Docker) Ensure the mistral:7b model is available. 🚀 Setup Instructions Configure RapidAPI Sign up at RapidAPI . Subscribe to the Google Search API. Create an HTTP Header Auth credential in n8n with your API key. Configure Google Drive In n8n, add a Google Drive OAuth2 credential. Select the Drive folder ID where output files should be saved. Configure Ollama Install Ollama locally. Pull the required model (mistral:7b). Create an Ollama API credential in n8n. Run the Workflow Trigger by sending a chat message with your target keyword. The workflow searches Google, extracts the top 3 results, scrapes the articles, cleans the content, and generates a structured blog post. Final output is stored in Google Drive as a .docx file. 🎨 Customization Options Search Engine → Swap out RapidAPI with Bing or SerpAPI. Number of Articles → Change limit: 3 in the Google Search node. Content Cleaning → Modify the regex in the “Clean Body Text” node to capture or tags. AI Model → Replace mistral:7b with llama3, mixtral, or any other Ollama-supported model. Storage → Save output to a different Google Drive folder or export to Notion/Slack. 📌 Workflow Highlights Google Search (RapidAPI) → Fetch top 3 results for your keyword. HTTP Request + Code Nodes → Extract and clean article body text. Mistral AI via Ollama → Summarize, optimize, and refine the content. Google Drive → Save the final blog-ready article automatically.
by A Z
⚡ Quick Setup Import this workflow into your n8n instance. Add your Apify, Google Sheets, and Firecrawl credentials. Activate the workflow to start your automated lead enrichment system. Copy the webhook URL from the MCP trigger node. Connect AI agents using the MCP URL. 🔧 How it Works This solution combines two powerful workflows to deliver fully enriched, AI-ready business leads from Google Maps: Apify Google Maps Scraper Node: Collects business data and, if enabled, enriches each lead with contact details and social profiles. Leads Missing Enrichment: Any leads without contact or social info are automatically saved to a Google Sheet. Firecrawl & Code Node Workflow: A second workflow monitors the Google Sheet, crawls each business’s website using Firecrawl, and extracts additional social media profiles or contact info using a Code node. Personalization Logic: AI-powered nodes generate tailored outreach content for each enriched lead. Native Integration: The entire process is exposed as an MCP-compatible interface, returning enriched and personalized lead data directly to the AI agent. 📋 Available Operations Business Search: Find businesses on Google Maps by location, category, or keyword. Lead Enrichment: Automatically append contact details, social profiles, and other business info using Apify and Firecrawl. Personalized Outreach Generation: Create custom messages or emails for each lead. Batch Processing: Handle multiple leads in a single request. Status & Error Reporting: Get real-time feedback on processing, enrichment, and crawling. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: Search queries (location, keywords, categories) Enrichment options (contact, social, etc.) Personalization variables (name, business type, etc.) Response Format: Returns fully enriched lead data and personalized outreach content in a structured format.
by Vivekanand M
Upwork Proposal Automation with AI, Airtable and Slack 📘 Description This workflow automates the complete Upwork job discovery and proposal generation process by continuously monitoring job listings, intelligently filtering opportunities based on your skill set, generating personalised AI-written proposals, and delivering instant notifications — all without any manual effort. The workflow is triggered automatically every minute via Vollna's RSS feed, which monitors Upwork job postings matching your configured search filters. Each new job listing is parsed and analysed to extract key details, including title, description, budget, required skills, and job ID. A skills matching engine scores each job against your defined skill set and filters out weak matches. Duplicate jobs are automatically detected and skipped using Airtable as a reference store, ensuring AI credits are never wasted on already-processed listings. For every qualified new job, GPT-4o-mini generates a tailored 150–250-word proposal that references specific details from the job post, aligns your experience to the client's exact requirements, and ends with a clear call to action. The proposal and all job metadata are saved to an Airtable base for review. A formatted Slack notification is sent instantly with the full job details and generated proposal, allowing you to review, edit, and apply directly from Upwork with a single click. ⚙️ What This Workflow Does (Step-by-Step) 📡 RSS Feed Monitoring — Polls Vollna's Upwork RSS feed every minute for new job listings matching your skill keywords. Vollna replaces Upwork's discontinued native RSS feed (removed August 2024) and supports 30+ filter parameters, including category, budget, and client history. 🔍 Parse & Extract — Extracts structured fields from each RSS item, including job title, full description, budget, required skills, posted date, job ID, and clean Upwork job URL (decoded from Vollna's redirect format). 🎯 Filter: Skills Match — Scores each job against your defined skill list. Jobs scoring fewer than 2 matched skills are dropped immediately, ensuring only relevant opportunities proceed. ⭐ Filter: Client Quality — Filters out clients with ratings below 4.5. New clients with no rating history are allowed through by default. 🔁 Duplicate Detection — Queries Airtable to check if the job ID has already been processed in a previous run. Duplicate jobs are silently skipped without generating a proposal. 🤖 AI Proposal Generation — Calls GPT-4o-mini with a structured prompt containing the job details and your freelancer profile. Generates a concise, personalised proposal that opens with a specific reference to the job post, highlights relevant experience with real numbers, proposes a concrete first step, and ends with a soft call to action. 💾 Save to Airtable — Creates a new record in your Airtable base with all job fields, matched skills, match score, generated proposal, and status set to "New" for review tracking. 💬 Slack Notification — Sends a formatted message to your Slack channel with the job title, budget, match score, matched skills, required skills, direct Upwork job link, and the full AI-generated proposal — ready to copy and submit. 🧩 Prerequisites Vollna account** — Free tier available at vollna.com. Create a job filter matching your skills and copy the RSS feed URL from the Filters section OpenAI API key** — Used for GPT-4o-mini proposal generation (~$0.007 per proposal) Airtable account** — Free tier supports up to 1,000 records. Create a base with the schema below Slack workspace** — Bot token with chat:write permission, invited to your target channel 🗄️ Airtable Base Schema Create a table called Upwork Proposals with these fields: | Field Name | Type | |---|---| | Job Title | Single line text | | Job URL | URL | | Upwork URL | URL | | Posted At | Date | | Budget | Single line text | | Skills Required | Long text | | Matched Skills | Long text | | Match Score | Number | | AI Proposal | Long text | | Status | Single select: New, Reviewed, Applied, Skipped | | Job ID | Single line text | | Notes | Long text | 💰 Cost Estimate | Item | Estimated Cost | |---|---| | Vollna (free tier) | $0/mo | | GPT-4o-mini (50 proposals/day) | $1–3/mo | | Airtable (free tier) | $0/mo | | n8n self-hosted (AWS t3.small) | ~$10–15/mo | | Total | ~$11–18/mo | ⚙️ Setup Instructions Vollna — Sign up at vollna.com, create a job filter with your target keywords and skill categories, then copy the RSS feed URL from the Filters section Airtable — Create a new base and table using the schema above. Copy your Base ID from the Airtable URL and connect your Personal Access Token in n8n credentials OpenAI — Add your OpenAI API key as an n8n credential (HTTP Header Auth with Authorisation: Bearer sk-...) Slack — Create a Slack app, add chat:write scope, install to your workspace, invite the bot to your channel with /invite @your-bot-name Customise the AI prompt — Open the Build OpenAI Payload node and update the MY PROFILE section with your actual name, skills, and experience details Update skill filters — In the Filter: Skills Match node, update the YOUR_SKILLS array to match your exact skill set Publish the workflow — Click Publish. The RSS trigger will begin polling Vollna every minute automatically 💡 Key Benefits ✔ Fully automated job discovery — no manual searching required ✔ Skills-based filtering ensures AI only runs on relevant jobs ✔ Personalised proposals referencing specific job details — not generic templates ✔ Airtable CRM for tracking proposal status and conversion rates ✔ Instant Slack alerts with one-click access to apply on Upwork ✔ Deduplication prevents reprocessing the same job across runs ✔ Modular design — swap OpenAI for Claude or AWS Bedrock with minimal changes ✔ Cost-optimised — GPT-4o-mini keeps proposal generation under $3/month at scale 👥 Perfect For Freelancers on Upwork wanting to automate proposal writing Agencies managing multiple freelancer profiles Developers and automation specialists looking to win more technical contracts Anyone spending more than 30 minutes per day manually browsing and applying to Upwork jobs
by Khairul Muhtadin
This automated TLDW (Too Long; Didn't Watch) generator using Decodo's scraping API to extract complete video transcripts and metadata, then uses Google Gemini 3 to create intelligent summaries with key points, chapters breakdown, tools mentioned, and actionable takeaways—eliminating hours of manual note-taking and video watching. Why Use This Workflow? Time Savings: Convert a 2-hour video into a readable 5-minute summary, reducing research time by 95% Comprehensive Coverage: Captures key points, chapters, tools, quotes, and actionable steps that manual notes often miss Instant Accessibility: Receive structured summaries directly in Telegram within 30-60 seconds of sharing a link Multi-Language Support: Process transcripts in multiple languages supported by YouTube's auto-caption system Ideal For Content Creators & Researchers:** Quickly extract insights from competitor videos, educational content, or industry talks without watching hours of footage Students & Educators:** Generate study notes from lecture recordings, online courses, or tutorial videos with chapter-based breakdowns Marketing Teams:** Analyze competitor content strategies, extract tools and techniques mentioned, and identify trending topics across multiple videos Busy Professionals:** Stay updated with conference talks, webinars, or industry updates by reading summaries instead of watching full recordings How It Works Trigger: User sends any YouTube URL (youtube.com or youtu.be) to a configured Telegram bot Data Collection: Workflow extracts video ID and simultaneously fetches full transcript and metadata (title, channel, views, duration, chapters, tags) via Decodo API Processing: Raw transcript data is extracted and cleaned, while metadata is parsed into structured fields including formatted statistics and chapter timestamps AI Processing: Google Gemini Flash analyzes the transcript to generate a structured summary covering one-line overview, key points, main topics by chapter, tools mentioned, target audience, practical takeaways, and notable quotes Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Telegram Bot API | Essential | Receives video links and delivers summaries | | Decodo Scraper API | Essential | Extracts YouTube transcripts and metadata | | Google Gemini API | Essential | AI-powered summary generation | Installation Steps Import the JSON file to your n8n instance Configure credentials: Telegram Bot API: Create a bot via @BotFather on Telegram, obtain the API token, and configure in n8n Telegram credentials Decodo API: Sign up at Decodo Dashboard, get your API key, create HTTP Header Auth credential with header name "Authorization" and value "Basic [YOUR_API_KEY]" Google Gemini API: Obtain API key from Google AI Studio, configure in n8n Google Palm API credentials Update environment-specific values: In the "Alert Admin" node, replace YOUR_CHAT_ID with your personal Telegram user ID for error notifications Optionally adjust the languageCode in "Set: Video ID & Config" node (default: "en") Customize settings: Modify the AI prompt in "Generate TLDR" node to adjust summary structure and depth Test execution: Send a YouTube link to your Telegram bot Verify you receive the "Processing..." notification, video info card, and formatted summary chunks Technical Details Workflow Logic The workflow employs parallel processing for efficiency. Transcript and metadata are fetched simultaneously after video ID extraction. Once both API calls complete, the transcript feeds directly into Gemini AI while metadata is parsed separately. The merge node combines AI output with structured metadata before splitting into Telegram-friendly chunks. Error handling is isolated on a separate branch triggered by any node failure, formatting error details and alerting admins without disrupting the main flow. Customization Options Basic Adjustments: Language Selection**: Change languageCode from "en" to "id", "es", "fr", etc. to fetch transcripts in different languages (YouTube must have captions available) Summary Style**: Edit the prompt in "Generate TLDR" to focus on specific aspects (e.g., "focus only on technical tools mentioned" or "create a summary for beginners") Message Length**: Adjust maxCharsPerChunk (currently 4000) to create longer or shorter message splits based on preference Advanced Enhancements: Database Storage**: Add a Postgres/Airtable node after "Merge: Data + Summary" to archive all summaries with timestamps and user IDs for searchable knowledge base (medium complexity) Multi-Model Comparison**: Duplicate the "Generate TLDR" chain and connect GPT-4 or Claude, merge results to show different AI perspectives on the same video (high complexity) Auto-Translation**: Insert a translation node after summary generation to deliver summaries in user's preferred language automatically (medium complexity) Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "Not a YouTube URL" error | URL format not recognized | Ensure UR sent contains youtube.com or youtu.be | | No transcript available | Video lacks captions or wrong language | Check video has auto-generated or manual captions change languageCode to match available options | | Decodo API 401/403 error | Invalid or expired API key | Verify API key in HTTP Header Auth credential. regenerate if needed from Decodo dashboard || | Error notifications not received | Wrong chat ID in Alert Admin node | Get your Telegram user ID from @userinfobot and update the node | Use Case Examples Scenario 1: Marketing Agency Competitive Analysis Challenge: Agency needs to analyze 50+ competitor YouTube videos monthly to identify content strategies, tools used, and messaging angles—watching all videos would require 80+ hours Solution: Drop youtube links into a shared Telegram group with the bot. Summaries are generated instantly, highlighting tools mentioned, key talking points, and target audience insights Result: Research time reduced from 80 hours to 6 hours monthly (93% time savings), with searchable archive of all competitor content strategies Created by: Khaisa Studio Category: AI-Powered Automation Tags: YouTube, AI, Telegram, Summarization, Content Analysis, Decodo, Gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads