by Avkash Kakdiya
How it works This workflow starts whenever a new lead is submitted through Typeform. It cleans and stores the raw lead data, checks if the email is business-related (not Gmail), and then uses AI to enrich the lead with company details. After enrichment, the workflow scores the lead with AI, updates your HubSpot CRM, and saves everything neatly into Google Sheets for tracking and reporting. Step-by-step Capture New Lead Triggered by a new Typeform submission. Collects basic details: Name, Email, Phone, and Message. Saves raw lead data into a Google Sheet for backup. Stores the basic info in Airtable (avoids duplicates by email). Format & Filter Leads Formats the incoming data into a clean structure. Filters out non-business emails (e.g., Gmail) so only qualified leads continue. Enrich Company Information Uses AI (GPT-4o-mini) to enrich the lead’s company data based on email domain. Returns details like: Company Name, Industry, Headquarters, Employee Count, Website, LinkedIn, and Description. Merges this information with the original lead profile and adds metadata (timestamp, workflow ID). Score the Lead AI analyzes the enriched profile and assigns a lead score (1–10). Scoring considers industry fit, company size, contact source, and domain reputation. Update CRM & Sheets Sends the enriched lead with score to HubSpot CRM. Updates company details, contact info, and custom properties (lead_score, LinkedIn, description). Logs the fully enriched lead in a Google Sheet for tracking. Why use this? Automatically enriches and scores every incoming lead. Filters out low-value (non-business) emails before wasting CRM space. Keeps HubSpot CRM up to date with the latest company and contact info. Maintains both raw and enriched lead data in Google Sheets for easy reporting. Saves your team hours of manual research and ensures consistent, AI-driven lead qualification.
by oka hironobu
Research competitors and generate market insights with Claude AI and Notion Who is this for SaaS product managers, startup founders, and marketing teams who need to stay informed about competitor movements without manual monitoring. Perfect for teams who want to automate competitive intelligence gathering and respond quickly to market changes. How it works The workflow runs weekly, automatically scraping competitor websites and pricing pages using HTTP requests. A code node extracts key content and creates content hashes to detect changes from previous scans. When changes are detected, Claude AI analyzes the updates and provides strategic insights about pricing shifts, feature launches, or messaging changes. All competitor data and AI analysis are automatically saved to a Notion database for historical tracking. Important changes trigger immediate Slack notifications with actionable insights. At the end of each week, a comprehensive report summarizing all competitor activity is generated and emailed to your team. How to set up Configure competitor URLs in the Set node by adding websites, pricing pages, and feature pages you want to monitor. Set up API credentials for Claude AI, Notion, Slack, and Gmail. Create a Notion database with properties for competitor name, URL, content hash, AI analysis, and scan date. Define environment variables for your Notion database ID, Slack channel, and team email list. Requirements Anthropic Claude API key for competitive analysis Notion workspace with API access for data storage Slack workspace for urgent alerts Gmail account for weekly reporting Basic HTML/CSS knowledge helpful for customizing content extraction How to customize Adjust the schedule trigger frequency, modify urgent notification keywords in the priority evaluation code, or customize the Claude AI analysis prompt to focus on specific competitive aspects like pricing, features, or market positioning.
by Bilel Aroua
👥 Who is this for? Creators, marketers, and brands that want to turn a single product photo into premium motion clips, then optionally publish to Instagram/TikTok/YouTube via LATE. No editing skills required. ❓ What problem does it solve? Producing short vertical ads from a static packshot takes time (retouching, motion design, soundtrack, publishing). This workflow automates the entire process: image enhancement → cinematic motion → optional upscale → soundtrack → share. 🛠️ What this workflow does Collects a product photo via Telegram. Generates two refined edit prompts + two motion prompts using multi-agent Gemini orchestration. Creates two edited images with Fal.ai Gemini-Flash (image edit). Renders two 5s vertical videos with Kling (via fal.run queue). Auto-stitches them (FFmpeg API) and optionally upscales with Topaz. Generates a clean ambient soundtrack with MMAudio. Sends previews + final links back on Telegram. Optionally publishes to Instagram, TikTok, YouTube Shorts, and more via LATE. ⚡ Setup Telegram**: Bot token (Telegram node). Fal.ai**: HTTP Header Auth (Authorization: Bearer <FAL_API_KEY>) for Gemini-Flash edit, Kling queue, FFmpeg compose, Topaz upscale, and MMAudio. Google Gemini** (PaLM credential) for AI agents. ImgBB**: API key for uploading original/edited images. LATE: create an account at **getlate.dev and use your API key for publishing (optional). ▶️ How to use Start the workflow and DM your bot a clear product photo (jpg/jpeg/webp). Approve the two still concepts when prompted in Telegram. The orchestrator generates cinematic motion prompts and queues Kling renders. Receive two motion previews, then a stitched final (upscaled + soundtrack). Choose to auto-publish to Instagram/TikTok/YouTube via LATE (optional). 🎨 How to customize Art Direction** → tweak the “Art Director” system message (lighting, backgrounds, grading). Motion Flavor** → adjust the “Motion Designer” vocabulary for different camera moves/dynamics. Durations/Aspect** → default is 9:16, 5s; you can change Kling duration. Soundtrack** → edit the MMAudio prompt to reflect your brand’s sonic identity. Publishing** → enable/disable LATE targets; customize captions/hashtags. ✅ Prerequisites A Telegram bot created via @BotFather. A Fal.ai account + API key. An ImgBB account + API key. (Optional) a LATE account with connected social profiles — sign up at getlate.dev. 💡 Detailed technical notes, architecture, and step-by-step flow explanation are included as sticky notes inside this workflow. 🆘 Support If you need help setting up or customizing this workflow: 📧 Email: bilsimaging@gmail.com 🌐 Website: bilsimaging.com I can provide guidance, troubleshooting, or custom extra workflow adaptations.
by Will Carlson
What it does: Collects cybersecurity news from trusted RSS feeds and uses OpenAI’s Retrieval-Augmented Generation (RAG) capabilities with Pinecone to filter for content that is directly relevant to your organization’s tech stack. “Relevant” means the AI looks for news items that mention your specific tools, vendors, frameworks, cloud platforms, programming languages, operating systems, or security solutions — as described in your .txt scope documents. By training on these documents, the system understands the environment you operate in and can prioritize news that could affect your security posture, compliance, or operational stability. Once filtered, summaries of the most important items are sent to your work email every day. How it works Pulls in news from multiple cybersecurity-focused RSS feeds:** The workflow automatically collects articles from trusted, high-signal security news sources. These feeds cover threat intelligence, vulnerability disclosures, vendor advisories, and industry updates. Filters articles for recency and direct connection to your documented tech stack:** Using the publish date, it removes stale or outdated content. Then, leveraging your .txt scope documents stored in Pinecone, it checks each article for references to your technologies, vendors, platforms, or security tools. Uses OpenAI to generate and review concise summaries:** For each relevant article, OpenAI creates a short, clear summary of the key points. The AI also evaluates whether the article provides actionable or critical information before passing it through. Trains on your scope using Pinecone Vector Store (free) for context-aware filtering:** Your scope documents are embedded into a vector store so the AI can “remember” your environment. This context ensures the filtering process understands indirect or non-obvious connections to your tech stack. Aggregates and sends only the most critical items to your work email:** The system compiles the highest-priority news items into one daily digest, so you can review key developments without wading through irrelevant stories. What you need to do: Setup your OpenAI and Pinecone credentials in the workflow Create and configure a Pinecone index (dimension 1536 recommended) Pinecone is free to setup. Setup Pinecone with a single free index. Use a namespace like: scope. Make sure the embedding model is the same for all of your Pinecone references. Submit .txt scope documents listing your technologies, vendors, platforms, frameworks, and security products. .txt does not need to be structured. Add as much detail as possible. Update AI prompts to accurately describe your company’s environment and priorities.
by Typhoon Team
This n8n template demonstrates how to use Typhoon OCR + LLM to digitize business cards, enrich the extracted details, and save them directly into Google Sheets or any CRM. It works with both Thai and English business cards and even includes an optional step to draft greeting emails automatically. Use cases: Automatically capture leads at events, enrich contact details before saving them into your CRM, or simply keep a structured database of your professional network. Good to know Two versions of the workflow are provided: 🟢 Without Search API → cost-free option using only Typhoon OCR + LLM 🔵 With Search API → adds Google Search enrichment for richer profiles (may incur API costs via SerpAPI) The Send Email step is optional — include it if you want to follow up instantly, or disable it if not needed. Typhoon provides a free API for anyone to sign up and use → opentyphoon.ai How it works A form submission triggers the workflow with a business card image (JPG/PNG). Typhoon OCR extracts text from the card (supports Thai & English). Typhoon LLM parses the extracted text into structured JSON fields (e.g., name, job title, organization, email). Depending on your chosen path: Version 1: Typhoon LLM enriches the record with job type, level, and sector. Version 2: The workflow calls the Search API (via SerpAPI) to add a profile/company summary. The cleaned and enriched contact is saved to Google Sheets (can be swapped with your preferred CRM or database). (Optional) Typhoon LLM drafts a short, friendly greeting email, which can be sent automatically via Gmail. How to use The included form trigger is just one example. You can replace it with: A webhook for uploads A file drop in cloud storage Or even a manual trigger for testing You can easily change the destination from Google Sheets to HubSpot, Notion, Airtable, or Salesforce. The enrichment prompt is customizable — adjust it to classify contacts based on your organization’s needs. Requirements Typhoon API key Google Sheets API credentials + a prepared spreadsheet (Optional) Gmail API credentials for sending emails (Optional) SerpAPI key for the Search API enrichment path Customising this workflow This AI-powered business card reader can be adapted to many scenarios: Event lead capture: Collect cards at conferences and sync them to your CRM automatically. Sales enablement: Draft instant greeting emails for new contacts. Networking: Keep a clean and enriched database of your professional connections.
by Harry Siggins
This n8n template automatically processes your industry newsletters and creates AI-powered intelligence briefs that filter signal from noise. Perfect for busy professionals who need to stay informed without information overload, it delivers structured insights directly to Slack while optionally saving content questions to Notion. Who's it for Busy executives, product managers, and content creators at growing companies who subscribe to multiple industry newsletters but lack time to read them all. Ideal for teams that need to spot trends, generate content ideas, and share curated insights without drowning in information. How it works The workflow runs daily to fetch labeled emails from Gmail, combines all newsletter content, and sends it to an AI agent for intelligent analysis. The AI filters developments through your specific business lens, identifies only operationally relevant insights, and generates thought-provoking questions for content creation. Results are formatted as rich Slack messages using Block Kit, with optional Notion integration for tracking content ideas. Requirements Gmail account with newsletter labeling system OpenRouter API key for AI analysis (costs approximately $0.01-0.05 per run) or API key for a specific LLM Slack workspace with bot permissions for message posting Notion account with database setup (optional, for content question tracking) Perplexity API key (optional, for additional AI research capabilities) How to set up 1 Connect your Gmail, OpenRouter, and Slack credentials through n8n's secure credential system. Create a Gmail label for newsletters you want analyzed and setup in the "Get Labeled Newsletters" node. Update the Slack channel ID in the "Send to Slack" node. The template comes pre-configured with sample settings for tech companies, so you can run it immediately after credential setup. How to customize the workflow Edit the "Configuration" node to match your industry and audience - change the 13 pre-defined fields including target audience, business context, relevance filters, and content pillars. Adjust the cron expression in the trigger node for your timezone. Modify the Slack formatting code to change output appearance, or add additional destination nodes for email, Teams, or Discord. Remove Notion nodes if you only need Slack output. The AI analysis framework is fully customizable through the Configuration node, allowing you to adapt from the default tech company focus to any industry including healthcare, finance, marketing, or consulting.
by Rahul Joshi
Description Automates daily EOD summaries from Jira issues into an Excel sheet, then compiles a weekly summary using Azure OpenAI (GPT-4o-mini) and delivers it to stakeholders via email. Gain consistent reporting, clear insights, and hands-free delivery. ✨📧 What This Template Does Fetches Jira issues and extracts key fields. 🧩 Generates End‑of‑Day summaries and stores them in Excel daily. 📄 Aggregates the week’s EOD data from Excel. 📚 Creates a weekly summary using Azure OpenAI (GPT-4o-mini). 🤖 Delivers the weekly report to stakeholders via email. 📬 Key Benefits Saves time with fully automated daily and weekly reporting. ⏱️ Ensures consistent, structured summaries every time. 📏 Improves clarity for stakeholders with readable insights. 🪄 Produces mobile-friendly email summaries for quick consumption. 📱 No-code customization inside n8n. 🛠 Features Jira issue ingestion and transformation. Daily EOD summary generation and Excel storage. Weekly AI summarization with Azure OpenAI (GPT-4o-mini). Styled HTML email output to stakeholders. Scheduling for hands-free execution. Requirements An n8n instance (cloud or self-hosted). Jira access to read issues. Azure OpenAI (GPT-4o-mini) for weekly AI summarization. Email service (Gmail/SMTP) configured in n8n credentials. Excel/Sheet storage set up to append and read daily EOD entries. Target Audience Engineering and product teams needing routine summaries. Project managers tracking daily progress. Operations teams consolidating weekly reporting. Stakeholders who prefer clean email digests. Step-by-Step Setup Instructions Jira: Connect your Jira credentials and confirm issue read access. Azure OpenAI: Deploy GPT-4o-mini and add Azure OpenAI credentials in n8n. Gmail/SMTP: Connect your email account in n8n Credentials and authorize sending. Excel/Sheet: Configure the sheet used to store daily EOD summaries. Import the workflow, assign credentials to nodes, replace placeholders, then run and schedule. Security Best Practices Use scoped API tokens for Jira with read-only permissions. 🔐 Store Azure OpenAI and email credentials in n8n’s encrypted credentials manager. 🧯 Limit email recipients to approved stakeholder lists. 🚦 Review logs regularly and rotate credentials on a schedule. ♻
by Rahul Joshi
Description Automatically generate polished, n8n-ready template descriptions from your saved JSON workflows in Google Drive. This AI-powered automation processes workflow files, drafts compliant descriptions, and delivers Markdown and HTML outputs directly to your inbox. 🚀💌📊💬 What This Template Does Manually triggers the workflow to start processing. Searches a specified Google Drive folder for JSON workflow files. Iterates through each JSON file found in that folder. Downloads each file and prepares it for data extraction. Parses workflow data from the downloaded JSON content. Uses Azure OpenAI GPT-4 to generate concise titles and detailed descriptions. Converts the AI output into structured Markdown for n8n template publishing. Creates an HTML version of the description for email delivery. Logs generated details into a Google Sheet for record-keeping. Sends an email containing the Markdown and HTML descriptions to the target recipient. Key Benefits ✅ Fully automates n8n template description creation. ✅ Ensures consistency with official n8n publishing guidelines. ✅ Saves time while eliminating human writing errors. ✅ Provides dual Markdown + HTML outputs for flexibility. ✅ Centralizes workflow metadata in Google Sheets. ✅ Simplifies collaboration and version tracking via email delivery. Features Manual workflow trigger for controlled execution. Integration with Google Drive for locating and downloading JSON files. Intelligent parsing of workflow data from JSON structure. GPT-4-powered AI for title and description generation. Automatic Markdown + HTML formatting for n8n publishing. Google Sheets integration for persistent record-keeping. Automated Gmail delivery of generated documentation. Requirements n8n instance (cloud or self-hosted). Google Drive OAuth2 credentials with file read permissions. Google Sheets OAuth2 credentials with edit permissions. Azure OpenAI GPT-4 API key for AI text generation. Gmail OAuth2 credentials for email sending. Target Audience n8n content creators documenting workflows. 👩💼 Automation teams handling multiple template deployments. 🔄 Agencies and freelancers managing workflow documentation. 🏢 Developers leveraging AI for faster template creation. 🌐 Technical writers ensuring polished, standardized outputs. 📊 Step-by-Step Setup Instructions Connect your Google Drive account and specify the folder containing JSON workflows. 🔑 Authorize Google Sheets and confirm access to the tracking spreadsheet. ⚙️ Add Azure OpenAI GPT-4 API credentials for AI-powered text generation. 🧠 Connect Gmail credentials for automated email delivery. 📧 Run the workflow manually using a test JSON file to validate all nodes. ✅ Enable the workflow to automatically generate and send descriptions as needed. 🚀
by Ruth Olatunji
Eliminate 90% of manual work in procurement by automating quote requests, response tracking, price extraction, and supplier follow-ups. This complete automation handles everything from sending personalized emails to extracting pricing data with AI and sending WhatsApp reminders—so you can focus on decision-making, not data entry. This all-in-one workflow transforms a 5-hour manual process into a 10-minute review task, saving 15-20 hours per month while improving supplier response rates by 30%. How it works This workflow contains 4 independent automation modules running on separate schedules: Quote Request Sender (Manual trigger) Reads supplier list from Google Sheets Sends personalized emails via Gmail with category and deadline Logs all requests with timestamps to tracking sheet Response Monitor (Hourly schedule) Automatically checks Gmail for supplier replies with attachments Updates tracking sheet status to "Quote Received" Zero manual email monitoring required AI Price Extraction (Manual trigger) Downloads PDF/Excel attachments from emails Extracts text using n8n's built-in parser Sends to OpenAI GPT-4o-mini to identify products, prices, quantities, currencies Saves structured data to Price Comparison sheet WhatsApp Follow-ups (Daily at 9 AM) Checks for non-responsive suppliers Sends smart reminders at Day 3, 5, and 7 with escalating urgency Falls back to email if no phone number Logs all follow-up history Each module shares data through Google Sheets while running independently. Set up steps Time to set up: 20-30 minutes Create two Google Sheets: "Quote Tracking" (with columns: supplier_name, supplier_email, category, request_date, status, quote_received, phone_number, last_follow_up, follow_up_count) and "Price Comparison" (with columns: supplier_name, supplier_email, product_name, price, currency, quantity, extracted_date, source_file) Connect credentials: Gmail OAuth, Google Sheets OAuth (same account), OpenAI API key, Twilio Account SID + Auth Token Update all Google Sheet IDs in every Google Sheets node (8 nodes total across all modules) Configure Twilio WhatsApp sandbox: Go to Twilio Console → Messaging → WhatsApp → Send join code from your phone → Update "From" number in Send WhatsApp node Add 2-3 test suppliers to Tracking Sheet with your email addresses using + trick (yourname+supplier1@gmail.com) and phone numbers in international format Test each module: Execute Quote Sender → Reply to test email with PDF → Execute AI Extraction → Set supplier date to 3 days ago → Test Follow-ups Activate schedules for Response Monitor (hourly) and Follow-ups (daily at 9 AM) Detailed node configurations and troubleshooting tips are included in sticky notes throughout the workflow canvas. Requirements Gmail account with API access Google Sheets (2 sheets) OpenAI API account (~$5-15/month) Twilio account with WhatsApp (~$10-20/month) n8n (any version supporting HTTP Request node) Who is this for Procurement teams managing multiple supplier quotes Small businesses comparing vendor prices Operations managers handling RFQs Purchasing departments drowning in email attachments Anyone collecting and tracking supplier pricing at scale Time savings: From 5 hours to 10 minutes per quote cycle (90% reduction) Response rate improvement: 50% → 80% with automated follow-ups Accuracy: 95%+ AI extraction accuracy vs 5-10% manual data entry errors
by Avkash Kakdiya
How it works This workflow automatically generates an AI-powered revenue forecast whenever a new deal is created in HubSpot. It collects all active deals, standardizes key sales data, and sends it to an AI model for forecasting and risk analysis. The AI produces best, likely, and worst-case revenue scenarios along with actionable insights. Results are shared with stakeholders via Slack and Email and stored in Google Sheets for tracking. Step-by-step Step 1 : Collect & prepare HubSpot deals** HubSpot Trigger – Starts the workflow when a new deal is created in HubSpot. Get many deals – Fetches all active deals from the sales pipeline. Format HubSpot Data – Cleans and standardizes deal fields like amount, stage, probability, and region. Loop Over Items – Iterates through formatted deals to prepare them for AI analysis. Step 2 : Generate & distribute AI forecast** AI Revenue Forecast & Risk Analysis – Sends pipeline data to the AI model to generate revenue forecasts and insights. Groq Chat Model – Powers the AI analysis and produces structured forecasting output. Format AI response – Extracts key metrics, risks, and recommendations from the AI response. Send a message (Gmail) – Emails the revenue forecast report to stakeholders. Send a message (Slack) – Posts the forecast summary to a selected Slack channel. Append row in sheet – Logs forecast data and insights into Google Sheets. Wait – Adds a controlled pause before looping or completing the workflow. Why use this? Get real-time revenue forecasts triggered directly by CRM activity. Reduce manual pipeline analysis and reporting effort. Identify high-risk deals early with AI-driven insights. Keep leadership aligned through automated Slack and Email updates. Maintain a historical forecast log for audits and performance tracking.
by Rahul Joshi
📊 Description Automatically analyze the sentiment of Facebook posts and their audience comments using GPT-4 to identify trends and potential PR risks. 🧠💬 This workflow fetches recent posts via the Facebook Graph API, performs AI-powered sentiment analysis on both posts and comments, routes negative results to Slack for immediate attention, logs all data into Google Sheets, and sends a beautifully formatted sentiment summary report via Outlook. 📈📧 What This Template Does 1️⃣ Trigger – Runs daily at 10 AM to fetch the latest Facebook posts. ⏰ 2️⃣ Data Extraction – Pulls post text, reactions, and up to 100 comments per post using Facebook Graph API. 📲 3️⃣ Formatting – Structures and cleans Facebook data for AI analysis. 🧩 4️⃣ AI Sentiment Analysis – GPT-4 evaluates post tone and audience sentiment with confidence scores and explanations. 🤖 5️⃣ Routing – Sends negative sentiment alerts directly to Slack for quick response. ⚠️ 6️⃣ Logging – Records all sentiment metrics in Google Sheets, including positivity ratio and engagement data. 📊 7️⃣ Reporting – Generates a color-coded HTML email report and delivers it via Outlook. 💌 8️⃣ Error Handling – Sends Slack alerts if any node in the workflow fails. 🛡️ Key Benefits ✅ Monitors brand reputation automatically across Facebook comments ✅ Provides structured AI sentiment reports for data-driven decisions ✅ Flags negative audience sentiment for timely intervention ✅ Maintains an always-updated sentiment log in Google Sheets ✅ Delivers professional HTML summary emails to teams Features Automated daily trigger for post sentiment scanning Facebook Graph API integration for posts and comments GPT-4–powered comment tone and sentiment scoring Slack notifications for negative sentiment alerts Google Sheets sentiment dashboard logging Beautiful HTML report delivery through Microsoft Outlook Built-in error detection with Slack fallback alerts Requirements Facebook Graph API credentials with pages_read_engagement access OpenAI API key for GPT-4 or GPT-4o Slack Bot token with chat:write permission Google Sheets OAuth2 credentials with edit rights Microsoft Outlook OAuth2 credentials for email delivery Target Audience Social media and marketing teams monitoring brand perception 📣 PR teams managing engagement or audience sentiment 🧾 Analysts building data dashboards from engagement metrics 📊 Agencies offering automated reporting for client pages 🧑💼 Step-by-Step Setup Instructions 1️⃣ Connect Facebook Graph API credentials and set your page ID. 2️⃣ Add OpenAI credentials for GPT-4 sentiment analysis. 3️⃣ Configure Slack and provide your channel ID for alerts. 4️⃣ Set Google Sheets credentials and specify sheet ID and name. 5️⃣ Connect Microsoft Outlook for email report delivery. 6️⃣ Adjust the daily schedule (default 10 AM) as needed. 7️⃣ Run once manually to verify setup, then enable automation. ✅
by gclbck
Analyze YouTube videos for virality with an AI-powered report This workflow automates the discovery and analysis of potentially viral YouTube videos. It searches for recent, popular videos based on a keyword, calculates a unique "Algorithmic Lift Score" to measure virality, and uses an AI agent to generate an insightful summary report that is sent directly to your email. What it does This workflow identifies videos that are outperforming their channel's baseline, a key indicator of viral potential. It operates in several stages: Searches YouTube: It finds recent, top-performing videos based on your specified keyword and timeframe. Gathers Data: For each video found, it fetches detailed statistics for both the video (views, likes, comments) and its channel (subscriber count, total views). Calculates Virality Score: It calculates an "Algorithmic Lift Score" for each video. This custom metric prioritizes videos that achieve high view counts and engagement relative to their channel's subscriber base. Analyzes with AI: The top 5 videos, sorted by their virality score, are sent to an AI agent (pre-configured for OpenAI). The AI generates a concise summary highlighting trends, top performers, and other noteworthy patterns. Sends Email Report: The final AI-generated analysis is converted to HTML and emailed to you, providing a ready-to-read report on what's trending in your niche. Who it's for This workflow is perfect for: Content Creators** looking for trending topics and content ideas. Digital Marketers** conducting competitor analysis or market research. Social Media Managers** wanting to understand what content resonates on YouTube. Data Analysts** who need to automate the collection and analysis of YouTube trends. Requirements A Google API Key with the "YouTube Data API v3" enabled. An OpenAI API Key (or another compatible AI model credential). A connected Gmail account in n8n to send the final report. How to set up Configure the Setup Node: Click on the "Setup" node and fill in the values: query: The keyword you want to search for (e.g., "AI tools"). GoogleAPIkey: Your Google API key. daysback: How many days in the past to search for new videos. maxResult: The number of videos to analyze (e.g., 20). email: The email address where the report will be sent. Set AI Credentials: Click the "OpenAI Chat Model" node and add your OpenAI API key to the credentials. Set Gmail Credentials: Click the "Send_Report" node and connect your Gmail account to the credentials.