by Amina Doszhan
Hi, I’m Amina I built this workflow to remove the daily pain of Meta Ads reporting. If you manage multiple ad accounts, you know how time-consuming it is to open Ads Manager, export campaign data, clean spreadsheets, and send updates to clients. This automation pulls campaign performance directly from the Meta Ads API, writes structured data into Google Sheets (per client), generates a performance summary, and sends a Telegram alert when the report is updated. It transforms manual reporting into a fully automated monitoring system. What it does Reads a list of clients/ad accounts from a Google Sheets “client register” Fetches campaign-level performance data from the Meta Ads API Splits the campaign results into clean, row-ready records Appends each campaign as a new row in the client’s Google Sheets report Calculates aggregated performance metrics Applies campaign diagnostics logic Sends a Telegram notification with summary + alerts Runs automatically on a schedule (daily/weekly) How it works Schedule Trigger starts the workflow on a defined schedule. Google Sheets (Get rows) loads your client register (one row per client). Use this template structure for the client register: 👉 Client Register Template The register should include: ad_account_id access_token report_sheet_url Loop Over Items processes each client individually. Code (ctx) prepares the current client context (account ID, token, report sheet URL). HTTP Request calls the Meta Ads Insights endpoint and retrieves campaign-level metrics. IF checks the response (skip if no data). Merge combines client context + API response. Code (Split campaigns) converts the campaigns array into individual items (one per campaign) and formats metrics for reporting. Code (Extract spreadsheetId) extracts the spreadsheet ID from the report URL. Google Sheets (Append row) writes each campaign row into the client’s report sheet. Example report structure: 👉 Campaign Report Template Code (Summary & Status Logic) aggregates totals and applies campaign diagnostics. Telegram sends a structured performance summary including: Total metrics Campaign-level highlights Status alerts Direct link to the report Data captured (campaign level) Ad account ID Report date (date_start) Campaign name Spend Impressions Clicks CTR CPM CPC Date start / Date stop Summary Generation Logic After writing campaign rows to Google Sheets, the workflow generates a performance overview using a JavaScript node. Aggregated calculations: Total Spend** = sum of all campaign spend values Total Impressions** = sum of impressions Total Clicks** = sum of clicks CTR** = (Total Clicks / Total Impressions) × 100 CPC** = Total Spend / Total Clicks The workflow then: Formats the totals into a readable performance summary Lists campaign highlights Applies status diagnostics Appends the Google Sheets report link Sends everything via Telegram This provides both structured spreadsheet reporting and a quick executive snapshot. Status Logic (Campaign Diagnostics) Each campaign is automatically evaluated based on CTR and CPC thresholds. Status Rules Weak Creative (❌) If CTR < 1.5% → Indicates low engagement. The ad creative may need improvement. Expensive Click (⚠) If CPC > 0.5 → Indicates high cost per click. Audience targeting or bidding strategy may need optimization. Good Candidate to Scale (🔥) If CTR ≥ 2% AND CPC ≤ 0.5 → Strong performance. The campaign may be suitable for scaling. OK (✅) If none of the above conditions are met → Campaign performance is within an acceptable range. The status appears directly in the Telegram notification, allowing marketers to quickly identify which campaigns need attention without logging into Ads Manager. How to set up Estimated setup time: 10–20 minutes. Create a Google Sheets client register with: ad_account_id access_token report_sheet_url Connect Google Sheets credentials in n8n. Add your Meta Ads API access token (do not hardcode API keys in nodes). Connect your Telegram bot and set the destination chat ID. Adjust the schedule (daily/weekly) and run a test execution. Requirements Meta Ads API access token (Facebook Graph API) Google Sheets credentials Telegram bot token + chat ID How to customize Add additional metrics (e.g., conversions, purchases, ROAS) by extending Meta API fields. Modify CTR/CPC thresholds in the Status Logic section. Change the report structure in the “Split campaigns” step. Switch notifications from Telegram to Slack or Email. Add filters (e.g., only active campaigns, only spend > X). Benefits Fully automated reporting Multi-client support Dynamic spreadsheet handling Built-in KPI calculations Automated campaign diagnostics Instant performance alerts Scalable agency-ready structure
by oka hironobu
Who is this for Template creators who want to validate their n8n workflows against the official Creator Hub approval criteria before submitting. Useful for both new and verified creators looking to reduce rejection rates. What this workflow does This workflow scrapes the latest approval guidelines from four n8n Creator Hub Notion pages, generates a structured pass/fail checklist using Gemini AI, then reviews your uploaded workflow JSON against every criterion. The results are delivered as a formatted HTML email report with a score and specific fixes. Setup Add a Google Gemini (PaLM) API credential for criteria generation, file upload, and review. Add a Gmail OAuth2 credential for sending the results email. Activate the workflow and open the generated form URL. Requirements Google Gemini API key (used for three separate calls: criteria generation, file uploads, and final review) Gmail account with OAuth2 access enabled How to customize Change the Gemini model in the "Google Gemini for criteria generation" and "Review workflow against criteria" nodes. Edit the review prompt in the "Review workflow against criteria" node to adjust scoring weight or add custom checks. Replace the Gmail node with another email service or a Discord/Slack webhook for different delivery methods. Important disclaimer This workflow provides AI-generated feedback based on n8n Creator Hub guidelines available as of February 2026. The review results are not a guarantee of approval or rejection — actual decisions are made by the n8n review team. Guidelines and criteria may change over time. Always check the latest official information on the n8n Creator Hub before submitting your template.
by Madame AI
Extract employee emails from company domains using Human-in-the-Loop scraping using BrowserAct This workflow automates the process of enriching company data by scraping employee emails and positions from company websites. It processes a list of URLs from a Google Sheet, handles anti-bot measures (CAPTCHAs) by alerting you via Telegram to solve them manually before resuming, and saves the verified data back to your sheet. Target Audience Sales development representatives (SDRs), lead generation agencies, and recruiters needing verified contact info. How it works Read List: The workflow reads a list of company URLs from a Google Sheet. Scrape Data: It loops through each URL and triggers BrowserAct to scrape the site for team pages or contact info. Check Status: Finished**: If scraping is successful, it parses the data (Name, Email, Position). Paused: If a CAPTCHA is detected, the workflow sends a **Telegram alert. It pauses execution until you verify the CAPTCHA in the browser, then resumes automatically. Failed**: If scraping fails, it logs an error. Save Data: Extracted emails and names are appended to a new tab in the Google Sheet specific to that company. Notify: Once a company is processed, a Slack notification is sent. How to set up Configure Credentials: Connect your Google Sheets, BrowserAct, Telegram, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Company Domain to Email Enrichment template is saved in your BrowserAct account. Setup Google Sheet: Create a Google Sheet with a column named Company url and populate it with target domains. Configure Notifications: Open the Send a message node (Slack) and select your target channel. Open the Alert User and Remind user nodes (Telegram) and ensure your Chat ID is correct. Activate: Run the workflow manually to start processing the list. Requirements BrowserAct* account with the *Company Domain to Email Enrichment** template. Google Sheets** account. Telegram** account (Bot Token). Slack** account. How to customize the workflow Change Data Source: Replace the Google Sheet input with an Airtable or HubSpot node to pull domains from your CRM. Enrich Data: Add a Clearbit or Hunter.io node after scraping to cross-reference found emails. Modify Alerts: Change the Telegram alerts to Email or SMS (via Twilio) if preferred. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Scale Your Lead Gen! 🚀 Automate Email Extraction with n8n & Browseract
by David Olusola
GPT-4o Resume Screener with Error Handling - Google Sheets & Drive Pipeline How it works Enterprise-grade resume screening automation built for production environments. This workflow combines intelligent AI analysis with comprehensive error handling to ensure reliable processing of candidate applications. Every potential failure point is monitored with automatic recovery and notification systems. Core workflow steps: Intelligent Email Processing - Monitors Gmail with attachment validation and file type detection Robust File Handling - Multi-format support with upload verification and extraction validation Quality-Controlled AI Analysis - GPT-4o evaluation with output validation and fallback mechanisms Verified Data Extraction - Contact and qualification extraction with data integrity checks Dual Logging System - Success tracking in main dashboard, error logging in separate audit trail Error Recovery Features: Upload failure detection with retry mechanisms Text extraction validation with quality thresholds AI processing timeout protection and fallback responses Data validation before final logging Comprehensive error notification and tracking system Set up steps Total setup time: 25-35 minutes Core Credentials Setup (8 minutes) Gmail OAuth2 with attachment permissions Google Drive API with folder creation rights Google Sheets API with read/write access OpenAI API key with GPT-4o model access Primary Configuration (12 minutes) Configure monitoring systems - Set up Gmail trigger with error detection Establish file processing pipeline - Create Drive folders for resumes and configure upload validation Deploy dual spreadsheet system - Set up main tracking sheet and error logging sheet Initialize AI processing - Configure GPT-4o with structured output parsing and timeout settings Customize job requirements - Update role specifications and scoring criteria Error Handling Setup (10 minutes) Configure error notifications - Set administrator email for failure alerts Set up error logging spreadsheet - Create audit trail for failed processing attempts Customize timeout settings - Adjust processing limits based on expected file sizes Test error pathways - Validate notification system with sample failures Advanced Customization (5 minutes) Modify validation thresholds for resume quality Adjust AI prompt for industry-specific requirements Configure custom error messages and escalation rules Set up automated retry logic for transient failures Production-Ready Features: Comprehensive logging for compliance and auditing Graceful degradation when services are temporarily unavailable Detailed error context for troubleshooting Scalable architecture for high-volume processing Template Features Enterprise Error Management Multi-layer validation at every processing stage Automatic error categorization and routing Administrative alerts with detailed context Separate error logging for audit compliance Timeout protection preventing workflow hangs Advanced File Processing Upload success verification before processing Text extraction quality validation Resume content quality thresholds Corrupted file detection and handling Format conversion error recovery Robust AI Integration GPT-4o processing with output validation Structured response parsing with error checking AI timeout protection and fallback responses Failed analysis logging with manual review triggers Retry logic for transient API failures Production Monitoring Real-time error notifications via email Comprehensive error logging dashboard Processing success/failure metrics Failed resume tracking for manual review Audit trail for compliance requirements Data Integrity Controls Pre-logging validation of all extracted data Missing information detection and flagging Contact information verification checks Score validation and boundary enforcement Duplicate detection and handling Designed for HR departments and recruiting agencies that need reliable, scalable resume processing with enterprise-level monitoring and error recovery capabilities.
by Cheng Siong Chin
How It Works This workflow automates financial oversight for accounting teams, tax professionals, and financial controllers managing monthly transaction volumes. It solves the challenge of identifying and correcting revenue discrepancies, tax calculation errors, and unusual patterns that manual review often misses. The system collects monthly financial transactions via scheduled trigger, then fetches complete transaction data through API integration. An AI anomaly detection agent analyzes patterns using multiple specialized tools: an OpenAI model identifies statistical outliers and unusual behaviors, a calculator validates mathematical accuracy of revenue entries, and a historical pattern analyzer compares against baseline trends. Detected anomalies undergo verification by a secondary AI agent to eliminate false positives. Confirmed issues route to automated revenue adjustments and tax agent notifications, while alert emails provide detailed anomaly reports with recommended actions, ensuring financial accuracy and compliance. Setup Steps Configure OpenAI API credentials in "Anomaly Detection Agent" Set up financial data source API connection in "Fetch Financial Transactions" node with authentication Define anomaly detection thresholds and rules in AI agent tool configurations Configure tax system integration credentials in "Update Revenue Entries" Set up email notification service with recipient lists in "Send Anomaly Alert" node Prerequisites OpenAI API access, financial system API credentials with read/write permissions. Use Cases Monthly financial close automation, revenue recognition validation Customization Modify anomaly detection algorithms for industry-specific patterns Benefits Reduces financial close time by 60%, catches revenue errors before reporting
by Cheng Siong Chin
How It Works This workflow automates performance monitoring by aggregating data from PM tools, code repositories, meeting logs, and CRM systems. It processes team metrics using AI-powered analysis via OpenAI, identifies bottlenecks and workload issues, then creates manager follow-ups and tasks. The system runs weekly, collecting 4 data sources, combining them, analyzing trends, evaluating team capacity, and routing alerts to managers via Gmail. Managers receive structured summaries highlighting performance gaps and required actions. Target audience: Engineering managers and team leads monitoring team velocity, code quality, and capacity planning. Setup Steps Configure credentials: PM Tool API key, Code Repo token, and CRM API key. Set the OpenAI API key. Connect your Gmail account via OAuth. In the Workflow Configuration node, adjust API endpoints and polling intervals. Map data field names to match your tools. Test data fetch nodes using sample queries before deployment. Prerequisites PM tool API access, GitHub/GitLab token, CRM credentials, OpenAI API key, Gmail OAuth connection Use Cases Track engineering team productivity weekly; identify code review bottlenecks; Customization Replace PM tool with Jira/Linear; swap OpenAI for Claude/Gemini; Benefits Reduces manual performance tracking by 6+ hours weekly; provides real-time visibility into team capacity;
by Daniel Shashko
How it Works This workflow automatically monitors competitor affiliate programs twice daily using Bright Data's web scraping API to extract commission rates, cookie durations, average order values, and payout terms from competitor websites. The AI analysis engine scores each competitor (0-100 points) by comparing their commission rates, cookie windows, earnings per click (EPC), and affiliate-friendliness against your program, then categorizes them as Critical (70+), High (45-69), Medium (25-44), or Low (0-24) threat levels. Critical and high-threat competitors trigger immediate Slack alerts with detailed head-to-head comparisons and strategic recommendations, while lower threats route to monitoring channels. All competitors are logged to Google Sheets for tracking and historical analysis. The system generates personalized email reports—urgent action plans with 24-48 hour deadlines for critical threats, or standard intelligence updates for routine monitoring. The entire process takes minutes from scraping to strategic alert, eliminating manual competitive research and ensuring you never lose affiliates to better-positioned competitor programs. Who is this for? Affiliate program managers monitoring competitor programs who need automated intelligence E-commerce brands in competitive verticals who can't afford to lose top affiliates Affiliate networks managing multiple merchants needing competitive benchmarking Performance marketing teams responding to commission rate wars in their industry Setup Steps Setup time: Approx. 20-30 minutes (Bright Data setup, API configuration, spreadsheet creation) Requirements: Bright Data account with web scraping API access Google account with a competitor tracking spreadsheet Slack workspace SMTP email provider (Gmail, SendGrid, etc.) Sign up for Bright Data and get your API credentials and dataset ID. Create a Google Sheets with two tabs: "Competitor Analysis" and "Historical Log" with appropriate column headers. Set up these nodes: Schedule Competitor Check: Pre-configured for twice daily (adjust timing if needed). Scrape Competitor Sites: Add Bright Data credentials, dataset ID, and competitor URLs. AI Offer Analysis: Review scoring thresholds (commission, cookies, AOV, EPC). Route by Threat Level: Automatically splits by 70-point critical and 45-point high thresholds. Google Sheets Nodes: Connect spreadsheet and map data fields. Slack Alerts: Configure channels for critical alerts and routine monitoring. Email Reports: Set up SMTP and recipient addresses. Credentials must be entered into their respective nodes for successful execution. Customization Guidance Scoring Weights:** Adjust point values for commission (35), cookies (25), cost efficiency (25), volume (15) based on your priorities. Threat Thresholds:** Modify 70-point critical and 45-point high thresholds for your risk tolerance. Benchmark Values:** Update commission gap thresholds (5%+ = critical, 2%+ = warning) and cookie duration benchmarks (30+ days = critical). Competitor URLs:** Add or remove competitor websites to monitor in the HTTP Request node. Alert Routing:** Create tier-based channels or route to Microsoft Teams, Discord, or SMS via Twilio. Scraping Frequency:** Change from twice-daily to hourly for competitive markets or weekly for stable industries. Additional Networks:** Duplicate workflow for different affiliate networks (CJ, ShareASale, Impact, Rakuten). Once configured, this workflow will continuously monitor competitive threats and alert you before top affiliates switch to better-paying programs, protecting your affiliate revenue from competitive pressure. Built by Daniel Shashko Connect on LinkedIn
by Davide
This workflow is designed to automatically process AI news emails, extract and summarize articles, categorize them, and store the results in a structured Google Sheet for daily tracking and insights. This automated workflow processes a daily AI newsletter from AlphaSignal, extracting individual articles, summarizing them, categorizing them, and saving the results to a Google Sheet. Key Features 1. ✅ Fully Automated Daily News Pipeline No manual work is required — the workflow runs autonomously every time a new email arrives. This eliminates repetitive human tasks such as opening, reading, and summarizing newsletters. 2. ✅ Cross-AI Model Integration It combines multiple AI systems: Google Gemini* and *OpenAI GPT-5 Mini** for natural language processing and categorization. Scrapegraph AI** for external web scraping and summarization. This multi-model approach enhances accuracy and flexibility. 3. ✅ Accurate Content Structuring The workflow transforms unstructured email text into clean, structured JSON data, ensuring reliability and easy export or reuse. 4. ✅ Multi-Language Support The summaries are generated in Italian, which is ideal for local or internal reporting, while the metadata and logic remain in English — enabling global adaptability. 5. ✅ Scalable and Extensible New newsletters, categories, or destinations (like Notion, Slack, or a database) can be added easily without changing the core logic. 6. ✅ Centralized Knowledge Repository By appending to Google Sheets, the team can: Track daily AI developments at a glance. Filter or visualize trends across categories. Use the dataset for further analysis or content creation. 7. ✅ Error-Resilient and Maintainable The JSON validation and loop-based design ensure that if a single article fails, the rest continue to process smoothly. How it Works Email Trigger & Processing: The workflow is automatically triggered when a new email arrives from news@alphasignal.ai. It retrieves the full email content and converts its HTML body into clean Markdown format for easier parsing. Article Extraction & Scraping: A LangChain Agent, powered by Google Gemini, analyzes the newsletter's Markdown text. Its task is to identify and split the content into individual articles. For each article it finds, it outputs a JSON object containing the title, URL, and an initial summary. Crucially, the agent uses the "Scrape" tool to visit each article's URL and generate a more accurate summary in Italian based on the full page content. Data Preparation & Categorization: The JSON output from the previous step is validated and split into individual data items (one per article). Each article is then processed in a loop: Categorization: An OpenAI model analyzes the article's title and summary, assigning it to the most relevant pre-defined category (e.g., "LLM & Foundation Models," "AI Automation & WF"). URL Shortening: The article's link is sent to the CleanURI API to generate a shortened URL. Data Storage: Finally, for each article, a new row is appended to a specified Google Sheet. The row includes the current date, the article's title, the shortened link, the Italian summary, and its assigned category. Set up Steps To implement this workflow, you need to configure the following credentials and nodes in n8n: Email Credentials: Set up a Gmail OAuth2 credential (named "Gmail account" in the workflow) to allow n8n to access and read emails from the specified inbox. AI Model APIs: Google Gemini: Configure the "Google Gemini(PaLM)" credential with a valid API key to power the initial article extraction and scraping agent. OpenAI: Configure the "OpenAi account (Eure)" credential with a valid API key to power the article categorization step. Scraping Tool: Set up the ScrapegraphAI account credential with its required API key to enable the agent to access and scrape content from the article URLs. Google Sheets Destination: Configure the "Google Sheets account" credential via OAuth2. You must also specify the exact Google Sheet ID and sheet name (tab) where the processed article data will be stored. Activation: Once all credentials are tested and correctly configured, the workflow can be activated. It will then run automatically upon receiving a new newsletter from the specified sender. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Cheng Siong Chin
How It Works This workflow automates end-to-end research analysis by coordinating multiple AI models—including NVIDIA NIM (Llama), OpenAI GPT-4, and Claude to analyze uploaded documents, extract insights, and generate polished reports delivered via email. Built for researchers, academics, and business analysts, it enables fast, accurate synthesis of information from multiple sources. The workflow eliminates the manual burden of document review, cross-referencing, and report compilation by running parallel AI analyses, aggregating and validating model outputs, and producing structured, publication-ready documents in minutes instead of hours. Data flows from Google Sheets (user input) through document extraction, parallel AI processing, response aggregation, quality validation, structured storage in Google Sheets, automated report formatting, and final delivery via Gmail with attachments. Setup Steps Configure API credentials Add OpenAI API key with GPT-4 access enabled Connect Anthropic Claude API credentials Set up Google Sheets integration with read/write permissions Configure Gmail credentials with OAuth2 authentication for automated email Customize email templates and report formatting preferences Prerequisites NVIDIA NIM API access, OpenAI API key (GPT-4 enabled), Anthropic Claude API key Use Cases Academic literature reviews, competitive intelligence reports Customization Adjust AI model parameters (temperature, tokens) per analysis depth needs Benefits Reduces research analysis time by 80%, eliminates single-source bias through multi-model consensus
by kota
Overview This workflow automatically replies to important incoming Gmail messages using AI, while preventing duplicate or unnecessary replies. It applies multiple safety checks (filters, Google Sheets history, and Gmail sent history) to ensure replies are sent only when appropriate. This template is designed for creators, freelancers, and teams who want a reliable and maintainable AI-powered email auto-reply system. How it works New Gmail messages are received and normalized into a consistent structure. Unwanted emails (newsletters, promotions, no-reply senders) are filtered out. The sender’s email is checked against a Google Sheets reply history. Gmail is searched to confirm no recent reply was already sent. If no duplicate is found, an AI-generated English reply is created and sent. Setup steps Connect your Gmail account. Connect a Google Sheet for reply history tracking. Review the ignore rules and thresholds in the config node. Customize the AI prompt if needed. Activate the workflow. Estimated setup time: 5–10 minutes. Notes Sticky notes inside the workflow explain each processing step in detail. No hardcoded API keys are used. The workflow is intentionally linear for clarity and easy maintenance.
by Avkash Kakdiya
How it works This workflow captures new form submissions, cleans the data, and stores it securely before applying AI-based lead qualification. It analyzes each message to assign category, score, and priority. Based on this classification, leads are routed to the appropriate response path. The system ensures every lead is saved first, then enriched and responded to automatically. Step-by-step Capture and clean lead data** Typeform Trigger – Listens for new submissions in real time. Sanitize Lead Data – Cleans names, formats emails, and extracts domain. Store lead in database** Create a record – Saves lead details in Airtable with status “New”. Analyze and enrich with AI** AI Lead Analyzer – Uses OpenAI to classify, score, and prioritize leads. Merge – Combines original lead data with AI-generated insights. Route and respond automatically** Route by Category – Directs leads based on AI classification. Send a message – Sends tailored email for sales inquiries. Send a message1 – Sends confirmation email for support requests. Send a message2 – Sends response for partnership inquiries. Send a message3 – Sends fallback response for other categories. Send a message4 – Sends Discord alert for high-priority leads. Why use this? Ensures no lead is lost by storing data before processing Automatically prioritizes high-value opportunities using AI Reduces manual lead qualification and response time Provides personalized responses based on intent Enables real-time team alerts for important leads
by Rahul Joshi
📘 Description This workflow enables on-demand social lead discovery using a chat-based interface. When a user submits a lead discovery query, the workflow searches Twitter and Instagram for posts where people are actively asking for tools, recommendations, or help solving real problems. An AI agent filters out spam and promotions, extracts only genuine buying-intent posts, and classifies each lead as Low, Medium, or High intent. Qualified leads are converted into two outputs: a human-readable Slack summary for quick review and a structured, CRM-ready Notion record for tracking and follow-ups. Short-term conversation memory is maintained to improve relevance across follow-up queries. Built-in error handling ensures failures are reported immediately. ⚠️ Deployment Disclaimer This template can only be used on self-hosted n8n installations. It relies on external MCP tools and custom AI orchestration not supported on n8n Cloud. ⚙️ What This Workflow Does (Step-by-Step) 💬 Receive User Lead Discovery Query (Chat Trigger) Accepts a natural-language lead discovery request from a user. 🧠 Maintain Short-Term Conversation Context Keeps recent query context to improve follow-up accuracy. 🔎 Discover Buying-Intent Leads from Social Platforms (AI) Searches Twitter and Instagram for posts indicating real buying or problem-solving intent and extracts structured lead data. 🌐 External Social Search & Enrichment (MCP Tool) Fetches relevant social posts from external platforms. 🧠 AI Lead Qualification Classifies intent (Low / Medium / High), summarizes the problem, and filters noise. 🧩 Generate Slack & Notion Lead Insight Summary (AI) Creates a concise Slack summary and a clean, structured Notion record. 📣 Send Lead Discovery Summary to Slack Delivers a skimmable summary for immediate team visibility. 🗂 Store Lead Discovery Insight in Notion CRM Logs search query, themes, and overall intent for tracking. 🚨 Error Handler → Email Alert Sends an alert if the workflow fails at any step. 🧩 Prerequisites • Self-hosted n8n instance • Azure OpenAI API credentials • MCP bearer authentication for social search • Slack API credentials • Notion API credentials 🛠 Setup Instructions Deploy the workflow on a self-hosted n8n instance Connect Azure OpenAI, MCP, Slack, and Notion credentials Enable the chat trigger Test with a sample lead discovery query 🛠 Customization Tips • Adjust intent classification rules in the AI prompt • Modify output fields to match your CRM schema • Extend discovery to additional platforms via MCP tools 💡 Key Benefits ✔ On-demand social lead discovery via chat ✔ Filters only real buying-intent signals ✔ Produces Slack-ready summaries and CRM-ready records ✔ Maintains context across follow-up queries ✔ Eliminates manual social media scanning 👥 Perfect For Sales teams Growth teams Founders Agencies sourcing leads from social platforms