by Kornel Dubieniecki
AI LinkedIn Content Assistant using Bright Data and NocoDB Who’s it for This template is designed for creators, founders, and automation builders who publish regularly on LinkedIn and want to analyze their content performance using real data. It’s especially useful for users who are already comfortable with n8n and want to build data-grounded AI assistants instead of relying on generic prompts or manual spreadsheets. What this workflow does This workflow builds an AI-powered LinkedIn content assistant backed by real engagement data. It automatically: Scrapes LinkedIn posts and engagement metrics using Bright Data Stores structured post data in NocoDB Enables an AI chat interface in n8n to query and analyze your content Returns insights based on historical performance (not hallucinated data) You can ask questions like: “Which posts performed best last month?” “What content got the most engagement?” “What should I post next?” Requirements Self-hosted or cloud n8n instance Bright Data – LinkedIn scraping & data extraction NocoDB – Open-source Airtable-style database Open AI API – For AI reasoning & insights Setup Import the workflow into your n8n instance Open the Config node and fill in required variables Connect your credentials for Bright Data, NocoDB, and Open AI API Activate the workflow and run the scraper once to populate data How to customize the workflow You can extend this template by: Adding new metrics or post fields in NocoDB Scheduling regular data refreshes Changing the AI system prompt to match your content strategy Connecting additional channels (email, Slack, dashboards) This template is fully modular and designed to be adapted to your workflow. Questions or Need Help? For setup help, customization, or advanced AI workflows, join my 🌟 FREE 🌟 community: Tech Builders Club Happy building! 🚀 - Kornel Dubieniecki
by 中崎功大
Smart Irrigation Scheduler with Weather Forecast and Soil Analysis Summary Automated garden and farm irrigation system that uses weather forecasts and evapotranspiration calculations to determine optimal watering schedules, preventing water waste while maintaining healthy plants. Detailed Description A comprehensive irrigation management workflow that analyzes weather conditions, forecasts, soil types, and plant requirements to make intelligent watering decisions. The system considers multiple factors including expected rainfall, temperature, humidity, wind speed, and days since last watering to determine if irrigation is needed and how much. Key Features Multi-Zone Management**: Support for multiple irrigation zones with different plant and soil types Weather-Based Decisions**: Uses OpenWeatherMap current conditions and 5-day forecast Evapotranspiration Calculation**: Simplified Penman method for accurate water loss estimation Rain Forecast Skip**: Automatically skips watering when significant rain is expected Plant-Type Specific**: Different requirements for flowers, vegetables, grass, and shrubs Soil Type Consideration**: Adjusts for clay, loam, and sandy soil characteristics Urgency Classification**: High/medium/low priority based on moisture levels Optimal Timing**: Adjusts watering time based on temperature and wind conditions IoT Integration**: Sends commands to smart irrigation controllers Historical Logging**: Tracks all decisions in Google Sheets Use Cases Home garden automation Commercial greenhouse management Agricultural operations Landscaping company scheduling Property management with large grounds Water conservation projects Required Credentials OpenWeatherMap API key Slack Bot Token Google Sheets OAuth IoT Hub API (optional) Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses OpenWeatherMap node (rarely used in templates) Uses Split Out node for loop-style processing of zones Uses Filter node for conditional routing Uses Aggregate node to collect results Implements evapotranspiration calculation using Code node Comprehensive multi-factor decision logic Workflow Architecture [Daily Morning Check] [Manual Override Trigger] | | +----------+-------------+ | v [Define Irrigation Zones] | v [Split Zones] (Loop) / \ v v [Get Current] [Get 5-Day Forecast] \ / +----+----+ | v [Merge Weather Data] | v [Analyze Irrigation Need] / \ v v [Filter Needing] [Aggregate All] \ / +----+----+ | v [Generate Irrigation Schedule] | v [Has Irrigation Tasks?] (If) / \ Has Tasks No Tasks / | \ | Sheets[Slack] [Log No Action] \ | / | +---+---+-----------+ | v [Respond to Webhook] Configuration Guide Irrigation Zones: Edit "Define Irrigation Zones" with your zone data (coordinates, plant/soil types) Water Thresholds: Adjust waterThreshold per zone based on plant needs OpenWeatherMap: Add API credentials in the weather nodes Slack Channel: Set to your garden/irrigation channel IoT Integration: Configure endpoint URL for your smart valve controller Google Sheets: Connect to your logging spreadsheet Decision Logic The system evaluates: Expected rainfall in next 24 hours (skip if >5mm expected) Soil moisture estimate based on days since watering + evapotranspiration Plant-specific minimum and ideal moisture levels Temperature adjustments for hot days Scheduled watering frequency by plant type Wind speed for optimal watering time
by zahir khan
Screen resumes & save candidate scores to Notion with OpenAI This template helps you automate the initial screening of job candidates by analyzing resumes against your specific job descriptions using AI. 📺 How It Works The workflow automatically monitors a Notion database for new job applications. When a new candidate is added: It checks if the candidate has already been processed to avoid duplicates. It downloads the resume file (supporting both PDF and DOCX formats). It extracts the raw text and sends it to OpenAI along with the specific job description and requirements. The AI acts as a "Senior Technical Recruiter," scoring the candidate on skills, experience, and stability. Finally, it updates the Notion entry with a fit score (0-100), a one-line summary, detected skills, and a detailed analysis. 📄 Notion Database Structure You will need two databases in Notion: Jobs (containing descriptions/requirements) and Candidates (containing resume files). Candidates DB Fields:** AI Comments (Text), Resume Score (Text), Top Skills Detected (Text), Feedback (Select), One Line Summary (Text), Resume File (Files & Media). Jobs DB Fields:** Job Description (Text), Requirements (Text). 👤 Who’s it for This workflow is for recruiters, HR managers, founders, and hiring teams who want to reduce the time spent on manual resume screening. Whether you are handling high-volume applications or looking for specific niche skills, this tool ensures every resume gets a consistent, unbiased first-pass review. 🔧 How to set up Create the required databases in Notion (as described above). Import the .json workflow into your n8n instance. Set up credentials for Notion and OpenAI. Link those credentials in the workflow nodes. Update Database IDs: Open the "Fetch Job Description" and "On New Candidate" nodes and select your specific Notion databases. Run a test with a sample candidate and validate the output in Notion. 📋 Requirements An n8n instance (Cloud or Self-hosted) A Notion account OpenAI API Key (GPT-4o or GPT-4 Turbo recommended for best reasoning) 🧩 How to customize the workflow The system is fully modular. You can: Adjust the Persona:** In the Analyze Candidate agent nodes, edit the system prompt to change the "Recruiter" persona (e.g., make it stricter or focus on soft skills). Change Scoring:** Modify the scoring matrix in the prompt to weight "Education" or "Experience" differently. Filter Logic:** Add a node to automatically disqualify candidates below a certain score (e.g., < 50) and move them to a "Rejected" status in Notion. Multi-language:** Update the prompt to translate summaries into your local language if the resume is in English.
by Tomoki
Video Processing Pipeline with Thumbnail Generation and CDN Distribution Summary Automated video processing system that monitors S3 for new uploads, generates thumbnails and preview clips, extracts metadata, transcodes to multiple formats, and distributes to CDN with webhook notifications. Detailed Description A comprehensive video processing workflow that receives S3 events or manual triggers, validates video files, extracts metadata via FFprobe, generates thumbnails at key frames, creates animated GIF previews, transcodes to multiple resolutions, invalidates CDN cache, and sends completion notifications. Key Features S3 Event Monitoring**: Automatic detection of new video uploads Thumbnail Generation**: Multiple sizes at key frame intervals Video Metadata**: FFprobe extraction of duration, resolution, codec info Preview GIF**: Animated preview clips for video galleries Multi-Format Transcoding**: Convert to 1080p, 720p, 480p CDN Distribution**: Cloudflare cache invalidation and signed URLs Webhook Callbacks**: Notify origin system on completion Use Cases Video hosting platforms Media asset management systems Content delivery networks Video streaming services Social media platforms E-learning video processing User-generated content platforms Required Credentials AWS S3 Credentials (for video storage) FFmpeg API credentials (via HTTP) Cloudflare API Token (for CDN) Slack Bot Token (for notifications) Google Sheets OAuth (for logging) Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses Webhook for S3 event notifications Uses Code nodes for S3 info extraction and URL generation Uses If node for video format validation Uses HTTP Request nodes for FFprobe, FFmpeg, and CDN APIs Uses Aggregate node for collecting parallel processing results Uses Merge nodes for multiple workflow path consolidation Implements parallel processing for thumbnails, GIF, and transcoding Workflow Architecture [S3 Event Webhook] [Manual Webhook] | | +--------+----------+ | v [Merge Triggers] | v [Extract S3 Info] (Code) | v [Check Is Video] (If) / \ Yes No | | v v [Get Video Metadata] [Invalid Response] (FFprobe) | | | v | [Parse Video Metadata] | (Code) | /|\ | / | \ | v v v | Thumbs[Transcode] | \ | / | \ | / | v v | [Aggregate Results] | | | v | [Invalidate CDN Cache] | | | v | [Generate Signed URLs] | / \ | / \ | v v | [Log Sheet] [Slack] | \ / | \ / | v | [Merge Output Paths] | | | +---------+-------+ | v [Merge All Paths] | v [Respond to Webhook] Configuration Guide S3 Event: Configure S3 bucket notification to send events to webhook FFmpeg API: Use a hosted FFmpeg service (e.g., api.ffmpeg-service.com) Cloudflare: Set zone ID and API token for cache invalidation Slack Channel: Set #video-processing for notifications Google Sheets: Connect for processing metrics logging Supported Video Formats | Extension | MIME Type | |-----------|----------| | .mp4 | video/mp4 | | .mov | video/quicktime | | .avi | video/x-msvideo | | .mkv | video/x-matroska | | .webm | video/webm | | .m4v | video/x-m4v | Thumbnail Generation | Size | Dimensions | Suffix | |------|------------|--------| | Large | 1280x720 | _large | | Medium | 640x360 | _medium | | Small | 320x180 | _small | Thumbnails generated at: 10%, 30%, 50%, 70%, 90% of video duration Transcoding Presets | Preset | Resolution | Bitrate | Codec | |--------|------------|---------|-------| | 1080p | 1920x1080 | 5000k | H.264 | | 720p | 1280x720 | 2500k | H.264 | | 480p | 854x480 | 1000k | H.264 | Output Structure { "job_id": "job_1705312000_abc123", "status": "completed", "original": { "filename": "video.mp4", "resolution": "1920x1080", "duration": "00:05:30" }, "thumbnails": { "large": "https://cdn/thumbnails/job_id/thumb_0_large.jpg", "medium": "https://cdn/thumbnails/job_id/thumb_0_medium.jpg", "small": "https://cdn/thumbnails/job_id/thumb_0_small.jpg" }, "preview_gif": "https://cdn/previews/job_id/preview.gif", "transcoded": { "1080p": "https://cdn/transcoded/job_id/video_1080p.mp4", "720p": "https://cdn/transcoded/job_id/video_720p.mp4", "480p": "https://cdn/transcoded/job_id/video_480p.mp4" } } `
by Rahul Joshi
Description This workflow acts as a CI/CD-style test harness for validating other n8n workflows. It executes a target workflow (here: Archive Payment Receipts), evaluates pass/fail outcomes, and generates structured reports. Results are automatically archived to Google Drive, logged in Google Sheets, and synced with ClickUp for visibility. Both success and failure scenarios are handled with standardized formatting. What This Template Does (Step-by-Step) ⚡ Manual Trigger – Start the test run manually. ▶️ Execute Target Workflow Under Test – Calls the specified workflow (Archive Payment Receipts) and captures its output, even if it errors. ✅ Test Result Evaluation (If Node) – Checks if the output contains errors. Pass Path → success formatting + archival. Fail Path → failure formatting + logging. 📄 Format Success Test Result (Set Node) – Creates a structured result object with: Status: ✅ Passed Workflow Name Timestamp 📄 Format Failed Test Result (Set Node) – Same as above, but with ❌ Failed status. 📝 Generate Success/Failure Report Text (Code Node) – Converts structured data into a human-readable report string. 📦 Convert Report to Text File – Transforms the text into a .txt file for archiving. ☁️ Archive Reports to Google Drive – Saves .txt files (success/failure) into the resume store folder with timestamped filenames. ✏️ Update ClickUp Task (Success/Failure) – Posts results directly into a ClickUp task for visibility. 📊 Log Error Details to Error Tracking Sheet (Google Sheets) – Appends raw error objects to an error log sheet for debugging and trend analysis. Prerequisites Target workflow to test (e.g., Archive Payment Receipts) Google Drive folder for report storage Google Sheets (Error Log tab) ClickUp API credentials n8n instance Key Benefits ✅ Automates workflow regression testing ✅ Captures pass/fail outcomes with full audit trail ✅ Maintains error logs for debugging and reliability improvements ✅ Keeps stakeholders updated via ClickUp integration ✅ Supports compliance with archived test reports Perfect For Teams running workflow QA & testing Organizations needing audit-ready test reports DevOps pipelines with continuous validation of automations Stakeholders requiring real-time visibility into workflow health
by 寳田 武
Turn your n8n instance into a personal "Planetary Defense System." This workflow monitors NASA's data daily for hazardous asteroids, generates sci-fi style warnings using OpenAI, translates them via DeepL, and notifies you through LINE. Who is it for This template is perfect for space enthusiasts, sci-fi fans, or anyone interested in learning how to combine data analysis with AI text generation and translation services in n8n. What it does Fetches Data: Retrieves the daily "Near Earth Objects" list from the NASA NeoWs API. Analyzes Threats: A Code node filters for "potentially hazardous" asteroids and calculates their distance relative to the Moon. Smart Branching: If a threat exists: OpenAI generates a dramatic, sci-fi style warning based on the asteroid's size and distance. DeepL translates this alert into your preferred language (default: Japanese). If no threat exists: A pre-set "Peace Report" is prepared. Notifies: Sends the final message to your LINE account via LINE Notify. How to set up NASA API: Sign up for a free API key at api.nasa.gov and configure the Get Asteroid Data node credential. OpenAI & DeepL: Add your API keys to the respective nodes. LINE Notify: Generate an access token from the LINE Notify website and add it to the Send Danger Alert and Send Peace Report nodes. Configure Language: In the Translate Alert node, set the "Translate To" field to your desired language code (e.g., JA, EN, DE). Requirements n8n version 1.0 or later NASA API Key (Free) OpenAI API Key (Paid) DeepL API Key (Free or Pro) LINE Account & Notify Token How to customize Change the Vibe:* Edit the System Prompt in the *Generate SF Alert** node to change the persona (e.g., "Scientific Analyst" instead of "Sci-Fi System"). Switch Messenger:** Replace the LINE nodes with Slack, Discord, or Email nodes to receive alerts on your preferred platform. Adjust Thresholds:* Modify the JavaScript in the *Filter & Calculate Distance** node to change the definition of a "threat" (e.g., closer than 10 lunar distances).
by Madame AI
Post Jobs to Multiple Boards from Google Sheets using BrowerAct This powerful n8n template turns a Google Sheet into a control panel for automating job postings across multiple job boards. This workflow is perfect for HR teams, recruiters, and hiring managers who want to streamline their hiring process by posting jobs to multiple boards from a single source of truth. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow is triggered in two ways: manually (to batch-post all "Ready to Post" jobs) or automatically via a Google Sheets Trigger when a single row is updated. An If node filters for jobs marked with the status 'Ready to Post'. A BrowserAct node takes the job details (title, description, logins, target URL) and runs an automation to post the job on the specified board. An If node checks if the posting was successful. If it fails, a Slack alert is sent. A Code node parses the successful result from BrowserAct to get the status and live URL. The workflow Updates the row in Google Sheets with the Live_URL and changes the Status to 'Posted'. A final Slack message is sent to a channel to confirm the successful posting. Requirements BrowserAct** API account for automated posting BrowserAct* "Automated Job Posting to Niche Job Site (Custom Site)*" Template | User needs to customize the workflow based on the target site. BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node
by Madame AI
Real-Time MAP Enforcement & Price Violation Alerts using BrowserAct & slack This n8n template automates MAP (Minimum Advertised Price) enforcement by monitoring reseller websites and alerting you instantly to price violations and stock issues. This workflow is essential for brand owners, manufacturers, and compliance teams who need to proactively monitor their distribution channels and enforce pricing policies. How it works The workflow runs on a Schedule Trigger (e.g., hourly) to continuously monitor product prices. A Google Sheets node fetches your list of resellers, product URLs, and the official MAP price (AP_Price). The Loop Over Items node ensures that each reseller's product is checked individually. A pair of BrowserAct nodes navigate to the reseller's product page and reliably scrape the current live price. A series of If nodes check for violations: The first check (If1) looks for "NoData," signaling that the product is Out of Stock, and sends a specific Slack alert. The second check (If) compares the scraped price to your MAP price, triggering a detailed Slack alert if a MAP violation is found. The workflow loops back to check the next reseller on the list. Requirements BrowserAct** API account for web scraping BrowserAct* "MAP (Minimum Advertised Price) Violation Alerts*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials for your price list Slack** credentials for sending alerts Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase I Built a Bot to Catch MAP Violators (n8n + BrowserAct Workflow)
by Madame AI
Automated E-commerce Store Monitoring for New Products Using BrowserAct This n8n template is an advanced competitive intelligence tool that automatically monitors competitor E-commerce/Shopify stores and alerts you the moment they launch a new product. This workflow is essential for e-commerce store owners, product strategists, and marketing teams who need real-time insight into what their competitors are selling. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow runs on a Schedule Trigger to check for new products automatically (e.g., daily). A Google Sheets node fetches your master list of competitor store links from a central sheet. The workflow loops through each competitor one by one. For each competitor, a Google Sheets node first creates a dedicated tracking sheet (if one doesn't exist) to store their product list history. A BrowserAct node then scrapes the competitor's current product list from their live website. The scraped data is saved to the competitor's dedicated tracking sheet. The workflow then fetches the newly scraped list and the previously stored list of products. A custom Code node (labeled "Compare Datas") performs a difference check to reliably detect if any new products have been added. If a new product is detected, an If node triggers an immediate Slack alert to your team, providing real-time competitive insight. Requirements BrowserAct** API account for web scraping BrowserAct* "Competitors Shopify Website New Product Monitor*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials for storing and managing data Slack** credentials for sending alerts Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase Automatically Track Competitor Products | n8n & Google Sheets Template
by HoangSP
Who’s it for Teams that want to turn a chat prompt into a researched, ready-to-post social update—optionally published to Facebook. What it does / How it works Chat Trigger receives the user prompt Topic Agent optionally calls a research sub-workflow for fresh sources Outputs are validated into a structured JSON Post Writing Agent crafts a concise Vietnamese post (Optional) Facebook Graph API publishes to your Page How to set up Connect OpenAI & Facebook in Credentials (no API key inside nodes). In Tool: Call Perplexity Researcher, set your research workflowId. In Publish: Facebook Graph API, set your Page ID and edge. Adjust prompts/tone and the LANGUAGE in CONFIG. Test the flow with sample prompts in the chat. Requirements n8n (Cloud or self-hosted) OpenAI API key (stored in Credentials) Facebook Page publish permissions (Optional) a research workflow for Perplexity How to customize the workflow Add moderation/review gates before publishing. Duplicate the publish path for other platforms. Store outputs in Sheets/Notion/DB for auditing. Tune model choice & temperature for your brand voice. Security Avoid hardcoding secrets in HTTP or Code nodes. Keep identifiers (Page IDs, workflowIds) configurable in CONFIG.
by Rohit Dabra
🧩 Zoho CRM MCP Server Integration (n8n Workflow) 🧠 Overview This n8n flow integrates Zoho CRM with an MCP (Model Context Protocol) Server and OpenAI Chat Model, enabling AI-driven automation for CRM lead management. It allows an AI Agent to create, update, delete, and fetch leads in Zoho CRM through natural language instructions. ▶️ Demo Video Watch the full demo here: 👉 YouTube Demo Video ⚙️ Core Components | Component | Purpose | | ---------------------- | -------------------------------------------------------------------------------------------------- | | MCP Server Trigger | Acts as the entry point for requests sent to the MCP Server (external systems or chat interfaces). | | Zoho CRM Nodes | Handle CRUD operations for leads (create, update, delete, get, getAll). | | AI Agent | Uses the OpenAI Chat Model and Memory to interpret and respond to incoming chat messages. | | OpenAI Chat Model | Provides the LLM (Large Language Model) intelligence for the AI Agent. | | Simple Memory | Stores short-term memory context for chat continuity. | | MCP Client | Bridges communication between the AI Agent and the MCP Server for bi-directional message handling. | 🧭 Flow Description 1. Left Section (MCP Server + Zoho CRM Integration) Trigger:** MCP Server Trigger — receives API requests or chat events. Zoho CRM Actions:** 🟢 Create a lead in Zoho CRM 🔵 Update a lead in Zoho CRM 🟣 Get a lead in Zoho CRM 🟠 Get all leads in Zoho CRM 🔴 Delete a lead in Zoho CRM Each of these nodes connects to the Zoho CRM credentials and performs the respective operation on Zoho CRM’s “Leads” module. 2. Right Section (AI Agent + Chat Flow) Trigger:** When chat message received — initiates flow when a message is received. AI Agent Node:** Uses: OpenAI Chat Model → for natural language understanding and generation. Simple Memory → to maintain context between interactions. MCP Client → to call MCP actions (which include Zoho CRM operations). This creates a conversational interface allowing users to type things like: > “Add a new lead named John Doe with email john@acme.com” The AI agent interprets this and routes the request to the proper Zoho CRM action node automatically. ⚙️ Step-by-Step Configuration Guide 🧩 1. Import the Flow In n8n, go to Workflows → Import. Upload the JSON file of this workflow (or paste the JSON code). Once imported, you’ll see the structure as in the image. 🔐 2. Configure Zoho CRM Credentials You must connect Zoho CRM API to n8n. Go to Credentials → New → Zoho OAuth2 API. Follow Zoho’s official n8n documentation. Provide the following: Environment: Production Data Center: e.g., zoho.in or zoho.com depending on your region Client ID and Client Secret — from Zoho API Console (https://api-console.zoho.com/) Scope: ZohoCRM.modules.leads.ALL Redirect URL: Use the callback URL shown in n8n (copy it before saving credentials) Click Connect and complete the OAuth consent. ✅ Once authenticated, all Zoho CRM nodes (Create, Update, Delete, etc.) will be ready. 🔑 3. Configure OpenAI API Key In n8n, go to Credentials → New → OpenAI API. Enter: API Key: from https://platform.openai.com/account/api-keys Save credentials. In the AI Agent node, select this OpenAI credential under Model. 🧠 4. Configure the AI Agent Open the AI Agent node. Choose: Chat Model: Select your configured OpenAI Chat Model. Memory: Select Simple Memory. Tools: Add MCP Client as the tool. Configure AI instructions (System Prompt) — for example: You are an AI assistant that helps manage leads in Zoho CRM. When the user asks to create, update, or delete a lead, use the appropriate tool. Provide confirmations in natural language. 🧩 5. Configure MCP Server A. MCP Server Trigger Open the MCP Server Trigger node. Note down the endpoint URL — this acts as the API entry point for external requests. It listens for incoming POST requests from your MCP client or chat interface. B. MCP Client Node In the AI Agent, link the MCP Client node. Configure it to send requests back to your MCP Server endpoint (for 2-way communication). > 🔄 This enables a continuous conversation loop between external clients and the AI-powered CRM automation system. 🧪 6. Test the Flow Once everything is connected: Activate the workflow. From your chat interface or Postman, send a message to the MCP Server endpoint: { "message": "Create a new lead named Alice Johnson with email alice@zoho.com" } Observe: The AI Agent interprets the intent. Calls Zoho CRM Create Lead node. Returns a success message with lead ID. 🧰 Example Use Cases | User Query | Action Triggered | | ------------------------------------------------- | ----------------------- | | “Add John as a lead with phone number 9876543210” | Create lead in Zoho CRM | | “Update John’s company to Acme Inc.” | Update lead in Zoho CRM | | “Show me all leads from last week” | Get All Leads | | “Delete lead John Doe” | Delete lead | 🧱 Tech Stack Summary | Layer | Technology | | ---------------------- | ---------------------------- | | Automation Engine | n8n | | AI Layer | OpenAI GPT Chat Model | | CRM | Zoho CRM | | Communication Protocol | MCP (Model Context Protocol) | | Memory | Simple Memory | | Trigger | HTTP-based MCP Server | ✅ Best Practices 🔄 Refresh Tokens Regularly — Zoho tokens expire; ensure auto-refresh setup. 🧹 Use Environment Variables for API keys instead of hardcoding. 🧠 Fine-tune System Prompts for better AI understanding. 📊 Enable Logging for request/response tracking. 🔐 Restrict MCP Server Access with an API key or JWT token.
by MUHAMMAD SHAHEER
Overview This workflow helps you automatically collect verified business leads from Google Search using SerpAPI — no coding required. It extracts company names, websites, emails, and phone numbers directly from search results and saves them into Google Sheets for easy follow-up or CRM import. Perfect for marketers, freelancers, and agencies who want real, usable leads fast — without manual scraping or paid databases. How It Works SerpAPI Node performs a Google search for your chosen keyword or niche. Split Out Node separates each result for individual processing. HTTP Request Node optionally visits each site for deeper data extraction. Code Node filters, validates, and formats leads using smart parsing logic. Google Sheets Node stores the final structured data automatically. All steps include sticky notes with configuration help. Setup Steps Setup takes about 5–10 minutes: Add your SerpAPI key (replace the placeholder). Connect your Google Sheets account. Update the search term (e.g., “Plumbers in New York”). Run the workflow and watch leads populate your sheet in real time.