by Sean Spaniel
Predict Housing Prices with a Neural Network This n8n template demonstrates how a simple Multi-Layer Perceptron (MLP) neural network can predict housing prices. The prediction is based on four key features, processed through a three-layer model. Input Layer Receives the initial data via a webhook that accepts four query parameters. Hidden Layer Composed of two neurons. Each neuron calculates a weighted sum of the inputs, adds a bias, and applies the ReLU activation function. Output Layer Contains one neuron that calculates the weighted sum of the hidden layer's outputs, adds its bias, and returns the final price prediction. Setup This template works out-of-the-box and requires no special configuration or prerequisites. Just import the workflow to get started. How to Use Trigger this workflow by sending a GET request to the webhook endpoint. Include the house features as query parameters in the URL. Endpoint: /webhook/regression/house/price Query Parameters square_feet: The total square footage of the house. number_rooms: The total number of rooms. age_in_years: The age of the house in years. distance_to_city_in_km: The distance to the nearest city center in kilometers. Example Hereβs an example curl request for a 1,500 sq ft, 3-room house that is 10 years old and 5 km from the city. Request curl "https://your-n8n-instance.com/webhook/regression/house/price?square_feet=1500&number_rooms=3&age_in_years=10&distance_to_city_in_km=5" Response JSON { "price": 53095.832123960805 } `
by Stephan Koning
π What it Does This workflow automatically classifies uploaded files (PDFs or images) as floorplans or nonβfloorplans. It filters out junk files, then analyzes valid floorplans to extract room sizes and measurements. π₯ Who itβs For Built for real estate platforms, property managers, and automation builders who need a trustworthy way to detect invalid uploads while quickly turning true floorplans into structured, reusable data. βοΈ How it Works User uploads a file (PDF, JPG, PNG, etc.). Workflow routes the file based on type for specialized processing. A twoβlayer quality check is applied using heuristics and AI classification. A confidence score determines if the file is a valid floorplan. Valid floorplans are passed to a powerful OCR/AI for deep analysis. Results are returned as JSON and a user-friendly HTML table. π§ The Technology Behind the Demo This MVP is a glimpse into a more advanced commercial system. It runs on a custom n8n workflow that leverages Mistral AI's latest OCR technology. Hereβs what makes it powerful: Structured Data Extraction: The AI is forced to return data in a clean, predictable JSON Schema. This isn't just text scraping; itβs a reliable data pipeline. Intelligent Data Enrichment: The workflow doesn't just extract dataβit enriches it. A custom script automatically calculates crucial metrics like wall surface area from the floor dimensions, even using fallback estimates if needed. Automated Aggregation: It goes beyond individual rooms by automatically calculating totals per floor level and per room type, providing immediate, actionable insights. While this demo shows the core classification and measurement (Step 1), the full commercial version includes Step 2 & 3 (Automated Offer Generation), currently in use by a client in the construction industry. Test the Live MVP π Requirements Jigsaw Stack API Key n8n Instance Webhook Endpoint π¨ Customization Adjust thresholds, fineβtune heuristics, or swap OCR providers to better match your business needs and downstream integrations.
by David Olusola
π n8n Learning Hub β AI-Powered YouTube Educator Directory π Overview This workflow demonstrates how to use n8n Data Tables to create a searchable database of educational YouTube content. Users can search for videos by topic (e.g., "voice", "scraping", "lead gen") and receive formatted recommendations from top n8n educators. What This Workflow Does: Receives search queries** via webhook (e.g., topic: "voice agents") Processes keywords** using JavaScript to normalize search terms Queries a Data Table** to find matching educational videos Returns formatted results** with video titles, educators, difficulty levels, and links Populates the database** with a one-time setup workflow π― Key Features β Data Tables Introduction - Learn how to store and query structured data β Webhook Integration - Accept external requests and return JSON responses β Keyword Processing - Simple text normalization and keyword matching β Batch Operations - Use Split in Batches to populate tables efficiently β Frontend Ready - Easy to connect with Lovable, Replit, or custom UIs π οΈ Setup Guide Step 1: Import the Workflow Copy the workflow JSON In n8n, go to Workflows β Import from File or Import from URL Paste the JSON and click Import Step 2: Create the Data Table The workflow uses a Data Table called n8n_Educator_Videos with these columns: Educator** (text) - Creator name video_title** (text) - Video title Difficulty** (text) - Beginner/Intermediate/Advanced YouTubeLink** (text) - Full YouTube URL Description** (text) - Video summary for search matching To create it: Go to Data Tables in your n8n instance Click + Create Data Table Name it n8n_Educator_Videos Add the 5 columns listed above Step 3: Populate the Database Click on the "When clicking 'Execute workflow'" node (bottom branch) Click Execute Node to run the setup This will insert all 9 educational videos into your Data Table Step 4: Activate the Webhook Click on the Webhook node (top branch) Copy the Production URL (looks like: https://your-n8n.app.n8n.cloud/webhook/1799531d-...) Click Activate on the workflow Test it with a POST request: curl -X POST https://your-n8n.app.n8n.cloud/webhook/YOUR-WEBHOOK-ID \ -H "Content-Type: application/json" \ -d '{"topic": "voice"}' π How the Search Works Keyword Processing Logic The JavaScript node normalizes search queries: "voice", "audio", "talk"** β Matches voice agent tutorials "lead", "lead gen"** β Matches lead generation content "scrape", "data", "scraping"** β Matches web scraping tutorials The Data Table query uses LIKE matching on the Description field, so partial matches work great. Example Queries: {"topic": "voice"} // Returns Eleven Labs Voice Agent {"topic": "scraping"} // Returns 2 scraping tutorials {"topic": "avatar"} // Returns social media AI avatar videos {"topic": "advanced"} // Returns all advanced-level content π¨ Building a Frontend with Lovable or Replit Option 1: Lovable (lovable.dev) Lovable is an AI-powered frontend builder perfect for quick prototypes. Prompt for Lovable: Create a modern search interface for an n8n YouTube learning hub: Title: "π n8n Learning Hub" Search bar with placeholder "Search for topics: voice, scraping, RAG..." Submit button that POSTs to webhook: [YOUR_WEBHOOK_URL] Display results as cards showing: π₯ Video Title (bold) π€ Educator name π§© Difficulty badge (color-coded) π YouTube link button π Description Design: Dark mode, modern glassmorphism style, responsive grid layout Implementation Steps: Go to lovable.dev and start a new project Paste the prompt above Replace [YOUR_WEBHOOK_URL] with your actual webhook Export the code or deploy directly Option 2: Replit (replit.com) Use Replit's HTML/CSS/JS template for more control. HTML Structure: <!DOCTYPE html> <html> <head> <title>n8n Learning Hub</title> <style> body { font-family: Arial; max-width: 900px; margin: 50px auto; } #search { padding: 10px; width: 70%; font-size: 16px; } button { padding: 10px 20px; font-size: 16px; } .video-card { border: 1px solid #ddd; padding: 20px; margin: 20px 0; } </style> </head> <body> π n8n Learning Hub <input id="search" placeholder="Search: voice, scraping, RAG..." /> <button onclick="searchVideos()">Search</button> <script> async function searchVideos() { const topic = document.getElementById('search').value; const response = await fetch('YOUR_WEBHOOK_URL', { method: 'POST', headers: {'Content-Type': 'application/json'}, body: JSON.stringify({topic}) }); const data = await response.json(); document.getElementById('results').innerHTML = data.Message || 'No results'; } </script> </body> </html> Option 3: Base44 (No-Code Tool) If using Base44 or similar no-code tools: Create a Form with a text input (name: topic) Add a Submit Action β HTTP Request Set Method: POST, URL: Your webhook Map form data: {"topic": "{{topic}}"} Display response in a Text Block using {{response.Message}} π Understanding Data Tables Why Data Tables? Persistent Storage** - Data survives workflow restarts Queryable** - Use conditions (equals, like, greater than) to filter Scalable** - Handle thousands of records efficiently No External DB** - Everything stays within n8n Common Operations: Insert Row - Add new records (used in the setup branch) Get Row(s) - Query with filters (used in the search branch) Update Row - Modify existing records by ID Delete Row - Remove records Best Practices: Use descriptive column names Include a searchable text field (like Description) Keep data normalized (avoid duplicate entries) Use the "Split in Batches" node for bulk operations π Extending This Workflow Ideas to Try: Add More Educators - Expand the video database Category Filtering - Add a Category column (Automation, AI, Scraping) Difficulty Sorting - Let users filter by skill level Vote System - Add upvote/downvote columns Analytics - Track which topics are searched most Admin Panel - Build a form to add new videos via webhook Advanced Features: AI-Powered Search** - Use OpenAI embeddings for semantic search Thumbnail Scraping** - Fetch YouTube thumbnails via API Auto-Updates** - Periodically check for new videos from educators Personalization** - Track user preferences in a separate table π Troubleshooting Problem: Webhook returns empty results Solution: Check that the Description field contains searchable keywords Problem: Database is empty Solution: Run the "When clicking 'Execute workflow'" branch to populate data Problem: Frontend not connecting Solution: Verify webhook is activated and URL is correct (use Test mode first) Problem: Search too broad/narrow Solution: Adjust the keyword logic in "Load Video DB" node π Learning Resources Want to learn more about the concepts in this workflow? Data Tables:** n8n Data Tables Documentation Webhooks:** Webhook Node Guide JavaScript in n8n:** "Every N8N JavaScript Function Explained" (see database) π What You Learned By completing this workflow, you now understand: β How to create and populate Data Tables β How to query tables with conditional filters β How to build webhook-based APIs in n8n β How to process and normalize user input β How to format data for frontend consumption β How to connect n8n with external UIs Happy Learning! π Built with β€οΈ using n8n Data Tables
by Nalin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Qualify and enrich inbound leads with contextual insights and Slack alerts Who is this for? Sales teams, account executives, and RevOps professionals who need more than just basic lead scoring. Built for teams that want deep contextual insights about qualified prospects to enable truly relevant conversations from the first touchpoint. What problem does this solve? Most qualification stops at "good fit" or "bad fit" - but that leaves sales teams flying blind when it comes to actually engaging the prospect. You know they're qualified, but what are their specific pain points? What value propositions resonate? Which reference customers should you mention? This workflow uses Octave's context engine to not only qualify leads but enrich them with actionable insights that turn cold outreach into warm, contextualized conversations. What this workflow does Inbound Lead Processing: Receives lead information via webhook (firstName, companyName, companyDomain, profileURL, jobTitle) Processes leads from website forms, demo requests, content downloads, or trial signups Validates and structures lead data for intelligent qualification and enrichment Contextualized Lead Qualification: Leverages Octave's context engine to score leads against your specific ICP Analyzes company fit, role relevance, and timing indicators Generates qualification scores (1-10) with detailed rationale Filters out low-scoring leads (configurable threshold - default >5) Deep Lead Enrichment: Uses Octave's enrichment engine to generate contextual insights about qualified leads Identifies primary responsibilities, pain points, and relevant value propositions Suggests appropriate reference customers and use cases to mention Provides sales teams with conversation starters grounded in your business context Enhanced Sales Alerts: Sends enriched Slack alerts with qualification score plus actionable insights Includes suggested talking points, pain points, and reference customers Enables sales teams to have contextualized conversations from first contact Setup Required Credentials: Octave API key and workspace access Slack OAuth credentials with channel access Access to your lead source system (website forms, CRM, etc.) Step-by-Step Configuration: Set up Octave Qualification Agent: Add your Octave API credentials in n8n Replace your-octave-qualification-agent-id with your actual qualification agent ID Configure your qualification agent with your ICP criteria and business context Set up Octave Enrichment Agent: Replace your-octave-enrichment-agent-id with your actual enrichment agent ID Configure enrichment outputs based on the insights most valuable to your sales process Test enrichment quality with sample leads from your target market Configure Slack Integration: Add your Slack OAuth credentials to n8n Replace your-slack-channel-id with the channel for enriched lead alerts Customize the Slack message template with the enrichment fields most useful for your sales team Set up Lead Source: Replace your-webhook-path-here with a unique, secure path Configure your website forms, CRM, or lead source to send data to the webhook Ensure consistent data formatting across lead sources Customize Qualification Filter: Adjust the Filter node threshold (default: score > 5) Modify based on your lead volume and qualification standards Test with sample leads to calibrate scoring Required Webhook Payload Format: { "body": { "firstName": "Sarah", "lastName": "Johnson", "companyName": "ScaleUp Technologies", "companyDomain": "scaleuptech.com", "profileURL": "https://linkedin.com/in/sarahjohnson", "jobTitle": "VP of Engineering" } } How to customize Qualification Criteria: Customize scoring in your Octave qualification agent: Product Level:** Define "good fit" and "bad fit" questions that determine if someone needs your core offering Persona Level:** Set criteria for specific buyer personas and their unique qualification factors Segment Level:** Configure qualification logic for different market segments or use cases Multi-Level Qualification:** Qualify against Product + Persona, Product + Segment, or all three levels combined Enrichment Insights: Configure your Octave enrichment agent to surface the most valuable insights: Primary Responsibilities:** What this person actually does day-to-day Pain Points:** Specific challenges they face that your solution addresses Value Propositions:** Which benefits resonate most with their role and situation Reference Customers:** Similar companies/roles that have succeeded with your solution Conversation Starters:** Contextual talking points for outreach Slack Alert Format: Customize the enrichment data included in alerts: Add or remove enrichment fields based on sales team preferences Modify message formatting for better readability Include additional webhook data if needed Scoring Threshold: Adjust the Filter node to match your qualification standards Integration Channels: Replace Slack with email, CRM updates, or other notification systems Use Cases High-value enterprise lead qualification and research automation Demo request enrichment for contextual sales conversations Event lead processing with immediate actionable insights Website visitor qualification and conversation preparation Trial signup enrichment for targeted sales outreach Content download lead scoring with context-aware follow-up preparation
by Elodie Tasia
Automatically create branded social media graphics, certificates, thumbnails, or marketing visuals using Bannerbear's template-based image generation API. Bannerbear's API is primarily asynchronous by default: this workflow shows you how to use both asynchronous (webhook-based) and synchronous modes depending on your needs. What it does This workflow connects to Bannerbear's API to generate custom images based on your pre-designed templates. You can modify text, colors, and other elements programmatically. By default, Bannerbear works asynchronously: you submit a request, receive an immediate 202 Accepted response, and get the final image via webhook or polling. This workflow demonstrates both the standard asynchronous approach and an alternative synchronous method where you wait for the image to be generated before proceeding. How it works Set parameters - Configure your Bannerbear API key, template ID, and content (title, subtitle) Choose mode - Select synchronous (wait for immediate response) or asynchronous (standard webhook delivery) Generate image - The workflow calls Bannerbear's API with your modifications Receive result - Get the image URL, dimensions, and metadata in PNG or JPG format Async mode (recommended): The workflow receives a pending status immediately, then a webhook delivers the completed image when ready. Sync mode: The workflow waits for the image generation to complete before proceeding. Setup requirements A Bannerbear account (free tier available) A Bannerbear template created in your dashboard Your API key and template ID from Bannerbear For async mode: ability to receive webhooks (production n8n instance) How to set up Get Bannerbear credentials: Sign up at bannerbear.com Create a project and design a template Copy your API key from Settings > API Key Copy your template ID from the API Console Configure the workflow: Open the "SetParameters" node Replace the API key and template ID with yours Customize the title and subtitle text Set call_mode to "sync" or "async" For async mode (recommended): Activate the "Webhook_OnImageCreated" node Copy the production webhook URL Add it to Bannerbear via Settings > Webhooks > Create a Webhook Set event type to "image_created" Customize the workflow Modify the template parameters to match your Bannerbear template fields Add additional modification objects for more dynamic elements (colors, backgrounds, images) Connect to databases, CRMs, or other tools to pull content automatically Chain multiple image generations for batch processing Store generated images in Google Drive, S3, or your preferred storage Use async mode for high-volume generation without blocking your workflow
by Le Nguyen
Description (How it works) This workflow keeps your Zalo Official Account access token valid and easy to reuse across other flowsβno external server required. High-level steps Scheduled refresh runs on an interval to renew the access token before it expires. Static Data cache (global) stores access/refresh tokens + expiries for reuse by any downstream node. OAuth exchange calls Zalo OAuth v4 with your app_id and secret_key to get a fresh access token. Immediate output returns the current access token to the next nodes after each refresh. Operational webhooks include: A reset webhook to clear the cache when rotating credentials or testing. A token peek webhook to read the currently cached token for other services. Setup steps (estimated time ~8β15 minutes) Collect Zalo credentials (2β3 min): Obtain app_id, secret_key, and a valid refresh_token. Import & activate workflow (1β2 min): Import the JSON into n8n and activate it. Wire inputs (2β3 min): Point the βSet Refresh Token and App IDβ node to your env vars (or paste values for a quick test). Adjust schedule & secure webhooks (2β3 min): Tune the run interval to your token TTL; protect the reset/peek endpoints (e.g., secret param or IP allowlist). Test (1β2 min): Execute once to populate Static Data; optionally try the token peek and reset webhooks to confirm behavior.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works Automate transactional emails from KlickTipp via SMTP This workflow receives contact data from a KlickTipp Outbound rule, generates a personalized HTML email, and sends it via any SMTP-compatible service (e.g., Gmail SMTP, Outlook, SendGrid, your own mail server). Key fields (e.g., first name, company, website, phone, or other custom fields) can be dynamically mapped into the body. After sending, the workflow saves the emailβs HTML content and writes back an Email delivery status (βSentβ or βFailedβ) to the contact in KlickTipp for clear visibility. Key Features KlickTipp Outbound Trigger Starts when a KlickTipp Outbound rule calls the webhook (e.g., after a tag is applied). Accepts payload with recipient email and optional custom fields (first name, company, website, phone, etc.). Easy to adapt for confirmations, updates, welcomes, and announcements. HTML Email Composer Builds a clean, brandable HTML template with safe fallbacks. Supports per-contact personalization via mapped fields. SMTP Delivery Sends emails using the n8n Send Email node (SMTP). Works with Gmail, Outlook, or any SMTP-compatible service. Supports From/Reply-To, Subject, HTML body, CC/BCC, and attachments. Delivery Status Write-Back On success: updates a KlickTipp custom field (e.g., Email delivery status = Sent). On error: updates the same field to Failed (error details available in execution logs). Setup Instructions Install and Configure Nodes Add/enable KlickTipp community nodes and authenticate with valid API credentials. Configure an SMTP credential in n8n: Host (e.g., smtp.gmail.com) Port (465 or 587) Authentication (username, password, or app password) SSL/TLS settings as required by your provider Select this credential in the Send Email node. Paste/import your HTML into the Generate HTML template node. Activate the workflow. Workflow Logic Trigger from KlickTipp: Outbound sends contact data to the workflow. Generate HTML: Build personalized HTML (and optional plain-text). Send via SMTP: Deliver the message with the Send Email node. On Success: Update KlickTipp contact β Email delivery status: Sent. On Error: Update KlickTipp contact β Email delivery status: Failed (see logs for details). Benefits Immediate, personalized communication** without manual steps. Consistent branding** with reusable HTML templates. Clear observability** by writing back delivery status to KlickTipp. Flexible & extensible** for many message types beyond payments. Testing and Deployment Tag a test contact in KlickTipp to trigger the Outbound rule. Verify the email arrives with correct personalization. Confirm the Email delivery status field updates to Sent (or Failed for negative tests). Review execution logs and adjust field mappings if necessary. Notes Customization:** Swap templates, add CC/BCC, attachments, or a plain-text part for deliverability. SMTP Provider Settings:** Refer to your email providerβs SMTP configuration (e.g., Gmail, Outlook, or custom server).
by Nabin Bhandari
WhatsApp Voice Agent with Twilio, VAPI, Google Calendar, Gmail & Supabase This workflow turns WhatsApp voice messages into an AI assistant using Twilio, VAPI, and modular MCP servers. It handles scheduling, email, and knowledge queries all by voice. π How it works WhatsApp β Twilio β VAPI A WhatsApp Business number (via TwiML app) receives a voice message. Twilio streams the audio into VAPI for processing. VAPI β n8n Webhook VAPI interprets the intent and routes the request to the correct MCP server. MCP Servers in n8n π Calendar MCP β create, fetch, update, delete Google Calendar events π§ Gmail MCP β send confirmation or reminder emails π Knowledge Base MCP β query Supabase Vector Store with OpenAI embeddings n8n β VAPI β WhatsApp n8n executes the task and returns the result via VAPI back to the user. π οΈ How to use Import this workflow into your n8n instance. Configure a Twilio WhatsApp-enabled number and connect it to a TwiML app. Point the TwiML app to your VAPI project. Add credentials for Google Calendar, Gmail, Supabase, and OpenAI in n8n. Test by sending a WhatsApp voice command like: βBook a meeting tomorrow at 3pmβ βSend a confirmation email to the clientβ βWhatβs included in the AI receptionist package?β π¨ Customisation ideas Add more MCP servers (e.g. CRM, Notion, Slack). Swap Supabase for another vector database. Extend Gmail flows with templates or multiple senders. Adjust the VAPI assistantβs tone and role to fit your brand. π Requirements Twilio WhatsApp-enabled number + TwiML app (verified in WhatsApp Manager) VAPI project (assistant configured) n8n instance (Cloud or self-hosted) Google Calendar & Gmail credentials Supabase project OpenAI API key π‘ Good to know Twilio must have a verified WhatsApp Business number. VAPI handles voice infra + intent routing; n8n only executes actions. The design is modularβeasy to expand with new MCP servers. Works best when tested with short, clear commands. π Use cases Hands-free scheduling with Google Calendar. Voice-triggered email confirmations & reminders. Conversational knowledge base access. Extendable to CRMs, team chat, or business workflows. π With this setup, you get a scalable voice-first AI agent on WhatsApp that connects seamlessly to your business systems.
by Krishna Sharma
Stripe β Pipedrive (Automatic Person + Deal Creation) Prerequisites / Requirements n8n instance (cloud or self-hosted) with HTTPS reachable webhook endpoint Stripe account and secret API key Pipedrive account with API access and custom fields defined Users should create custom fields in Pipedrive and copy their field IDs (e.g. amount, payment method, status, Stripe Event ID) How it works (Step-by-Step) Stripe Webhook: n8n triggers on payment events from Stripe via an HTTP POST. Edit / Set node: Extracts eventId, eventType, and payload. HTTP Request: Confirms Stripe event via Stripe API (adds security). Search Person: Looks up Pipedrive contact by email from the Stripe payload. IF: Checks whether the person exists. If exists β skip to deal creation. If node - Check if Event id is not the same If same - Skip the deal creation If different - Create a deal linked to that person with the payment amount If not β create new Person. Create Person: Adds new Pipedrive contact, populates custom fields (amount, payment method, status, source). Create Deal: Creates a deal linked to that person with the payment amount. Description **This workflow connects Stripe to Pipedrive. Whenever a new payment (or checkout success) occurs in Stripe, this automation will: Fetch the full Stripe event data Find or create a customer (βPersonβ) in Pipedrive Create a βDealβ record corresponding to that payment Log custom fields like amount, source, payment method, status Itβs ideal for SaaS Subscription businesses Agencies, or Teams who want payments mirrored inside their CRM without manual work. Modification Notes / Extensions You can extend this workflow by: Adding Slack or Gmail notifications after deal creation. Logging every event in Google Sheets or Notion for audit. Enriching the customer using OpenAI / CRM data. Handling refunds, subscription cancellations, etc.
by Max Mitcham
An intelligent automation workflow that monitors thought leader activity via social listening, tracks high-value prospects who engage with industry content, and systematically builds a qualified lead database through social intelligence gathering. Overview This workflow transforms passive social listening into proactive lead generation by identifying prospects who demonstrate genuine interest in industry topics through their engagement with thought leader content. It creates a continuous pipeline of warm prospects with enriched data for personalized outreach. π Workflow Process 1. Social Intelligence Webhook Real-time engagement monitoring Integrated with Trigify.io social listening platform Monitors thought leader posts and their engagers Captures detailed prospect and company enrichment data Processes LinkedIn engagement activities in real-time Includes enriched contact information (email, phone, LinkedIn URLs) 2. Data Processing & Extraction Structured data organization Post Data Extraction**: Isolates LinkedIn post URLs, content, and posting dates Prospect Data Extraction**: Captures first/last names, job titles, LinkedIn profiles, and locations Company Data Extraction**: Gathers company names, domains, sizes, industries, and LinkedIn pages Prepares data for duplicate detection and storage systems 3. Duplicate Detection System Data quality maintenance Queries existing Google Sheets database by post URL Identifies previously tracked thought leader content Filters out duplicate posts to maintain data quality Only processes genuinely new thought leader activities Maintains clean, unique post tracking records 4. New Content Validation Gate Quality control checkpoint Validates that post URLs are not empty (indicating new content) Prevents processing of duplicate or invalid data Ensures only fresh thought leader content triggers downstream actions Maintains database integrity and notification relevance 5. Thought Leader Post Tracking Systematic content monitoring Appends new thought leader posts to "Social Warming" Google Sheets Records post URLs, content text, and publication dates Creates searchable database of industry thought leadership content Enables trend analysis and content performance tracking 6. Real-Time Slack Notifications Immediate team alerts Sends formatted alerts to #comment-strategy channel Includes post content, publication date, and direct links Provides action buttons (View Post, Engage Now, Save for Later) Enables rapid response to thought leader activity Facilitates team coordination on engagement opportunities 7. ICP Qualification Filter Smart prospect identification Filters engagers by job title keywords (currently: "marketing") Customizable ICP criteria for targeted lead generation Focuses on high-value prospects matching ideal customer profiles Prevents database pollution with irrelevant contacts 8. Qualified Lead Database Systematic prospect capture Appends qualified engagers to "Engagers" Google Sheets Records comprehensive prospect and company data Includes contact enrichment (emails, phone numbers) Creates actionable lead database for sales outreach Maintains detailed company intelligence for personalization π οΈ Technology Stack n8n**: Workflow orchestration and webhook management Trigify.io**: Social listening and engagement monitoring platform Google Sheets**: Lead database and content tracking system Slack API**: Real-time team notifications and collaboration Data Enrichment**: Automated contact and company information gathering β¨ Key Features Real-time thought leader content monitoring Automated prospect discovery through social engagement ICP-based lead qualification and filtering Duplicate content detection and prevention Comprehensive prospect and company data enrichment Integrated CRM-ready lead database creation Team collaboration through Slack notifications Customizable qualification criteria for targeted lead generation π― Ideal Use Cases Perfect for sales and marketing teams seeking warm prospects: B2B Sales Teams** seeking warm prospects through social engagement Marketing Professionals** building targeted lead databases Business Development Teams** identifying engaged prospects Account-Based Marketing Campaigns** requiring social intelligence Sales Professionals** needing conversation starters with warm leads Companies** wanting to identify prospects already engaged with industry content Teams** requiring systematic lead qualification through social activity Organizations** seeking to leverage thought leadership for lead generation π Business Impact Transform social listening into strategic lead generation: Warm Lead Generation**: Identifies prospects already engaged with industry content Social Selling Intelligence**: Provides conversation starters through engagement history ICP Qualification**: Focuses efforts on prospects matching ideal customer profiles Relationship Building**: Enables outreach based on genuine interest demonstration Market Intelligence**: Tracks industry engagement patterns and trending content Sales Efficiency**: Prioritizes prospects who show active industry engagement Personalization Data**: Provides context for highly personalized outreach campaigns π‘ Strategic Advantage This workflow creates a fundamental shift from cold outreach to warm, contextual conversations. By identifying prospects who have already demonstrated interest in industry topics through their engagement behavior, sales teams can approach leads with genuine relevance and shared context. The system delivers: Continuous Pipeline**: Automated flow of warm prospects showing industry engagement Social Context**: Rich background data for meaningful, personalized conversations Quality Focus**: ICP-filtered prospects matching ideal customer profiles Engagement History**: Conversation starters based on actual prospect interests Competitive Advantage**: Proactive lead identification before competitors Rather than interrupting prospects with cold messages, this workflow enables sales teams to join conversations prospects are already having, dramatically increasing response rates and relationship-building success.
by Harsh Maniya
β π¬Build Your Own WhatsApp Fact-Checking Bot with AI Tired of misinformation spreading on WhatsApp? π€¨ This workflow transforms your n8n instance into a powerful, automated fact-checking bot\! Send any news, claim, or question to a designated WhatsApp number, and this bot will use AI to research it, provide a verdict, and send back a summary with direct source links. Fight fake news with the power of automation and AI\! π How it works βοΈ This workflow uses a simple but powerful three-step process: π¬ WhatsApp Gateway (Webhook node): This is the front door. The workflow starts when the Webhook node receives an incoming message from a user via a Twilio WhatsApp number. π΅οΈ The Digital Detective (Perplexity node): The user's message is sent to the Perplexity node. Here, a powerful AI model, instructed by a custom system prompt, analyzes the claim, scours the web for reliable information, and generates a verdict (e.g., β Likely True, β Likely False). π² WhatsApp Reply (Twilio node): The final, formatted response, complete with the verdict, a simple summary, and source citations, is sent back to the original user via the Twilio node. Setup Guide π οΈ Follow these steps carefully to get your fact-checking bot up and running. Prerequisites A Twilio Account with an active phone number or access to the WhatsApp Sandbox. A Perplexity AI Account to get an API key. 1\. Configure Credentials You'll need to add API keys for both Perplexity and Twilio to your n8n instance. Perplexity AI: Go to your Perplexity AI API Settings. Generate and copy your API Key. In n8n, go to Credentials \& New, search for "Perplexity," and add your key. Twilio: Go to your Twilio Console Dashboard. Find and copy your Account SID and Auth Token. In n8n, go to Credentials \& New, search for "Twilio," and add your credentials. 2\. Set Up the Webhook and Tunnel To allow Twilio's cloud service to communicate with your n8n instance, you need a public URL. The n8n tunnel is perfect for this. Start the n8n Tunnel: If you are running n8n locally, you'll need to expose it to the web. Open your terminal and run: n8n start --tunnel Copy Your Webhook URL: Once the tunnel is active, open your n8n workflow. In the Receive Whatsapp Messages (Webhook) node, you will see two URLs: Test and Production. Copy the Test/Production URL. This is the public URL that Twilio will use. 3\. Configure Your Twilio WhatsApp Sandbox Go to the Twilio Console and navigate to Messaging \& Try it out \& Send a WhatsApp message. Select the Sandbox Settings tab. In the section "WHEN A MESSAGE COMES IN," paste your n8n Production Webhook URL. Make sure the method is set to HTTP POST. Click Save. How to Use Your Bot π Activate the Sandbox: To start, you (and any other users) must send a WhatsApp message with the join code (e.g., join given-word) to your Twilio Sandbox number. Twilio provides this phrase on the same Sandbox page. Fact-Check Away\! Once joined, simply send any claim or question to the Twilio number. For example: Did Elon Musk discover a new planet? Within moments, the workflow will trigger, and you'll receive a formatted reply with the verdict and sources right in your chat\! Further Reading & Resources π n8n Tunnel Documentation Twilio for WhatsApp Quickstart Perplexity AI API Documentation
by Marker.io
Automatically create Zendesk tickets with full technical context when your team receives new Marker.io issues π― What this template does This workflow creates a seamless bridge between Marker.io and Zendesk, your customer support platform. Every issue submitted through Marker.io's widget automatically becomes a trackable ticket in Zendesk, complete with technical details and visual context. Centralizing customer issues in Zendesk helps your support agents continue the conversation right where they work every day. When an issue is reported, the workflow: Creates or updates the reporter as a Zendesk user Opens a new ticket with all issue details Adds a comprehensive internal comment with technical metadata Preserves all screenshots, browser info, and custom data Automatically tags tickets for easy filtering β¨ Benefits Zero manual entry** - All bug details transfer automatically Instant visibility** - Support agents see issues immediately Rich context** - Technical details preserved for developers Better collaboration** - Single source of truth for bugs Faster resolution** - No time wasted gathering information Smart organization** - Auto-tagging for efficient triage π‘ Use Cases Product Teams**: Streamline bug triage without switching tools Support Teams**: Get technical context for customer-reported issues Development Teams**: Access browser info, console logs, and network logs directly from support tickets π§ How it works n8n Webhook receives Marker.io issue data Format and extract relevant information from the payload Create/update user in Zendesk with reporter details Create ticket with the title and issue description Add internal comment with screenshot, full technical context and Marker.io links for the support agent The result is a perfectly organized support ticket that your team can act on immediately, with all the context they need to reproduce and resolve the issue. π Prerequisites Marker.io account** with webhook capabilities Zendesk account** with API access Zendesk API token** with appropriate permissions π Setup Instructions Import this workflow into your n8n instance Configure the Webhook: Copy the test/production webhook URL after saving Add to Marker.io: Workspace Settings β Webhooks β Create webhook Select "Issue Created" as the trigger event Set up Zendesk credentials: Generate an API token from Zendesk Admin Center β Apps and integrations β APIs β Zendesk API Add credentials to all three HTTP Request nodes Update your subdomain in the URLs (replace [REPLACE_SUBDOMAIN] with your subdomain) Customize fields (optional): Update the custom field ID in "Create Ticket" node if you want to store Marker ID Modify tags to match your workflow Adjust priority mapping if needed Test the integration: Create a test issue in Marker.io Verify the ticket appears in Zendesk Check that all data transfers correctly π Data Captured Customer-facing ticket includes: Issue title (as subject) Description (as ticket body) Internal comment includes: π Marker ID π Priority level and issue type π Due date (if set) π₯οΈ Browser and OS details π€ Developer console & network logs π Website URL where issue occurred π Direct link to Marker.io issue π¦ Any custom data fields β Read more about Marker.io webhook events