by Jan Oberhauser
Simpe API which queries the received country code via GraphQL and returns it. Example URL: https://n8n.exampl.ecom/webhook/1/webhook/webhook?code=DE Receives country code from an incoming HTTP Request Reads data via GraphQL Converts the data to JSON Constructs return string
by Jimleuk
Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you could filter through them quickly using AI but the decks are unparseable through conventional means? Then you're in luck! This n8n template uses Multimodal LLMs to parse and extract valuable data from even the most overly designed pitch decks in quick fashion. Not only that, it'll also create the foundations of a RAG chatbot at the end so you or your colleagues can drill down into the details if needed. With this template, you'll scale your capacity to find interesting companies you'd otherwise miss! Requires n8n v1.62.1+ How It Works Airtable is used as the pitch deck database and PDF decks are downloaded from it. An AI Vision model is used to transcribe each page of the pitch deck into markdown. An Information Extractor is used to generate a report from the transcribed markdown and update required information back into pitch deck database. The transcribed markdown is also uploaded to a vector store to build an AI chatbot which can be used to ask questions on the pitch deck. Check out the sample Airtable here: https://airtable.com/appCkqc2jc3MoVqDO/shrS21vGqlnqzzNUc How To Use This template depends on the availability of the Airtable - make a duplicate of the airtable (link) and its columns before running the workflow. When a new pitchdeck is received, enter the company name into the Name column and upload the pdf into the File column. Leave all other columns blank. If you have the Airtable trigger active, the execution should start immediately once the file is uploaded. Otherwise, click the manual test trigger to start the workflow. When manually triggered, all "new" pitch decks will be handled by the workflow as separate executions. Requirements OpenAI for LLM Airtable For Database and Interface Qdrant for Vector Store Customising This Workflow Extend this starter template by adding more AI agents to validate claims made in the pitch deck eg. Linkedin Profiles, Page visits, Reviews etc.
by Colleen Brady
Who is this for? This workflow is built for anyone who works with YouTube content, whether you're: A learner looking to understand a video’s key points A content creator repurposing video material A YouTube manager looking to update titles, descriptions A social media strategist searching for the most shareable clips Don't just ask questions about what's said. Find out what's going on in a video too. Video Overview: https://www.youtube.com/watch?v=Ovg_KfKxnC8 What problem does this solve? YouTube videos hold valuable insights, but watching and processing them manually takes time. This workflow automates: Quick content extraction**: Summarize key ideas without watching full videos Visual analysis**: Understand what’s happening beyond spoken words Clip discovery**: Identify the best moments for social sharing How the workflow works This n8n-powered automation: Uses Google’s Gemini 1.5 Flash AI for intelligent video analysis Provides multiple content analysis templates tailored to different needs What makes this workflow powerful? The easiest place to start is by requesting a summary or transcript. From there, you can refine the prompts to match your specific use case and the type of video content you’re working with. But what's even more amazing? You can ask questions about what’s happening in the video — and get detailed insights about the people, objects, and scenes. It's jaw-dropping. This workflow is versatile — the actions adapt based on the values set. That means you can use a single workflow to: Extract transcripts Generate an extended YouTube description Write a summary blog post You can also modify the trigger based on how you want to run the workflow — use a webhook, connect it to an event in Airtable, or leave it as-is for on-demand use. The output can then be sent anywhere: Notion, Airtable, CMS platforms, or even just stored for reference. How to set it up Connect your Google API key Paste a YouTube video URL Select an analysis method Run the workflow and get structured results Analysis Templates Basic & Timestamped Transcripts**: Extract spoken content Summaries**: Get concise takeaways Visual Scene Analysis**: Detect objects, settings, and people Clip Finder**: Locate shareable moments Actionable Insights**: Extract practical information Customization Options Modify templates to fit your needs Connect with external platforms Adjust formatting preferences Advanced Configuration This workflow is designed for use with gemini-1.5-flash. In the future, you can update the flow to work with different models or even modify the HTTP request node to define which API endpoint should be used. It's also been designed so you can use this flow on it's own or add to a new / existing worflow. This workflow helps you get the most out of YouTube content — quickly and efficiently.
by SpaGreen Creative
Shopify Order Fulfillment & Send Tracking Link via WhatsApp Using Rapiwa API Who is this for? This n8n workflow automatically sends WhatsApp notifications to customers when their Shopify orders are fulfilled. It extracts order details, validates customer phone numbers for WhatsApp compatibility using the Rapiwa API, sends tracking information via WhatsApp, and logs all interactions in Google Sheets with appropriate verification status. What this Workflow Does This n8n workflow listens for new order fulfillments on Shopify and automatically sends a WhatsApp message with tracking details to customers. It uses the Rapiwa API to verify if the customer's number is on WhatsApp, formats all the data, sends a message, and logs everything to Google Sheets for tracking and auditing purposes. Key Features Webhook-Triggered**: Activates on new Shopify fulfillment events Phone Number Validation**: Uses Rapiwa to check WhatsApp compatibility Tracking Message Automation**: Sends real-time tracking messages via WhatsApp Data Cleaning**: Formats phone numbers and customer data Smart Branching**: Separates verified and unverified WhatsApp users Google Sheets Logging**: Stores data with status labels for all messages Rate-Limit Protection**: Wait node helps space API calls Dual Sheet Logging**: Maintains separate records for verified and unverified numbers Requirements Tools & Services An n8n instance (self-hosted or cloud) A Shopify store with REST API access enabled A Rapiwa.com account with: Valid Bearer Token Connected and verified WhatsApp number A Google Sheet with the following columns: Like this Sample Sheet Google Sheets OAuth2 credentials** set up in n8n Shopify API credentials** added to n8n Rapiwa Bearer Token** added as httpBearerAuth credentials How to Use Step-by-step Setup Connect Shopify to n8n Use the Shopify Trigger node Set event to fulfillments/create to capture new fulfillment events Extract Webhook Data Use a Code Node to format the webhook response Capture order, customer, and tracking details Fetch Complete Order Information Add an HTTP Request Node using Shopify Admin API Include the order ID to retrieve customer phone, email, and product details Clean the Phone Number Use a Code Node to: Remove non-numeric characters Format number to international standard Combine customer first and last name Batch Process Orders Use the Split In Batches node to handle customers one-by-one Validate WhatsApp Number Use Rapiwa’s /verify-whatsapp endpoint with a Bearer Token Check if number exists on WhatsApp Conditional Branching Use an If Node: If data.exists === "true" → Verified path Else → Unverified path Send WhatsApp Message Send tracking info with a personalized message: Hi [Customer Name], Good news! Your order has just been fulfilled. Tracking Number: [Tracking Number] Track your package here: [Tracking URL] Thank you for shopping with us. -Team SpaGreen Creative Log Data to Google Sheets Log verified and unverified entries in separate sheets Include all relevant customer and tracking data Add Delay Between Messages Use the Wait Node to avoid rate limits on Rapiwa API Requirements A Shopify store with API access enabled A Google Sheet with required column like this ➤ Sample Rapiwa API account**: Connected WhatsApp number Valid Bearer Token n8n** with: Shopify API credentials Rapiwa Bearer Token Google Sheets OAuth2 credentials Google Sheet Column Reference A Google Sheet formatted like this ➤ Sample | customer_id | name | email | number | tracking_company | tracking_number | tracking_url | product_title | status | |---------------|-----------------|--------------------------------|---------------|------------------|-----------------|---------------------------------------------|----------------------------------------|------------| | 8986XXXX06 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-02 | https://traxxxG-OT-02 | S25 Ultra 5G Smartphone | verified | | 883XXX7982 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-N03 | https://traxxxGOT-N03| Samsung Galaxy S24 Ultra | verified | Workflow Logic Summary Shopify Webhook Trigger: On order fulfillment Extract Webhook Payload Fetch Order + Customer Details Clean and Format Phone Number Split into Single-Item Batch Check WhatsApp Validity via Rapiwa If Verified: Send WhatsApp Message Log to verified sheet If Not Verified: Skip message Log to unverified sheet Add Delay with Wait Node Repeat for Next Fulfillment Customization Ideas Modify WhatsApp message to include delivery date or store contact Send different messages for different product categories Use product_type or shipping_zone to trigger separate workflows Add admin alerts for unverified numbers Store message delivery status (e.g., success, failed) Notes & Warnings Rapiwa is an unofficial WhatsApp API — delivery reliability is not guaranteed The Google Sheet column name must include the space at the end Wait node** may need longer delay for high-volume stores Always format phone numbers in international format (e.g., 8801XXXXXXXXX) Shopify API version used is 2025-07 — update as newer versions release You must comply with WhatsApp terms and data privacy laws when messaging users Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
by Oriol Seguí
Web Consultation & Crawling Chatbot with Google Sheets Memory Who is this workflow for? This workflow is designed for SEO analysts, content creators, marketing agencies, and developers who need to index a website and then interact with its content as if it were a chatbot. ⚠ Note: if the site contains many pages, AI token consumption can generate high costs, especially during the initial crawling and analysis phase. 1. Initial Mode (first use with a URL) When the user enters a URL for the first time: URL validation using AI (gpt-5-nano). Automatic sitemap discovery via robots.txt. Relevant sitemap selection (pages, posts, categories, or tags) using GPT-4o according to configured options. (Includes “OPTIONS” node to precisely choose which types of URLs to process) Crawling of all selected pages: Downloads HTML of each page. Converts HTML to Markdown. AI analysis to extract: Detected language. Heading hierarchy (H1, H2, etc.). Internal and external links. Content summary. Structured storage in Google Sheets: Lang H1 and hierarchy External URLs Internal URLs Summary Content Data schema (flag to enable agent mode) When finished, the sheet is marked with Data schema = true, signaling that the site is indexed. 2. Agent Mode (subsequent queries) If the URL has already been indexed (Data schema = true): The chat becomes a LangChain Agent that: Reads the database in Google Sheets. Can perform real-time HTTP requests if it needs updated information. Responds as if it were the website, using stored and live data. This allows the user to ask questions such as: "What’s on the contact page?" "How many external links are there on the homepage?" "Give me all the H1 headings from the services pages" "What CTA would you suggest for my page?" "How would you expand X content?" Use cases Build a chatbot that answers questions about a website’s content. Index and analyze full websites for future queries. SEO tool to list headings, links, and content summaries. Assistant for quick exploration of a site’s structure. Generate improvement recommendations and content strategies from site data.
by Shelly-Ann Davy
AI Contact Enrichment 📋 Template Description Overview Automatically enhance and enrich contact data using AI to fill in missing information, generate insights, and create detailed buyer personas. Supports multiple AI providers (OpenAI, Anthropic, etc.) with automatic logging to Supabase. Description This workflow transforms incomplete contact records into rich, actionable profiles. By leveraging AI, it can infer job roles, company information, likely pain points, communication preferences, and buying motivations from minimal input data. Perfect for sales and marketing teams looking to improve data quality and personalize outreach. Key Benefits: Smart Data Completion**: Fill in missing contact fields using AI inference Buyer Persona Generation**: Create detailed profiles from basic information Universal AI Support**: Works with OpenAI, Anthropic Claude, or custom providers CRM Enhancement**: Automatically enrich contacts as they enter your system Lead Qualification**: Assess lead quality and fit based on enriched data Personalization Engine**: Generate insights for tailored outreach Data Quality**: Maintain clean, complete contact records Use Cases: Sales prospecting and lead enrichment Marketing persona development CRM data cleansing and completion Account-based marketing (ABM) research Lead scoring and qualification Personalized email campaign preparation Contact segmentation and targeting ⚙️ Setup Instructions Prerequisites n8n instance (cloud or self-hosted) AI Provider account (OpenAI, Anthropic, or custom) Supabase account with database access Step 1: Configure Environment Variables Add these to your n8n environment settings: AI_PROVIDER=openai # or 'anthropic', 'custom' AI_API_KEY=your_api_key_here AI_MODEL=gpt-3.5-turbo # or 'gpt-4', 'claude-3-sonnet-20240229' AI_ENDPOINT= # Only for custom providers Recommended Models: Cost-effective**: gpt-3.5-turbo (fast, affordable, good for basic enrichment) High-quality**: gpt-4 or claude-3-sonnet-20240229 (better inference, deeper insights) Premium**: claude-3-opus-20240229 (best for complex persona generation) How to set environment variables: n8n Cloud**: Go to Settings → Environment Variables Self-hosted**: Add to your .env file or docker-compose configuration Step 2: Set Up Supabase Database Create the logging table in your Supabase database: CREATE TABLE workflow_logs ( id BIGSERIAL PRIMARY KEY, workflow_name TEXT NOT NULL, data JSONB NOT NULL, ai_response JSONB NOT NULL, created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() ); CREATE INDEX idx_workflow_logs_created_at ON workflow_logs(created_at); CREATE INDEX idx_workflow_logs_workflow_name ON workflow_logs(workflow_name); -- Optional: Create a view for enriched contacts CREATE VIEW enriched_contacts AS SELECT id, data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response as enrichment_data, created_at FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ORDER BY created_at DESC; To run this SQL: Open your Supabase project dashboard Go to the SQL Editor Paste the SQL above and click "Run" Step 3: Configure Supabase Credentials in n8n Go to Settings → Credentials Click Add Credential → Supabase API Enter your Supabase URL and API key (found in Project Settings → API) Name it Supabase API Click Save Step 4: Activate the Webhook Import this workflow into n8n Click the Activate toggle in the top-right corner Click on the "Webhook Trigger" node Copy the Production URL (this is your webhook endpoint) Save this URL for integration with your applications Step 5: Test the Workflow Send a test POST request to the webhook: curl -X POST https://your-n8n-instance.com/webhook/contact-enrichment \ -H "Content-Type: application/json" \ -d '{ "email": "john.doe@acmecorp.com", "name": "John Doe", "company": "Acme Corporation", "linkedin_url": "https://linkedin.com/in/johndoe" }' Successful Response: { "success": true, "workflow": "AI Contact Enrichment", "timestamp": "2025-01-14T12:00:00.000Z" } 📥 Expected Payload Format The webhook accepts JSON with basic contact information: Minimal Input { "email": "string (required or name required)", "name": "string (required or email required)" } Recommended Input { "email": "string", "name": "string", "company": "string", "job_title": "string", "linkedin_url": "string", "phone": "string", "location": "string", "website": "string" } Complete Input Example { "email": "sarah.chen@techstartup.io", "name": "Sarah Chen", "company": "TechStartup Inc.", "job_title": "VP of Marketing", "linkedin_url": "https://linkedin.com/in/sarahchen", "phone": "+1-555-0123", "location": "San Francisco, CA", "website": "https://techstartup.io", "industry": "B2B SaaS", "company_size": "50-200 employees", "notes": "Met at SaaS conference 2024" } Field Guidelines: At minimum, provide either email or name More input fields = better AI enrichment quality Include linkedin_url for best results company helps with firmographic enrichment Any additional context improves accuracy 🔄 Workflow Flow Webhook Trigger: Receives basic contact information from your application, form, or CRM Process Data: Adds unique ID and timestamp to the incoming data Prepare AI Request: Configures AI provider settings from environment variables Call AI API: Sends contact data to AI with enrichment prompt Save to Supabase: Archives original data and enrichment results Format Response: Returns success confirmation 🎯 Customization Tips Enhance AI Prompts for Better Enrichment Modify the "Prepare AI Request" node to customize enrichment: // Enhanced prompt for contact enrichment const systemPrompt = `You are an expert sales intelligence analyst. Analyze the provided contact information and generate a comprehensive enrichment including: INFERRED DETAILS: Fill in missing information based on available data Full job title and seniority level Department and reporting structure Years of experience (estimated) Professional background COMPANY INSIGHTS: If company name provided Industry and sub-industry Company size and revenue (estimated) Key products/services Recent news or developments BUYER PERSONA: Create a detailed profile Primary responsibilities Likely pain points and challenges Key priorities and goals Decision-making authority Budget influence level ENGAGEMENT STRATEGY: Provide outreach recommendations Best communication channels Optimal outreach timing Key talking points Personalization suggestions Content interests LEAD SCORE: Rate 1-10 based on: Fit for product/service (specify your ICP) Seniority and decision power Company size and maturity Engagement potential Return as structured JSON with clear sections.`; const userMessage = Contact Information:\n${JSON.stringify($json.data, null, 2)}; const aiConfig = { provider: $env.AI_PROVIDER || 'openai', apiKey: $env.AI_API_KEY, model: $env.AI_MODEL || 'gpt-3.5-turbo', endpoint: $env.AI_ENDPOINT, messages: [ { role: 'system', content: systemPrompt }, { role: 'user', content: userMessage } ] }; return { json: { aiConfig, data: $json } }; Add External Data Sources Enhance enrichment with third-party APIs: After "Process Data" node, add: Clearbit/Hunter.io Node: Get verified company data LinkedIn API: Pull professional information Company Database: Query internal customer data Web Scraping: Extract data from company websites Then merge all data before AI enrichment for best results Connect to Your CRM Auto-update contacts after enrichment: Salesforce Integration: // Add after "Call AI API" node // Update Salesforce contact with enriched data const enrichedData = JSON.parse($json.ai_response); return { json: { contactId: $json.data.salesforce_id, updates: { Description: enrichedData.buyer_persona, Custom_Score__c: enrichedData.lead_score, Pain_Points__c: enrichedData.pain_points } } }; HubSpot Integration: Add HubSpot node to update contact properties Map enriched fields to custom HubSpot properties Pipedrive Integration: Use Pipedrive node to update person records Add custom fields for AI insights Implement Lead Scoring Add scoring logic after enrichment: // Calculate lead score based on enrichment const enrichment = JSON.parse($json.ai_response); let score = 0; // Job title scoring if (enrichment.seniority === 'C-Level') score += 30; else if (enrichment.seniority === 'VP/Director') score += 20; else if (enrichment.seniority === 'Manager') score += 10; // Company size scoring if (enrichment.company_size === 'Enterprise') score += 25; else if (enrichment.company_size === 'Mid-Market') score += 15; // Decision authority scoring if (enrichment.decision_authority === 'High') score += 25; else if (enrichment.decision_authority === 'Medium') score += 15; // Budget influence if (enrichment.budget_influence === 'Direct') score += 20; return { json: { ...enrichment, lead_score: score } }; Add Compliance Checks Insert before AI processing: // Check for opt-out or compliance flags const email = $json.email.toLowerCase(); // Check against suppression list const suppressedDomains = ['competitor.com', 'spam.com']; const domain = email.split('@')[1]; if (suppressedDomains.includes(domain)) { throw new Error('Contact on suppression list'); } // Verify email format const emailRegex = /^+@+\.+$/; if (!emailRegex.test(email)) { throw new Error('Invalid email format'); } return { json: $json }; Batch Enrichment Process multiple contacts: Add Spreadsheet File trigger instead of webhook Add Split In Batches node (process 10-20 at a time) Run enrichment for each contact Combine results and export to CSV 🛠️ Troubleshooting Common Issues Issue: "Enrichment is too generic" Solution**: Provide more input data (company, job title, LinkedIn) Use GPT-4 or Claude models for better inference Enhance the system prompt with specific instructions Issue: "AI_API_KEY is undefined" Solution**: Ensure environment variables are set correctly Verify variable names match exactly (case-sensitive) Issue: "Enrichment contradicts actual data" Solution**: AI makes inferences - always validate critical information Add validation step to check enriched data against known facts Use external APIs for verification Issue: "Too slow for real-time use" Solution**: Implement queue system for async processing Use faster models (gpt-3.5-turbo) for speed Process in batches during off-peak hours Issue: "Supabase credentials not found" Solution**: Check credential name matches exactly: "Supabase API" Verify Supabase URL and API key are correct Debugging Tips Test with known contacts first to validate accuracy Compare AI enrichment against actual data Check execution logs for API errors Start with minimal prompt, then enhance gradually Use "Execute Node" to test individual steps 📊 Analyzing Enriched Data Query and analyze your enriched contacts: -- Get all enriched contacts SELECT * FROM enriched_contacts ORDER BY created_at DESC; -- Find high-value leads (assuming scoring implemented) SELECT email, name, company, ai_response->>'lead_score' as score FROM enriched_contacts WHERE (ai_response->>'lead_score')::int > 70 ORDER BY (ai_response->>'lead_score')::int DESC; -- Analyze enrichment by company SELECT data->>'company' as company, COUNT(*) as contact_count, AVG((ai_response->>'lead_score')::int) as avg_score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' AND ai_response->>'lead_score' IS NOT NULL GROUP BY data->>'company' ORDER BY contact_count DESC; -- Find contacts needing follow-up SELECT email, name, ai_response->>'engagement_strategy' as strategy, created_at FROM enriched_contacts WHERE created_at > NOW() - INTERVAL '7 days' ORDER BY created_at DESC; Export Enriched Data -- Export to CSV COPY ( SELECT data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response->>'job_title' as enriched_title, ai_response->>'seniority' as seniority, ai_response->>'lead_score' as score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ) TO '/tmp/enriched_contacts.csv' WITH CSV HEADER; 📈 Integration Ideas Form Integration Automatically enrich new leads from forms: Typeform**: Trigger on form submission Google Forms**: Use Google Sheets trigger Calendly**: Enrich after meeting booking Webflow Forms**: Webhook trigger from form CRM Integration Real-time enrichment as contacts enter CRM: Salesforce**: Trigger on new lead/contact creation HubSpot**: Enrich on form submission or import Pipedrive**: Auto-enrich new persons Close**: Webhook on lead creation Email Tools Enhance cold outreach campaigns: Instantly.ai**: Enrich before campaign launch Lemlist**: Generate personalization variables Apollo.io**: Supplement with AI insights Mailshake**: Enrich prospect lists Marketing Automation Power ABM and segmentation: Marketo**: Enrich leads for scoring Pardot**: Enhance prospect profiles ActiveCampaign**: Personalization data Klaviyo**: E-commerce customer insights Slack Integration Team notifications and collaboration: Send enrichment summaries to sales channel Notify reps of high-value leads Share persona insights with marketing Alert on key account contacts 🔒 Security & Compliance Best Practices Data Protection Encrypt Sensitive Data: Use environment variables for all credentials Access Control: Limit webhook access with authentication Data Retention: Set automatic deletion policies in Supabase Audit Logging: Track all enrichment activities Privacy Compliance GDPR Compliance: Get consent before enriching personal data Allow contacts to request data deletion Document legal basis for processing CCPA Compliance: Honor do-not-sell requests Data Minimization: Only enrich necessary fields Right to Access: Allow contacts to view enriched data AI Ethics Bias Awareness: Review AI inferences for bias Accuracy Validation: Verify critical information Transparency: Disclose use of AI enrichment Human Oversight: Review before critical decisions 💡 Best Practices Input Data Quality Always include email or full name** as anchor point Add LinkedIn URLs** for 50% better accuracy Provide company name** for firmographic insights Include any known details** - more data = better results Prompt Engineering Be specific** about your ideal customer profile (ICP) Request structured output** (JSON format) Define scoring criteria** that match your business Ask for actionable insights** not just descriptions Post-Enrichment Workflow Always validate** critical information before use Review AI inferences** for accuracy and bias Update CRM promptly** to maintain data freshness Track enrichment ROI** (conversion rates, time saved) Performance Optimization Batch process** during off-peak hours Use appropriate models** (gpt-3.5 for speed, gpt-4 for quality) Cache common enrichments** to reduce API costs Set rate limits** to avoid API throttling 🏷️ Tags sales-automation, lead-enrichment, ai-automation, crm-integration, data-enrichment, contact-intelligence, buyer-personas, lead-scoring, webhook, supabase, openai, anthropic, b2b-sales 📝 License This workflow template is provided as-is for use with n8n. 🤝 Support For questions or issues: n8n Community Forum: https://community.n8n.io n8n Documentation: https://docs.n8n.io 🌟 Example Output Input: { "email": "mike.johnson@cloudtech.com", "name": "Mike Johnson", "company": "CloudTech Solutions", "job_title": "Director of IT" } AI-Generated Enrichment: { "full_title": "Director of Information Technology", "seniority": "Director", "department": "Technology/IT", "experience_years": "10-15", "company_insights": { "industry": "Cloud Computing", "size": "Mid-Market (100-500)", "revenue_estimate": "$10M-$50M" }, "buyer_persona": { "responsibilities": ["Infrastructure management", "Vendor selection", "Security oversight"], "pain_points": ["Legacy system migration", "Cost optimization", "Security compliance"], "priorities": ["Scalability", "Cost reduction", "Team efficiency"] }, "engagement_strategy": { "best_channels": ["Email", "LinkedIn"], "timing": "Tuesday-Thursday, 9-11 AM", "talking_points": ["ROI and cost savings", "Security features", "Ease of implementation"], "personalization": "Reference cloud migration challenges" }, "lead_score": 75 } 🔄 Version History v1.0.0** (2025-01-14): Initial release with universal AI provider support
by Will Stenzel
Creates a new team for a project from webhook form data. When the project is created the current semester is added to it's relation attribute. More info can be found on using this workflow as part of a larger system here.
by Max Tkacz
This workflow shows how to sum multiple items of data, like you would in Excel or Airtable when summing up the total of a column. It uses a Function node with some javascript to perform the aggregation of numeric data. The first node is simply mock data to avoid needing a credential to run the workflow. The second node actually performs the summation - the javascript has various comments in case you need to edit the JS. For example, to sum multiple items of data. Below is an example of the type of data this workflow can sum - so anything that is in a tabular form (Airtable, GSHeets, Postgres etc).
by Incrementors
Description A natural conversational AI chatbot that collects lead information (Name, Phone, Email, Message) one question at a time without feeling like a form. Uses session-based memory to track conversations, intelligently asks only for missing details, and saves complete leads to Google Sheets automatically. What this workflow does This workflow creates a human-like booking assistant that gathers lead information through natural conversation instead of traditional forms. The AI chatbot asks ONE question at a time, remembers previous answers using session memory, never repeats questions, and only saves data to Google Sheets when all four required fields (Name, Phone Number, Email Address, User Message) are confidently collected. The conversation feels natural and friendly—users engage with the bot as if chatting with a real person, dramatically improving completion rates compared to static forms. Perfect for booking systems, consultation requests, event registrations, customer support intake, or any scenario where you need to collect contact information without friction. Key features One question at a time: The AI never overwhelms users with multiple questions. It asks for Name, then Phone, then Email, then Message—sequentially and naturally, based on what's still missing from the conversation. Session-based memory: Uses timestamp-based session tracking so the AI remembers the entire conversation context. If a user says "My name is John" in message 1, the AI won't ask for the name again in message 5. Smart field detection: The AI automatically detects which details have been collected and which are still missing. It adapts the conversation flow dynamically instead of following a rigid script. Natural language processing: Handles variations in user input ("John Doe", "I'm John", "Call me John") and validates data intelligently before saving. Complete data guarantee: Only writes to Google Sheets when all 4 required fields are present. No partial or incomplete leads clutter your tracking sheet. Webhook-based integration: Works with any website, app, or platform that can send HTTP requests. Integrate with chatbots, contact forms, booking widgets, or custom applications. Instant responses: Real-time conversation with sub-second response times. Users get immediate replies, maintaining engagement throughout the lead collection process. How it works 1. User initiates conversation via webhook A user sends a message through your website chat widget, contact form, or booking interface. This triggers a webhook that passes the message along with query parameters (name, email, phone, message, timestamp, source) to n8n. 2. AI Agent analyzes conversation state The Conversational Lead Collection Agent receives the user's message and checks the current state: Which fields are already collected (from previous messages in this session)? Which fields are still missing? What should be asked next? The AI uses the system prompt to understand its role as a booking assistant for "Spark Writers' Retreat" and follows strict conversation rules. 3. Session memory tracks context The Buffer Window Memory node uses the timestamp from the webhook as a unique session ID. This allows the AI to: Remember all previous messages in this conversation Access previously collected information (name, phone, email) Never ask the same question twice Maintain conversation continuity even if the user takes breaks 4. One question at a time Based on what's missing, the AI asks exactly ONE question in natural, friendly language: If Name is missing → "Hi! What's your name?" If Phone is missing → "Great! And what's your phone number?" If Email is missing → "Perfect! Could you share your email address?" If Message is missing → "Thanks! How can I help you today?" The AI adapts its language based on previous conversation flow—it doesn't sound robotic or repetitive. 5. Data validation and collection As the user responds, the AI: Validates input (checks if phone number looks valid, email has @ symbol, etc.) Extracts the information from natural language responses Stores it temporarily in session memory Continues asking until all 4 fields are complete If the user provides unclear input, the AI politely asks again: "I didn't quite catch that. Could you share your phone number?" 6. Save to Google Sheets (when complete) Critical rule: The AI only uses the Google Sheets tool AFTER all four details are confidently collected. This prevents partial or incomplete leads from cluttering your database. When all fields are present, the AI: Writes exactly ONE row to Google Sheets Maps data: Name → Name, Phone → Phone No., Email → Email, Message → Message Uses Timestamp as the unique identifier (matching column) Updates existing rows if the same timestamp appears again (prevents duplicates) 7. Confirmation message After successfully saving, the AI sends a polite thank you: "Thank you! 🙏 We've received your details and our team will get back to you shortly." The AI never mentions Google Sheets, tools, backend systems, or automation—it maintains the illusion of human conversation. 8. Response delivery The final AI response is sent back to the user via the webhook response. Your website or app displays this message in the chat interface, completing the conversation loop. Setup requirements Tools you'll need: Active n8n instance (self-hosted or n8n Cloud) Google Sheets with OAuth access for lead storage OpenAI API key (GPT-4.1-mini access) Website or app with chat interface (or any platform that can send webhooks) Estimated setup time: 15–20 minutes Configuration steps 1. Connect Google Sheets In n8n: Credentials → Add credential → Google Sheets OAuth2 API Complete OAuth authentication Create a Google Sheet for lead tracking with these columns: Timestamp (unique session identifier) Name Phone No. Email Message Open "Save Lead to Google Sheets" node Select your Google Sheet and the correct sheet tab Verify column mapping matches your sheet structure 2. Add OpenAI API credentials Get API key: https://platform.openai.com/api-keys In n8n: Credentials → Add credential → OpenAI API Paste your API key Open "OpenAI GPT-4.1 Mini Language Model" node Select your OpenAI credential Ensure model is set to gpt-4.1-mini 3. Copy webhook URL Open "Receive User Message via Webhook" node Copy the Webhook URL (format: https://your-n8n.cloud/webhook/[webhook-id]) This is the endpoint your website or app will send messages to 4. Integrate with your chat interface You need to send HTTP POST/GET requests to the webhook URL with these query parameters: GET https://your-n8n.cloud/webhook/[id]?name=[name]&email=[email]&phone=[phone]&message=[user_message]×tamp=[unique_timestamp]&source=[source] Query parameter details: name: User's name (empty string if not yet collected) email: User's email (empty string if not yet collected) phone: User's phone number (empty string if not yet collected) message: Current user message (required) timestamp: Unique session ID (use ISO timestamp or UUID) source: Source identifier (e.g., "website_chat", "booking_form") Example integration (JavaScript): const sessionId = new Date().toISOString(); const userMessage = "Hi, I want to book a retreat"; fetch(https://your-n8n.cloud/webhook/[id]?message=${encodeURIComponent(userMessage)}×tamp=${sessionId}&name=&email=&phone=&source=website_chat) .then(res => res.json()) .then(data => { // Display AI response in your chat UI console.log(data.output); }); 5. Customize the AI assistant Open "Conversational Lead Collection Agent" node and edit the system message to: Change the business name (currently "Spark Writers' Retreat") Modify conversation tone (formal vs. casual) Adjust the fields being collected Change the final thank you message 6. Test the workflow Activate the workflow (toggle to Active at the top) Send a test message to the webhook URL Verify the AI responds appropriately Continue the conversation by sending follow-up messages with the same timestamp Check that: AI asks for missing fields only Session memory persists across messages Lead saves to Google Sheets when all 4 fields are collected Thank you message appears after saving Use cases Booking and reservations: Hotels, retreat centers, event venues, or appointment-based businesses collect guest details conversationally instead of long booking forms. Higher completion rates mean more confirmed bookings. Lead generation for services: Agencies, consultants, coaches, or freelancers capture qualified leads through natural conversation. Users are more likely to complete the process when it feels like chatting instead of form-filling. Customer support intake: Support teams collect issue details, contact information, and problem descriptions through chat before routing to the right agent. All data automatically logged in Google Sheets for ticketing. Event registration: Conference organizers, workshop hosts, or webinar providers gather attendee information without friction. The conversational approach encourages sign-ups even from mobile users who hate forms. Sales qualification: Sales teams use the chatbot to qualify leads by collecting basic information and understanding requirements before human handoff. Complete context stored in Google Sheets for CRM integration. Consultation requests: Professional services (legal, medical, financial) collect client details and initial consultation requests through friendly conversation, reducing no-show rates by building rapport early. Customization options Change collected fields Open "Conversational Lead Collection Agent" node and modify the system message: Add new fields (e.g., Company Name, Budget, Preferred Date) Remove optional fields (e.g., make Message optional) Update the field names and data mapping Then update the Google Sheets node to include the new columns. Adjust conversation tone In the system message, change conversation style: Formal:** "May I please have your full name?" Casual:** "What's your name?" Friendly:** "Hey! What should I call you?" Add validation rules Enhance the system prompt with specific validation: Phone format (e.g., 10 digits, US format) Email domain restrictions (e.g., only business emails) Name length requirements Message minimum word count Connect to CRM or email After "Save Lead to Google Sheets" node, add: HTTP Request node** to send data to your CRM API Email node** to notify sales team of new leads Slack/Discord node** for real-time team alerts Webhook node** to trigger other workflows Multi-language support Modify the system prompt to respond in the user's language: Add language detection logic Translate questions and responses Update thank you message for each language Add conversation analytics Insert a Set node before saving to track: Number of messages per lead Time to completion Drop-off points Source performance Troubleshooting AI repeats questions already answered Memory not persisting:* Verify the *"Session Memory with Timestamp"** node is using the correct timestamp from the webhook query params. Timestamp changing:** Ensure your chat interface sends the SAME timestamp for all messages in one conversation. Generate it once and reuse it. Memory window size:** Increase the buffer window size in the memory node if conversations are very long. Leads not saving to Google Sheets Partial data:** The AI only saves when all 4 fields are collected. Check your test conversation actually provided all required information. OAuth expired:** Re-authenticate Google Sheets credentials. Sheet permissions:** Verify the connected Google account has edit access to the sheet. Column names mismatch:** Ensure sheet column names exactly match the mapping in the Google Sheets node (case-sensitive). AI saves incomplete data System prompt not followed:** Review the "Tool usage (VERY IMPORTANT)" section in the system message. Ensure it clearly states to only use Google Sheets after all fields are collected. Validation too lenient:** The AI might be guessing missing fields. Strengthen validation rules in the system prompt. Webhook not receiving messages URL incorrect:** Double-check the webhook URL in your integration code matches the n8n webhook URL exactly. CORS issues:** If calling from a browser, ensure n8n allows cross-origin requests or use server-side integration. Query params missing:** Verify all required parameters (message, timestamp) are included in the request. AI responses too slow OpenAI API latency:** GPT-4.1-mini typically responds in 1-3 seconds. If slower, check OpenAI API status. Network delays:** Verify n8n instance has good connectivity. Memory lookup slow:** Reduce buffer window size if storing hundreds of messages. Session memory not working Timestamp format inconsistent:** Use ISO format (e.g., 2026-01-28T14:38:23.720Z) and ensure it's identical across messages. Memory node misconfigured:* Check the session key expression in *"Session Memory with Timestamp"** node references the correct webhook query param. Resources n8n documentation OpenAI GPT-4 API Google Sheets API n8n Webhook node n8n AI Agent Buffer Window Memory Support Need help or custom development? 📧 Email: info@incrementors.com 🌐 Website: https://www.incrementors.com/
by Shahrukh
AI-Powered Local Lead Generation Workflow with n8n This workflow solves one of the biggest pain points for freelancers, agencies, and SaaS founders—finding accurate local business leads at scale without manual copy-pasting or unreliable scraping tools. Traditional lead generation is time-consuming and prone to errors. This template automates the entire process so you can focus on outreach, not data gathering. ✅ What the Workflow Does Accepts a business type (e.g., plumbers) and city (e.g., Los Angeles) as input Uses AI to generate hyperlocal search terms for full neighborhood coverage Scrapes Google Maps results to extract business details and websites Filters out junk, Google-owned links, and duplicates Scrapes homepage HTML for each business and extracts valid emails using Regex Outputs a clean, deduplicated lead list with business names, websites, and emails 🛠 Everything Runs Inside n8n With: OpenAI** for AI-driven query generation Zyte API** for reliable scraping JavaScript functions** for email extraction Built-in filtering and batching** for clean results 👥 Who is This For? Marketing agencies** doing local outreach Freelancers** offering SEO, design, or lead gen services SaaS founders** targeting SMBs Sales teams** scaling outbound campaigns ✅ Requirements n8n account** (Cloud or self-hosted) OpenAI API key** (stored in n8n credentials) Zyte API key** (stored securely) Basic familiarity with Google Sheets if you want to export results ⚙️ How to Set Up Import the workflow JSON into n8n Go to Credentials in n8n and add OpenAI and Zyte API keys Replace placeholder credential references in the HTTP Request nodes Set your search parameters (business type and city) in the designated Set node Test the workflow with a single search to confirm scraping and email extraction steps Configure batching if you plan to scrape multiple neighborhoods Add an output step (Google Sheets, Airtable, or CRM) to store your leads 🔧 How to Customize Update the OpenAI prompt for different search formats (e.g., service + zip code) Adjust the Regex pattern in the JavaScript node for additional email validation rules Add extra filtering logic for niche-specific keywords Integrate with Instantly, HubSpot, or any email-sending tool for full automation
by Jan Oberhauser
Receives data from an incoming HTTP Request (set up to use respond to webhook node) Create dummy data Convert JSON to XML which gets returned Respond to Webhook which returns the data and the content type of the data
by Holger
++How it Works:++ This RSS reader retrieves links from a Google Sheets file and goes through each link to retrieve the messages that are younger than 3 days, then saves them in a second Google Sheets file and then deletes all older entries in the second Google Sheets file! The retrieval can take a while due to the Google API block prevention, depending on the number of news feeds that were retrieved!==* Detailed Description is in the sticky Notes from the Workflow!*