by Don Jayamaha Jr
This workflow acts as a central API gateway for all technical indicator agents in the Binance Spot Market Quant AI system. It listens for incoming webhook requests and dynamically routes them to the correct timeframe-based indicator tool (15m, 1h, 4h, 1d). Designed to power multi-timeframe analysis at scale. 🎥 Watch Tutorial: 🎯 What It Does Accepts requests via webhook with a token symbol and timeframe Forwards requests to the correct internal technical indicator tool Returns a clean JSON payload with RSI, MACD, BBANDS, EMA, SMA, and ADX Can be used directly or as a microservice by other agents 🛠️ Input Format Webhook endpoint: POST /webhook/indicators Body format: { "symbol": "DOGEUSDT", "timeframe": "15m" } 🔄 Routing Logic | Timeframe | Routed To | | --------- | -------------------------------- | | 15m | Binance SM 15min Indicators Tool | | 1h | Binance SM 1hour Indicators Tool | | 4h | Binance SM 4hour Indicators Tool | | 1d | Binance SM 1day Indicators Tool | 🔎 Use Cases | Use Case | Description | | -------------------------------------------------- | ------------------------------------------------------ | | 🔗 Used by Binance Financial Analyst Tool | Automatically triggers all indicator tools in parallel | | 🤖 Integrated in Binance Quant AI System | Supports reasoning, signal generation, and summaries | | ⚙️ Can be called independently for raw data access | Useful for dashboards or advanced analytics | 📤 Output Example { "symbol": "DOGEUSDT", "timeframe": "15m", "rsi": 56.7, "macd": "Bearish Crossover", "bbands": "Stable", "ema": "Price above EMA", "adx": 19.4 } ✅ Prerequisites Make sure all the following workflows are installed and operational: Binance SM 15min Indicators Tool Binance SM 1hour Indicators Tool Binance SM 4hour Indicators Tool Binance SM 1day Indicators Tool OpenAI credentials (for any agent using LLM formatting) 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company All architectural routing logic and endpoint structuring is IP-protected. No unauthorized rebranding or resale permitted. 🔗 Need help? Connect on LinkedIn – Don Jayamaha
by jason
This workflow will gather data every minute from the GitHub (https://github.com), Docker (https://www.docker.com/), npm (https://www.npmjs.com/) and Product Hunt (https://www.producthunt.com/) website APIs and display select information on a Smashing (https://smashing.github.io/) dashboard. For convenience sake, the dashboard piece can be easily downloaded as a docker container (https://hub.docker.com/r/tephlon/n8n_dashboard) and installed into your docker environment.
by Jan Oberhauser
Simpe API which queries the received country code via GraphQL and returns it. Example URL: https://n8n.exampl.ecom/webhook/1/webhook/webhook?code=DE Receives country code from an incoming HTTP Request Reads data via GraphQL Converts the data to JSON Constructs return string
by Jimleuk
Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you could filter through them quickly using AI but the decks are unparseable through conventional means? Then you're in luck! This n8n template uses Multimodal LLMs to parse and extract valuable data from even the most overly designed pitch decks in quick fashion. Not only that, it'll also create the foundations of a RAG chatbot at the end so you or your colleagues can drill down into the details if needed. With this template, you'll scale your capacity to find interesting companies you'd otherwise miss! Requires n8n v1.62.1+ How It Works Airtable is used as the pitch deck database and PDF decks are downloaded from it. An AI Vision model is used to transcribe each page of the pitch deck into markdown. An Information Extractor is used to generate a report from the transcribed markdown and update required information back into pitch deck database. The transcribed markdown is also uploaded to a vector store to build an AI chatbot which can be used to ask questions on the pitch deck. Check out the sample Airtable here: https://airtable.com/appCkqc2jc3MoVqDO/shrS21vGqlnqzzNUc How To Use This template depends on the availability of the Airtable - make a duplicate of the airtable (link) and its columns before running the workflow. When a new pitchdeck is received, enter the company name into the Name column and upload the pdf into the File column. Leave all other columns blank. If you have the Airtable trigger active, the execution should start immediately once the file is uploaded. Otherwise, click the manual test trigger to start the workflow. When manually triggered, all "new" pitch decks will be handled by the workflow as separate executions. Requirements OpenAI for LLM Airtable For Database and Interface Qdrant for Vector Store Customising This Workflow Extend this starter template by adding more AI agents to validate claims made in the pitch deck eg. Linkedin Profiles, Page visits, Reviews etc.
by Colleen Brady
Who is this for? This workflow is built for anyone who works with YouTube content, whether you're: A learner looking to understand a video’s key points A content creator repurposing video material A YouTube manager looking to update titles, descriptions A social media strategist searching for the most shareable clips Don't just ask questions about what's said. Find out what's going on in a video too. Video Overview: https://www.youtube.com/watch?v=Ovg_KfKxnC8 What problem does this solve? YouTube videos hold valuable insights, but watching and processing them manually takes time. This workflow automates: Quick content extraction**: Summarize key ideas without watching full videos Visual analysis**: Understand what’s happening beyond spoken words Clip discovery**: Identify the best moments for social sharing How the workflow works This n8n-powered automation: Uses Google’s Gemini 1.5 Flash AI for intelligent video analysis Provides multiple content analysis templates tailored to different needs What makes this workflow powerful? The easiest place to start is by requesting a summary or transcript. From there, you can refine the prompts to match your specific use case and the type of video content you’re working with. But what's even more amazing? You can ask questions about what’s happening in the video — and get detailed insights about the people, objects, and scenes. It's jaw-dropping. This workflow is versatile — the actions adapt based on the values set. That means you can use a single workflow to: Extract transcripts Generate an extended YouTube description Write a summary blog post You can also modify the trigger based on how you want to run the workflow — use a webhook, connect it to an event in Airtable, or leave it as-is for on-demand use. The output can then be sent anywhere: Notion, Airtable, CMS platforms, or even just stored for reference. How to set it up Connect your Google API key Paste a YouTube video URL Select an analysis method Run the workflow and get structured results Analysis Templates Basic & Timestamped Transcripts**: Extract spoken content Summaries**: Get concise takeaways Visual Scene Analysis**: Detect objects, settings, and people Clip Finder**: Locate shareable moments Actionable Insights**: Extract practical information Customization Options Modify templates to fit your needs Connect with external platforms Adjust formatting preferences Advanced Configuration This workflow is designed for use with gemini-1.5-flash. In the future, you can update the flow to work with different models or even modify the HTTP request node to define which API endpoint should be used. It's also been designed so you can use this flow on it's own or add to a new / existing worflow. This workflow helps you get the most out of YouTube content — quickly and efficiently.
by SpaGreen Creative
Shopify Order Fulfillment & Send Tracking Link via WhatsApp Using Rapiwa API Who is this for? This n8n workflow automatically sends WhatsApp notifications to customers when their Shopify orders are fulfilled. It extracts order details, validates customer phone numbers for WhatsApp compatibility using the Rapiwa API, sends tracking information via WhatsApp, and logs all interactions in Google Sheets with appropriate verification status. What this Workflow Does This n8n workflow listens for new order fulfillments on Shopify and automatically sends a WhatsApp message with tracking details to customers. It uses the Rapiwa API to verify if the customer's number is on WhatsApp, formats all the data, sends a message, and logs everything to Google Sheets for tracking and auditing purposes. Key Features Webhook-Triggered**: Activates on new Shopify fulfillment events Phone Number Validation**: Uses Rapiwa to check WhatsApp compatibility Tracking Message Automation**: Sends real-time tracking messages via WhatsApp Data Cleaning**: Formats phone numbers and customer data Smart Branching**: Separates verified and unverified WhatsApp users Google Sheets Logging**: Stores data with status labels for all messages Rate-Limit Protection**: Wait node helps space API calls Dual Sheet Logging**: Maintains separate records for verified and unverified numbers Requirements Tools & Services An n8n instance (self-hosted or cloud) A Shopify store with REST API access enabled A Rapiwa.com account with: Valid Bearer Token Connected and verified WhatsApp number A Google Sheet with the following columns: Like this Sample Sheet Google Sheets OAuth2 credentials** set up in n8n Shopify API credentials** added to n8n Rapiwa Bearer Token** added as httpBearerAuth credentials How to Use Step-by-step Setup Connect Shopify to n8n Use the Shopify Trigger node Set event to fulfillments/create to capture new fulfillment events Extract Webhook Data Use a Code Node to format the webhook response Capture order, customer, and tracking details Fetch Complete Order Information Add an HTTP Request Node using Shopify Admin API Include the order ID to retrieve customer phone, email, and product details Clean the Phone Number Use a Code Node to: Remove non-numeric characters Format number to international standard Combine customer first and last name Batch Process Orders Use the Split In Batches node to handle customers one-by-one Validate WhatsApp Number Use Rapiwa’s /verify-whatsapp endpoint with a Bearer Token Check if number exists on WhatsApp Conditional Branching Use an If Node: If data.exists === "true" → Verified path Else → Unverified path Send WhatsApp Message Send tracking info with a personalized message: Hi [Customer Name], Good news! Your order has just been fulfilled. Tracking Number: [Tracking Number] Track your package here: [Tracking URL] Thank you for shopping with us. -Team SpaGreen Creative Log Data to Google Sheets Log verified and unverified entries in separate sheets Include all relevant customer and tracking data Add Delay Between Messages Use the Wait Node to avoid rate limits on Rapiwa API Requirements A Shopify store with API access enabled A Google Sheet with required column like this ➤ Sample Rapiwa API account**: Connected WhatsApp number Valid Bearer Token n8n** with: Shopify API credentials Rapiwa Bearer Token Google Sheets OAuth2 credentials Google Sheet Column Reference A Google Sheet formatted like this ➤ Sample | customer_id | name | email | number | tracking_company | tracking_number | tracking_url | product_title | status | |---------------|-----------------|--------------------------------|---------------|------------------|-----------------|---------------------------------------------|----------------------------------------|------------| | 8986XXXX06 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-02 | https://traxxxG-OT-02 | S25 Ultra 5G Smartphone | verified | | 883XXX7982 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-N03 | https://traxxxGOT-N03| Samsung Galaxy S24 Ultra | verified | Workflow Logic Summary Shopify Webhook Trigger: On order fulfillment Extract Webhook Payload Fetch Order + Customer Details Clean and Format Phone Number Split into Single-Item Batch Check WhatsApp Validity via Rapiwa If Verified: Send WhatsApp Message Log to verified sheet If Not Verified: Skip message Log to unverified sheet Add Delay with Wait Node Repeat for Next Fulfillment Customization Ideas Modify WhatsApp message to include delivery date or store contact Send different messages for different product categories Use product_type or shipping_zone to trigger separate workflows Add admin alerts for unverified numbers Store message delivery status (e.g., success, failed) Notes & Warnings Rapiwa is an unofficial WhatsApp API — delivery reliability is not guaranteed The Google Sheet column name must include the space at the end Wait node** may need longer delay for high-volume stores Always format phone numbers in international format (e.g., 8801XXXXXXXXX) Shopify API version used is 2025-07 — update as newer versions release You must comply with WhatsApp terms and data privacy laws when messaging users Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
by Oriol Seguí
Web Consultation & Crawling Chatbot with Google Sheets Memory Who is this workflow for? This workflow is designed for SEO analysts, content creators, marketing agencies, and developers who need to index a website and then interact with its content as if it were a chatbot. ⚠ Note: if the site contains many pages, AI token consumption can generate high costs, especially during the initial crawling and analysis phase. 1. Initial Mode (first use with a URL) When the user enters a URL for the first time: URL validation using AI (gpt-5-nano). Automatic sitemap discovery via robots.txt. Relevant sitemap selection (pages, posts, categories, or tags) using GPT-4o according to configured options. (Includes “OPTIONS” node to precisely choose which types of URLs to process) Crawling of all selected pages: Downloads HTML of each page. Converts HTML to Markdown. AI analysis to extract: Detected language. Heading hierarchy (H1, H2, etc.). Internal and external links. Content summary. Structured storage in Google Sheets: Lang H1 and hierarchy External URLs Internal URLs Summary Content Data schema (flag to enable agent mode) When finished, the sheet is marked with Data schema = true, signaling that the site is indexed. 2. Agent Mode (subsequent queries) If the URL has already been indexed (Data schema = true): The chat becomes a LangChain Agent that: Reads the database in Google Sheets. Can perform real-time HTTP requests if it needs updated information. Responds as if it were the website, using stored and live data. This allows the user to ask questions such as: "What’s on the contact page?" "How many external links are there on the homepage?" "Give me all the H1 headings from the services pages" "What CTA would you suggest for my page?" "How would you expand X content?" Use cases Build a chatbot that answers questions about a website’s content. Index and analyze full websites for future queries. SEO tool to list headings, links, and content summaries. Assistant for quick exploration of a site’s structure. Generate improvement recommendations and content strategies from site data.
by Shelly-Ann Davy
AI Contact Enrichment 📋 Template Description Overview Automatically enhance and enrich contact data using AI to fill in missing information, generate insights, and create detailed buyer personas. Supports multiple AI providers (OpenAI, Anthropic, etc.) with automatic logging to Supabase. Description This workflow transforms incomplete contact records into rich, actionable profiles. By leveraging AI, it can infer job roles, company information, likely pain points, communication preferences, and buying motivations from minimal input data. Perfect for sales and marketing teams looking to improve data quality and personalize outreach. Key Benefits: Smart Data Completion**: Fill in missing contact fields using AI inference Buyer Persona Generation**: Create detailed profiles from basic information Universal AI Support**: Works with OpenAI, Anthropic Claude, or custom providers CRM Enhancement**: Automatically enrich contacts as they enter your system Lead Qualification**: Assess lead quality and fit based on enriched data Personalization Engine**: Generate insights for tailored outreach Data Quality**: Maintain clean, complete contact records Use Cases: Sales prospecting and lead enrichment Marketing persona development CRM data cleansing and completion Account-based marketing (ABM) research Lead scoring and qualification Personalized email campaign preparation Contact segmentation and targeting ⚙️ Setup Instructions Prerequisites n8n instance (cloud or self-hosted) AI Provider account (OpenAI, Anthropic, or custom) Supabase account with database access Step 1: Configure Environment Variables Add these to your n8n environment settings: AI_PROVIDER=openai # or 'anthropic', 'custom' AI_API_KEY=your_api_key_here AI_MODEL=gpt-3.5-turbo # or 'gpt-4', 'claude-3-sonnet-20240229' AI_ENDPOINT= # Only for custom providers Recommended Models: Cost-effective**: gpt-3.5-turbo (fast, affordable, good for basic enrichment) High-quality**: gpt-4 or claude-3-sonnet-20240229 (better inference, deeper insights) Premium**: claude-3-opus-20240229 (best for complex persona generation) How to set environment variables: n8n Cloud**: Go to Settings → Environment Variables Self-hosted**: Add to your .env file or docker-compose configuration Step 2: Set Up Supabase Database Create the logging table in your Supabase database: CREATE TABLE workflow_logs ( id BIGSERIAL PRIMARY KEY, workflow_name TEXT NOT NULL, data JSONB NOT NULL, ai_response JSONB NOT NULL, created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() ); CREATE INDEX idx_workflow_logs_created_at ON workflow_logs(created_at); CREATE INDEX idx_workflow_logs_workflow_name ON workflow_logs(workflow_name); -- Optional: Create a view for enriched contacts CREATE VIEW enriched_contacts AS SELECT id, data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response as enrichment_data, created_at FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ORDER BY created_at DESC; To run this SQL: Open your Supabase project dashboard Go to the SQL Editor Paste the SQL above and click "Run" Step 3: Configure Supabase Credentials in n8n Go to Settings → Credentials Click Add Credential → Supabase API Enter your Supabase URL and API key (found in Project Settings → API) Name it Supabase API Click Save Step 4: Activate the Webhook Import this workflow into n8n Click the Activate toggle in the top-right corner Click on the "Webhook Trigger" node Copy the Production URL (this is your webhook endpoint) Save this URL for integration with your applications Step 5: Test the Workflow Send a test POST request to the webhook: curl -X POST https://your-n8n-instance.com/webhook/contact-enrichment \ -H "Content-Type: application/json" \ -d '{ "email": "john.doe@acmecorp.com", "name": "John Doe", "company": "Acme Corporation", "linkedin_url": "https://linkedin.com/in/johndoe" }' Successful Response: { "success": true, "workflow": "AI Contact Enrichment", "timestamp": "2025-01-14T12:00:00.000Z" } 📥 Expected Payload Format The webhook accepts JSON with basic contact information: Minimal Input { "email": "string (required or name required)", "name": "string (required or email required)" } Recommended Input { "email": "string", "name": "string", "company": "string", "job_title": "string", "linkedin_url": "string", "phone": "string", "location": "string", "website": "string" } Complete Input Example { "email": "sarah.chen@techstartup.io", "name": "Sarah Chen", "company": "TechStartup Inc.", "job_title": "VP of Marketing", "linkedin_url": "https://linkedin.com/in/sarahchen", "phone": "+1-555-0123", "location": "San Francisco, CA", "website": "https://techstartup.io", "industry": "B2B SaaS", "company_size": "50-200 employees", "notes": "Met at SaaS conference 2024" } Field Guidelines: At minimum, provide either email or name More input fields = better AI enrichment quality Include linkedin_url for best results company helps with firmographic enrichment Any additional context improves accuracy 🔄 Workflow Flow Webhook Trigger: Receives basic contact information from your application, form, or CRM Process Data: Adds unique ID and timestamp to the incoming data Prepare AI Request: Configures AI provider settings from environment variables Call AI API: Sends contact data to AI with enrichment prompt Save to Supabase: Archives original data and enrichment results Format Response: Returns success confirmation 🎯 Customization Tips Enhance AI Prompts for Better Enrichment Modify the "Prepare AI Request" node to customize enrichment: // Enhanced prompt for contact enrichment const systemPrompt = `You are an expert sales intelligence analyst. Analyze the provided contact information and generate a comprehensive enrichment including: INFERRED DETAILS: Fill in missing information based on available data Full job title and seniority level Department and reporting structure Years of experience (estimated) Professional background COMPANY INSIGHTS: If company name provided Industry and sub-industry Company size and revenue (estimated) Key products/services Recent news or developments BUYER PERSONA: Create a detailed profile Primary responsibilities Likely pain points and challenges Key priorities and goals Decision-making authority Budget influence level ENGAGEMENT STRATEGY: Provide outreach recommendations Best communication channels Optimal outreach timing Key talking points Personalization suggestions Content interests LEAD SCORE: Rate 1-10 based on: Fit for product/service (specify your ICP) Seniority and decision power Company size and maturity Engagement potential Return as structured JSON with clear sections.`; const userMessage = Contact Information:\n${JSON.stringify($json.data, null, 2)}; const aiConfig = { provider: $env.AI_PROVIDER || 'openai', apiKey: $env.AI_API_KEY, model: $env.AI_MODEL || 'gpt-3.5-turbo', endpoint: $env.AI_ENDPOINT, messages: [ { role: 'system', content: systemPrompt }, { role: 'user', content: userMessage } ] }; return { json: { aiConfig, data: $json } }; Add External Data Sources Enhance enrichment with third-party APIs: After "Process Data" node, add: Clearbit/Hunter.io Node: Get verified company data LinkedIn API: Pull professional information Company Database: Query internal customer data Web Scraping: Extract data from company websites Then merge all data before AI enrichment for best results Connect to Your CRM Auto-update contacts after enrichment: Salesforce Integration: // Add after "Call AI API" node // Update Salesforce contact with enriched data const enrichedData = JSON.parse($json.ai_response); return { json: { contactId: $json.data.salesforce_id, updates: { Description: enrichedData.buyer_persona, Custom_Score__c: enrichedData.lead_score, Pain_Points__c: enrichedData.pain_points } } }; HubSpot Integration: Add HubSpot node to update contact properties Map enriched fields to custom HubSpot properties Pipedrive Integration: Use Pipedrive node to update person records Add custom fields for AI insights Implement Lead Scoring Add scoring logic after enrichment: // Calculate lead score based on enrichment const enrichment = JSON.parse($json.ai_response); let score = 0; // Job title scoring if (enrichment.seniority === 'C-Level') score += 30; else if (enrichment.seniority === 'VP/Director') score += 20; else if (enrichment.seniority === 'Manager') score += 10; // Company size scoring if (enrichment.company_size === 'Enterprise') score += 25; else if (enrichment.company_size === 'Mid-Market') score += 15; // Decision authority scoring if (enrichment.decision_authority === 'High') score += 25; else if (enrichment.decision_authority === 'Medium') score += 15; // Budget influence if (enrichment.budget_influence === 'Direct') score += 20; return { json: { ...enrichment, lead_score: score } }; Add Compliance Checks Insert before AI processing: // Check for opt-out or compliance flags const email = $json.email.toLowerCase(); // Check against suppression list const suppressedDomains = ['competitor.com', 'spam.com']; const domain = email.split('@')[1]; if (suppressedDomains.includes(domain)) { throw new Error('Contact on suppression list'); } // Verify email format const emailRegex = /^+@+\.+$/; if (!emailRegex.test(email)) { throw new Error('Invalid email format'); } return { json: $json }; Batch Enrichment Process multiple contacts: Add Spreadsheet File trigger instead of webhook Add Split In Batches node (process 10-20 at a time) Run enrichment for each contact Combine results and export to CSV 🛠️ Troubleshooting Common Issues Issue: "Enrichment is too generic" Solution**: Provide more input data (company, job title, LinkedIn) Use GPT-4 or Claude models for better inference Enhance the system prompt with specific instructions Issue: "AI_API_KEY is undefined" Solution**: Ensure environment variables are set correctly Verify variable names match exactly (case-sensitive) Issue: "Enrichment contradicts actual data" Solution**: AI makes inferences - always validate critical information Add validation step to check enriched data against known facts Use external APIs for verification Issue: "Too slow for real-time use" Solution**: Implement queue system for async processing Use faster models (gpt-3.5-turbo) for speed Process in batches during off-peak hours Issue: "Supabase credentials not found" Solution**: Check credential name matches exactly: "Supabase API" Verify Supabase URL and API key are correct Debugging Tips Test with known contacts first to validate accuracy Compare AI enrichment against actual data Check execution logs for API errors Start with minimal prompt, then enhance gradually Use "Execute Node" to test individual steps 📊 Analyzing Enriched Data Query and analyze your enriched contacts: -- Get all enriched contacts SELECT * FROM enriched_contacts ORDER BY created_at DESC; -- Find high-value leads (assuming scoring implemented) SELECT email, name, company, ai_response->>'lead_score' as score FROM enriched_contacts WHERE (ai_response->>'lead_score')::int > 70 ORDER BY (ai_response->>'lead_score')::int DESC; -- Analyze enrichment by company SELECT data->>'company' as company, COUNT(*) as contact_count, AVG((ai_response->>'lead_score')::int) as avg_score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' AND ai_response->>'lead_score' IS NOT NULL GROUP BY data->>'company' ORDER BY contact_count DESC; -- Find contacts needing follow-up SELECT email, name, ai_response->>'engagement_strategy' as strategy, created_at FROM enriched_contacts WHERE created_at > NOW() - INTERVAL '7 days' ORDER BY created_at DESC; Export Enriched Data -- Export to CSV COPY ( SELECT data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response->>'job_title' as enriched_title, ai_response->>'seniority' as seniority, ai_response->>'lead_score' as score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ) TO '/tmp/enriched_contacts.csv' WITH CSV HEADER; 📈 Integration Ideas Form Integration Automatically enrich new leads from forms: Typeform**: Trigger on form submission Google Forms**: Use Google Sheets trigger Calendly**: Enrich after meeting booking Webflow Forms**: Webhook trigger from form CRM Integration Real-time enrichment as contacts enter CRM: Salesforce**: Trigger on new lead/contact creation HubSpot**: Enrich on form submission or import Pipedrive**: Auto-enrich new persons Close**: Webhook on lead creation Email Tools Enhance cold outreach campaigns: Instantly.ai**: Enrich before campaign launch Lemlist**: Generate personalization variables Apollo.io**: Supplement with AI insights Mailshake**: Enrich prospect lists Marketing Automation Power ABM and segmentation: Marketo**: Enrich leads for scoring Pardot**: Enhance prospect profiles ActiveCampaign**: Personalization data Klaviyo**: E-commerce customer insights Slack Integration Team notifications and collaboration: Send enrichment summaries to sales channel Notify reps of high-value leads Share persona insights with marketing Alert on key account contacts 🔒 Security & Compliance Best Practices Data Protection Encrypt Sensitive Data: Use environment variables for all credentials Access Control: Limit webhook access with authentication Data Retention: Set automatic deletion policies in Supabase Audit Logging: Track all enrichment activities Privacy Compliance GDPR Compliance: Get consent before enriching personal data Allow contacts to request data deletion Document legal basis for processing CCPA Compliance: Honor do-not-sell requests Data Minimization: Only enrich necessary fields Right to Access: Allow contacts to view enriched data AI Ethics Bias Awareness: Review AI inferences for bias Accuracy Validation: Verify critical information Transparency: Disclose use of AI enrichment Human Oversight: Review before critical decisions 💡 Best Practices Input Data Quality Always include email or full name** as anchor point Add LinkedIn URLs** for 50% better accuracy Provide company name** for firmographic insights Include any known details** - more data = better results Prompt Engineering Be specific** about your ideal customer profile (ICP) Request structured output** (JSON format) Define scoring criteria** that match your business Ask for actionable insights** not just descriptions Post-Enrichment Workflow Always validate** critical information before use Review AI inferences** for accuracy and bias Update CRM promptly** to maintain data freshness Track enrichment ROI** (conversion rates, time saved) Performance Optimization Batch process** during off-peak hours Use appropriate models** (gpt-3.5 for speed, gpt-4 for quality) Cache common enrichments** to reduce API costs Set rate limits** to avoid API throttling 🏷️ Tags sales-automation, lead-enrichment, ai-automation, crm-integration, data-enrichment, contact-intelligence, buyer-personas, lead-scoring, webhook, supabase, openai, anthropic, b2b-sales 📝 License This workflow template is provided as-is for use with n8n. 🤝 Support For questions or issues: n8n Community Forum: https://community.n8n.io n8n Documentation: https://docs.n8n.io 🌟 Example Output Input: { "email": "mike.johnson@cloudtech.com", "name": "Mike Johnson", "company": "CloudTech Solutions", "job_title": "Director of IT" } AI-Generated Enrichment: { "full_title": "Director of Information Technology", "seniority": "Director", "department": "Technology/IT", "experience_years": "10-15", "company_insights": { "industry": "Cloud Computing", "size": "Mid-Market (100-500)", "revenue_estimate": "$10M-$50M" }, "buyer_persona": { "responsibilities": ["Infrastructure management", "Vendor selection", "Security oversight"], "pain_points": ["Legacy system migration", "Cost optimization", "Security compliance"], "priorities": ["Scalability", "Cost reduction", "Team efficiency"] }, "engagement_strategy": { "best_channels": ["Email", "LinkedIn"], "timing": "Tuesday-Thursday, 9-11 AM", "talking_points": ["ROI and cost savings", "Security features", "Ease of implementation"], "personalization": "Reference cloud migration challenges" }, "lead_score": 75 } 🔄 Version History v1.0.0** (2025-01-14): Initial release with universal AI provider support
by Will Stenzel
Creates a new team for a project from webhook form data. When the project is created the current semester is added to it's relation attribute. More info can be found on using this workflow as part of a larger system here.
by Max Tkacz
This workflow shows how to sum multiple items of data, like you would in Excel or Airtable when summing up the total of a column. It uses a Function node with some javascript to perform the aggregation of numeric data. The first node is simply mock data to avoid needing a credential to run the workflow. The second node actually performs the summation - the javascript has various comments in case you need to edit the JS. For example, to sum multiple items of data. Below is an example of the type of data this workflow can sum - so anything that is in a tabular form (Airtable, GSHeets, Postgres etc).
by Shahrukh
AI-Powered Local Lead Generation Workflow with n8n This workflow solves one of the biggest pain points for freelancers, agencies, and SaaS founders—finding accurate local business leads at scale without manual copy-pasting or unreliable scraping tools. Traditional lead generation is time-consuming and prone to errors. This template automates the entire process so you can focus on outreach, not data gathering. ✅ What the Workflow Does Accepts a business type (e.g., plumbers) and city (e.g., Los Angeles) as input Uses AI to generate hyperlocal search terms for full neighborhood coverage Scrapes Google Maps results to extract business details and websites Filters out junk, Google-owned links, and duplicates Scrapes homepage HTML for each business and extracts valid emails using Regex Outputs a clean, deduplicated lead list with business names, websites, and emails 🛠 Everything Runs Inside n8n With: OpenAI** for AI-driven query generation Zyte API** for reliable scraping JavaScript functions** for email extraction Built-in filtering and batching** for clean results 👥 Who is This For? Marketing agencies** doing local outreach Freelancers** offering SEO, design, or lead gen services SaaS founders** targeting SMBs Sales teams** scaling outbound campaigns ✅ Requirements n8n account** (Cloud or self-hosted) OpenAI API key** (stored in n8n credentials) Zyte API key** (stored securely) Basic familiarity with Google Sheets if you want to export results ⚙️ How to Set Up Import the workflow JSON into n8n Go to Credentials in n8n and add OpenAI and Zyte API keys Replace placeholder credential references in the HTTP Request nodes Set your search parameters (business type and city) in the designated Set node Test the workflow with a single search to confirm scraping and email extraction steps Configure batching if you plan to scrape multiple neighborhoods Add an output step (Google Sheets, Airtable, or CRM) to store your leads 🔧 How to Customize Update the OpenAI prompt for different search formats (e.g., service + zip code) Adjust the Regex pattern in the JavaScript node for additional email validation rules Add extra filtering logic for niche-specific keywords Integrate with Instantly, HubSpot, or any email-sending tool for full automation
by Jan Oberhauser
Receives data from an incoming HTTP Request (set up to use respond to webhook node) Create dummy data Convert JSON to XML which gets returned Respond to Webhook which returns the data and the content type of the data