by Daniel Rosehill
Voice Note Context Extraction Pipeline with AI Agent & Vector Storage This n8n template demonstrates how to automatically extract and store contextual information from voice notes using AI agents and vector databases for future retrieval. How it works Webhook trigger** receives voice note data including title, transcript, and timestamp from external services (example here: voicenotes.com) Field extraction** isolates the key data fields (title, transcript, timestamp) for processing AI Context Agent** processes the transcript to extract meaningful context while: Correcting speech-to-text errors Converting first-person references to third-person facts Filtering out casual conversation and focusing on significant information Output formatting** structures the extracted context with timestamps for embedding File conversion** prepares the context data for vector storage Vector embedding** uses OpenAI embeddings to create searchable representations Milvus storage** stores the embedded context for future retrieval in RAG applications How to use Configure the webhook endpoint to receive data from your voice note service Set up credentials for OpenRouter (LLM), OpenAI (embeddings), and Milvus (vector storage) Customize the AI agent's system prompt to match your context extraction needs The workflow automatically processes incoming voice notes and stores extracted context Requirements OpenRouter account for LLM access OpenAI API key for embeddings Milvus vector database (cloud or self-hosted) Voice note service with webhook capabilities (e.g., Voicenotes.com) Customizing this workflow Modify the context extraction prompt** to focus on specific types of information (preferences, facts, relationships) Add filtering logic** to process only voice notes with specific tags or keywords Integrate with other storage** systems like Pinecone, Weaviate, or local vector databases Connect to RAG systems** to use the stored context for enhanced AI conversations Add notification nodes** to confirm successful context extraction and storage Use cases Personal AI assistant** that remembers your preferences and context from voice notes Knowledge management** system for capturing insights from recorded thoughts Content creation** pipeline that extracts key themes from voice recordings Research assistant** that builds context from interview transcripts or meeting notes
by Rahul Joshi
📊 Description Most career advice is generic. This workflow builds a fully personalized AI coaching system that remembers every user, adapts to their career stage and goals, detects what kind of help they need, and gets more contextual with every conversation. It is not a simple chatbot — it is a structured coaching engine with user profiling, conversation memory, intent routing, and proactive weekly outreach. Built for women's communities, coaching platforms, HR teams, and edtech creators who want to deliver real personalized career support at scale without hiring a team of coaches. What This Workflow Does 💬 Opens a chat interface where users can start talking immediately — no signup required 🧮 Detects if the user is new or returning on every message using Google Sheets as a user database 📋 Walks new users through a 4-step onboarding — name, career stage, biggest challenge, and monthly goal 🗂️ Stores and updates every user profile in Google Sheets with full onboarding state tracking 🔍 Detects intent from every coaching message across 6 categories — salary negotiation, interview prep, career change, leadership, confidence building, and work-life balance 🤖 Routes each message to a topic-specific GPT-4o system prompt tailored to that coaching category 💬 Loads the last 5 conversations from history to give GPT-4o full context before generating a response ✅ Responds with personalized advice, one actionable step for today, and a follow-up question to keep momentum 📝 Logs every conversation to Google Sheets with timestamp, message, intent, and AI response 📧 Sends every fully onboarded user a personalized weekly check-in email every Sunday with a weekly challenge, progress acknowledgment, and motivational quote from a woman leader Key Benefits ✅ Full conversation memory — every session builds on the last ✅ Intent detection across 6 coaching categories — not one generic prompt ✅ User profiling — advice is always tailored to their stage, challenge, and goal ✅ Proactive weekly outreach — not just reactive coaching ✅ Complete audit trail — every conversation logged to Google Sheets ✅ Works for any number of users simultaneously via session-based identification ✅ No login or signup needed — just open the chat URL and start How It Works SW1 — Onboarding Every message hits the Chat Trigger and gets routed through the onboarding engine. The workflow reads all users from Google Sheets and matches the current session ID. If the user is new they go through 4 onboarding steps one message at a time — name, career stage with a numbered menu, biggest challenge with a numbered menu, and their monthly goal in their own words. Numbered responses are automatically mapped to their full label so 2 becomes Early Career (1-3 years) and 5 becomes Work-life balance. Every step is saved and the onboarding state is tracked so users can return to the same conversation and pick up exactly where they left off. SW2 — Coaching Engine Once onboarding is complete every subsequent message goes straight to the coaching engine. The workflow reads the user's last 5 conversations from the Conversation Log sheet for context. It then scans the message for intent keywords across 6 categories and selects the matching system prompt. GPT-4o receives the full user profile, conversation history, and topic-specific coaching instructions before generating a response. Every response includes a main coaching answer, one specific action the user can take today, and a follow-up question to continue the conversation. The full exchange is logged to the Conversation Log sheet. SW3 — Weekly Check-in Every Sunday at 10AM the workflow reads all fully onboarded users, pulls their recent conversation topics from the log, and generates a personalized weekly check-in email for each one via GPT-4o. The email includes a warm personalized greeting, acknowledgment of their progress, a concrete weekly challenge tied to their goal, and a motivational quote from a real woman leader relevant to their situation. Emails are sent via Gmail and every send is logged to the Weekly Checkins sheet. Features n8n Chat Trigger with public URL — shareable with any user Session-based user identification — no login required 4-step guided onboarding with numbered menu options Numbered response to label mapping for clean data storage New vs returning user detection on every message Google Sheets as full user database and conversation memory 6-category intent detection engine with keyword matching Topic-specific GPT-4o system prompts per coaching category Last 5 message context window passed to every GPT call Structured JSON responses — advice, action step, follow-up question Weekly Sunday proactive check-in via Gmail Personalized HTML email with challenge box and quote section Full logging across 3 sheets — User Profiles, Conversation Log, Weekly Checkins Requirements OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection A configured Google Sheet with 3 sheets as above Setup Steps Create a Google Sheet called "AI Confidence Coach" with 3 sheets — User Profiles, Conversation Log, Weekly Checkins Paste your Sheet ID into all Google Sheets nodes Add your Google Sheets OAuth2 credentials Add your OpenAI API key Add your Gmail OAuth2 credentials Add your email as fallback in the Send Weekly Checkin Email node Activate the workflow and copy the Chat Trigger URL Open the chat URL and test the full onboarding flow with 5 messages Send a coaching question and confirm GPT-4o responds with personalized advice Add your email to the User Profiles sheet and run SW3 manually to test the weekly check-in email Share the chat URL with your users — the workflow runs itself from here Target Audience 🤖 Women's communities and career platforms who want to offer AI coaching at scale 💼 HR teams building internal confidence and career development tools 🎓 Edtech creators running women's upskilling and mentorship programs 🔧 Automation agencies building AI coaching products for clients
by Wassim Abid
Build a fully local RAG chatbot using Ollama that works without tool calling — ideal for smaller open-source models like Qwen that don't support native function calls. This template lets you run a private, self-hosted AI assistant with retrieval-augmented generation using only your own hardware. How it works A Webhook receives the user's chat message A small classifier LLM (Qwen 7B) analyzes the input and decides: is this small talk, or a real question that needs the knowledge base? For small talk, a dedicated AI agent responds conversationally with chat memory For real questions, the classifier generates focused sub-queries, which are sent through a loop-based RAG pipeline: Each sub-query is embedded using BGE-M3 and matched against a Postgres PGVector store Results are filtered by a relevance score threshold (>0.4) Chunks are aggregated and deduplicated across all sub-queries An Answer Generator agent (Qwen 14B) produces a sourced answer using a strict 3-step format: short answer → sources → follow-up question Both paths use Postgres-backed chat memory for multi-turn conversations A post-processing step removes <think> tags that some reasoning models produce Set up steps Install Ollama and pull the required models: ollama pull qwen2.5:7b (classifier + small talk) ollama pull qwen3:14b (answer generation) ollama pull bge-m3 (embeddings) Set up PostgreSQL with the pgvector extension enabled Create your vector store — ingest your documents into the PGVector store using BGE-M3 embeddings (you can use n8n's built-in document loaders for this) Configure credentials in n8n: Ollama connection (default: http://localhost:11434) PostgreSQL connection for both chat memory and vector store Customize the webhook path and connect it to your frontend or API client Optional: Adjust the relevance score threshold, swap models for larger/smaller ones, or modify the system prompts to match your use case
by Milo Bravo
AI YouTube Trend Intelligence Report: YouTube API + GPT-4o + PDF Dashboard Who is this for? AI creators, marketers, agencies, and researchers tracking YouTube trends who need weekly high-signal insights without 4+ hours manual research. **What problem is this workflow solving? Trend hunting is exhausting:** Scanning 500+ videos across keywords Manual engagement calculations No automated filtering or analysis Scattered spreadsheets vs polished reports This workflow auto-discovers top videos, ranks by engagement, and delivers branded PDF + Sheets dashboard. What this workflow does Trigger: Form input (keywords, days back) or weekly cron YouTube API: Searches 10 keywords → ~500 videos (past 7 days) Ranking: Views + engagement rates → top performers Google Sheets: Exports channels/videos/keywords/stats GPT-4o: Analyzes trends → content recommendations PDF.co: HTML charts → branded PDF report Gmail: Delivers to inbox Setup:(5 minutes) YouTube Data API v3 key (HTTP Query Auth) Google Sheets OAuth2 for exports OpenAI API (GPT-4o-mini) PDF.co for HTML-to-PDF Gmail OAuth2 + recipient email Fully configurable env vars—no hardcoded IDs. How to customize: Edit 10-term list for your niche Filters: Adjust min views (1k), engagement (2%) Schedule: Daily/weekly cron Output: Swap Gmail for Slack/Notion Scale: 1000s videos/month ROI: 4+ hours saved weekly 20% higher content performance Automated competitive intel Zero manual spreadsheet work Need help customizing? Contact me for consulting and support: LinkedIn / **[Message](https://tally.so/r/E Keywords: YouTube trend analysis, AI YouTube research, YouTube analytics automation, content trend tracker, video engagement ranking, YouTube API n8n, weekly YouTube report, YouTube keyword monitoring
by n8n Automation Expert | Template Creator | 2+ Years Experience
🔗 Automated Blockchain Transaction Audit System Transform your blockchain compliance workflow with this enterprise-grade automation that monitors transactions across Ethereum and Solana networks, automatically generates professional audit reports, and maintains complete documentation trails. 🚀 What This Workflow Does This comprehensive automation system: 📊 Multi-Chain Monitoring**: Real-time transaction tracking for Ethereum (via Alchemy API) and Solana networks 🤖 AI-Powered Risk Analysis**: Intelligent scoring algorithm that evaluates transaction risk (0-100 scale) 📄 Automated PDF Generation**: Professional audit reports created instantly using APITemplate.io ☁️ Cloud Storage Integration**: Seamless uploads to Google Drive with organized folder structure 📋 Database Management**: Automatic Notion database entries for complete audit trail tracking 📧 Smart Notifications**: Multi-channel alerts to finance teams with detailed transaction summaries 🔒 Compliance Verification**: Built-in KYC/AML checks and regulatory compliance monitoring 💼 Perfect For FinTech Companies** managing blockchain transactions DeFi Protocols** requiring audit documentation Enterprise Finance Teams** handling crypto compliance Blockchain Auditors** automating report generation Compliance Officers** tracking regulatory requirements 🛠 Key Integrations Alchemy API** - Ethereum transaction monitoring Solana RPC** - Native Solana network access APITemplate.io** - Professional PDF report generation Google Drive** - Secure cloud document storage Notion** - Comprehensive audit database Email/SMTP** - Multi-recipient notification system Etherscan/Solscan** - Smart contract verification ⚡ Technical Highlights 10 Optimized Nodes** with parallel processing capabilities Sub-30 Second Processing** for complete audit cycles Enterprise Security** with credential management Error Handling** with automatic retry mechanisms Scalable Architecture** supporting 1000+ transactions/hour Risk Scoring Algorithm** with customizable parameters 📊 Business Impact 80% Cost Reduction** in manual audit processes 95% Error Elimination** through automation 100% Compliance Coverage** with immutable audit trails 70% Time Savings** for finance teams 🔧 Setup Requirements Before using this workflow, ensure you have: Alchemy API key for Ethereum monitoring APITemplate.io account with audit report template Google Drive service account with folder permissions Notion workspace with configured audit database SMTP credentials for email notifications Etherscan API key for contract verification 📈 Use Cases Transaction Compliance Monitoring**: Automatic flagging of high-risk transactions Regulatory Reporting**: Scheduled audit report generation for authorities Internal Auditing**: Complete documentation for financial reviews Risk Management**: Real-time scoring and alert systems Multi-Chain Portfolio Tracking**: Unified reporting across blockchain networks 🎯 Why Choose This Workflow This isn't just another blockchain monitor - it's a complete document management ecosystem that transforms raw blockchain data into professional, compliant documentation while maintaining enterprise-grade security and scalability. Perfect for organizations serious about blockchain compliance and audit trail management! 🚀 🔄 Workflow Process Webhook Trigger receives blockchain event Parallel Monitoring queries Ethereum & Solana networks AI Processing analyzes transaction data and calculates risk Document Generation creates professional PDF audit reports Multi-Channel Distribution uploads to Drive, logs in Notion, sends notifications Verification & Response confirms all processes completed successfully Ready to automate your blockchain compliance? Import this workflow and transform your audit processes today! ✨
by Sabrina Ramonov 🍄
Description Fully automated pipeline where you send an email to yourself with a rough idea (subject contains “thread”), n8n’s Gmail trigger picks it up, OpenAI ChatGPT rewrites/apply a viral-thread template, and Blotato posts the long-form thread to X/Twitter, Bluesky, and Meta Threads (optionally schedule or include images/videos). Template is easily extensible to other social platforms. Who Is This For? Digital creators, content marketers, social media managers, agencies, entrepreneurs, and influencers who want fast, automated long-form thread posting. 📄 Documentation Full Step-by-Step Tutorial How It Works 1. Trigger: Gmail Connect your Gmail account. n8n monitors emails sent from you and filters for subjects containing the word “thread”. 2. AI Thread Writer: OpenAI ChatGPT Connect your OpenAI account. Prompt ChatGPT to clean up your draft and format a long-form viral thread. 3. Publish to Social Media via Blotato Connect your Blotato account and choose social accounts (X/Twitter, Threads, Bluesky). Schedule or post immediately. Supports optional image/video URLs via a mediaUrls array (publicly accessible URLs). Example email to trigger the workflow: Email Subject: thread Email Body: I'm obsessed with voice AI apps. Super Whisper is my current favorite because it runs locally and keeps my voice data private. I talk to it instead of typing. Way faster. Setup & Required Accounts Gmail account (used as trigger) n8n Gmail OAuth doc: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-single-service OpenAI Platform account (access to ChatGPT) Blotato account: https://blotato.com Generate Blotato API Key: Settings > API > Generate API Key (paid feature only) Sign in to Blotato and create an API Key (required for posting) n8n: Ensure "Verified Community Nodes" enabled in your n8n Admin Panel Install the "Blotato" community node and create Blotato credentials Optional: Media & Style Tweaks Attach images/videos: insert publicly accessible URLs into the mediaUrls array (advanced). To emulate a specific tone/structure, provide ChatGPT examples of your favorite viral threads or replace the example viral-thread prompt with your preferred example. Voice-to-text tip: record ideas (e.g., Superwhispr) and send the transcript by email — ChatGPT will clean it up. Tips & Tricks During testing, use “Scheduled Time” in Blotato instead of immediate posting to preview before going live. Start with a single social platform while testing. If your script is long or includes media, processing may take longer. Many users prefer speaking their ideas (voice notes) then letting AI edit — faster than typing. Troubleshooting Check your Blotato API Dashboard to inspect each request, response, and error. Confirm API key validity, n8n node credentials, and that emails sent have subject containing “thread”. Need Help? In the Blotato web app, click the orange support button in the bottom right to access Blotato support.
by Fahmi Fahreza
Match Resumes to Jobs Automatically with Gemini AI and Decodo Scraping Sign up for Decodo HERE for Discount This automation intelligently connects candidate profiles to job opportunities. It takes an intake form with a short summary, resume link, and optional LinkedIn profile, then enriches the data using Decodo and Gemini. The workflow analyzes skills, experience, and role relevance, ranks top matches, and emails a polished HTML report directly to your inbox—saving hours of manual review and matching effort. Who’s it for? This template is designed for recruiters, hiring managers, and talent operations teams who handle large candidate volumes and want faster, more accurate shortlisting. It’s also helpful for job seekers or career coaches who wish to identify high-fit openings automatically using structured AI analysis. How it works Receive an intake form containing a candidate’s resume, summary, and LinkedIn URL. Parse and summarize the resume with Gemini for core skills and experience. Enrich the data using Decodo scraping to gather extra profile details. Merge insights and rank job matches from Decodo’s job data. Generate an HTML shortlist and email it automatically through Gmail. How to set up Connect credentials for Gmail, Google Gemini, and Decodo. Update the Webhook path and test your form connection. Customize variables such as location or role preferences. Enable Send as HTML in the Gmail node for clean reports. Publish as self-hosted if community nodes are included.
by InfyOm Technologies
✅ What problem does this workflow solve? Manually entering bank statements into QuickBooks is one of the most time-consuming and error-prone accounting tasks. Accountants often spend hours converting PDF bank statements into individual income and expense entries—risking missed transactions, incorrect categorization, and inconsistencies. This workflow fully automates the end-to-end process: from uploading a (even password-protected) bank statement PDF to creating accurate Sales Receipts and Expenses directly inside QuickBooks, using AI and n8n. ⚙️ What does this workflow do? Accepts bank statement PDFs via a secure web form Decrypts and extracts text from password-protected PDFs Uses AI to extract structured transactions from raw statement text Validates AI output against a strict JSON schema Processes each transaction individually for reliability Automatically routes transactions based on type: Credits → Income (Sales Receipts) Debits → Expenses Intelligently creates missing QuickBooks entities: Customers Vendors Items Expense categories Posts transactions directly into QuickBooks Eliminates manual accounting entry completely 🧠 How It Works – End-to-End Flow 1️⃣ Secure Bank Statement Upload A user uploads a bank statement PDF (normal or password-protected) using an n8n Form Trigger. 2️⃣ PDF Decryption & Text Extraction The workflow: Unlocks the PDF (if password-protected) Extracts the full statement text using the Extract PDF Text node 3️⃣ AI-Powered Transaction Extraction An AI Agent reads the raw bank statement text and extracts every transaction with high precision: Transaction type (credit / debit) Date (YYYY-MM-DD)` Amount Description Reference number Payee / counterparty 4️⃣ Strict JSON Validation AI output is validated using a Structured Output Parser to ensure: No malformed data Schema-safe transactions Reliable downstream processing 5️⃣ Transaction Processing Loop Each transaction is processed individually using batching and loop control to guarantee accuracy. 6️⃣ Smart Routing: Credit vs Debit A switch node routes transactions automatically: Credits** → Income flow Debits** → Expense flow 💰 Credit Path – Income Automation For every credit transaction: Checks if a matching QuickBooks item exists Creates missing service items automatically Finds or creates the customer Builds a Sales Receipt payload Posts the transaction into QuickBooks as income 💸 Debit Path – Expense Automation For every debit transaction: Searches for the vendor by payee name Creates the vendor if missing Loads expense categories from the Chart of Accounts Auto-maps transactions to the correct category using keyword logic Builds a Purchase (Expense) payload Posts the expense into QuickBooks 🧠 Built-In QuickBooks Intelligence This workflow intelligently handles: Duplicate prevention Missing customer/vendor creation Automatic item mapping Category resolution using transaction descriptions Consistent accounting structure across all entries 📊 Results & Benefits ✅ Zero manual bank statement entry ✅ Works with password-protected PDFs ✅ Handles both income and expenses automatically ✅ Creates clean, structured QuickBooks records ✅ Saves dozens of accounting hours every month ✅ Reduces human error and reconciliation issues 🔧 Setup Requirements Connect your QuickBooks Online account (Sandbox or Production) Add OpenRouter / AI model credentials for transaction extraction Update the PDF password (if required) in the extraction node Replace company_id in QuickBooks API endpoints Verify QuickBooks account IDs (bank, income, expense) Test with a sample bank statement PDF 👤 Who is this for? This workflow is ideal for: 📒 Accountants & bookkeeping firms 🏢 Businesses managing frequent bank statements 💼 Finance teams using QuickBooks Online 🤖 Automation-first accounting systems
by Weiser22
Shopify Multilingual Product Copy with n8n & Gemini 2.5 Flash-Lite Use for free Created by <Weiser22> · Last update 2025-09-02 Categories: E-commerce, Product Content, Translation, Computer Vision Description Generate language-specific Shopify product copy (ES, DE, EN, FR, IT, PT) from each product’s main image and metadata. The workflow performs a vision analysis to extract objective, verifiable details, then produces product names, descriptions, and handles per language, and stores the results in Google Sheets for review or publishing. Good to know Model:** models/gemini-2.5-flash-lite (supports image input). Confirm pricing/limits in your account before scaling. Image requirement:** products should have images[0].src; add a fallback if some products lack a primary image. Sheets mapping:** the sheet node uses Auto-map; ensure your matching column aligns with the field you emit (id vs product_id). Strict output:** the Agent enforces a multilingual JSON contract (es,de,en,fr,it,pt), each with shopify_product_name, shopify_description, handle. How it works Manual Trigger:** start a test run on demand. Get many products (Shopify):** fetch products and their images. Analyze image (Gemini Vision):** send images[0].src with an objective, 3–5 sentence prompt. AI Agent (Gemini Chat):** merge Shopify fields + vision text under anti-hallucination rules and a strict JSON schema. Structured Output Parser:** validates the exact JSON shape. Expand Languages & Sanitize (Code):** split into 6 items and normalize handles/HTML content as needed. Append row in sheet (Google Sheets):** add one row per language to your spreadsheet. Requirements Shopify Access Token with product read permissions. Google AI Studio (Gemini) API key for Vision + Chat Model nodes. Google Sheets credentials (OAuth or Service Account) with access to the target spreadsheet. How to use Connect credentials: Shopify, Gemini (same key for Vision and Chat), and Google Sheets. Configure nodes: Get many products: adjust limit/filters. Analyze image: verify ={{ $json.images[0].src }} resolves to a public image URL. AI Agent & Parser: keep the strict JSON contract as provided. Code (Expand & Sanitize): emits product_id, lang, handle, shopify_product_name, shopify_description, base_handle_es. Google Sheets (Append): set documentId and tab name; confirm the matching column. Run a test: execute the workflow and confirm six rows per product (one per language) appear in the sheet. Data contract (Agent output) { "es": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "de": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "en": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "fr": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "it": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "pt": {"shopify_product_name": "", "shopify_description": "", "handle": ""} } Customising this workflow Publish to Shopify:** after review in Sheets, add a product.update step to write finalized copy/handles. Handle policy:** tweak slug rules (diacritics, separators, max length) in the Code node to match store conventions. No-image fallback:** add an IF/Switch to skip vision when images[0].src is missing and generate copy from title + body only. Tone/length:** adjust temperature and token limits on the Chat Model for brand-fit. Troubleshooting No rows in Sheets:** confirm spreadsheet ID, tab name, Auto-map status, and that the matching column matches your emitted field. Vision errors:** ensure images[0].src is reachable. Parser failures:* the Agent must return *bare JSON** with the six root keys and three fields per language—no extra text.
by Emilio Loewenstein
This workflow automates customer email support by combining Gmail, AI classification, and a knowledge base to provide instant, accurate, and friendly responses. It’s designed for businesses that want to improve customer satisfaction while reducing manual workload. 🚀 How it Works Gmail Trigger The workflow listens for new incoming Gmail messages. Text Classification Each email is classified using AI as either Customer Support or Other. If it’s Other, the workflow stops. If it’s Customer Support, the email continues to the AI agent. AI Agent with Knowledge Base The AI agent: Reads the customer’s message. Searches the Pinecone Knowledge Base for FAQs and policies. Generates a helpful, polite, and detailed reply using an OpenRouter model. Signs off as Mrs. Helpful from Tech Haven Solutions. Reply to Gmail The drafted email is automatically sent back to the customer. 💡 Value ✅ Save Time – No more manual triaging and drafting of replies. ✅ Consistency – Every answer is based on your own FAQ/policies. ✅ Customer Satisfaction – Fast, friendly, and accurate responses 24/7. ✅ Scalable – Handle higher email volume without scaling headcount. 🔑 Credentials Needed To use this workflow, connect the following accounts: Gmail OAuth2** → for receiving and sending emails. OpenRouter API** → for text classification and AI-generated replies. OpenAI API** → for embeddings (to connect FAQs with AI). Pinecone API** → for storing and retrieving knowledge base content. 🛠 Example Use Case A customer writes: > “Hi, I placed an order last week but haven’t received a shipping confirmation yet. Can you check the status?” The workflow will: Detect it’s a support-related email. Retrieve order policy information from the knowledge base. Generate a friendly response explaining order tracking steps. Automatically reply to the customer in Gmail. ⚡️ Setup Instructions Import this workflow into your n8n instance. Connect your Gmail, OpenRouter, OpenAI, and Pinecone credentials. Populate your Pinecone knowledge base with FAQs and policies. Activate the workflow. From now on, all support-related emails will be automatically answered by your AI-powered support agent.
by Cheng Siong Chin
How It Works This workflow automates AI decision governance by tracing, assessing, and auditing automated decisions for risk and compliance. Designed for AI governance officers, compliance teams, and regulated industries, it addresses the critical need for explainability and accountability in AI-driven decisions. A schedule trigger initiates a simulated decision request, which is processed by a Decision Trace Agent to extract metadata. A Governance Agent then delegates to Risk Assessment and Compliance Checker sub-agents. Decisions are routed by risk level—high-risk cases trigger Slack alerts and are stored separately—while all outcomes are merged into a governance report sent via email, with full audit trail and explainability report stored for regulatory review. Setup Steps Set schedule trigger interval to match governance audit frequency. Add OpenAI API credentials to all OpenAI Model nodes. Configure Slack credentials and set high-risk alert channel. Add Gmail/SMTP credentials to Send Governance Report node. Replace simulated decision request with live AI system webhook. Prerequisites Slack workspace with bot token Gmail or SMTP credentials Google Sheets or database for audit storage Use Cases Regulatory compliance auditing for AI-driven loan or insurance decisions Automated fairness and bias detection in HR or admissions systems Customization Swap simulated input with live AI system API or decision log feed Add sub-agents for fairness, bias, or sector-specific compliance checks Benefits Automates end-to-end AI decision auditing on a schedule Ensures high-risk decisions are flagged and stored instantly
by Krishna Sharma
📄 Smart Lead Capture, Scoring & Slack Alerts This workflow captures new leads from Typeform, checks for duplicates in HubSpot CRM, enriches and scores them, assigns priority tiers (Cold, Warm, Hot), and instantly notifies your sales team in Slack. 🔧 How It Works Typeform Trigger → Monitors form submissions and passes lead details into the workflow. HubSpot Deduplication → Searches HubSpot by email before creating a new record. Conditional Routing → If no match → Creates a new contact in HubSpot. If match found → Updates the existing contact with fresh data. Lead Scoring (Function Node) → Custom JavaScript assigns a score based on your rules (e.g. company email, job title, engagement signals, enrichment data). Tier Assignment → Categorizes the lead as ❄️ Cold, 🌡 Warm, or 🔥 Hot based on score thresholds. Slack Notification → Sends formatted lead alerts to a dedicated sales channel with priority indicators. 👤 Who Is This For? Sales teams who need to prioritize hot leads in real-time. Marketing teams running inbound lead capture campaigns with Typeform. RevOps teams that want custom scoring beyond HubSpot defaults. Founders/SMBs looking to tighten lead-to-revenue pipeline with automation. 💡 Use Case / Problem Solved ❌ Duplicate contacts clogging HubSpot CRM. ❌ Manual lead triage slows down response time. ❌ HubSpot’s default scoring is rigid. ✅ Automates lead creation + scoring + notification in one flow. ✅ Sales teams get immediate Slack alerts with context to act fast. ⚙️ What This Workflow Does Captures lead data directly from Typeform. Cleans & deduplicates contacts before pushing to HubSpot CRM. Scores and categorizes leads via custom logic. Sends structured lead alerts to Slack, tagged by priority. Provides a scalable foundation you can extend with data enrichment (e.g., Clearbit, Apollo). 🛠️ Setup Instructions 🔑 Prerequisites Typeform account with API access → Typeform Developer Docs HubSpot CRM account with API key or OAuth → HubSpot API Docs Slack workspace & API access → Slack API Docs (Optional) n8n automation platform to build & run → n8n Hub 📝 Steps to Configure Typeform Node (Trigger) Connect your Typeform account in n8n. Select the form to track submissions. Fields typically include: first name, last name, email, company, phone. HubSpot Node (Search Contact) Configure a search by email. Route outcomes: Not Found → Create Contact Found → Update Contact HubSpot Node (Create/Update Contact) Map Typeform fields into HubSpot (email, name, phone, company). Ensure you capture both standard and custom properties. Function Node (Lead Scoring) Example JavaScript: // Simple lead scoring example const email = $json.email || ""; let score = 0; if (email.endsWith("@company.com")) score += 30; if ($json.company && $json.company.length > 2) score += 20; if ($json.phone) score += 10; let tier = "❄️ Cold"; if (score >= 60) tier = "🔥 Hot"; else if (score >= 30) tier = "🌡 Warm"; return { ...$json, leadScore: score, leadTier: tier }; Customize rules based on your GTM strategy. Reference → n8n Function Node Docs Slack Node (Send Message) Example Slack message template: 🚀 New Lead Alert! 👤 {{ $json.firstname }} {{ $json.lastname }} 📧 {{ $json.email }} | 🏢 {{ $json.company }} 📊 Score: {{ $json.leadScore }} — {{ $json.leadTier }} Send to dedicated #sales-leads channel. Reference → Slack Node in n8n 📌 Notes & Extensions 🔄 Add enrichment with Clearbit or Apollo.io before scoring. 📊 Use HubSpot workflows to trigger nurturing campaigns for ❄️ Cold leads. ⏱ For 🔥 Hot leads, auto-assign to an SDR using HubSpot deal automation. 🧩 Export data to Google Sheets or Airtable for analytics.