by Deborah
Use n8n to bring data from any API to your AI. This workflow uses the Chat Trigger to provide the chat interface, and the Custom n8n Workflow Tool to call a second workflow that calls the API. The second workflow uses AI functionality to refine the API request based on the user's query. It then makes an API call, and returns the response to the main workflow. This workflow is used in Advanced AI examples | Call an API to fetch data in the documentation. To use this workflow: Load it into your n8n instance. Add your credentials as prompted by the notes. Requires n8n 1.28.0 or above
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks regional sentiment across social media and news outlets, giving you a real-time pulse on how people in a specific area feel about your brand or topic. Overview The automation queries Twitter, Reddit, and major news APIs filtered by geolocation. Bright Data handles location-specific scraping where APIs are limited. OpenAI performs sentiment and keyword extraction, aggregating scores into a daily report stored in Google Sheets and visualized in Data Studio. Tools Used n8n** – Coordinates all steps Bright Data** – Collects geo-targeted data beyond API limits OpenAI** – Runs sentiment analysis and topic modeling Google Sheets** – Houses cleaned sentiment metrics Data Studio / Looker** – Optional dashboard for visualization How to Install Import the Workflow into n8n with the provided .json. Configure Bright Data credentials. Set Up OpenAI API key. Connect Google Sheets and create a destination spreadsheet. Customize Regions & Keywords in the Start node. Use Cases Brand Monitoring**: Measure public opinion in target markets. Political Campaigns**: Gauge voter sentiment by district. Market Entry**: Understand regional attitudes before launching. Crisis Management**: Detect negative spikes early. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #sentimentanalysis #geolocation #brightdata #openai #sociallistening #n8nworkflow #nocode #brandmonitoring
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically scrapes local business directories (Yelp, Yellow Pages, Google Maps, etc.) to build a structured database of prospects. Stop copying listings by hand—get fresh leads delivered straight to Google Sheets. Overview Using Bright Data, the automation fetches business names, contact details, ratings, and categories for a given city or ZIP code. OpenAI cleans and normalizes the data, while duplicate detection ensures each business appears only once. The result is emailed as a CSV and stored in Sheets for easy filtering. Tools Used n8n** – Workflow orchestration Bright Data** – Handles large-scale directory scraping OpenAI** – Performs entity cleanup and deduplication Google Sheets** – Houses the resulting lead list Gmail** – Sends the CSV file to your inbox How to Install Import the Workflow: Load the .json into n8n. Configure Bright Data: Add your credentials. Set Up OpenAI: Enter your API key. Connect Google Sheets & Gmail: Authorize both integrations. Customize Locations & Categories: Adjust parameters in the Start node. Use Cases Local Lead Generation**: Build outreach lists for agencies or SaaS. Market Research**: Analyze density of businesses in a region. Franchise Expansion**: Identify potential partners within a territory. Startup Sales**: Discover SMBs that match your ICP. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #webscraping #localbusiness #brightdata #leadgeneration #n8nworkflow #nocode #businessdirectories #openai
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically discovers and collects information about events and attendee data from event platforms. It saves you time by eliminating the need to manually browse through event listings and provides a centralized database of event information including categories, venues, and attendee feedback. Overview This workflow automatically scrapes event data from 10times.com and other event platforms to extract categories, featured events, attendee feedback, and venue information. It uses Bright Data to access event websites without being blocked and AI to intelligently parse event data into structured format for storage in Google Sheets. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping event websites without being blocked OpenAI**: AI agent for intelligent event data extraction and parsing Google Sheets**: For storing and organizing event information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and specify the target spreadsheet Customize: Adjust the event platform URLs and event criteria you want to monitor Use Cases Event Planners**: Monitor competing events and industry trends Marketing Teams**: Identify events for sponsorship and networking opportunities Business Development**: Find relevant events for lead generation and partnerships Market Research**: Track event attendance patterns and industry insights Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #events #eventdiscovery #brightdata #webscraping #eventplanning #eventscraping #attendeedata #eventmarketing #n8nworkflow #workflow #nocode #eventautomation #eventmonitoring #eventresearch #10times #eventintelligence #venuedata #eventfeedback #eventtracking #eventcalendar #eventanalytics #businessevents #eventorganizer #eventtech #eventindustry #eventcollection #networkingevents #conferencedata
by Yaron Been
This workflow automatically analyzes website conversion funnels to identify optimization opportunities and track user journey performance. It saves you time by eliminating the need to manually analyze funnel metrics and provides detailed insights into conversion bottlenecks and improvement areas. Overview This workflow automatically scrapes website pages to analyze funnel elements including CTAs, tracking scripts, page structure, and conversion paths. It uses Bright Data to access websites without restrictions and AI to intelligently extract funnel data, identify conversion elements, and provide optimization recommendations. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping website pages without being blocked OpenAI**: AI agent for intelligent funnel analysis and optimization insights Google Sheets**: For storing funnel analysis data and recommendations How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your funnel analysis spreadsheet Customize: Define target website URLs and funnel analysis parameters Use Cases Conversion Optimization**: Identify and fix conversion funnel bottlenecks UX Analysis**: Analyze user experience and journey optimization opportunities Competitor Research**: Study competitor funnel strategies and implementations A/B Testing**: Monitor funnel performance changes over time Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #funnelanalysis #conversionoptimization #brightdata #webscraping #uxanalysis #n8nworkflow #workflow #nocode #websiteanalysis #funneloptimization #conversiontracking #userjourney #websiteoptimization #cro #digitalmarketing #funnelalyzer #websiteperformance #conversionanalytics #uxresearch #websitemetrics #funnelmonitoring #performanceanalysis #websiteinsights #conversionfunnel #userexperience #websiteaudit #funneltracking #optimizationanalysis
by Akhil Varma Gadiraju
Workflow: HubSpot Contact Email Validation with Hunter.io Overall Goal This workflow retrieves contacts from HubSpot that have an email address but haven't yet had their email validated by Hunter. It then iterates through each of these contacts, uses Hunter.io to verify their email, updates the contact record in HubSpot with the validation status and date, and finally sends a summary email notification upon completion. How it Works (Step-by-Step Breakdown) Node: "When clicking ‘Test workflow’" (Manual Trigger) Type:** n8n-nodes-base.manualTrigger Purpose:** Start the workflow manually via the n8n interface. Output:** Triggers workflow execution. Node: "HubSpot" (HubSpot) Type:** n8n-nodes-base.hubspot Purpose:** Fetch contacts from HubSpot. Configuration:** Authentication: App Token Operation: Search for contacts Return All: True Filter Groups: Contact HAS_PROPERTY email Contact NOT_HAS_PROPERTY hunter_email_validation_status Output:** List of contact objects. Node: "Loop Over Items" (SplitInBatches) Type:** n8n-nodes-base.splitInBatches Purpose:** Process each contact one-by-one. Configuration:** Options > Reset: false Output:** Output 1 to "Hunter" Output 2 to "Send Email" Node: "Hunter" (Inside the loop) Type:** n8n-nodes-base.hunter Purpose:** Verify email with Hunter.io Configuration:** Operation: Email Verifier Email: {{ $json.properties.email }} Node: "Add Hunter Details (Contact)" (HTTP Request - Inside the loop) Type:** n8n-nodes-base.httpRequest Purpose:** Update HubSpot contact. Configuration:** Method: PATCH URL: https://api.hubapi.com/crm/v3/objects/contacts/{{ $('Loop Over Items').item.json.id }} Headers: Content-Type: application/json Body (JSON): { "properties": { "hunter_email_validation_status": "{{ $json.status }}", "hunter_verification_date": "{{ $now.format('yyyy-MM-dd') }}" } } Node: "Wait" (Inside the loop) Type:** n8n-nodes-base.wait Purpose:** Avoid API rate limits. Configuration:** Wait for 1 second. Node: "Replace Me" (NoOp - Inside the loop) Type:** n8n-nodes-base.noOp Purpose:** Junction node to complete the loop. Node: "Send Email" (After the loop completes) Type:** n8n-nodes-base.emailSend Purpose:** Send summary notification. Configuration:** From Email: test@gmail.com To Email: akhilgadiraju@gmail.com Subject: "Email Verification Completed for Your HubSpot Contacts" HTML: Formatted confirmation message Sticky Notes "HubSpot": Create custom properties (hunter_email_validation_status, hunter_verification_date). "Add Hunter Details": Ensure field names match HubSpot properties. "Wait": Prevent API rate limits. How to Customize It Trigger Replace Manual Trigger with Schedule Trigger (Cron) for automation. Optionally use HubSpot Trigger for new contact events. HubSpot Node Create matching custom properties. Adjust filters and returned properties as needed. Hunter Node Minimal customization needed. HTTP Request Node Update JSON property names if renaming in HubSpot. Customize date format as needed. Wait Node Adjust wait time to balance speed and API safety. Email Node Customize email addresses, subject, and body. Add dynamic contact count with a Set or Function node. Error Handling Add Error Trigger nodes. Use If nodes inside loop to act on certain statuses. Use Cases Clean your email list. Enrich CRM data. Prep verified lists for campaigns. Automate contact hygiene on a schedule. Required Credentials HubSpot App Token Used by: HubSpot node and HTTP Request node Create a Private App in HubSpot with required scopes. Hunter API Used by: Hunter node SMTP Used by: Email Send node Configure host, port, username, and password. Made with ❤️ using n8n by Akhil.
by Akhil Varma Gadiraju
AI-Powered GitHub Commit Reviewer Overview Workflow Name: AI-Powered GitHub Commit Reviewer Author: Akhil Purpose: This n8n workflow triggers on a GitHub push event, fetches commit diffs, formats them into HTML, runs an AI-powered code review using Groq LLM, and sends a detailed review via email. How It Works (Step-by-Step) 1. GitHub Trigger Node Type**: n8n-nodes-base.githubTrigger Purpose**: Initiates the workflow on GitHub push events. Repo**: akhilv77/relevance Output**: JSON with commit and repo details. 2. Parser Node Type**: n8n-nodes-base.set Purpose**: Extracts key info (repo ID, name, commit SHA, file changes). 3. HTTP Request Node Type**: n8n-nodes-base.httpRequest Purpose**: Fetches commit diff details using GitHub API. Auth**: GitHub OAuth2 API. 4. Code (HTML Formatter) Node Type**: n8n-nodes-base.code Purpose**: Formats commit info and diffs into styled HTML. Output**: HTML report of commit details. 5. Groq Chat Model Node Type**: @n8n/n8n-nodes-langchain.lmChatGroq Purpose**: Provides the AI model (llama-3.1-8b-instant). 6. Simple Memory Node Type**: @n8n/n8n-nodes-langchain.memoryBufferWindow Purpose**: Maintains memory context for AI agent. 7. AI Agent Node Type**: @n8n/n8n-nodes-langchain.agent Purpose**: Executes AI-based code review. Prompt**: Reviews for bugs, style, grammar, and security. Outputs styled HTML. 8. Output Parser Node Type**: n8n-nodes-base.code Purpose**: Combines commit HTML with AI review into one HTML block. 9. Gmail Node Type**: n8n-nodes-base.gmail Purpose**: Sends review report via email. Recipient**: akhilgadiraju@gmail.com 10. End Workflow Node Type**: n8n-nodes-base.noOp Purpose**: Marks the end. Customization Tips GitHub Trigger**: Change repo/owner or trigger events. HTTP Request**: Modify endpoint to get specific data. AI Agent**: Update the prompt to focus on different review aspects. Groq Model**: Swap for other supported LLMs if needed. Memory**: Use dynamic session key for per-commit reviews. Email**: Change recipient or email styling. Error Handling Use Error Trigger nodes to handle failures in: GitHub API requests LLM generation Email delivery Use Cases Instant AI-powered feedback on code pushes. Pre-human review suggestions. Security and standards enforcement. Developer onboarding assistance. Required Credentials | Credential | Used By | Notes | |-----------|---------|-------| | GitHub API (ID PSygiwMjdjFDImYb) | GitHub Trigger | PAT with repo and admin:repo_hook | | GitHub OAuth2 API | HTTP Request | OAuth2 token with repo scope | | Groq - Akhil (ID HJl5cdJzjhf727zW) | Groq Chat Model | API Key from GroqCloud | | Gmail OAuth2 - Akhil (ID wqFUFuFpF5eRAp4E) | Gmail | Gmail OAuth2 for sending email | Final Note Made with ❤️ using n8n by Akhil.
by Rajeet Nair
Overview This workflow automates customer support ticket processing using AI-powered analysis, classification, and intelligent routing. It processes incoming tickets from email or webhook, translates messages when needed, analyzes sentiment and urgency, and routes tickets to auto-reply or escalation flows. The system also updates CRM platforms and logs observability metrics for monitoring. This enables faster response times, improved customer experience, and scalable support operations. How It Works Input Sources Receives tickets via: IMAP Email Trigger Webhook endpoint Workflow Configuration Defines: CRM/Helpdesk API endpoint Escalation webhook URL Observability logging endpoint Data Cleaning & Normalization Extracts and cleans HTML content Normalizes ticket data: Ticket ID User email Message content Timestamp Source channel Language Detection & Translation Detects the original language Translates message into English if needed Returns confidence score AI Support Intelligence Classifies ticket into: Sentiment (positive/neutral/negative) Urgency (low → critical) Category (billing, bug, technical, etc.) Generates: Short summary Churn risk score (0–1) Recommended action path Decision Routing Routes tickets based on AI output: Auto Reply → Generate response Escalate / Critical → Send to team Auto Reply Flow AI Reply Generation Drafts professional response using ticket context Keeps tone empathetic and actionable CRM/Helpdesk Update Sends structured ticket data to CRM: Priority Category Sentiment Churn risk Draft reply Escalation Flow Escalation Handling Sends high-priority tickets to support team Includes full ticket context and analysis Observability & Monitoring Metrics Logging Tracks: Response time Escalation status Category & urgency Sentiment & churn risk Sends data to observability endpoint (optional) Setup Instructions Email / Webhook Setup Configure IMAP credentials OR webhook endpoint (support-ticket) AI Model Setup Add Anthropic or OpenAI credentials Connect models to: Translation agent Intelligence agent Reply generator CRM / Helpdesk Integration Set API endpoint URL Configure headers and authentication Escalation Setup Add webhook URL for team notifications (Slack, internal API, etc.) Observability (Optional) Configure logging endpoint for metrics tracking Customize Prompts Adjust system messages for: Translation Classification Reply generation Use Cases AI-powered customer support automation SaaS support ticket triaging Multi-language support systems Helpdesk automation with CRM integration Customer churn risk detection workflows Requirements Anthropic or OpenAI API key Email (IMAP) or webhook source CRM/helpdesk system API Optional observability/logging service n8n instance Key Features Multi-channel ticket ingestion (email + webhook) Automatic language detection and translation AI-based sentiment, urgency, and category classification Intelligent routing (auto-reply vs escalation) AI-generated support replies CRM integration for structured ticket updates Observability and performance tracking Summary A powerful AI-driven support automation workflow that processes, analyzes, and routes customer tickets intelligently. It reduces manual workload, improves response speed, and enables scalable, data-driven support operations.
by Rahul Joshi
Description Automate your weekly social media analytics with this end-to-end AI reporting workflow. 📊🤖 This system collects real-time Twitter (X) and Facebook metrics, merges and validates data, formats it with JavaScript, generates an AI-powered HTML report via GPT-4o, saves structured insights in Notion, and shares visual summaries via Slack and Gmail. Perfect for marketing teams tracking engagement trends and performance growth. 🚀💬 What This Template Does 1️⃣ Starts manually or on-demand to fetch the latest analytics data. 🕹️ 2️⃣ Retrieves follower, engagement, and post metrics from both X (Twitter) and Facebook APIs. 🐦📘 3️⃣ Merges and validates responses to ensure clean, complete datasets. 🔍 4️⃣ Runs custom JavaScript to normalize and format metrics into a unified JSON structure. 🧩 5️⃣ Uses Azure OpenAI GPT-4o to generate a visually rich HTML performance report with tables, emojis, and insights. 🧠📈 6️⃣ Saves the processed analytics into a Notion “Growth Chart” database for centralized trend tracking. 🗂️ 7️⃣ Sends an email summary report to the marketing team, complete with formatted HTML insights. 📧 8️⃣ Posts a concise Slack update comparing platform performance and engagement deltas. 💬 9️⃣ Logs any validation or API errors automatically into Google Sheets for debugging and traceability. 🧾 Key Benefits ✅ Centralizes all social metrics into a single automated flow. ✅ Delivers AI-generated HTML reports ready for email and dashboard embedding. ✅ Reduces manual tracking with Notion and Slack syncs. ✅ Ensures data reliability with built-in validation and error logging. ✅ Gives instant, visual insights for weekly marketing reviews. Features Multi-platform analytics integration (Twitter X + Facebook Graph API). JavaScript node for dynamic data normalization. Azure OpenAI GPT-4o for HTML report generation. Notion database update for long-term trend storage. Slack and Gmail nodes for instant sharing and communication. Automated error capture to Google Sheets for workflow reliability. Visual, emoji-enhanced reporting with HTML formatting and insights. Requirements Twitter OAuth2 API credentials for access to public metrics. Facebook Graph API access token for page insights. Azure OpenAI API key for GPT-4o report generation. Notion API credentials with write access to “Growth Chart” database. Gmail OAuth2 credentials for report dispatch. Slack Bot Token with chat:write permission for posting analytics summaries. Google Sheets OAuth2 credentials for maintaining the error log. Environment Variables TWITTER_API_KEY FACEBOOK_ACCESS_TOKEN AZURE_OPENAI_API_KEY NOTION_GROWTH_DB_ID GMAIL_REPORT_RECIPIENTS SLACK_REPORT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID Target Audience 📈 Marketing and growth teams tracking cross-platform performance 💡 Social media managers needing automated reporting 🧠 Data analysts compiling weekly engagement metrics 💬 Digital agencies managing multiple brand accounts 🧾 Operations and analytics teams monitoring performance KPIs Step-by-Step Setup Instructions 1️⃣ Connect all API credentials (Twitter, Facebook, Notion, Gmail, Slack, and Sheets). 2️⃣ Paste your Facebook Page ID and Twitter handle in respective API nodes. 3️⃣ Verify your Azure OpenAI GPT-4o connection and prompt text for HTML report generation. 4️⃣ Update your Notion database structure to match “Growth Chart” property names. 5️⃣ Add your marketing email in the Gmail node and test delivery. 6️⃣ Specify the Slack channel ID where summaries will be posted. 7️⃣ Optionally, connect a Google Sheet tab for error tracking (error_id, message). 8️⃣ Execute the workflow once manually to validate data flow. 9️⃣ Activate or schedule it for weekly or daily analytics automation. ✅
by Yulia
Free template for voice & text messages with short-term memory This n8n workflow template is a blueprint for an AI Telegram bot that processes both voice and text messages. Ready to use with minimal setup. The bot remembers the last several messages (10 by default), understands commands and provides responses in HTML. You can easily swap GPT-4 and Whisper for other language and speech-to-text models to suit your needs. Core Features Text: send or forward messages Voice: transcription via Whisper Extend this template by adding LangChain tools. Requirements Telegram Bot API OpenAI API (for GPT-4 and Whisper) 💡 New to Telegram bots? Check our step-by-step guide on creating your first bot and setting up OpenAI access. Use Cases Personal AI assistant Customer support automation Knowledge base interface Integration hub for services that you use: Connect to any API via HTTP Request Tool Trigger other n8n workflows with Workflow Tool
by Samir Saci
Context Hey! I'm Samir, a Supply Chain Data Scientist from Paris who spent six years in China studying and working while struggling to learn Mandarin. I know the challenges of mastering a complex language like Chinese and my greatest support was flash cards. Therefore, I designed this workflow to support fellow Mandarin learners by automating flashcard creation using n8n, so they can focus more on learning and less on manual data entry. 📬 For business inquiries, you can add me on Here Who is this template for? This workflow template is designed for language learners and educators who want to automate the creation of flashcards for Mandarin (or any other language) using Google Translate API, an AI agent for phonetic transcription and generating an illustrative sentence and a free image retrieval API. Why? If you use the open-source application Anki, this workflow will help you automatically generate personalized study materials. How? Let us imagine you want to learn how to say the word Contract in Mandarin. The workflow will automatically Translate the word in Simplified Mandarin (Mandarin: 合同). Provide the phonetic transcription (Pinyin: Hétóng) Generate an example sentence (Example: 我们签订了一份合同.) Download an illustrative picture (For example, a picture of a contract signature) All these fields are automatically recorded in a Google Sheet, making it easy to import into Anki and generate flashcards instantly What do I need to start? This workflow can be used with the free tier plans of the services used. It does not require any advanced programming skills. Prerequisite A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Google Translate API activated with OAuth2 credentials A free API key of pexels.com A google sheet with the columns Next Follow the sticky notes to set up the parameters inside each node and get ready to pump your learning skills. I have detailed the steps in a short tutorial 👇 🎥 Check My Tutorial Notes This workflow can be used for any language. In the AI Agent prompt, you just need to replace the word pinyin with phonetic transcription. You can adapt the trigger to operate the workflow in the way you want. These operations can be performed by batch or triggered by Telegram, email, or webhook. If you want to learn more about how I used Anki flash cards to learn mandarin: 🈷️ Blog Article about Anki Flash Cards This workflow has been created with N8N 1.82.1 Submitted: March 17th, 2025
by Charles
Modern AI systems are powerful but pose privacy risks when handling sensitive data. Organizations need AI capabilities while ensuring: ✅ Sensitive data never leaves secure environments ✅ Compliance with regulations (GDPR, HIPAA, PCI, SOX) ✅ Real-time decision making about data sensitivity ✅ Comprehensive audit trails for regulatory review The Concept: Intelligent Data Classification + Smart Routing The goal of this concept is to build the foundations of the safe and compliant use of LLMs in Agentic workflows by automatically detecting sensitive data, applying sanitization rules, and intelligently routing requests through secure processing channels. This workflow will analyze the user's chat or webhook input and attempt to detect PII using the Enhanced PII Pattern Detector. If detected, the workflow will process that input via a series of Compliance, Auditing, and Security steps which log and sanitizes the request prior to any LLM being pinged. Why Multi-Tier Routing? Traditional systems use binary decisions (sensitive/not sensitive). Our 3-tier approach provides: ✅ Granular Security: Critical PII gets maximum protection ✅ Performance Optimization: Clean data gets full cloud capabilities ✅ Cost Efficiency: Expensive local processing only when needed ✅ User Experience: Maintains conversational flow across security levels Why Context-Aware Detection? Regex patterns alone miss contextual sensitivity. Our approach: ✅ Catches Intent: "Bank account" discussion is sensitive even without account numbers ✅ Reduces False Negatives: Medical discussions stay secure even without explicit medical IDs ✅ Proactive Protection: Identifies sensitive contexts before PII is shared ✅ Compliance Alignment: Matches how regulations actually define sensitive data Why Risk Scoring vs Binary Classification? Binary PII detection creates artificial boundaries. Risk scoring provides: ✅ Nuanced Decisions: Multiple low-risk patterns might aggregate to high risk ✅ Adaptive Thresholds: Organizations can adjust sensitivity based on their needs ✅ Better UX: Users aren't unnecessarily restricted for low-risk scenarios ✅ Audit Transparency: Clear reasoning for every routing decision Why Comprehensive Monitoring? Privacy systems require trust and verification: ✅ Compliance Proof: Audit trails demonstrate regulatory compliance ✅ Performance Optimization: Identify bottlenecks and improve efficiency ✅ Security Validation: Ensure no sensitive data leakage occurs ✅ Operational Insights: Understand usage patterns and system health How to Install: All that you will need for this workflow are credentials for your LLM providers such as Ollama, OpenRouter, OpenAI, Anthropic, etc. This workflow is customizable and allows the user to define the best LLM and storage/memory solutions for their specific use case.