by Ruthwik
📧 AI-Powered Email Categorization & Labeling in Zoho Mail This n8n template demonstrates how to use AI text classification to automatically categorize incoming emails in Zoho Mail and apply the correct label (e.g., Support, Billing, HR). It saves time by keeping your inbox structured and ensures emails are routed to the right category. Use cases include: Routing customer support requests to the correct team. Organizing billing and finance communications separately. Streamlining HR and recruitment email handling. Reducing inbox clutter and ensuring no important message is missed. ℹ️ Good to know You’ll need to configure Zoho OAuth credentials — see Self Client Overview, Authorization Code Flow, and Zoho Mail OAuth Guide. The labels must already exist in Zoho Mail (e.g., Support, Billing, HR). The workflow fetches these labels and applies them automatically. The Zoho Mail API domain changes depending on your account region: .com → Global accounts (https://mail.zoho.com/api/...) .eu → EU accounts (https://mail.zoho.eu/api/...) .in → India accounts (https://mail.zoho.in/api/...) Example: For an EU account, the endpoint would be: https://mail.zoho.eu/api/accounts/<accountID>/updatemessage The AI model used for text classification may incur costs depending on your provider (e.g., OpenRouter). Start by testing with a small set of emails before enabling for your full inbox. 🔄 How it works A new email in Zoho Mail triggers the workflow. OAuth authentication retrieves access to Zoho Mail’s API. All available labels are fetched, and a label map (display name → ID) is created. The AI model analyzes the subject and body to predict the correct category. The workflow routes the email to the right category branch. The matching Zoho Mail label is applied (final node is deactivated by default). 🛠️ How to use Create the required labels (e.g., Support, Billing, HR, etc.) in your Zoho Mail account before running the workflow. Replace the Zoho Mail Account ID in the Set Account ID node. Configure your Zoho OAuth credentials in the Get Access Token node. Update the API base URL to match your Zoho account’s region (.com, .eu, .in, etc.). Activate the Apply Label to Email node once ready for production. Optionally, adjust categories in the AI classifier prompt to fit your organization’s needs. 📋 Requirements Zoho Mail account with API access enabled. Labels created in Zoho Mail for each category you want to classify. OAuth credentials set up in n8n. Correct Zoho Mail API domain (.com, .eu, .in) based on your account region. An AI model (via OpenRouter or other provider) for text classification. 🎨 Customising this workflow This workflow can be adapted to many inbox management scenarios. Examples include: Auto-routing customer inquiries to specific departments. Prioritizing VIP client emails with special labels. Filtering job applications directly into an HR-managed folder.
by Yassin Zehar
Description This workflow continuously validates data quality using rules stored in Notion, runs anomaly checks against your SQL database, generates AI-powered diagnostics, and alerts your team only when real issues occur. Notion holds all data quality rules (source, field, condition, severity). n8n reads them on schedule, converts them into live SQL queries, and aggregates anomalies into a global run summary. The workflow then scores data health, creates a Notion run record, optionally opens a Jira issue, and sends a Slack/email alert including AI-generated root cause & recommended fixes. Target users Perfect for: DataOps Analytics Product Data BI Compliance ETL/ELT pipelines Platform reliability teams. Workflow steps How it works 1) Notion → Rules Database Each entry defines a check (table, field, condition, severity). 2) n8n → Dynamic Query Execution Rules are converted into SQL and checked automatically. 3) Summary Engine Aggregates anomalies, computes data quality score. 4) AI Diagnostic Layer Root cause analysis + recommended fix plan. 5) Incident Handling Notion Run Page + optional Slack/Email/Jira escalation. Silent exit when no anomaly = zero noise. Setup Instructions Create two Notion databases: Data Quality Rules → source / field / rule / severity / owner Data Quality Runs → run_id / timestamp / score / anomalies / trend / AI summary/recommendation Connect SQL database (Postgres / Supabase / Redshift etc.) Add OpenAI credentials for AI analysis Connect Slack + Gmail + Jira for incident alerts Set your execution schedule (daily/weekly) Expected outcomes Fully automated, rule-based data quality monitoring with minimal maintenance and zero manual checking. When everything is healthy, runs remain silent. When data breaks, the team is notified instantly: with context, root cause insight, and a structured remediation output. Tutorial video Watch the Youtube Tutorial video About me : I’m Yassin a Project & Product Manager Scaling tech products with data-driven project management. 📬 Feel free to connect with me on Linkedin
by PollupAI
Who's it for This template is for Customer Success and Sales teams who use HubSpot. It automates the critical handoff from sales to success, ensuring every new customer gets a fast, personalized welcome. It's perfect for anyone looking to standardize their onboarding process, save time on manual tasks, and improve the new customer experience using AI. What it does This workflow triggers when a deal's "Is closed won" property is set to True in HubSpot. It assigns a Customer Success Manager (CSM) by querying an n8n Data Table to find the 'least busy' CSM (based on a deal count) and fetches the deal's details to find all associated contacts. It then loops to identify the "Champion" contact by checking their "Buying Role" (hs_buying_role). An AI agent (in the AI: Write Welcome Email node) generates a personalized welcome email, which is converted to HTML and sent via Gmail. Finally, the workflow updates the Champion's contact record in HubSpot and updates the CSM's deal count in the Data Table to keep the logic in sync. How to set up Create and Populate Data Table: This template requires an n8n Data Table to manage CSM assignments. Create a Data Table named csm_assignments. Add two columns: csm_id (String) and deal_count (Number). Add one row for each CSM with their HubSpot Owner ID and a starting deal_count of 0. Link Data Table Nodes: Open the Get CSM List and Increment CSM Deal Count nodes and select the csm_assignments table you just created from the Table dropdown. Configure Variables: In the Configure Template Variables node, you must set your sender info (company_name, sender_name, and sender_email). Customize AI Prompt: In the AI: Write Welcome Email node, update the placeholder [Link to Your Video] and [Link to Your Help Doc] links with your own URLs. Check HubSpot Property: This workflow assumes you use the "Buying Role" (hs_buying_role) contact property to identify your "Champion". If you use a different property, you must update the HubSpot: Get Contact Details and If Role is 'Champion' nodes. Requirements Access to n8n Data Tables. HubSpot (Developer API):** A credential for the Trigger: Deal Is 'Closed Won' node. HubSpot (OAuth2):** A credential for all other HubSpot nodes (Get Deal Details, Get Contact Details, Assign Contact Owner). AI Credentials:** (e.g., OpenAI) Credentials for the AI Model node (the node connected to AI: Write Welcome Email). Email Credentials:** (e.g., Gmail) Credentials for the Gmail: Send Welcome Email node. How to customize the workflow You can easily customize this workflow to send different emails based on deal properties. Add an If node after the HubSpot: Get Deal Details node to check for the deal's value, product line, or region. Based on these properties, you can route the flow to different AI: Write Welcome Email nodes with unique prompts. For example, you could check the contact's 'industry' or 'company size' to send them links to different, more relevant 'Getting Started' videos and documentation.
by Cheng Siong Chin
How It Works This workflow automates insurance claims processing by deploying specialized AI agents to analyze actuarial data, draft claim memos, and perform risk assessments. Designed for insurance adjusters, underwriters, and claims managers handling high claim volumes, it solves the bottleneck of manual claim review that delays settlements and increases operational costs. The system ingests new claims data via scheduled triggers, then routes information to an actuarial analysis agent that calculates loss ratios and risk scores. A memo writer agent generates detailed claim summaries with recommendations, while a risk assessment agent evaluates fraud indicators and coverage implications. An orchestrator agent coordinates these specialists, ensuring consistent analysis standards. Final reports are automatically distributed via email to product teams and Slack notifications to risk management, creating transparent workflows while reducing claim processing time from days to hours with standardized, comprehensive evaluations. Setup Steps Configure claims database API credentials in "Fetch New Claims Data" node Input NVIDIA API key for all OpenAI Model nodes Add OpenAI API key in Orchestrator Agent configuration Set up Calculator Tool parameters for premium adjustment calculations Configure Gmail credentials and recipient addresses for product team Connect Slack workspace and specify risk team channel for alerts Prerequisites NVIDIA API access, OpenAI API key, claims management system API Use Cases Auto insurance claim triage, property damage assessment automation Customization Adjust risk scoring thresholds, add industry-specific analysis criteria Benefits Reduces claim processing time by 85%, ensures consistent evaluation standards
by Atta
Stop watching long videos, start listening to concise summaries. This workflow transforms any YouTube video URL sent via Telegram into a high-quality, spoken audio summary (MP3) and a structured text overview. It acts as your personal AI research assistant, turning lengthy content into bite-sized audio files that you can consume on the go. It leverages Decodo for robust transcript extraction, OpenAI for intelligent summarization, and for realistic text-to-speech generation. ✨ Features Telegram-First Interface:** Send links and receive audio directly in your chat app. Smart Validation:** Automatically checks if the link is a valid YouTube URL before processing to save API credits. Multi-Language Support:** Easily configure the output language (English, Spanish, German, etc.) via a simple Config node. The AI will translate and speak in this language. Robust Error Handling:** Gracefully handles videos with no captions/transcripts by notifying the user instead of breaking the workflow. Structured Data Extraction:** Uses AI to extract the Genre, Title, and Summary alongside the audio file. ⚙️ How it Works Trigger: You send a YouTube URL to your Telegram Bot. Validate: The workflow checks the URL pattern using Regex. Extract: Decodo scrapes the video page to retrieve the full transcript JSON. Process: A Code node flattens the complex JSON into a readable text format. Summarize: OpenAI (gpt-4o-mini) analyzes the text and writes a script optimized for listening. Speak: OpenAI converts the script into a high-definition MP3 file. Deliver: The bot replies with the Audio File and a formatted text summary including the genre tags and original link. 📥 Decodo Node Installation The Decodo node is used in this workflow for fetching the YouTube Transcript. Find the Node: Click the + button in your n8n canvas. Search: Search for the Decodo node and select it. Credentials: When configuring the first Decodo node, use your API key (obtained with the 80% discount coupon). Setup: Open the Decodo (Fetch YouTube Transcript) node to ensure it is correctly targeting the YouTube service. 🎁 Exclusive Deal for n8n Users To run this workflow, you require a robust scraping provider. We have secured a massive discount for Decodo users: Get 80% OFF the 23k Advanced Scraping API plan. Coupon Code: ATTAN8N Sign Up Here: Claim 80% Discount on Decodo ➕ How to Adapt the Template This workflow is highly flexible and can be modified for various content tasks: Change AI Model:* Easily swap the *OpenAI Chat Model* node with an *OpenAI* or *Anthropic (Claude)** node without altering the core logic. Create Long-Form Drafts:** Modify the AI System Prompt to generate a full 1,000-word blog post draft or a set of social media updates instead of a short audio script. Change Destination:* Replace the *Telegram* nodes with *Slack, **Microsoft Teams, Email (Gmail/SMTP), or Discord to deliver the audio and summary to your preferred channel. Create an Archive:* Connect the successful output to a *Google Sheets* or *Airtable** node to keep a searchable archive of every video summary created.
by Yasser Sami
Customer Support AI Agent for Gmail This n8n template demonstrates how to build an AI-powered customer support workflow that automatically handles incoming Gmail messages, classifies them, finds answers from your knowledge base, and sends a personalized reply. Who’s it for SaaS founders or teams who want to automate customer support. Freelancers and solopreneurs who receive repetitive customer queries. Companies that want to reduce manual email triage and improve response times. How it works / What it does Trigger: A new email arrives in Gmail. Classification: The workflow uses a text classifier to decide whether the email is customer support-related or not. If not, it’s ignored. If yes, it proceeds. AI Agent: Queries a knowledge base (vector database with OpenAI embeddings). Retrieves the most relevant answer. Drafts a reply using AI (OpenAI or Google Gemini model). Post-processing: Labels the email in Gmail for organization. Sends a reply automatically. This ensures that your customers get timely, relevant responses without manual intervention. How to set up Import this template into your n8n account. Connect your Gmail account in the Gmail Trigger, Label, and Reply nodes. Connect your AI model provider (OpenAI or Google Gemini). Configure the knowledge base embeddings (upload your docs/FAQ into the vector database). Activate the workflow — and your AI customer support agent is live! Requirements n8n account. Gmail account (with API access enabled). OpenAI or Google Gemini account for LLM and embeddings. Knowledge base data (FAQ, documentation, or past tickets). Google Drive account for auto update your vector database(with API access enabled). How to customize the workflow Knowledge Base**: Replace or expand with your own company docs, FAQs, or past conversations. Classification Rules**: Train or adjust the classifier to handle more categories (e.g., Sales, Partnership, Technical Support). Reply Style**: Customize AI prompts for tone — professional, casual, or friendly. Labels**: Change Gmail labels to match your workflow (e.g., “Support,” “Sales,” “Priority”). Multi-language**: Add translation steps if your customers speak different languages. This template saves you hours of manual email triage and ensures your customers always get quick, accurate responses.
by Cheng Siong Chin
How It Works This workflow automates enterprise budget monitoring and cost optimization using Anthropic Claude as the core AI engine across multiple specialist agents. It targets finance teams, operations managers, and CFOs managing complex multi-department budgets where manual tracking leads to delayed decisions and cost overruns. The workflow triggers on schedule, generates metrics data, and routes it through a Cost Intelligence Agent that classifies budget status (Critical, Warning, Review, Feedback). Each path activates specialist agents—Budget Alert, Routing Recommendation, and Cost Projection—coordinated by an Optimization Coordinator. Results are routed by action type: urgent alerts fire via Slack, executive summaries deliver via email, and all optimization actions are stored. This gives finance teams real-time cost intelligence with automated escalation and audit-ready records. Setup Steps Import workflow JSON into your n8n instance. Add Anthropic API credentials. Set Schedule Trigger frequency. Update Workflow Configuration node with budget thresholds per department or cost centre. Add Slack credentials and configure the target channel in the Send Slack Alert node. Set Gmail/SMTP credentials for the Send Executive Report Email node. Prerequisites n8n (cloud or self-hosted), Anthropic API key (Claude), Slack workspace with bot token Use Cases Finance teams automating multi-department budget variance detection and escalation Customization Replace Anthropic Claude with OpenAI GPT-4 or NVIDIA NIM in any agent node Benefits Eliminates manual budget reviews through automated AI-driven cost classification
by Cheng Siong Chin
How It Works This workflow automates industrial asset health monitoring and predictive maintenance using Anthropic Claude across coordinated specialist agents. It targets facility managers, maintenance engineers, and operations teams in manufacturing, energy, and infrastructure sectors where reactive maintenance leads to costly unplanned downtime and asset failures. On schedule, the system ingests asset health data and routes it through a Performance Evaluation Agent that coordinates three specialist agents: Maintenance Scheduling, Parts Readiness, and Lifecycle Reporting. An MCP External Data Tool enriches analysis with real-time contextual data. Results are risk-routed—Critical assets trigger immediate Slack alerts, High-risk assets escalate via email reports, and Routine cases are logged for scheduled maintenance. All paths merge into a unified maintenance log, giving operations teams proactive, audit-ready asset intelligence before failures occur. Setup Steps Import workflow JSON into your n8n instance. Add Anthropic API credentials. Set Schedule Trigger frequency aligned to your asset monitoring cycle. Update Workflow Configuration node with asset thresholds. Configure MCP External Data Tool with your external data source endpoint and authentication. Add Slack credentials and set the target channel in the Notify Critical Alert node. Set Gmail/SMTP credentials for the Email Escalation Report node. Prerequisites n8n (cloud or self-hosted), Anthropic API key (Claude), Slack workspace with bot token Use Cases Facility managers automating condition-based maintenance scheduling across multiple assets Customization Replace Anthropic Claude with OpenAI GPT-4 or NVIDIA NIM in any agent node Benefits Shifts maintenance from reactive to predictive, reducing unplanned downtime significantly
by Cheng Siong Chin
How It Works This workflow automates end-to-end medical claims processing using a multi-agent AI orchestration system built on OpenAI GPT-4. It targets healthcare revenue cycle teams, billing departments, and hospital administrators burdened by manual claims adjudication, coding errors, and payer denials. The workflow triggers on a schedule, loads billing data, and routes it through an Orchestrator Agent that coordinates four specialist sub-agents: Coding Validation, Claims Submission, Denial Detection, and Payer Follow-up. Each agent independently validates, submits, or flags claims. Results are parsed, merged, and routed by risk level. Final metrics and a formatted report close the cycle, giving teams real-time visibility into claim status, denial patterns, and revenue recovery. Setup Steps Import workflow JSON into your n8n instance. Add OpenAI API credentials. Configure Schedule Trigger with desired processing frequency. Update Workflow Configuration node with your billing system endpoint or sample data path. Set Gmail/SMTP credentials for the Escalate to Revenue Specialist email node. Connect Google Sheets or database nodes with appropriate credentials and sheet IDs. Test with simulated billing data before enabling live data sources. Prerequisites n8n, OpenAI API key (GPT-4) and Gmail or SMTP account Use Cases Hospital billing departments automating claims submission and denial follow-up Customization Swap OpenAI for NVIDIA NIM or Anthropic models in any agent node and add Slack alerts alongside email escalation Benefits Reduces manual claims review by 80%+ through parallel AI agent processing
by Jitesh Dugar
⚖️ HR Sovereign: AI-Powered Onboarding Hub A high-fidelity employee onboarding engine: Intake → Role-Based Enrichment → AI Personalization → IT Provisioning. ⚙️ Core Sovereign Logic Enrichment:** Auto-classifies Tech, Sales, and Leadership roles to drive specific logic tracks. Intelligence:* Uses *AI Agent (GPT-4)** to generate personalized welcome messaging based on job DNA. Atomization:* *Merge PDF** node assembles role-specific policies and benefits into a single high-res package. Provisioning:* Dynamically generates *Jira* hardware/access tickets and *Notion** tracking dashboards. Delivery:* Sends branded HTML emails via *Gmail* and announces hires on *Slack**. 📋 Setup & Prerequisites Intake: Connect your HRIS (BambooHR/Workday) to the Webhook URL. Assets: Organize Drive folders into "Technical", "Leadership", and "Standard" templates. Tracking: Connect your Notion Onboarding Database and Jira IT Project. Metrics: Time_to_Provision, Engagement_Score, Document_Integrity_Hash.
by Yaron Been
Analyze Reddit sentiment around competitor brands using Bright Data URL-based scraping and GPT-5.4. This workflow Reads competitor post URLs from Google Sheets, scrapes Reddit posts via URL using Bright Data's Reddit Posts API, and uses GPT-5.4 to map sentiment distribution. The AI calculates positive, neutral, and negative sentiment percentages, extracts top complaints and praises, identifies vulnerability areas, and scores the competitive opportunity (0-100). High-opportunity brands trigger alerts to the strategy team. How it works: Weekly schedule trigger runs on Monday at 8 AM. Reads the 'competitor_brands' sheet (columns: brand_name, url, industry). Sends each URL to Bright Data for Reddit post discovery. Validates the Bright Data API response. GPT-5.4 analyzes sentiment distribution and identifies competitive opportunities. Parses AI output and merges with original brand data. Filters by AI confidence (>= 0.7). Brands with competitive_opportunity_score >= 70 trigger email alerts and go to 'high_opportunity_brands'. Lower-scoring brands go to 'competitor_sentiment'. Low-confidence results go to 'low_confidence_sentiment'. Setup: Create a Google Sheet with a 'competitor_brands' tab containing columns: brand_name, url, industry. Create output tabs: high_opportunity_brands, competitor_sentiment, low_confidence_sentiment. Configure Bright Data API credentials as HTTP Header Auth (Bearer token). Connect OpenAI, Google Sheets, and Gmail OAuth2 credentials. Requirements: Bright Data API account (~$0.003-0.005 per URL scrape). OpenAI API account (GPT-5.4 costs ~$0.003-0.008 per call). Google Sheets OAuth2 credentials. Gmail OAuth2 credentials. Notes: Track competitive_opportunity_score over time to spot widening vulnerabilities. Use the vulnerability_areas field to inform your product positioning and messaging. Combine with your own product monitoring (Template 21) for a complete competitive picture.
by Leo Lara
AI Meeting Task Manager - Google Meet to GoHighLevel CRM 📋 TEMPLATE DESCRIPTION Transform your meeting follow-ups from chaos to clarity! This workflow automates the entire post-meeting workflow by scanning Google Meet recordings folders, extracting action items from AI-generated meeting notes, and creating tasks directly in your GoHighLevel CRM. 🎯 Who is this for? Sales teams using GoHighLevel CRM Agency owners managing multiple client meetings Anyone who uses Google Meet with Gemini note-taking Professionals drowning in meeting follow-ups ✨ What it does: Daily File Organization Scans your Google Meet recordings folder Automatically sorts recordings, notes, and chat logs into organized subfolders Keeps your Drive clean and searchable AI-Powered Task Extraction Reads Google Docs meeting notes (generated by Gemini) Identifies action items assigned to you Intelligently determines due dates from context (defaults to 3 business days) CRM Integration Searches for meeting participants in GoHighLevel Creates properly formatted tasks with full context Links tasks to the correct contact record Beautiful Email Summaries Sends a professionally designed HTML email Shows tasks created per contact Includes due dates and status updates 🔧 Technologies Used: Google Drive API (file management) Google Docs API (content extraction) GoHighLevel API (contact search + task creation) OpenAI GPT-4 (task extraction intelligence) Gmail API (email delivery) ⚙️ Setup Requirements: Google Cloud OAuth credentials (Drive, Docs, Gmail) GoHighLevel OAuth credentials OpenAI API key Create 4 Google Drive folders (source + 3 destination folders) 📖 Setup Instructions: Create Google Drive Folders: Source folder: Where Google Meet saves recordings Recordings folder: For video files Notes folder: For Gemini notes Chat folder: For meeting chat logs Configure Credentials: Connect Google Drive OAuth Connect Google Docs OAuth Connect GoHighLevel OAuth Connect Gmail OAuth Add OpenAI API key Update Folder URLs: Replace placeholder URLs in Google Drive nodes with your folder URLs Customize: Set your email address in the Gmail tool Set your GoHighLevel user ID for task assignment Adjust the schedule trigger timing as needed 💡 Pro Tips: Works best with Google Meet's Gemini note-taking feature Customize the AI prompts to match your task naming conventions The HTML email template is fully customizable