by Jonathan
This workflow uses a WooCommerce trigger that will run when a new customer has been added, It will then add the customer to Mautic. To use this workflow you will need to set the credentials to use for the WooCommerce and Mautic nodes.
by Harshil Agrawal
This workflow allows you to send daily weather updates via an SMS message using the Plivo node. Cron node: The Cron node will trigger the workflow daily at 9 AM. OpenWeatherMap node: This node will return data about the current weather in Berlin. To get the weather updates for your city, you can enter the name of your city instead. Plivo node: This node will send an SMS with the weather update, which was sent by the previous node.
by Muh Resky Adiansyah
Lead-Routing-Engine-with-SLA-Auto-Reassignment This repository contains an SLA-based lead routing workflow built in n8n, designed to ensure fast lead response, fair sales distribution, and controlled escalation without relying on a full CRM system. The workflow focuses on routing discipline and operational safety, not feature completeness. What This Workflow Does At a high level, the system: 1) Accepts new leads from a generic intake form 2) Assigns leads to sales reps using round-robin 3) Enforces a response SLA 4) Automatically re-routes uncontacted leads 5) Allows sales to mark leads as CONTACTED via Slack 6) Escalates to a manager once if SLA is repeatedly violated Architecture Overview Core Components 1) n8n 2) Google Sheets 3) Slack Primary Data Stores 1) sales_sheet (list of active sales reps) 2) lead_sheet (lead state and routing history) 3) routing_state_sheet (global routing + escalation flags) End-to-End Flow A) Lead Intake & Normalization 1) New leads enter via Form Trigger 2) Phone numbers are normalized to 62xxxxxxxx (Indonesia International Direct Dialing code) 3) A unique lead_id is generated 4) Lead is initialized with: stage = NEW route_count = 0 B) Initial Assignment (Round Robin) 1) Active sales reps are loaded from sales_sheet 2) Global last_index is read from routing_state_sheet 3) Lead is assigned to the next sales rep in sequence 4) Assignment metadata is stored: assigned sales timestamps route count 5) last_index is updated centrally C) Slack Notification (New Lead) 1) Assigned sales receives a Slack message 2) Message includes a “Mark as CONTACTED” button 3) SLA expectation is clearly communicated (1 hour by default) D) SLA Monitoring (Scheduled) 1) A scheduled trigger runs every hour 2) Workflow scans leads where: stage = NEW SLA window has elapsed since last assignment E) SLA Re-Routing For each qualifying lead: 1) Lead is reassigned to the next sales rep 2) Route count is incremented 3) Timestamps are updated 4) Slack notification is sent to the new assignee This process repeats until the lead is contacted or escalated. F) Controlled Escalation If route_count >= threshold (default: 10): 1) Workflow checks escalation state (escalated_<lead_id>) 2) If not escalated yet: Manager is notified via Slack Escalation flag is written 4) If already escalated: No further action is taken Escalation is one-time per lead. G) Stage Update via Slack (CONTACTED) 1) Sales marks a lead as CONTACTED via Slack button 2) Incoming Slack action is validated: Only assigned sales is allowed Only if current stage is NEW 3) Update is idempotent 4) Unauthorized or stale actions receive Slack feedback Once contacted: 1) SLA routing stops 2) Lead remains stable Safeguards Built In 1) Ownership enforcement (Only the assigned sales rep can update a lead) 2) Idempotent stage transitions (Prevents duplicate or stale Slack actions) 3) One-time escalation (No notification spam) 4) Fail-fast behavior (Missing sales data or malformed payloads halt execution early) Google Sheets Schema sales_sheet | Column | Description | |----------|-----------------| | name | Sales name | | email | Optional | | slack_id | Slack user ID | | active | ON / TRUE | lead_sheet | Column | Description | | -------------- | --------------------- | | lead_id | Unique identifier | | name | Lead name | | phone | Normalized phone | | stage | NEW / CONTACTED / QUALIFIED / CLOSED LOST | | assigned_sales | Current owner | | sales_slack_id | Slack ID | | route_count | Number of re-routes | | created_at | Creation timestamp | | assigned_at | Last assignment | | last_routed_at | Last SLA routing | | contacted_at | When marked contacted | routing_state_sheet | key | value | | ------------------- | ---------------------- | | last_index | Last round-robin index | | escalated_<lead_id> | Escalation timestamp | Limitations (By Design) 1) Google Sheets is not transactional 2) SLA enforcement is time-bucketed, not real-time 3) No concurrency locking across parallel runs 4) Slack is required for interaction 5) This is not a CRM, only a routing engine These constraints are explicit and intentional. When This Design Works Well 1) Small to mid-size teams 2) Human-response SLAs (minutes/hours) 3) Teams needing discipline, not heavy tooling 4) CRM-lite or pre-CRM environments When to Migrate Consider migrating if you need: 1) High-volume ingestion 2) Sub-minute SLA guarantees 3) Strong transactional consistency 4) Advanced analytics or forecasting The routing logic itself is portable to SQL or CRM systems.
by Abdullah Alshiekh
🧩 What Problem Does It Solve? Meta’s ad forms often generate unqualified leads from casual scrollers. This workflow uses WhatsApp and AI to automatically verify, qualify, and prioritize real leads — saving time and boosting sales efficiency. 📝 Description This workflow automates lead qualification for businesses using Meta Ads (Facebook/Instagram Lead Ads) to filter out irrelevant leads. It ensures only confirmed prospects enter your CRM by: Collecting new Facebook leads Verifying via WhatsApp confirmation Classifying responses with AI Updating CRM status based on intent When a new Facebook lead arrives: Lead details are extracted (name/phone/email) Zoho CRM is checked for existing contacts WhatsApp confirmation request is sent AI classifies the response (confirmed/declined/human/invalid) CRM status is updated automatically Sales team receives only verified leads 🎯 Key Advantages for Meta Ads ✅ Blocks 60%+ irrelevant leads based on WhatsApp non-response ✅ Reduces fake submissions by requiring active confirmation ✅ Prevents CRM bloat through duplicate checking ✅ Identifies hot leads via instant "human_requested" escalation ✅ Saves sales team hours by auto-declining "no" responses 🛠️ Features Facebook Lead Ads integration via Graph API WhatsApp messaging via Twilio AI response classification (Gemini) Zoho CRM synchronization Duplicate lead prevention Customizable confirmation flow Error-resistant JSON parsing CRM owner assignment Status-based routing 🔧 Requirements Facebook Access Token with ads_management & leads_retrieval permissions Twilio Account with WhatsApp-enabled number Zoho CRM with custom "Status" field Gemini API Key (or alternative LLM) n8n credentials configured for: Twilio (API SID/token) Zoho CRM (OAuth2) Google Gemini (or alternative LLM) ⚙️ Customization Tips 1-Adjust Classification Criteria Modify the AI prompt in Classify Response (AI) node 2-Customize CRM Status Values Update field IDs in Zoho nodes 3-Modify Messaging Edit WhatsApp templates in Send WhatsApp Confirmation 4-Set Owner Assignment Replace owner ID in Prepare Owner ID node 🧠 Use Case Examples Real Estate Agencies: Filter speculative inquiries from serious buyers Medical Clinics: Verify appointment requests before scheduling SAAS Companies: Qualify free trial sign-ups Education Providers: Confirm course interest before counselor assignment Auto Dealerships: Screen test drive requests from tire-kickers If you need help get in touch on Linkedin
by Matthieu
🔧 AI-Powered Cold Call Machine 🎯 Purpose The AI-Powered Cold Call Machine is a fully automated workflow designed to generate qualified leads from LinkedIn, evaluate them using AI-based scoring, identify key decision-makers, and generate personalized cold call scripts. All results are saved to a Google Sheet-based CRM. ⚙️ How It Works 1. Initialization Triggered either manually or via schedule. Pulls configuration from a Google Sheet’s Settings tab (e.g., target product, keywords, company size, API key). 2. Company Search on LinkedIn Uses the Ghost Genius API to search for companies based on cleaned, relevant keywords extracted by OpenAI. Handles pagination, up to 1000 companies per batch. 3. Company Filtering Each company goes through: Data enrichment via Ghost Genius (website, size, followers, etc.). Filtering: Must have a LinkedIn page with a website. Must have 200+ followers. Deduplication: checks if the company already exists in the CRM. 4. AI-Based Company Scoring A specialized AI model scores each company from 0 to 10 based on: Industry fit. Size/location alignment. Potential pain points that match your offering. If the company is new and relevant (score ≥ 7), it is saved in the Companies sheet. 5. Decision Maker Identification Uses Sales Navigator API (via Ghost Genius) to find employees with targeted job titles. For each matching profile: Enriches contact data (title, bio, etc.). Retrieves phone number (if available). Generates a 20-second personalized cold call script using OpenAI, based on company and profile data. Saves all information in the Leads tab of the CRM. If no decision maker is found, the company status is marked accordingly. 📈 Outcome A fully enriched, qualified lead database. Custom cold call scripts** ready to be used by SDRs or founders. Zero manual work – from search to lead generation, everything is automated. 💡 Use Case Perfect for SDRs, founders, or growth marketers looking to scale cold outreach without sacrificing personalization or running into LinkedIn scraping limits.
by Rahul Joshi
Description: Turn raw customer feedback into actionable insights with this intelligent n8n workflow template! Automatically capture reviews from Google Sheets, run AI-driven sentiment and intent analysis, and enrich your dataset with structured insights—no manual review required. This automation connects to your feedback form responses, processes reviews with an AI model, classifies intent, evaluates sentiment, assigns a score, and generates concise summaries. The results are then parsed, merged with original customer details, and stored in a structured Google Sheet for easy tracking. Perfect for sales, product, and customer success teams looking to streamline lead qualification and feedback analysis. What This Template Does: 📊 Captures new customer feedback from Google Sheets in real time 🧠 Uses AI to classify intent (praise, complaint, suggestion, etc.) 😊 Detects sentiment (positive, neutral, negative, or mixed) 🔢 Assigns a review score (1–10) for quick lead qualification 📝 Generates short, meaningful summaries of customer reviews 📂 Saves enriched data into a structured destination sheet 🌟 100% hands-free: just let AI process and organize your feedback Built-in Logic Ensures: ✔️ Clean JSON-based AI output (intent, sentiment, score, summary) ✔️ Customer details remain tied to their feedback and insights ✔️ Final dataset is ready for reporting, CRM import, or dashboards Requirements: Google Sheets with customer feedback form responses Google Sheets account for storing enriched data Azure OpenAI (or compatible) account for AI analysis n8n instance (self-hosted or cloud) Perfect For: Sales teams qualifying leads based on review sentiment Product managers analyzing user feedback at scale Customer success teams identifying risks and opportunities Analysts turning unstructured reviews into actionable insights
by Łukasz
What Is This? This workflow is an automated employee time tracking and reporting system that monitors weekly work hours via TMetric, then delivers personalized summaries directly to each team member on Slack. It compares actual logged hours against individual work schedules, accounts for approved time-off requests, and flags anyone falling behind their hourly quota. It works with any team using TMetric for time tracking and Slack for communication, making it ideal for remote-first companies, IT agencies, and consulting firms managing distributed teams. Who Is It For? Designed for team leads, operations managers, and HR departments in small to medium-sized companies who want effortless visibility into employee time tracking compliance — without micromanaging or manually pulling reports. Software development agencies, MSPs, and consulting firms billing clients by the hour will find this especially valuable, as it keeps employees accountable and ensures time entries are complete and up to date before payroll or invoicing cycles. How Does It Work? The workflow operates in two modes: a one-time Setup and a recurring Weekly Report. Setup builds a mapping table between TMetric and Slack user accounts. It checks whether the required N8N Data Table already exists, creates it if not, then fetches all TMetric and Slack users. Each Slack user receives a message asking them to identify their corresponding TMetric account via an interactive dropdown form. Responses are saved to the Data Table for use in subsequent runs. Weekly Report runs automatically every Friday at 4:00 PM. It fetches users, their work schedules, and approved time-off requests from TMetric, then processes each employee individually — retrieving their time entries, filtering out any currently running tasks, and calculating total hours worked. The workflow accounts for time-off when determining expected hours, then computes how far ahead or behind schedule each person is. Finally, it looks up each user's Slack ID from the mapping table and sends them a personalized Block Kit message summarizing their week: hours worked, schedule adherence status, and a project-by-project breakdown. How To Set It Up? Prerequisites: An active N8N account or self-hosted instance A TMetric account with API access A Slack workspace with bot permissions Step 1: Configure the Globals node tmAccountId** — found in your TMetric URL: https://app.tmetric.com/#/tracker/YOUR_ID/ Allow hours missing percentage** — tolerance threshold before flagging a user as behind (default: 10%) tmetricToSlackUserDataTableName** — name of the N8N Data Table used for user mapping (default: Tmetric to Slack user map) Step 2: Set up TMetric authentication Open any HTTP Request node, select Authentication → Generic Credential Type → Header Auth, and create a new credential: Name**: Authorization Value**: <Your TMetric API Key> You can generate an API key in your TMetric account settings. Full API documentation is available here. Step 3: Configure Slack credentials Connect your Slack workspace using OAuth2 in the Slack nodes. Ensure your bot has permissions to list users and send direct messages. Step 4: Run Setup Trigger the Setup manual trigger once. It will create the mapping table and send a form to each Slack user. Once all responses are collected, the workflow is ready for scheduled operation. Step 5: Customize the Slack message (optional) Edit the Send a message node to adjust the message layout, change delivery from DM to a channel, or add additional fields to the report. What's More? Flexible time-off handling: Approved time-off requests are subtracted from expected working hours, ensuring employees aren't penalized for legitimate absences. Running task filtering: Time entries without an end time are excluded from calculations, preventing inflated hours from active timers. Visual status indicators: Slack messages use color-coded action buttons — green for on-track employees, red for those behind schedule — giving recipients immediate visual feedback at a glance. Project breakdown: Each report includes a per-project summary showing time spent across the reporting period, giving employees and managers full transparency into where hours went. Safe re-runs: The Setup flow can be triggered again at any time to onboard new team members or correct mapping mistakes without affecting existing records. Need help? Reach out at developers@sailingbyte.com or visit sailingbyte.com. Happy hacking!
by David Olusola
📊 Log BTC/ETH Prices and USD Exchange Rates to Notion (Hourly) 📌 Overview This workflow automatically logs live crypto prices (Bitcoin & Ethereum) and fiat exchange rates (USD→EUR / USD→NGN) into a Notion database every hour. Each entry becomes a new row in your Notion dashboard, letting you visualize currency and crypto trends side by side. It’s perfect for traders, analysts, and anyone who wants a single source of truth in Notion without needing multiple apps open. With hourly updates, you’ll have a clean data history for building rollups, trend graphs, or financial dashboards. ⚙️ How it works Schedule Trigger — runs every hour (adjustable via cron). HTTP Request (ExchangeRate-API) — fetches USD-base FX rates (no API key required). HTTP Request (CoinGecko) — fetches BTC & ETH prices + 24h % change (no API key required). Merge — combines both payloads. Code (v2) — formats a Notion-ready JSON payload with the correct fields. Notion Node — creates a new page in your database with mapped properties. Example Row in Notion: Title: Crypto+FX — 2025-09-08 09:00 BTC: 112,417 | BTC_24h: +1.22% ETH: 4,334.57 | ETH_24h: +1.33% USD→EUR: 0.854 | USD→NGN: ₦1,524.54 🛠 Setup Guide 1) Create the Notion database In Notion, create a new database (Table view). Add these columns with matching property types: | Column | Property Type | |------------|---------------| | Title | Title | | BTC | Number | | BTC_24h | Number | | ETH | Number | | ETH_24h | Number | | USD_EUR | Number | | USD_NGN | Number | 2) Connect Notion in n8n In the Notion “Create Page” node, connect with your Notion OAuth2 credentials. On first use, you’ll be redirected to authorize n8n with your Notion workspace. Copy your Database ID (from the Notion URL) and paste it into the node. 3) Map the Code output The Code node outputs JSON fields: BTC, BTC_24h, ETH, ETH_24h, USD_EUR, USD_NGN. In the Notion node, map each property: BTC → {{$json.BTC}} BTC_24h → {{$json.BTC_24h}} ETH → {{$json.ETH}} ETH_24h → {{$json.ETH_24h}} USD_EUR → {{$json.USD_EUR}} USD_NGN → {{$json.USD_NGN}} 4) Test Run the workflow once. Confirm that a new page is added to your Notion database with all values filled. 🎛 Customization Cadence:** change the schedule to 10 minutes, 4 hours, or daily depending on your needs. Extra coins:** add more IDs (e.g., solana, bnb) in the CoinGecko call and update the Code node. Extra FX pairs:** expand from ExchangeRate-API (e.g., USD→GBP, USD→ZAR). Notion dashboards:** use rollups, charts, and linked databases for trend visualization. Formatting:** add emojis, colors, or sections in your Notion view for clarity. 🧩 Troubleshooting Page not created:** verify Database ID and ensure the Notion API integration has access. Empty fields:** check that property names in Notion exactly match those used in the Code node. Wrong data type:* make sure properties are set as *Number**, not Text. Rate limits:** CoinGecko and ExchangeRate-API are free but may rate-limit if called too often; keep cadence reasonable (hourly recommended).
by Fahmi Fahreza
Sync QuickBooks Chart of Accounts to Google BigQuery Keep a historical, structured copy of your QuickBooks Chart of Accounts in BigQuery. This n8n workflow runs weekly, syncing new or updated accounts for better reporting and long-term tracking. Who Is This For? Data Analysts & BI Developers** Build a robust financial model and analyze changes over time. Financial Analysts & Accountants** Track structural changes in your Chart of Accounts historically. Business Owners** Maintain a permanent archive of your financial structure for future reference. What the Workflow Does Extract** Every Monday, fetch accounts created or updated in the past 7 days from QuickBooks. Transform** Clean the API response, manage currencies, create stable IDs, and format the data. Format** Convert cleaned data into an SQL insert-ready structure. Load** Insert or update account records into BigQuery. Setup Steps 1. Prepare BigQuery Create a table (e.g., quickbooks.accounts) with columns matching the final SQL insert step. 2. Add Credentials Connect QuickBooks Online and BigQuery credentials in n8n. 3. Configure the HTTP Node Open 1. Get Updated Accounts from QuickBooks. Replace the Company ID {COMPANY_ID} with your real Company ID. Press Ctrl + Alt + ? in QuickBooks to find it. 4. Configure the BigQuery Node Open 4. Load Accounts to BigQuery. Select the correct project. Make sure your dataset and table name are correctly referenced in the SQL. 5. Activate Save and activate the workflow. It will now run every week. Requirements QuickBooks Online account QuickBooks Company ID Google Cloud project with BigQuery and a matching table Customization Options Change Sync Frequency** Adjust the schedule node to run daily, weekly, etc. Initial Backfill** Temporarily update the API query to select * from Account for a full pull. Add Fields** Modify 2. Structure Account Data to include or transform fields as needed.
by Cheng Siong Chin
How It Works This workflow automates real estate lead qualification and routing by enriching leads from multiple sources with AI-powered analysis and directing them to appropriate sales agents based on priority. Designed for real estate brokers, sales managers, and lead generation teams, it solves the critical challenge of quickly identifying high-value prospects from high-volume lead streams while ensuring timely agent follow-up. The system triggers on schedule, fetches leads simultaneously from MLS portals and CRM/email sources, aggregates all leads into unified dataset, splits leads for parallel processing, then deploys AI agents using Anthropic's Claude to analyze lead quality, buying intent, budget capacity, and urgency. Based on enrichment scores, leads are routed to best-fit agents by priority tier, with engagement tracking logged to Google Sheets for performance monitoring and optimization. Setup Steps Configure Schedule Trigger with desired lead processing frequency Set up MLS/portal integration API credentials in Fetch Leads from MLS/Portals node Configure CRM/email system API access in Fetch Leads from CRM/Email node Connect Anthropic API credentials for AI Lead Enrichment Agent Customize Structured Output Parser with your lead scoring criteria Update Check Lead Priority node with priority threshold rules Configure agent routing logic in Route to Best-Fit Agent nodes by priority tier Prerequisites Active Anthropic API account, MLS/real estate portal API access, CRM system with API integration Use Cases Inbound lead qualification from property portals, open house inquiry processing, email campaign lead scoring Customization Modify AI enrichment prompts for market-specific criteria, adjust priority scoring algorithms for business goals Benefits Reduces lead response time by 75%, ensures high-value prospects receive immediate attention
by Daniel Turgeman
How it works Runs every 6 hours to pull contacts needing a data refresh from HubSpot All contacts are bulk-enriched with Lusha in a single API call Each contact is scored against your ICP criteria (seniority, company size, industry) CRM records are updated with fresh data; high-scoring fast-track leads trigger a Slack alert Set up steps Install the Lusha community node Add your Lusha API, HubSpot, and Slack credentials Customize the ICP scoring weights in the Code node Set the Slack channel for fast-track alerts and activate
by Trung Tran
Automatic Clean Up Expired AWS ACM Certificates with Human Approval > Automate the cleanup of expired AWS ACM certificates with Slack-based approval. This workflow helps maintain a secure and tidy AWS environment by detecting expired SSL certs, sending detailed Slack notifications to admins, and deleting them upon approval, ensuring full visibility and control over certificate lifecycle management. 🧑💼 Who’s it for This workflow is designed for: AWS administrators** who want to keep their environment clean and secure DevOps teams** managing SSL lifecycle in AWS ACM IT Admins** needing visibility and control over expired cert removal Teams that use Slack for collaboration and approvals ⚙️ How it works / What it does This automated workflow performs the following tasks on a daily schedule: Fetch all ACM certificates in your AWS account. Filter out the expired ones by comparing expiration date and status. Send a Slack approval message with certificate details to the admin team. Wait for approval response directly in Slack (✅ to approve deletion). If approved, it deletes the expired certificate using AWS ACM. Finally, it notifies the IT admin about the action taken. 🔧 How to set up Create the Workflow Add the nodes as shown: Schedule Trigger AWS - ACM: listCertificates AWS - ACM: describeCertificate (loop per cert) IF Node to filter expired certs Slack - Send & Wait for Reaction AWS - ACM: deleteCertificate Slack - Post Message to notify Configure Slack Create a Slack Bot Token with: chat:write reactions:read channels:read Connect it in your Slack nodes. Configure AWS Credentials Use IAM User or Role with: acm:ListCertificates acm:DescribeCertificate acm:DeleteCertificate Set schedule Daily, Weekly, or custom cron expression. 📋 Requirements | Component | Description | |------------------|--------------------------------------| | AWS ACM Access | IAM permissions for ACM actions | | Slack Bot Token | With chat:write & reactions:read | | n8n Environment | Self-hosted or n8n Cloud | | Slack Channel | Where approval messages will be sent | 🛠️ How to customize the workflow 🕒 Change waiting time Adjust the wait time before checking Slack reactions in the sendAndWait node (default 1 hour). 👥 Change Slack target Change the Slack channel or tag specific people (<@U123456>). 📓 Add logging Add Google Sheets, Notion, or DynamoDB to log certificate details and approval decisions. 🧪 Add dry-run/test mode Use an IF node before deletion to simulate removal when ENV === dry-run.