by Cheng Siong Chin
How It Works Scheduled triggers initiate automated contract reviews. The system fetches documents from cloud storage and email, then uses AI to extract key terms, obligations, and compliance requirements. Multi-model parsing identifies gaps, inconsistencies, and potential risks. A scoring engine evaluates severity and routes alerts to the appropriate channels. The workflow then updates the CLM system and produces audit-ready documentation for tracking and governance. Setup Instructions Storage: Configure access to your Google Drive or webhook-based document repository. Email: Connect Gmail to automatically ingest contract-related emails. AI Extraction: Add the OpenAI API key and define extraction prompts for obligations and terms. CLM System: Enter credentials for your contract lifecycle management platform. Alerts: Set up Google Sheets logging and connect dashboard endpoints for risk and compliance alerts. Prerequisites Cloud storage access; Gmail credentials; OpenAI API key; CLM system credentials; document processing license Use Cases Contract renewal tracking; compliance audits; risk management; vendor agreement reviews; regulatory adherence monitoring Customization Adjust risk thresholds; modify extraction rules; add Slack notifications; extend compliance frameworks Benefits Reduces review time 80%; catches compliance gaps; automates audit trails;
by Cheng Siong Chin
How It Works This workflow automates policy compliance validation and approval orchestration through intelligent AI-driven assessment. Designed for compliance departments, legal teams, and governance officers, it solves the critical challenge of ensuring policy adherence while managing approval workflows that require human oversight for critical decisions.The system operates on scheduled intervals, fetching data from policy databases and audit program performance metrics, then merging these sources for comprehensive compliance analysis. It employs a dual-agent AI framework for policy validation and execution orchestration, detecting violations, assessing severity, and determining required approval actions. The workflow intelligently routes findings based on compliance status, escalating violations through human approval checkpoints while maintaining detailed audit trails. By coordinating multi-channel notifications through email and Slack alongside synchronized logging, it ensures stakeholders receive timely alerts while creating complete traceability for regulatory examinations and internal audits. Setup Steps Configure Schedule Trigger with policy review frequency Connect Workflow Configuration node with compliance parameters Set up Fetch Policy Data and Fetch Audit Program Performance Data nodes Configure Merge Data Sources node for consolidation logic Connect Policy Validation Agent with OpenAI/Claude API credentials Set up validation processing Configure Route by Compliance Status node with severity classification Connect Execution Orchestration Agent with AI API credentials Set up orchestration processing Prerequisites OpenAI/Claude API credentials for AI validation agents, policy management system API access Use Cases Financial institutions validating AML policy compliance, healthcare organizations ensuring HIPAA adherence Customization Adjust validation criteria for industry-specific regulations Benefits Reduces compliance review cycles by 70%, eliminates manual policy monitoring
by moosa
n8n Wizard 🪄 – Your personal AI assistant inside WhatsApp This workflow transforms WhatsApp into a powerful personal AI using n8n + Green-API. Send text or voice messages — the assistant understands intent and handles daily tasks automatically. Key features 💰 Expense & income tracking — record spending, view summaries & category breakdowns (Google Sheets, append-only) ✓ Google Tasks management — create, list, update, delete tasks & reminders 🐦 Post to X/Twitter — write and publish single tweets or short threads 📧 Gmail search & summaries — find recent/unread emails by sender, label, keyword (read-only) 🌐 Real-time answers — current weather, news, exchange rates, facts via web search 🧮 Quick calculations — math, percentages, currency conversions 🎤 Full voice support — incoming voice messages transcribed (Whisper), replies can be spoken (TTS) How it works Green-API webhook receives message (text or audio) Voice → transcribed automatically Main intelligent router agent selects one sub-agent/tool Action executed → result sent back as text or voice (if input was voice) Setup requirements Green-API instance (webhook + send endpoints) OpenAI API key (chat, Whisper, TTS) Google Sheets, Google Tasks, Twitter/X, Gmail (read scope), SerpAPI credentials Strict routing rules prevent misuse — no deletions, no guessing values, one tool per clear intent. Start commanding: “spent 3200 on groceries”, “remind dentist tomorrow”, “tweet: loving n8n!”, “weather in Lahore now”
by Cheng Siong Chin
How It Works This workflow automates end-to-end carbon emissions monitoring, strategy optimisation, and ESG reporting using a multi-agent AI supervisor architecture in n8n. Designed for sustainability managers, ESG teams, and operations leads, it eliminates the manual effort of tracking emissions, evaluating reduction strategies, and producing compliance reports. Data enters via scheduled pulls and real-time webhooks, then merges into a unified feed processed by a Carbon Supervisor Agent. Sub-agents handle monitoring, optimisation, policy enforcement, and ESG reporting. Approved strategies are auto-executed or routed for human sign-off. Outputs are consolidated and pushed to Slack, Google Sheets, and email, keeping all stakeholders informed. The workflow closes the loop from raw sensor data to actionable ESG dashboards with minimal human intervention. Setup Steps Connect scheduled trigger and webhook nodes to your emissions data sources. Add credentials for Slack (bot token), Gmail (OAuth2), and Google Sheets (service account). Configure the Carbon Supervisor Agent with your preferred LLM (OpenAI or compatible). Set approval thresholds in the Check Approval Required node. Map Google Sheets document ID for ESG report and KPI dashboard nodes. Prerequisites OpenAI or compatible LLM API key Slack bot token Gmail OAuth2 credentials Google Sheets service account Use Cases Corporate sustainability teams automating monthly ESG reporting Customisation Swap LLM models per agent for cost or accuracy trade-offs Benefits Eliminates manual emissions data aggregation and report generation
by WeblineIndia
Shopify New Orders Auto-Sync to Google Sheets & AI Analysis This workflow automatically fetches new Shopify orders on a schedule, detects only newly created orders, enriches them using AI, and stores clean, structured data in Google Sheets for reporting and tracking. This workflow runs on a schedule, fetches recent Shopify orders, checks which ones are new using a stored sync timestamp, analyzes each new order with AI, and saves the final results into Google Sheets. You receive: Automatic order sync without webhooks** Duplicate-free processing using last sync tracking** AI-generated order insights (category, priority, notes)** Clean Google Sheets storage for reporting** Ideal for teams that want reliable order tracking, analysis, and reporting without relying on Shopify webhooks. Quick Start – Implementation Steps Connect your Shopify API credentials. Set the Schedule Trigger (e.g., every 15 minutes). Connect your Google Sheets account and select a sheet. Configure the AI node for order analysis. Activate the workflow — automated order sync begins. What It Does This workflow automates Shopify order processing: Runs on a defined schedule. Loads the last successful sync time from workflow static data. Fetches Shopify orders created after that time. Identifies only new orders to prevent duplicates. Extracts key order details: Order ID & number Product name & quantity Prices & currency Shipping country Uses AI to analyze each order and generate: Product category Priority level Internal review notes Formats the final data into clean fields. Saves each new order as a row in Google Sheets. Updates the last sync timestamp after successful processing. Who’s It For This workflow is ideal for: E-commerce operations teams Finance & accounting teams Store owners using Shopify Teams tracking orders in spreadsheets Businesses without Shopify webhook access Anyone needing automated order backups Requirements to Use This Workflow To run this workflow, you will need: n8n instance** (self-hosted or cloud) Shopify Admin API access** Google Sheets account** Optional: AI credentials (OpenAI or compatible) Basic understanding of Shopify order data How It Works Scheduled Trigger – Workflow runs at fixed intervals. Load Last Sync – Reads the last processed timestamp. Fetch Shopify Orders – Retrieves recent orders. New Order Detection – Filters only unprocessed orders. AI Order Analysis – Adds category, priority, and notes. Prepare Final Data – Cleans and structures fields. Save to Google Sheets – Appends a new row per order. Update Sync Time – Stores the latest successful run time. If no new orders exist, the workflow exits safely without errors. Setup Steps Import the workflow JSON into n8n. Open the Schedule Trigger and set the desired frequency. Configure the Shopify node with your store credentials. Connect the Google Sheets node and map columns correctly. Review the AI prompt for customization. Run the workflow once manually to initialize sync data. Activate the workflow. How To Customize Nodes Change Sync Frequency Adjust the Schedule Trigger to run every few minutes, hourly, or daily. Customize AI Analysis Modify the AI prompt to: Change priority rules Add risk or fraud flags Generate internal comments Extend Google Sheets Fields You can add: Customer email Payment status Fulfillment status AI confidence score Add-Ons (Optional Enhancements) You can extend this workflow to: Send Slack or email notifications Generate daily summary reports Detect high-value orders Add fraud or risk scoring Store data in a database instead of Sheets Support multiple Shopify stores Create dashboards from Google Sheets Use Case Examples 1. Operations Reporting Track daily orders in a shared spreadsheet. 2. Finance & Accounting Maintain an independent record of all sales. 3. Order Review Use AI notes to quickly understand important orders. 4. Backup & Auditing Keep an external copy of Shopify order data. Troubleshooting Guide | Issue | Possible Cause | Solution | |-----|---------------|----------| | No orders fetched | lastSync too recent | Reset static data | | Duplicate orders | Sync update missing | Ensure last sync node runs | | AI output not parsed | Invalid JSON | Adjust AI prompt | | Sheet not updating | Column mismatch | Match headers exactly | | Workflow stops early | No new data | Enable “Always Output Data” | Need Help? If you need help extending this workflow by adding alerts, dashboards advanced AI logic or scaling for production, then our n8n automation developers at WeblineIndia team are happy to assist with advanced n8n automation and Shopify integrations.
by Avkash Kakdiya
How it works This workflow captures new form submissions, cleans the data, and stores it securely before applying AI-based lead qualification. It analyzes each message to assign category, score, and priority. Based on this classification, leads are routed to the appropriate response path. The system ensures every lead is saved first, then enriched and responded to automatically. Step-by-step Capture and clean lead data** Typeform Trigger – Listens for new submissions in real time. Sanitize Lead Data – Cleans names, formats emails, and extracts domain. Store lead in database** Create a record – Saves lead details in Airtable with status “New”. Analyze and enrich with AI** AI Lead Analyzer – Uses OpenAI to classify, score, and prioritize leads. Merge – Combines original lead data with AI-generated insights. Route and respond automatically** Route by Category – Directs leads based on AI classification. Send a message – Sends tailored email for sales inquiries. Send a message1 – Sends confirmation email for support requests. Send a message2 – Sends response for partnership inquiries. Send a message3 – Sends fallback response for other categories. Send a message4 – Sends Discord alert for high-priority leads. Why use this? Ensures no lead is lost by storing data before processing Automatically prioritizes high-value opportunities using AI Reduces manual lead qualification and response time Provides personalized responses based on intent Enables real-time team alerts for important leads
by Rahul Joshi
📊 Description Most learning newsletters send the same email to everyone. This workflow does the opposite — every subscriber gets a completely different email every single day, personalized to their topic interest, learning phase, and how far they've come in their journey. Built specifically for women's skill development programs, coaching platforms, and edtech creators who want to deliver real personalized learning at scale without manually writing hundreds of emails. You manage the content library, the AI handles the personalization, and the workflow handles the delivery — every morning at 9AM without you touching anything. What This Workflow Does ⏰ Triggers every morning at 9AM automatically 📋 Fetches all subscribers from Google Sheets with their name, email, topic interest, and subscription date 🧮 Calculates each subscriber's day number, week number, and learning phase — Beginner, Intermediate, or Advanced — based on how long they've been subscribed 📚 Fetches your full content library and scores every lesson against each subscriber's topic interest and phase 🎯 Picks the best matching lesson per subscriber — falls back to day-index rotation if no strong match found 🔁 Loops through each subscriber one at a time to ensure every email is individually generated 🤖 Sends each subscriber's profile and matched lesson to GPT-4o which generates a fully personalized 3-paragraph lesson explanation, actionable task, key takeaways, and a motivational quote from a woman leader in their field 📧 Builds a beautiful branded HTML email and sends it via Gmail 📝 Logs every delivery to a SendLog sheet with date, name, lesson title, phase, category, and AI snippet Key Benefits ✅ Every subscriber gets a unique email — no generic blasts ✅ Learning phase auto-advances as subscribers stay longer ✅ GPT-4o adapts lesson tone and depth to Beginner, Intermediate, or Advanced ✅ Motivational quotes always come from women leaders in the relevant field ✅ Full delivery log in Google Sheets for tracking and analytics ✅ Works for any skill category — coding, finance, leadership, marketing, and more Features Cron-based daily trigger at 9AM Automatic learning phase calculation per subscriber Content scoring and matching engine Day-index fallback rotation for unmatched subscribers GPT-4o lesson personalization with phase-aware prompting Woman leader motivational quotes per field Branded HTML email template with inline CSS Dynamic subject line per subscriber Gmail delivery with individual personalization Full SendLog tracking in Google Sheets Loop-based processing — one subscriber at a time for accuracy Requirements OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection A configured Google Sheet with 3 sheets: Subscribers, ContentLibrary, SendLog Setup Steps Copy the Google Sheet template and grab your Sheet ID Paste the Sheet ID into all Google Sheets nodes Add your Google Sheets OAuth2 credentials Add your OpenAI API key to the GPT-4o node Add your Gmail OAuth2 credentials Populate the Subscribers sheet with your learners Populate the ContentLibrary sheet with your lessons — at least 5-10 per category Run the workflow manually once to test with your first subscriber Confirm the HTML email looks correct in your inbox Target Audience 🎓 Women's skill development platforms and bootcamps 📧 Edtech creators running personalized learning newsletters 💼 Career coaches who want to deliver daily value to their community 🤖 Automation agencies building AI-powered email learning systems for clients
by oka hironobu
Who is this for Customer support teams and operations managers who receive support requests via email and need automated triage. Works well for small to mid-size teams using Notion as their ticket tracker. What this workflow does This workflow watches a Gmail inbox for incoming support emails. Each email is analyzed by Gemini AI to determine its category (billing, technical, feature request, or general), priority level, and a suggested response draft. A new page is created in a Notion database with all classified fields. Critical tickets trigger an immediate Slack alert to the on-call team, while all tickets get a summary notification. Setup Add a Gmail OAuth2 credential and configure label or address filters. Add a Google Gemini API credential for email classification. Add a Notion API credential and create a database with columns: Title, Category, Priority, Status, Summary. Add a Slack OAuth2 credential and set your alerts channel. Requirements Gmail account with OAuth2 access Google Gemini API key Notion workspace with API integration enabled Slack workspace with OAuth2 app How to customize Edit the AI prompt in "Classify ticket with AI" to add more categories or adjust priority rules. Change the critical priority condition in "Is critical priority" to include high-priority tickets. Replace Notion with Airtable or Google Sheets for a different ticket backend.
by WeblineIndia
(Wealth Management) Client Question → Instant Answer Assistant (n8n + Google Sheets + AI + API) This workflow allows users to ask portfolio-related questions in a simple format (C001: your question). It validates the input, fetches client data and holdings from Google Sheets, retrieves live market prices via API, calculates portfolio performance and generates a short AI-powered response. Quick Implementation Steps Connect Google Sheets (clients, holdings, interaction_logs) Configure Get Live Prices API endpoint Add credentials for Generate AI Answer (Google Gemini) Ensure input format: C001: your question Run test cases (valid + invalid inputs) What It Does This workflow acts as an intelligent financial assistant that responds to client portfolio queries in real time. It starts by receiving a chat message through the When chat message received node and processes it using Parse Client Message to extract the client ID and question. Once validated, it retrieves client details using Get Client Profile and portfolio holdings via Get Client Holdings. It then fetches live stock prices through the Get Live Prices API and merges all data using Merge Portfolio Data to compute metrics like invested value, current value, profit/loss and returns. Finally, the workflow builds a structured prompt in Build AI Prompt and generates a concise response using Generate AI Answer, ensuring the reply is accurate, controlled and based only on available data. Who Can Use This Workflow Financial advisors managing multiple client portfolios Wealth management platforms Fintech developers building AI-driven assistants Anyone looking to automate portfolio Q&A workflows Requirements n8n (self-hosted or cloud) Google Sheets account with: clients sheet holdings sheet interaction_logs sheet Live price API endpoint (used in Get Live Prices) Google Gemini API credentials (used in Generate AI Answer) How It Works & Setup Guide 1. Trigger & Input Parsing When chat message received** receives input Parse Client Message** extracts: client_id question IF Valid Input?** validates format If invalid: Build Invalid Input Response** Log Invalid Input** Return Invalid Response** 2. Client & Holdings Lookup Get Client Profile** fetches client details IF Client Found?** ensures existence Get Client Holdings** retrieves holdings Prepare Symbols** extracts stock symbols IF Holdings Found?** validates data Failure handling: Build Client Not Found Response** Build No Holdings Response** 3. Market Data Fetching Get Live Prices** calls external API Normalize Price Response** standardizes output IF Price API Worked?** validates API success Failure handling: Build API Failed Response** 4. Portfolio Calculation Merge Portfolio Data** computes: Invested amount Current value P&L Return % Best performer Weakest performer Missing prices 5. Market Context (Optional Enhancement) Get Market Context** Attach Market Context** Adds optional insights like: Nifty/Sensex movement Market tone 6. AI Response Generation Build AI Prompt** creates structured prompt Generate AI Answer** generates response Extract AI Answer** extracts clean reply 7. Final Output Final response includes: Client ID Question AI reply Status Timestamp How To Customize Nodes Parse Client Message** Modify input format rules Get Live Prices** Replace with another API (Alpha Vantage, Twelve Data, etc.) Merge Portfolio Data** Add more financial metrics (CAGR, allocation %, etc.) Build AI Prompt** Change tone (formal, advisory, aggressive) Generate AI Answer** Replace Gemini with Hugging Face / OpenAI models Add-Ons (Enhancements) Add Slack notifications for responses Save AI replies to CRM Add email delivery for clients Implement caching for price API Add retry logic for API failures Support multi-client batch processing Use Case Examples Client asks: “C001: How is my portfolio performing?” Advisor checks: “C002: Which stock is my top performer?” User queries: “C003: Why is my portfolio down?” Daily automated portfolio summary generation Integration with chatbot for real-time advisory There can be many more use cases depending on how this workflow is extended. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|--------------|---------| | Invalid input error | Wrong format | Use C001: your question | | Client not found | Missing in sheet | Check clients sheet | | No holdings found | Empty data | Verify holdings sheet | | API failed | Endpoint issue | Check API URL or timeout | | AI reply empty | Model issue | Verify API credentials | | Incorrect calculations | Missing price data | Check API response | 🤝 Need Help? If you need help setting up this workflow, customizing nodes or building advanced automation solutions, feel free to reach out. For professional support, custom workflow development or enterprise-grade automation, contact our n8n workflow developers at WeblineIndia. We help businesses build scalable and intelligent automation systems tailored to their needs.
by vinci-king-01
Medical Research Tracker with Email and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Pipedrive account and API token SMTP credentials (or native n8n Email credentials) for sending alerts List of target URLs or RSS feeds from government or healthcare policy organizations Basic familiarity with n8n credential setup Required Credentials | Service | Credential Name | Purpose | |--------------------|-----------------|-----------------------------------| | ScrapeGraphAI | API Key | Perform web scraping | | Pipedrive | API Token | Create / update deals & notes | | Email (SMTP/Nodemailer) | SMTP creds | Send alert emails | Environment Variables (optional) | Variable | Example Value | Description | |-------------------------|------------------------------|-----------------------------------------------| | N8N_DEFAULT_EMAIL_FROM | policy-bot@yourorg.com | Default sender for Email Send node | | POLICY_KEYWORDS | telehealth, Medicare, HIPAA | Comma-separated keywords for filtering | How it works This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Key Steps: Manual Trigger**: Kick-starts the workflow or schedules it via cron. Set → URL List**: Defines the list of healthcare policy pages or RSS feeds to scrape. Split In Batches**: Iterates through each URL so scraping happens sequentially. ScrapeGraphAI**: Extracts headlines, publication dates, and links. Code (Filter & Normalize)**: Removes duplicates, standardizes JSON structure, and applies keyword filters. HTTP Request**: Optionally enriches data with summary content using external APIs (e.g., OpenAI, SummarizeBot). If Node**: Checks if the policy item is new (not already logged in Pipedrive). Pipedrive**: Creates a new deal or note for tracking and collaboration. Email Send**: Sends an alert to compliance or leadership teams with the policy summary. Sticky Note**: Provides inline documentation inside the workflow. Set up steps Setup Time: 15–20 minutes Install ScrapeGraphAI: In n8n, go to “Settings → Community Nodes” and install n8n-nodes-scrapegraphai. Create Credentials: a. Pipedrive → “API Token” from your Pipedrive settings → add in n8n. b. ScrapeGraphAI → obtain API key → add as credential. c. Email SMTP → configure sender details in n8n. Import Workflow: Copy the JSON template into n8n (“Import from clipboard”). Update URL List: Open the initial Set node and replace placeholder URLs with the sites you monitor (e.g., cms.gov, nih.gov, who.int, state health departments). Define Keywords (optional): a. Open the Code node “Filter & Normalize”. b. Adjust the const keywords = [...] array to match topics you care about. Test Run: Trigger manually; verify that: Scraped items appear in the execution logs. New deals/notes show up in Pipedrive. Alert email lands in your inbox. Schedule: Add a Cron node (e.g., every 6 hours) in place of Manual Trigger for automated execution. Node Descriptions Core Workflow Nodes: Manual Trigger** – Launches the workflow on demand. Set – URL List** – Holds an array of target policy URLs/RSS feeds. Split In Batches** – Processes each URL one at a time to avoid rate limiting. ScrapeGraphAI** – Scrapes page content and parses structured data. Code – Filter & Normalize** – Cleans results, removes duplicates, applies keyword filter. HTTP Request – Summarize** – Calls a summarization API (optional). If – Duplicate Check** – Queries Pipedrive to see if the policy item already exists. Pipedrive (Deal/Note)** – Logs a new deal or adds a note with policy details. Email Send – Alert** – Notifies subscribed stakeholders. Sticky Note** – Embedded instructions inside the canvas. Data Flow: Manual Trigger → Set (URLs) → Split In Batches → ScrapeGraphAI → Code (Filter) → If (Duplicate?) → Pipedrive → Email Send Customization Examples 1. Add Slack notifications // Insert after Email Send { "node": "Slack", "parameters": { "channel": "#policy-alerts", "text": New policy update: ${$json["title"]} - ${$json["url"]} } } 2. Use different CRM (HubSpot) // Replace Pipedrive node config { "resource": "deal", "operation": "create", "title": $json["title"], "properties": { "dealstage": "appointmentscheduled", "description": $json["summary"] } } Data Output Format The workflow outputs structured JSON data: { "title": "Telehealth Expansion Act of 2024", "date": "2024-05-30", "url": "https://www.congress.gov/bill/118th-congress-house-bill/1234", "summary": "This bill proposes expanding Medicare reimbursement for telehealth services...", "source": "congress.gov", "status": "new" } Troubleshooting Common Issues Empty Scrape Results – Check if the target site uses JavaScript rendering; ScrapeGraphAI may need a headless browser option enabled. Duplicate Deals in Pipedrive – Ensure the “If Duplicate?” node compares a unique field (e.g., URL or title) before creating a new deal. Performance Tips Limit batch size to avoid API rate limits. Cache or store the last scraped timestamp to skip unchanged pages. Pro Tips: Combine this workflow with an n8n “Cron” or “Webhook” trigger for fully automated monitoring. Use environment variables for keywords and email recipients to avoid editing nodes each time. Leverage Pipedrive’s automations to notify additional teams (e.g., legal) when high-priority items are logged.
by Chad M. Crowell
How it works This workflow automatically scans AWS accounts for orphaned resources (unattached EBS volumes, old snapshots >90 days, unassociated Elastic IPs) that waste money. It calculates cost impact, validates compliance tags, and sends multi-channel alerts via Slack, Email, and Google Sheets audit logs. Key Features: 🔍 Multi-region scanning with parallel execution 💰 Monthly/annual cost calculation with risk scoring 📊 Professional HTML reports with charts and tables 🏷️ Tag compliance validation (SOC2/ISO27001/HIPAA) ✅ Conditional alerting (only alerts when resources found) 📈 Google Sheets audit trail for trend analysis What gets detected: Unattached EBS volumes ($0.10/GB/month waste) Snapshots older than 90 days ($0.05/GB/month) Unassociated Elastic IPs ($3.60/month each) Typical savings: $50-10K/month depending on account size Set up steps Prerequisites AWS Configuration: Create IAM user n8n-resource-scanner with these permissions: ec2:DescribeVolumes ec2:DescribeSnapshots ec2:DescribeAddresses ec2:DescribeInstances lambda:InvokeFunction Deploy Lambda function aws-orphaned-resource-scanner (Node.js 18+) Add EC2 read-only permissions to Lambda execution role Generate AWS Access Key + Secret Key Lambda Function Code: See sticky notes in workflow for complete implementation using @aws-sdk/client-ec2 Credentials Required: AWS IAM (Access Key + Secret) Slack (OAuth2 or Webhook) Gmail (OAuth2) Google Sheets (OAuth2) Configuration Initialize Config Node: Update these settings: awsRegions: Your AWS regions (default: us-east-1) emailRecipients: FinOps team emails slackChannel: Alert channel (e.g., #cloud-ops) requiredTags: Compliance tags to validate snapshotAgeDays: Age threshold (default: 90) Set Region Variables: Choose regions to scan Lambda Function: Deploy function with provided code (see workflow sticky notes) Google Sheet: Create spreadsheet with headers: Scan Date | Region | Resource Type | Resource ID | Monthly Cost | Compliance | etc. Credentials: Connect all four credential types in n8n Schedule: Enable "Weekly Scan Trigger" (default: Mondays 8 AM UTC) Testing Click "Execute Workflow" to run manual test Verify Lambda invokes successfully Check Slack alert appears Confirm email with HTML report received Validate Google Sheets logging works Customization Options Multi-region:** Add regions in "Initialize Config" Alert thresholds:** Modify cost/age thresholds Additional resource types:** Extend Lambda function Custom tags:** Update required tags list Schedule frequency:** Adjust cron trigger Use Cases FinOps Teams:** Automated cloud waste detection and cost reporting Cloud Operations:** Weekly compliance and governance audits DevOps:** Resource cleanup automation and alerting Security/Compliance:** Tag validation for SOC2/ISO27001/HIPAA Executive Reporting:** Monthly cost optimization metrics Resources AWS IAM Best Practices Lambda Function Code
by Rakin Jakaria
Who this is for This workflow is for digital marketing agencies or sales teams who want to automatically find business leads based on industry & location, gather their contact details, and send personalized cold emails — all from one form submission. What this workflow does This workflow starts every time someone submits the Lead Machine Form. It then: Scrapes business data* (company name, website, phone, address, category) using *Apify** based on business type & location. Extracts the best email address* from each business website using *Google Gemini AI**. Stores valid leads* in *Google Sheets**. Generates cold email content** (subject + body) with AI based on your preferred tone (Friendly, Professional, Simple). Sends the cold email** via Gmail. Updates the sheet** with send status & timestamp. Setup To set this workflow up: Form Trigger – Customize the “Lead Machine” form fields if needed (Business Type, Location, Lead Number, Email Style). Apify API – Add your Apify Actor Endpoint URL in the HTTP Request node. Google Gemini – Add credentials for extracting email addresses. Google Sheets – Connect your sheet for storing leads & email status. OpenAI – Add your credentials for cold email generation. Gmail – Connect your Gmail account for sending cold emails. How to customize this workflow to your needs Change the AI email prompt to reflect your brand’s voice and offer. Add filters to only target leads that meet specific criteria (e.g., website must exist, email must be verified). Modify the Google Sheets structure to track extra info like “Follow-up Date” or “Lead Source”. Switch Gmail to another email provider if preferred.