by Robert Breen
This n8n workflow checks daily ad spend totals from a Google Sheet and sends a Slack alert if spend exceeds $100. It can be scheduled to run automatically or manually triggered for testing. This is perfect for marketing teams who want to monitor budget spikes in near real-time. β Key Features π₯ Google Sheets Integration**: Pulls raw spend data from a shared spreadsheet. π Scheduled or Manual Execution**: Can run daily on a schedule or manually for testing. π Aggregates Spend**: Summarizes daily totals from raw data. π§ Logic Check**: Alerts only when spend exceeds a certain threshold. π Slack Alerts**: Sends instant notification to a specified channel. π§° What You'll Need Google Cloud Project** with Sheets API enabled OAuth2 Credential** for Google Sheets Slack Bot Token** with permission to post to your workspace Your Google Sheet link and tab name** π Copy this Sample Google Sheet to Use: Marketing Data Sheet - Copy Me π§ Step-by-Step Setup 1οΈβ£ Schedule or Manual Trigger Node**: Schedule Workflow or Test Workflow Purpose**: Either run daily via a cron-like rule or manually trigger the flow. 2οΈβ£ Get Google Sheet Data Node**: Get Data What it does**: Fetches all rows from your connected sheet. Setup**: Go to Google Cloud Console Create a new project Enable Google Sheets API Create OAuth2 credentials for a desktop or web application Connect your Google account in n8n via OAuth2 Grant access to the sheet you want to read (ensure it's shared with your OAuth email) Use the copied sheet's link when connecting in n8n 3οΈβ£ Summarize Spend by Day Node**: Sum spend by Day What it does**: Groups the dataset by Date and sums the Spend ($) column Requirements**: Your sheet must have a header row with Date and Spend ($) as columns 4οΈβ£ Sort by Most Recent Date Node**: Sort Dates Descending What it does**: Sorts all entries by the Date field so that the most recent day is first Custom JavaScript**: const items = $input.all(); items.sort((a, b) => new Date(b.json.Date) - new Date(a.json.Date)); return items; 5οΈβ£ Select Top Result Node**: Keep only Last Day What it does**: Captures the top row (most recent day) for evaluation Fields**: Sets only Date and sum_Spend_($) to keep things clean 6οΈβ£ Check Spend Threshold Node**: Check if Spend over $100 What it does**: Uses an IF node to compare sum_Spend_($) against a 100 threshold Logic**: sum_Spend_($) > 100 7οΈβ£ Send Slack Notification Node**: Send Slack Message What it does**: Sends a message to a Slack channel if the threshold is exceeded Setup**: Go to Slack API Create a new app Enable chat:write and channels:read scopes under OAuth & Permissions Install the app to your workspace Copy the OAuth Token into your Slack credentials in n8n Select your target channel from the dropdown (must be public or the bot must be invited) Message**: The spend for the most recent day is over $100 8οΈβ£ No Action if Under Budget Node**: Do Nothing. Under 100 Purpose**: This path simply ends the flow with no action if spend is below the threshold π€ Created By Robert Breen Automation Consultant | AI Workflow Designer | n8n Expert π§ rbreen@ynteractive.com π ynteractive.com π LinkedIn π·οΈ Tags slack marketing automation budget alert daily schedule google sheets threshold logic n8n spend tracking data summarization
by Oneclick AI Squad
This n8n workflow monitors blood stock levels daily and sends alerts when availability is low. It fetches data from a Google Sheet, checks stock status, and notifies via WhatsApp. Key Features Daily Monitoring**: Checks blood stock every day. Automated Alerts**: Sends notifications when stock is low. Real-Time Updates**: Uses live data from Google Sheets. Efficient Delivery**: Alerts sent instantly via WhatsApp. Continuous Check**: Loops to ensure ongoing monitoring. Workflow Process Daily Check Blood Stock: Triggers the workflow daily. Fetch Blood Stock: Reads data from a Google Sheet. Get All Stock: Collects all available blood stock details. Check Stock Availability: Analyzes stock levels for low thresholds. Send Alert Message: Sends WhatsApp alerts if stock is low. Sheet Columns Blood Type**: Type of blood (e.g., A+, O-). Quantity**: Current stock amount. Threshold**: Minimum acceptable stock level. Last Updated**: Date and time of last update. Status**: Current status (e.g., Low, Sufficient). Setup Instructions Import Workflow**: Add the workflow to n8n via the import option. Configure Sheet**: Set up a Google Sheet with blood stock data. Set Up WhatsApp**: Configure WhatsApp API credentials in n8n. Activate**: Save and enable the workflow. Test**: Simulate low stock to verify alerts. Requirements n8n Instance**: Hosted or cloud-based n8n setup. Google Sheets**: Access with stock data. WhatsApp API**: Integration for sending alerts. Admin Access**: For monitoring and updates. Customization Options Adjust Threshold**: Change low stock limits. Add Channels**: Include email or SMS alerts. Update Frequency**: Modify daily trigger time.
by Calvin Cunningham
AI-Assisted Lead Follow-Up With Human Approval This workflow automates your lead response process from end to end. When someone submits your n8n Form, the workflow generates an AI-written follow-up email, sends that draft to your sales team for approval, and then sends the approved email to the lead or marks it as needing revision. All lead details, drafts, approval decisions, and timestamps are stored in Airtable. Ideal For Teams that want AI to draft emails while keeping a human approval step Businesses receiving inbound inquiries that need fast, consistent responses Users building a simple form β email β CRM workflow Teams that want a record of all drafts and approval outcomes What This Template Provides AI-generated follow-up email drafts Human approval flow, using Approve and Reject links Automatic Airtable logging of leads, drafts, and statuses Fully automated pipeline triggered by a simple form submission Setup Steps (5β10 minutes) Connect Your Credentials Gmail Airtable OpenAI Create Your Airtable Table Use the following fields: Name Email Phone Company Name Message Status Email Draft Created On Add Your Airtable Base ID and Table ID Insert them into the Workflow Configuration node. Enter Your Company Details Add your: Name Title Company name Email Phone number Website Etc. These values will appear in the final approved email sent to the lead. Set the Sales Approval Email Specify the email address where draft approval requests should be sent. Deploy the Approval Webhook Switch the Webhook node to the Production URL, and confirm that the Approve and Reject links point to it. Publish Your n8n Form Submit a test lead to verify: AI draft generation Approval email delivery Airtable logging Final email sending Why Use This Template? This workflow creates a reliable, semi-automated follow-up process that blends AI speed with human judgment. It ensures consistent communication, maintains accurate CRM records, and reduces manual work without requiring a full CRM platform.
by Eddy Medina
What does this workflow do? This workflow exports the names of all Dialogflow intents from your agent, together with their priority levels, directly into a Google Sheets spreadsheet. It is triggered via Telegram and includes visual indicators (emojis) for priority levels. π Overview π Activation**: Triggered when a validated user sends the keyword (e.g. "backup") via Telegram. π₯ Data Retrieval**: Fetches all intents of the specified Dialogflow agent using the Dialogflow API. βοΈ Processing**: Transforms each intent into an n8n-compatible item. Extracts the displayName and priority of each intent. Assigns an emoji and descriptive label based on priority tier: π΄ Highest, π High, π΅ Normal, π’ Low, π« Ignore. π Storage**: Appends each intent (name, priority number, emoji, and description), along with current date and time, to a Google Sheets document. π© Notification**: Sends a single confirmation message to the Telegram user once insertion is complete (using Execute Once). π οΈ How to install and configure Import the workflow: Upload the .json into your n8n instance. Connect Telegram: Add your Telegram bot credentials and configure the node ValidaciΓ³n de usuario por ID with your Telegram ID. Configure Dialogflow: Authenticate using a Google Service Account API Credential. Then, in the Obtiene datos de los intents node, replace the example project ID (TU_PROJECT_ID) with your actual Dialogflow agent's project ID. Connect Google Sheets: Authorize Google Sheets via OAuth2 and select your destination document/sheet in the node AΓ±adir fila en la hoja. Customize trigger keyword: Adjust the command text (default "backup") if needed. Activate workflow: Ensure the webhook is correctly set up in Telegram before enabling the workflow. π₯ Who is this for? π€ Bot administrators who need quick backups of Dialogflow intent names. π Teams managing multilingual or multi-intent agents wanting priority oversight. π» Development teams needing an automated way to audit or version intent configurations regularly. π‘ Use Cases βοΈ Backup intents periodically to monitor changes over time. π Visualize priority assignment in a spreadsheet for analysis or team discussion. π Document conversational structure for onboarding or knowledge transfer.
by Intuz
This n8n template from Intuz provides a complete solution to automate your accounting by instantly creating QuickBooks sales receipts for every new Stripe payment. This workflow automates the process of recording successful payments from Stripe into QuickBooks by creating corresponding Sales Receipts. It ensures payment data is captured accurately, checks whether the customer exists in QuickBooks, and creates a new customer if necessary before generating the receipt. This integration streamlines bookkeeping by eliminating manual data entry and ensuring all payment records are synchronized between systems. Who's this workflow for? Accountants & Bookkeepers Small Business Owners E-commerce Managers Finance Teams How it works 1. Trigger on Successful Payment: The workflow starts instantly when a payment_intent.succeeded event is received from Stripe via a webhook. This means it only runs after a payment is confirmed. 2. Get Customer Details: It uses the customer ID from the payment to fetch the customer's full details (name and email) from Stripe. 3. Check for Customer in QuickBooks: The workflow then searches your QuickBooks account to see if a customer with that name already exists. 4. Create Customer if New: If the customer is not found in QuickBooks, a new customer record is automatically created using the information from Stripe. 5. Generate Sales Receipt: Finally, using the correct customer record (either existing or newly created) and the payment amount, the workflow creates and saves a new sales receipt in QuickBooks, perfectly matching the Stripe transaction. Key Requirements to Use This Template 1. n8n Instance: An active n8n account (Cloud or self-hosted). 2. Stripe Account: An active Stripe account with API access. You must be able to create and manage webhooks. 3. QuickBooks Online Account: An active QuickBooks Online account with API access to manage customers and sales receipts. Setup Instructions 1. Configure the Webhook Trigger: Copy the webhook URL from the Capture Payment (Webhook) node in n8n. In your Stripe dashboard, go to Developers > Webhooks and add a new endpoint. Paste the n8n webhook URL and have it listen for the payment_intent.succeeded event. 2. Connect Stripe: In the Get a customer node, connect your Stripe account credentials. 3. Connect QuickBooks: In all three QuickBooks nodes (Find Customer, Create a customer, and Create a payment), connect your QuickBooks Online account using OAuth2 credentials. 4. Activate Workflow: Save the workflow and toggle the "Active" switch to ON. Your accounting automation is now live! Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Davide
This workflow automates the process of estimating a personβs fashion size from an uploaded image using an AI model. This workflow is an automated pipeline that uses an AI model to estimate a person's body measurements and clothing size from an image URL. Key Features π Full Automation** β From image submission to result display, the process requires no manual steps. βοΈ Easy Integration** β Uses n8nβs native nodes and simple HTTP requests to connect with Fal.aiβs API. π Real-Time Processing** β Automatically waits and checks for the AI result, ensuring the user receives the output as soon as itβs ready. π§© Modular Design** β Each step (submit β process β check β result) is clearly separated, making it easy to modify or extend (e.g., adding notifications or storing results in a database). π‘ User-Friendly Interface** β The initial form and final result form make it accessible even for non-technical users. π Secure** β Authentication to the Fal.ai API is handled through HTTP header authorization, keeping API keys protected. How it works Form Trigger: The workflow starts with a public form where a user submits a URL of an image. AI Processing Request: The submitted image URL is sent to the fal.run AI service (specifically, the "fashion-size-estimator" model) via a POST request. This initial request places the job in a queue and returns a unique request_id. Polling for Completion: The AI processing is asynchronous and takes some time. The workflow enters a loop where it: Waits: Pauses for 10 seconds to give the AI model time to process the request. Checks Status: Uses the request_id to check the status of the job. Conditional Check: An IF node checks if the status is "COMPLETED". If NO (not completed), the loop repeats (wait, then check again). If YES, the workflow exits the loop. Fetching and Displaying Results: Once processing is complete, the workflow retrieves the final result (containing the size, height, bust, waist, and hip measurements) and automatically displays it to the user on a "thank you" page. Set up steps To make this workflow operational, you need to configure the API authentication. Obtain an API Key: Create an account at fal.ai Navigate to your account settings to generate an API key. Configure Credentials in n8n: In your n8n instance, create a new HTTP Header Auth credential (you can name it "Fal.run API"). Set the Name field to Authorization. Set the Value field to Key YOURAPIKEY, replacing "YOURAPIKEY" with the actual key you obtained from fal.ai. Ensure this credential is correctly selected in the three HTTP Request nodes: "Send image to estimator", "Get status", and "Get result". Need help customizing? Contact me for consulting and support or add me on Linkedin.
by SpaGreen Creative
Automated WhatsApp Welcome Messages for Sales Leads with Google Sheets & Rapiwa Who is this for? This automation is ideal for sales teams, digital marketers, support agents, or small business owners who collect leads in Google Sheets and want to automatically send WhatsApp welcome messages. It's a cost-effective and easy-to-use solution built for those not using the official WhatsApp Business API but still looking to scale communication. What this Workflow Does This n8n automation reads leads from a connected Google Sheet, verifies if the provided WhatsApp numbers are valid using the Rapiwa API, and sends a personalized welcome message. It updates the sheet based on delivery success or failure, and continues this process every 5 minutes β ensuring new leads are automatically engaged. Key Features Automatic Scheduling**: Runs every 5 minutes (adjustable) Google Sheets Integration**: Reads and updates lead data WhatsApp Number Validation**: Confirms number validity via Rapiwa Personalized Messaging**: Uses lead name for custom messages Batch Processing**: Sends up to 60 messages per cycle Safe API Usage**: Adds 5-second delay between each message Error Handling**: Marks failed messages as not sent and unverified Live Status Updates**: Sheet columns are updated after each attempt Loop Logic**: Repeats continuously to catch new rows How to Use Step-by-step Setup Prepare Your Google Sheet Copy this Sample Sheet Ensure it includes the following columns: WhatsApp No name (note: trailing space is required) row_number status, check, validity Connect Google Sheets in n8n Use OAuth2 credentials to allow n8n access Set the workflow to fetch rows where check is not empty Get a Rapiwa Account Sign up at https://rapiwa.com Add your WhatsApp number Retrieve your Bearer Token from your Rapiwa dashboard Configure HTTP Request Nodes Use Rapiwa's API endpoints: Verify Number: https://app.rapiwa.com/api/verify-whatsapp Send Message: https://app.rapiwa.com/api/send-message Add your Bearer Token to the header Start Your Workflow Run the n8n automation It will read leads, clean phone numbers, verify WhatsApp validity, send messages, and update the sheet accordingly Requirements A Google Sheet with correctly formatted columns Active Rapiwa subscription (~$5/month) A valid Bearer Token from Rapiwa Your WhatsApp number connected to Rapiwa n8n instance with: Google Sheets integration (OAuth2 setup) HTTP Request capability Google Sheet Column Reference | name | number | email | time | check | validity | status | |-----------------|--------------|-------------------|-----------------------------|---------|------------|-----------| | Abdul Mannan | 8801322827799| contact@spagreen.net| September 14th 2025, 10:34 | checked | verified | sent | | Abdul Mannan | 8801322827798| contact@spagreen.net| September 14th 2025, 10:34 | checked | unverified | not sent | Workflow Logic Summary Trigger Every 5 Minutes Fetch All Rows with Pending Status Limit to 60 Rows per Execution Clean and Format Phone Numbers Check Number Validity via Rapiwa Condition Check: If valid β Send Message If invalid β Update status as not sent, unverified Send WhatsApp Message via Rapiwa Update Sheet Row On success: sent, verified, checked On failure: not sent, unverified Delay 5 seconds before next message Repeat for next lead Customization Ideas Add image or document sending support via Rapiwa Customize messages based on additional fields (e.g., product, service) Log failures to a separate sheet Send admin email for failed batches Add support for multilingual messages Notes & Warnings The column name "name " includes a space β do not remove or rename it. International number format is required for Rapiwa to work correctly. If you're sending many messages, increase the Wait node delay to prevent API throttling. Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
by Rahul Joshi
π Description This workflow automatically classifies new Stack Overflow questions by topic, generates structured FAQ content using GPT-4o-mini, logs each entry in Google Sheets, saves formatted FAQs in Notion, and notifies your team on Slack β ensuring your product and support teams stay aligned with real-world developer discussions. π€π¬π βοΈ What This Template Does Step 1: Monitors Stack Overflow RSS feeds for new questions related to your selected tags. β±οΈ Step 2: Filters out irrelevant or incomplete questions before processing. π§Ή Step 3: Uses OpenAI GPT-4o-mini to classify each question into a topic category (Frontend, Backend, DevOps, etc.). π§ Step 4: Generates structured FAQ content including summaries, technical insights, and internal guidance. π Step 5: Saves formatted entries into your Notion knowledge-base database. π Step 6: Logs all FAQ data into a connected Google Sheet for analytics and tracking. π Step 7: Sends real-time Slack notifications with quick links to the new FAQ and the original Stack Overflow post. π Step 8: Provides automatic error detection β any failed AI or Notion step triggers an instant Slack alert. π¨ π‘ Key Benefits β Builds a continuously updated, AI-driven knowledge base β Reduces repetitive support and documentation work β Keeps product and dev teams aware of trending community issues β Enhances internal docs with verified Stack Overflow insights β Maintains an audit trail via Google Sheets β Alerts your team instantly on errors or new FAQs π§© Features Automatic Stack Overflow RSS monitoring Dual-layer OpenAI integration (Topic Classification + FAQ Generation) Structured Notion database integration Google Sheets logging for analytics Slack notifications for new FAQs and error alerts Custom tag-based question filtering Near real-time updates (every minute) Built-in error handling for reliability π Requirements OpenAI API Key (GPT-4o-mini access) Notion API credentials with database access Google Sheets OAuth2 credentials Slack bot token with chat:write permissions Stack Overflow RSS feed URL for your preferred tags π₯ Target Audience SaaS or product teams building internal FAQ and knowledge systems Developer relations and documentation teams Customer-support teams automating knowledge reuse Technical communities curating content from Stack Overflow π§ Setup Instructions Add your OpenAI API credentials in n8n. Connect your Notion database and update the page or database ID. Connect Google Sheets credentials and select your tracking sheet. Connect your Slack account and specify your notification channel. Update the RSS Feed URL with your chosen Stack Overflow tags. Run the workflow manually once to test connectivity, then enable automation.
by PDF Vector
Overview Healthcare organizations face significant challenges in digitizing and processing medical records while maintaining strict HIPAA compliance. This workflow provides a secure, automated solution for extracting clinical data from various medical documents including discharge summaries, lab reports, clinical notes, prescription records, and scanned medical images (JPG, PNG). What You Can Do Extract clinical data from medical documents while maintaining HIPAA compliance Process handwritten notes and scanned medical images with OCR Automatically identify and protect PHI (Protected Health Information) Generate structured data from various medical document formats Maintain audit trails for regulatory compliance Who It's For Healthcare providers, medical billing companies, clinical research organizations, health information exchanges, and medical practice administrators who need to digitize and extract data from medical records while maintaining HIPAA compliance. The Problem It Solves Manual medical record processing is time-consuming, error-prone, and creates compliance risks. Healthcare organizations struggle to extract structured data from handwritten notes, scanned documents, and various medical forms while protecting PHI. This template automates the extraction process while maintaining the highest security standards for Protected Health Information. Setup Instructions: Configure Google Drive credentials with proper medical record access controls Install the PDF Vector community node from the n8n marketplace Configure PDF Vector API credentials with HIPAA-compliant settings Set up secure database storage with encryption at rest Define PHI handling rules and extraction parameters Configure audit logging for regulatory compliance Set up integration with your Electronic Health Record (EHR) system Key Features: Secure retrieval of medical documents from Google Drive HIPAA-compliant processing with automatic PHI masking OCR support for handwritten notes and scanned medical images Automatic extraction of diagnoses with ICD-10 code validation Medication list processing with dosage and frequency information Lab results extraction with reference ranges and flagging Vital signs capture and normalization Complete audit trail for regulatory compliance Integration-ready format for EHR systems Customization Options: Define institution-specific medical terminology and abbreviations Configure automated alerts for critical lab values or abnormal results Set up custom extraction fields for specialized medical forms Implement medication interaction warnings and contraindication checks Add support for multiple languages and international medical coding systems Configure integration with specific EHR platforms (Epic, Cerner, etc.) Set up automated quality assurance checks and validation rules Implementation Details: The workflow uses advanced AI with medical domain knowledge to understand clinical terminology and extract relevant information while automatically identifying and protecting PHI. It processes various document formats including handwritten prescriptions, lab reports, discharge summaries, and clinical notes. The system maintains strict security protocols with encryption at rest and in transit, ensuring full HIPAA compliance throughout the processing pipeline. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Rahul Joshi
Description: Eliminate duplicate entries and streamline your lead management process with this n8n workflow template! Automatically capture new form submissions, check against existing records, update duplicates, and sync leads seamlessly into GoHighLevel (GHL) CRM and your Google Sheets database. This automation monitors Google Form submissions in real time, verifies if a lead already exists, and routes them accordingly: β New leads are created in GoHighLevel and logged in your master database. β»οΈ Duplicates are updated with the latest details and tracked in a dedicated duplicate log. Perfect for sales, marketing, and operations teams that need clean, accurate, and up-to-date lead records without manual effort. What This Template Does π₯ Captures new lead form submissions from Google Sheets π Checks existing records to detect duplicate entries π Creates new contacts in GoHighLevel CRM for unique leads β»οΈ Updates existing GHL contacts with fresh submission details π Logs new leads in a master database spreadsheet π Tracks duplicate leads in a dedicated log for analytics π Fully automated: ensures a clean, organized, and deduplicated lead pipeline How It Works Google Sheets Trigger β Monitors your form response sheet for new submissions. Process Contact Data β Evaluates lead details and checks for duplicates. Duplicate Check β Compares against your master database. New Lead Handling β Creates a new contact in GoHighLevel and adds it to the master sheet. Duplicate Handling β Updates the existing contact in GoHighLevel and logs the activity in the duplicate log sheet. Setup Instructions- Google Sheets Setup Prepare three sheets: Form Responses Sheet β where new leads from your form are captured Master Lead Database β stores all unique leads Duplicate Log Sheet β tracks duplicate entries for reporting Required columns: Name Email Address Phone Number Company (optional) Submission Time (timestamp) GoHighLevel Setup Log into your GoHighLevel account. Generate an API key under settings. Store the key securely in n8n credentials. n8n Setup Import the workflow into your n8n instance. Update all node credentials (Google Sheets + GoHighLevel). Rename the Code node to Process Contact Data. Test the workflow with a sample form submission. Customization π Business Logic: Adjust duplicate detection rules (e.g., match on email only, or email + phone). π Data Fields: Add more fields (e.g., industry, source, notes) and map them to GHL + Sheets. π Reporting: Use the Duplicate Log Sheet for analytics, dashboards, or reporting pipelines. π Notifications: Add a Telegram or Slack node to notify your team when duplicates occur. Security Best Practices β Do not hardcode your GoHighLevel API key. Use n8n credentials. β Remove private sheet IDs and tokens before sharing workflows. β Restrict credential access to authorized team members only. Requirements Google Sheets (form responses, master database, duplicate log) GoHighLevel (GHL) account with API access n8n instance (self-hosted or cloud) This workflow is perfect for: π’ Sales Teams managing growing lead databases π Marketing Teams syncing form submissions with CRM βοΈ Operations Teams preventing duplicate records π Businesses wanting a reliable, automated lead pipeline
by Vigh Sandor
Setup Instructions Overview This n8n workflow monitors your Proxmox VE server and sends automated reports to Telegram every 15 minutes. It tracks VM status, host resource usage, temperature sensors, and detects recently stopped VMs. Prerequisites Required Software n8n instance (self-hosted or cloud) Proxmox VE server with API access Telegram account with bot created via BotFather lm-sensors package installed on Proxmox host Required Access Proxmox admin credentials (username and password) SSH access to Proxmox server Telegram Bot API token Telegram Chat ID Installation Steps Step 1: Install Temperature Sensors on Proxmox SSH into your Proxmox server and run: apt-get update apt-get install -y lm-sensors sensors-detect Press ENTER to accept default answers during sensors-detect setup. Test that sensors work: sensors | grep -E 'Package|Core' Step 2: Create Telegram Bot Open Telegram and search for BotFather Send /newbot command Follow prompts to create your bot Save the API token provided Get your Chat ID by sending a message to your bot, then visiting: https://api.telegram.org/bot<YOUR_TOKEN>/getUpdates Look for "chat":{"id": YOUR_CHAT_ID in the response Step 3: Configure n8n Credentials SSH Password Credential In n8n, go to Credentials menu Create new credential: SSH Password Enter: Host: Your Proxmox IP address Port: 22 Username: root (or your admin user) Password: Your Proxmox password Telegram API Credential Create new credential: Telegram API Enter the Bot Token from BotFather Step 4: Import and Configure Workflow Import the JSON workflow into n8n Open the "Set Variables" node Update the following values: PROXMOX_IP: Your Proxmox server IP address PROXMOX_PORT: API port (default: 8006) PROXMOX_NODE: Node name (default: pve) TELEGRAM_CHAT_ID: Your Telegram chat ID PROXMOX_USER: Proxmox username with realm (e.g., root@pam) PROXMOX_PASSWORD: Proxmox password Connect credentials: SSH - Get Sensors node: Select your SSH credential Send Telegram Report node: Select your Telegram credential Save the workflow Activate the workflow Configuration Options Adjust Monitoring Interval Edit the "Schedule Every 15min" node: Change minutesInterval value to desired interval (in minutes) Recommended: 5-30 minutes Adjust Recently Stopped VM Detection Window Edit the "Process Data" node: Find line: const fifteenMinutesAgo = now - 900; Change 900 to desired seconds (900 = 15 minutes) Modify Temperature Warning Threshold The workflow uses the "high" threshold defined by sensors. To manually set threshold, edit "Process Data" node: Modify the temperature parsing logic Change comparison: if (current >= high) to use custom value Testing Test Individual Components Execute "Set Variables" node manually - verify output Execute "Proxmox Login" node - check for valid ticket Execute "API - VM List" - confirm VM data received Execute complete workflow - check Telegram for message Troubleshooting Login fails: Verify PROXMOX_USER format includes realm (e.g., root@pam) Check password is correct Ensure allowUnauthorizedCerts is enabled for self-signed certificates No temperature data: Verify lm-sensors is installed on Proxmox Run sensors command manually via SSH Check SSH credentials are correct Recently stopped VMs not detected: Check task log API endpoint returns data Verify VM was stopped within detection window Ensure task types qmstop or qmshutdown are logged Telegram not receiving messages: Verify bot token is correct Confirm chat ID is accurate Check bot was started (send /start to bot) Verify parse_mode is set to HTML in Telegram node How It Works Workflow Architecture The workflow executes in a sequential chain of nodes that gather data from multiple sources, process it, and deliver a formatted report. Execution Flow Schedule Trigger (15min) Set Variables Proxmox Login (get authentication ticket) Prepare Auth (prepare credentials for API calls) API - VM List (get all VMs and their status) API - Node Tasks (get recent task log) API - Node Status (get host CPU, memory, uptime) SSH - Get Sensors (get temperature data) Process Data (analyze and structure all data) Generate Formatted Message (create Telegram message) Send Telegram Report (deliver via Telegram) Data Collection VM Information (Proxmox API) Endpoint: /api2/json/nodes/{node}/qemu Retrieves: Total VM count Running VM count Stopped VM count VM names and IDs Task Log (Proxmox API) Endpoint: /api2/json/nodes/{node}/tasks?limit=100 Retrieves recent tasks to detect: qmstop operations (VM stop commands) qmshutdown operations (VM shutdown commands) Task timestamps Task status Host Status (Proxmox API) Endpoint: /api2/json/nodes/{node}/status Retrieves: CPU usage percentage Memory total and used (in GB) System uptime (in seconds) Temperature Data (SSH) Command: sensors | grep -E 'Package|Core' Retrieves: CPU package temperature Individual core temperatures High and critical thresholds Data Processing VM Status Analysis Counts total, running, and stopped VMs Queries task log for stop/shutdown operations Filters tasks within 15-minute window Extracts VM ID from task UPID string Matches VM ID to VM name from VM list Calculates time elapsed since stop operation Temperature Intelligence The workflow implements smart temperature reporting: Normal Operation (all temps below high threshold): Calculates average temperature across all cores Displays min, max, and average values Example: "Average: 47.5 C (Min: 44.0 C, Max: 52.0 C)" Warning State (any temp at or above high threshold): Displays all temperature readings in detail Shows full sensor output with thresholds Changes section title to "Temperature Warning" Adds fire emoji indicator Resource Calculation CPU Usage: API returns decimal (0.0 to 1.0) Converted to percentage: cpu * 100 Memory: API returns bytes Converted to GB: bytes / (1024^3) Calculates percentage: (used / total) * 100 Uptime: API returns seconds Converted to days and hours: days = seconds / 86400, hours = (seconds % 86400) / 3600 Report Generation Message Structure The Telegram message uses HTML formatting for structure: Header Section Report title Generation timestamp Virtual Machines Section Total VM count Running VMs with checkmark Stopped VMs with stop sign Recently stopped count with warning Detailed list if VMs stopped in last 15 minutes Host Resources Section CPU usage percentage Memory used/total with percentage Host uptime in days and hours Temperature Section Smart display (summary or detailed) Warning indicator if thresholds exceeded Monospace formatting for sensor output HTML Formatting Features Bold tags for headers and labels Italic for timestamps Code blocks for temperature data Unicode separators for visual structure Emoji indicators for status (checkmark, stop, warning, fire) Security Considerations Credential Storage Passwords stored in n8n Set node (encrypted in database) Alternative: Use n8n environment variables Recommendation: Use Proxmox API tokens instead of passwords API Communication HTTPS with self-signed certificate acceptance Authentication via session tickets (15-minute validity) CSRF token validation for API requests SSH Access Password-based authentication (can use key-based) Commands limited to read-only operations No privilege escalation required Performance Impact API Load 3 API calls per execution (VM list, tasks, status) Lightweight endpoints with minimal data 15-minute interval reduces server load Execution Time Typical workflow execution: 5-10 seconds Login: 1-2 seconds API calls: 2-3 seconds SSH command: 1-2 seconds Processing: less than 1 second Resource Usage Minimal CPU impact on Proxmox Small memory footprint Negligible network bandwidth Extensibility Adding Additional Metrics To monitor additional data points: Add new API call node after "Prepare Auth" Update "Process Data" node to include new data Modify "Generate Formatted Message" for display Integration with Other Services The workflow can be extended to: Send to Discord, Slack, or email Write to database or log file Trigger alerts based on thresholds Generate charts or graphs Multi-Node Monitoring To monitor multiple Proxmox nodes: Duplicate API call nodes Update node names in URLs Merge data in processing step Generate combined report
by Alexandra Spalato
Scrape Google Maps leads and find emails with Apify and Anymailfinder Short Description This workflow automates lead generation by scraping business data from Google Maps using Apify, enriching it with verified email addresses via Anymailfinder, and storing the results in a NocoDB database. It's designed to prevent duplicates by checking against existing records before saving new leads. Key Features Automated Scraping**: Kicks off a Google Maps search based on your query, city, and country. Email Enrichment**: For businesses with a website, it automatically finds professional email addresses. Data Cleaning**: Cleans website URLs to extract the root domain, ignoring social media links. Duplicate Prevention**: Checks against existing entries in NocoDB using the Google placeId to avoid adding the same lead twice. Structured Storage**: Saves enriched lead data into a structured NocoDB database. Batch Processing**: Efficiently handles and loops through all scraped results. Who This Workflow Is For Sales Teams** looking for a source of local business leads. Marketing Agencies** building outreach campaigns for local clients. Business Developers** prospecting for new partnerships. Freelancers** seeking clients in specific geographical areas. How It Works Trigger: The workflow starts when you submit the initial form with a business type (e.g., "plumber"), a city, a country code, and the number of results you want. Scrape Google Maps: It sends the query to Apify to scrape Google Maps for matching businesses. Process Leads: The workflow loops through each result one by one. Clean Data: It extracts the main website domain from the URL provided by Google Maps. Check for Duplicates: It queries your NocoDB database to see if the business (placeId) has already been saved. If so, it skips to the next lead. Find Emails: If a valid website domain exists, it uses Anymailfinder to find associated email addresses. Store Lead: The final data, including the business name, address, phone, website, and any found emails, is saved as a new row in your NocoDB table. Setup Requirements Required Credentials Apify API Key**: To use the Google Maps scraping actor. Anymailfinder API Key**: For email lookup. NocoDB API Token**: To connect to your database for storing and checking leads. Database Structure You need to create a table in your NocoDB instance with the following columns. The names should match exactly. Table: leads (or your preferred name) title (SingleLineText) website (Url) phone (PhoneNumber) email (Email) email_validation (SingleLineText) address (LongText) neighborhood (SingleLineText) rating (Number) categories (LongText) city (SingleLineText) country (SingleLineText) postal code (SingleLineText) domain (Url) placeId (SingleLineText) - Important for duplicate checking date (Date) Customization Options Change Trigger**: Replace the manual Form Trigger with a Schedule Trigger to run searches automatically or an HTTP Request node to start it from another application. Modify Scraper Parameters**: In the "Scrape Google Maps" node, you can adjust the Apify actor's JSON input to change language, include reviews, or customize other advanced settings. Use a Different Database**: Replace the NocoDB nodes with nodes for Google Sheets, Baserow, Airtable, or any SQL database to store your leads. Installation Instructions Import the workflow into your n8n instance. Create the required table structure in your NocoDB instance as detailed above. Configure the credentials for Apify, Anymailfinder, and NocoDB in the respective nodes. In the two NocoDB nodes ("Get all the recorded placeIds" and "Create a row"), select your project and table from the dropdown menus. Activate the workflow. You can now run it by filling out the form in the n8n UI.