by Rahul Joshi
📘 Description: This workflow automates sales performance tracking and motivational updates by integrating HighLevel CRM, Notion, GPT-4o, and Slack. It pulls all deals from HighLevel, cleans and summarizes sales data per representative, creates performance dashboards in Notion, and uses GPT-powered AI to generate personalized motivational Slack messages. It eliminates manual leaderboard tracking and boosts sales engagement with real-time insights and daily motivation — ensuring every sales rep stays informed, recognized, and inspired. What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Starts the automation manually for data refresh or testing. 📦 Fetch All Deals from HighLevel CRM – Retrieves all opportunities from HighLevel CRM, including deal names, reps, values, and stages for full visibility. 🔍 Validate Deal Fetch Success (IF Node) – Verifies that fetched data contains valid deal IDs. ✅ True Path: Continues to data cleaning. ❌ False Path: Logs failed records to Google Sheets for debugging. 🧹 Clean & Structure Deal Data – Normalizes raw deal data into a consistent schema (deal ID, rep ID, client name, value, status). Ensures clean inputs for analytics. 📊 Summarize Sales by Representative – Aggregates deals per sales rep and computes: Total deals handled Total deal value Total deals won Average deal value 🧾 Generate Notion Performance Dashboard – Creates personalized Notion dashboards for each rep with daily updated performance summaries and motivation metrics. ⚙️ Transform Data for AI Input – Converts summarized data into AI-readable format for GPT-4o processing. 🧠 GPT-4o Model Configuration – Sets up Azure OpenAI GPT-4o model to generate motivational and contextual Slack messages. 🤖 AI-Generated Motivational Slack Messages – Uses LangChain + GPT-4o to create energetic, emoji-filled messages that celebrate rep achievements and encourage improvement. 💬 Notify Sales Team in Slack – Sends the AI-generated performance summaries and motivational blurbs directly to each rep or the team Slack channel for transparency and engagement. 🚨 Log Fetch or Validation Errors (Error Handling) – Records any fetch or validation failures in the Google Sheets “error log sheet” for easy review and error management. Prerequisites HighLevel CRM API credentials Google Sheets for “Error Log” tracking Notion API integration for dashboards Azure OpenAI (GPT-4o) credentials Slack API connection for notifications Key Benefits ✅ Fully automated daily performance tracking ✅ Personalized AI-powered motivation in Slack ✅ Transparent visibility for managers and reps ✅ Improved accountability and sales performance ✅ Seamless integration across CRM, Notion, and Slack 👥 Perfect For Sales teams seeking real-time motivation and transparency Managers who want automated performance dashboards Teams using HighLevel CRM and Slack for daily operations Companies aiming to gamify sales productivity
by noda
Price Anomaly Detection & News Alert (Marketstack + HN + DeepL + Slack) Overview This workflow monitors a stock’s closing price via Marketstack. It computes a 20-day moving average and standard deviation (±2σ). If the latest close is outside ±2σ, it flags an anomaly, fetches related headlines from Hacker News, translates them to Japanese with DeepL, and posts both original and translated text to Slack. When no anomaly is detected, it sends a concise “normal” report. How it works 1) Daily trigger at 09:00 JST 2) Marketstack: fetch EOD data 3) Code: compute mean/σ and classify (normal/high/low) 4) IF: anomaly? → yes = news path / no = normal report 5) Hacker News: search related items 6) DeepL: translate EN → JA 7) Slack: send bilingual notification Requirements Marketstack API key DeepL API key Slack OAuth2 (bot token / channel permission) Notes Edit the ticker in Get Stock Data. Adjust N (days) and k (sigma multiplier) in Calculate Deviation. Keep credentials out of HTTP nodes (use n8n Credentials).
by Rahul Joshi
Description Automatically detect customer churn risks from Zendesk tickets, log them into Google Sheets for tracking, and send instant Slack alerts to your customer success team. This workflow helps you spot unhappy customers early and take proactive action to reduce churn. 🚨📊💬 What This Template Does Fetches Zendesk tickets daily on schedule (8:00 PM). ⏰ Processes and formats ticket data into clean JSON (priority, age, urgency). 🧠 Identifies churn risks based on negative satisfaction ratings. ⚠️ Logs churn risk tickets into Google Sheets for analysis and reporting. 📈 Sends formatted Slack alerts with ticket details to the CS team channel. 📢 Key Benefits Detects unhappy customers before they churn. 🚨 Centralized churn tracking for reporting and team reviews. 🧾 Proactive alerts to reduce response delays. ⏱️ Clean, structured ticket data for analytics and filtering. 🔄 Strengthens customer success strategy with real-time visibility. 🌐 Features Schedule Trigger – Runs every weekday at 8:00 PM. 🗓️ Zendesk Integration – Fetches all tickets automatically. 🎫 Smart Data Processing – Adds ticket age, urgency, and priority mapping. 🧮 Churn Risk Filter – Flags tickets with negative satisfaction scores. 🚩 Google Sheets Logging – Saves churn risk details with metadata. 📊 Slack Alerts – Sends formatted messages with ID, subject, rating, and action steps. 💬 Requirements n8n instance (cloud or self-hosted). Zendesk API credentials with ticket read access. Google Sheets OAuth2 credentials with write permissions. Slack Bot API credentials with channel posting permissions. Pre-configured Google Sheet for churn risk logging. Target Audience Customer Success teams monitoring churn risk. 👩💻 SaaS companies tracking customer health. 🚀 Support managers who want proactive churn alerts. 🛠️ SMBs improving retention through automation. 🏢 Remote CS teams needing instant notifications. 🌐 Step-by-Step Setup Instructions Connect your Zendesk, Google Sheets, and Slack credentials in n8n. 🔑 Update the Schedule Trigger (default: daily at 8:00 PM) if needed. ⏰ Replace the Google Sheet ID with your churn risk tracking sheet. 📊 Confirm the Slack channel ID for alerts (default: zendesk-churn-alerts). 💬 Adjust churn filter logic (default: satisfaction_score = "bad"). 🎯 Run a test to fetch Zendesk tickets and validate Sheets + Slack outputs. ✅
by Rahul Joshi
📘 Description: This workflow automates the incident response lifecycle — from creation to communication and archival. It instantly creates Jira tickets for new incidents, alerts the on-call Slack team, generates timeline reports, logs the status in Google Sheets, and archives documentation to Google Drive — all automatically. It helps engineering and DevOps teams respond faster, maintain audit trails, and ensure no incident details are lost, even after Slack or Jira history expires. ⚙️ What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Start the incident creation and alerting process manually on demand. 🏷️ Define Incident Metadata – Sets up standardized incident data (Service, Severity, Description) used across Jira, Slack, and Sheets for consistent processing. 🎫 Create Jira Incident Ticket – Automatically creates a Jira task with service, severity, and description fields. Returns a unique Jira key and link for tracking. ✅ Validate Jira Ticket Creation Success – Confirms the Jira ticket was successfully created before continuing. True Path: Proceeds to Slack alerts and documentation flow. False Path: Logs the failure details to Google Sheets for debugging. 🚨 Log Jira Creation Failures to Error Sheet – Records any Jira API errors, permission issues, or timeouts to an error log sheet for reliability monitoring. 🔗 Combine Incident & Jira Data – Merges incident context with Jira ticket data to ensure all details are unified for downstream notifications. 💬 Format Incident Alert for Slack – Generates a rich Slack message containing Jira key, service, severity, and description with clickable Jira links. 📢 Alert On-Call Team in Slack – Posts the formatted message directly to the #oncall Slack channel to instantly notify engineers. 📋 Generate Incident Timeline Report – Parses Slack message content to create a detailed incident timeline including timestamps, service, severity, and placeholders for postmortem tracking. 📄 Convert Timeline to Text File – Converts the generated timeline into a structured .txt file for archival and compliance. ☁️ Archive Incident Timeline to Drive – Uploads the finalized incident report to Google Drive (“Incident Reports” folder) with timestamped filenames for traceability. 📊 Log Incident to Status Tracking Sheet – Appends Jira key, service, severity, and timestamp to the “status update” Google Sheet to build a live incident dashboard and enable SLA tracking. 🧩 Prerequisites Jira account with API access Google Sheets for “status update” and “error log” tracking Slack workspace connected via API credentials Google Drive access for archival 💡 Key Benefits ✅ Instant Slack alerts for new incidents ✅ Centralized Jira ticketing and tracking ✅ Automated timeline documentation for audits ✅ Seamless Google Drive archival and status logging ✅ Reduced MTTR through faster communication 👥 Perfect For DevOps and SRE teams managing production incidents Engineering managers overseeing uptime and reliability Organizations needing automated post-incident documentation Teams focused on SLA adherence and compliance reporting
by Evoort Solutions
Job Search Automation with Job Search Global API & Google Sheet Logging Description: Automate your job search process by querying the Job Search Global API via RapidAPI every 6 hours for a specified keyword like “Web Developer.” This workflow extracts job listings and saves them directly to Google Sheets, with alerts sent for any API failures. Workflow Overview Schedule Trigger Runs the workflow automatically every 6 hours to ensure up-to-date job listings. Set Search Term Defines the dynamic job keyword, e.g., "Web Developer," used in API requests. Fetch Job Listings Sends a POST request to the Job Search Global API (via RapidAPI) to retrieve job listings with pagination. Check API Response Validates the API response status, branching workflow on success or failure. Extract Job Data Parses the job listings array from the API response for processing. Save to Google Sheet Appends or updates job listings in Google Sheets, avoiding duplicates by matching job titles. Send Failure Notification Email Sends an alert email if the API response fails or returns an error. How to Obtain Your RapidAPI Key (Quick Steps) Go to RapidAPI Job Search Global API. Sign up or log in to your RapidAPI account. Subscribe to the API plan that suits your needs. Copy your unique X-RapidAPI-Key from the dashboard. Insert this key into your workflow’s HTTP Request node headers. How to Configure Google Sheets Create a new Google Sheet for job listings. Share the sheet with your Google Service Account email to enable API access. Use the sheet URL in the Google Sheets node within your workflow. Map columns correctly based on the job data fields. Google Sheet Columns Used | Column Name | Description | | ----------- | ----------------------------------- | | title | Job title | | url | Job posting URL | | company | Company name | | postDate | Date job was posted | | jobSource | Source of the job listing | | slug | Unique job identifier or slug | | sentiment | Sentiment analysis score (if any) | | dateAdded | Date the job was added to the sheet | | tags | Associated tags or keywords | | viewCount | Number of views for the job post | Use Cases & Benefits Automated Job Tracking:** Get fresh job listings without manual searching by automatically querying the Job Search Global API multiple times per day. Centralized Job Data:** Save and update listings in Google Sheets for easy filtering, sharing, and tracking. Failure Alerts:** Get notified immediately if API calls fail, helping maintain workflow reliability. Customizable Search:** Change keywords anytime to tailor job searches for different roles or industries. Who Is This Workflow For? Recruiters** looking to monitor job market trends in real-time. Job Seekers** who want to automate job discovery for specific roles like “Web Developer.” HR Teams** managing talent pipelines and job postings. Data Analysts** needing structured job market data for research or reporting. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Rahul Joshi
📊 Description This workflow automatically classifies new Stack Overflow questions by topic, generates structured FAQ content using GPT-4o-mini, logs each entry in Google Sheets, saves formatted FAQs in Notion, and notifies your team on Slack — ensuring your product and support teams stay aligned with real-world developer discussions. 🤖💬📚 ⚙️ What This Template Does Step 1: Monitors Stack Overflow RSS feeds for new questions related to your selected tags. ⏱️ Step 2: Filters out irrelevant or incomplete questions before processing. 🧹 Step 3: Uses OpenAI GPT-4o-mini to classify each question into a topic category (Frontend, Backend, DevOps, etc.). 🧠 Step 4: Generates structured FAQ content including summaries, technical insights, and internal guidance. 📄 Step 5: Saves formatted entries into your Notion knowledge-base database. 📚 Step 6: Logs all FAQ data into a connected Google Sheet for analytics and tracking. 📊 Step 7: Sends real-time Slack notifications with quick links to the new FAQ and the original Stack Overflow post. 🔔 Step 8: Provides automatic error detection — any failed AI or Notion step triggers an instant Slack alert. 🚨 💡 Key Benefits ✅ Builds a continuously updated, AI-driven knowledge base ✅ Reduces repetitive support and documentation work ✅ Keeps product and dev teams aware of trending community issues ✅ Enhances internal docs with verified Stack Overflow insights ✅ Maintains an audit trail via Google Sheets ✅ Alerts your team instantly on errors or new FAQs 🧩 Features Automatic Stack Overflow RSS monitoring Dual-layer OpenAI integration (Topic Classification + FAQ Generation) Structured Notion database integration Google Sheets logging for analytics Slack notifications for new FAQs and error alerts Custom tag-based question filtering Near real-time updates (every minute) Built-in error handling for reliability 🔐 Requirements OpenAI API Key (GPT-4o-mini access) Notion API credentials with database access Google Sheets OAuth2 credentials Slack bot token with chat:write permissions Stack Overflow RSS feed URL for your preferred tags 👥 Target Audience SaaS or product teams building internal FAQ and knowledge systems Developer relations and documentation teams Customer-support teams automating knowledge reuse Technical communities curating content from Stack Overflow 🧭 Setup Instructions Add your OpenAI API credentials in n8n. Connect your Notion database and update the page or database ID. Connect Google Sheets credentials and select your tracking sheet. Connect your Slack account and specify your notification channel. Update the RSS Feed URL with your chosen Stack Overflow tags. Run the workflow manually once to test connectivity, then enable automation.
by Vigh Sandor
Setup Instructions Overview This n8n workflow monitors your Proxmox VE server and sends automated reports to Telegram every 15 minutes. It tracks VM status, host resource usage, temperature sensors, and detects recently stopped VMs. Prerequisites Required Software n8n instance (self-hosted or cloud) Proxmox VE server with API access Telegram account with bot created via BotFather lm-sensors package installed on Proxmox host Required Access Proxmox admin credentials (username and password) SSH access to Proxmox server Telegram Bot API token Telegram Chat ID Installation Steps Step 1: Install Temperature Sensors on Proxmox SSH into your Proxmox server and run: apt-get update apt-get install -y lm-sensors sensors-detect Press ENTER to accept default answers during sensors-detect setup. Test that sensors work: sensors | grep -E 'Package|Core' Step 2: Create Telegram Bot Open Telegram and search for BotFather Send /newbot command Follow prompts to create your bot Save the API token provided Get your Chat ID by sending a message to your bot, then visiting: https://api.telegram.org/bot<YOUR_TOKEN>/getUpdates Look for "chat":{"id": YOUR_CHAT_ID in the response Step 3: Configure n8n Credentials SSH Password Credential In n8n, go to Credentials menu Create new credential: SSH Password Enter: Host: Your Proxmox IP address Port: 22 Username: root (or your admin user) Password: Your Proxmox password Telegram API Credential Create new credential: Telegram API Enter the Bot Token from BotFather Step 4: Import and Configure Workflow Import the JSON workflow into n8n Open the "Set Variables" node Update the following values: PROXMOX_IP: Your Proxmox server IP address PROXMOX_PORT: API port (default: 8006) PROXMOX_NODE: Node name (default: pve) TELEGRAM_CHAT_ID: Your Telegram chat ID PROXMOX_USER: Proxmox username with realm (e.g., root@pam) PROXMOX_PASSWORD: Proxmox password Connect credentials: SSH - Get Sensors node: Select your SSH credential Send Telegram Report node: Select your Telegram credential Save the workflow Activate the workflow Configuration Options Adjust Monitoring Interval Edit the "Schedule Every 15min" node: Change minutesInterval value to desired interval (in minutes) Recommended: 5-30 minutes Adjust Recently Stopped VM Detection Window Edit the "Process Data" node: Find line: const fifteenMinutesAgo = now - 900; Change 900 to desired seconds (900 = 15 minutes) Modify Temperature Warning Threshold The workflow uses the "high" threshold defined by sensors. To manually set threshold, edit "Process Data" node: Modify the temperature parsing logic Change comparison: if (current >= high) to use custom value Testing Test Individual Components Execute "Set Variables" node manually - verify output Execute "Proxmox Login" node - check for valid ticket Execute "API - VM List" - confirm VM data received Execute complete workflow - check Telegram for message Troubleshooting Login fails: Verify PROXMOX_USER format includes realm (e.g., root@pam) Check password is correct Ensure allowUnauthorizedCerts is enabled for self-signed certificates No temperature data: Verify lm-sensors is installed on Proxmox Run sensors command manually via SSH Check SSH credentials are correct Recently stopped VMs not detected: Check task log API endpoint returns data Verify VM was stopped within detection window Ensure task types qmstop or qmshutdown are logged Telegram not receiving messages: Verify bot token is correct Confirm chat ID is accurate Check bot was started (send /start to bot) Verify parse_mode is set to HTML in Telegram node How It Works Workflow Architecture The workflow executes in a sequential chain of nodes that gather data from multiple sources, process it, and deliver a formatted report. Execution Flow Schedule Trigger (15min) Set Variables Proxmox Login (get authentication ticket) Prepare Auth (prepare credentials for API calls) API - VM List (get all VMs and their status) API - Node Tasks (get recent task log) API - Node Status (get host CPU, memory, uptime) SSH - Get Sensors (get temperature data) Process Data (analyze and structure all data) Generate Formatted Message (create Telegram message) Send Telegram Report (deliver via Telegram) Data Collection VM Information (Proxmox API) Endpoint: /api2/json/nodes/{node}/qemu Retrieves: Total VM count Running VM count Stopped VM count VM names and IDs Task Log (Proxmox API) Endpoint: /api2/json/nodes/{node}/tasks?limit=100 Retrieves recent tasks to detect: qmstop operations (VM stop commands) qmshutdown operations (VM shutdown commands) Task timestamps Task status Host Status (Proxmox API) Endpoint: /api2/json/nodes/{node}/status Retrieves: CPU usage percentage Memory total and used (in GB) System uptime (in seconds) Temperature Data (SSH) Command: sensors | grep -E 'Package|Core' Retrieves: CPU package temperature Individual core temperatures High and critical thresholds Data Processing VM Status Analysis Counts total, running, and stopped VMs Queries task log for stop/shutdown operations Filters tasks within 15-minute window Extracts VM ID from task UPID string Matches VM ID to VM name from VM list Calculates time elapsed since stop operation Temperature Intelligence The workflow implements smart temperature reporting: Normal Operation (all temps below high threshold): Calculates average temperature across all cores Displays min, max, and average values Example: "Average: 47.5 C (Min: 44.0 C, Max: 52.0 C)" Warning State (any temp at or above high threshold): Displays all temperature readings in detail Shows full sensor output with thresholds Changes section title to "Temperature Warning" Adds fire emoji indicator Resource Calculation CPU Usage: API returns decimal (0.0 to 1.0) Converted to percentage: cpu * 100 Memory: API returns bytes Converted to GB: bytes / (1024^3) Calculates percentage: (used / total) * 100 Uptime: API returns seconds Converted to days and hours: days = seconds / 86400, hours = (seconds % 86400) / 3600 Report Generation Message Structure The Telegram message uses HTML formatting for structure: Header Section Report title Generation timestamp Virtual Machines Section Total VM count Running VMs with checkmark Stopped VMs with stop sign Recently stopped count with warning Detailed list if VMs stopped in last 15 minutes Host Resources Section CPU usage percentage Memory used/total with percentage Host uptime in days and hours Temperature Section Smart display (summary or detailed) Warning indicator if thresholds exceeded Monospace formatting for sensor output HTML Formatting Features Bold tags for headers and labels Italic for timestamps Code blocks for temperature data Unicode separators for visual structure Emoji indicators for status (checkmark, stop, warning, fire) Security Considerations Credential Storage Passwords stored in n8n Set node (encrypted in database) Alternative: Use n8n environment variables Recommendation: Use Proxmox API tokens instead of passwords API Communication HTTPS with self-signed certificate acceptance Authentication via session tickets (15-minute validity) CSRF token validation for API requests SSH Access Password-based authentication (can use key-based) Commands limited to read-only operations No privilege escalation required Performance Impact API Load 3 API calls per execution (VM list, tasks, status) Lightweight endpoints with minimal data 15-minute interval reduces server load Execution Time Typical workflow execution: 5-10 seconds Login: 1-2 seconds API calls: 2-3 seconds SSH command: 1-2 seconds Processing: less than 1 second Resource Usage Minimal CPU impact on Proxmox Small memory footprint Negligible network bandwidth Extensibility Adding Additional Metrics To monitor additional data points: Add new API call node after "Prepare Auth" Update "Process Data" node to include new data Modify "Generate Formatted Message" for display Integration with Other Services The workflow can be extended to: Send to Discord, Slack, or email Write to database or log file Trigger alerts based on thresholds Generate charts or graphs Multi-Node Monitoring To monitor multiple Proxmox nodes: Duplicate API call nodes Update node names in URLs Merge data in processing step Generate combined report
by Rahul Joshi
Description This workflow automates the evaluation of interviewer feedback using AI. It retrieves raw notes from Google Sheets, processes them through GPT-4o-mini for structured scoring, validates outputs, and calculates weighted quality scores. The system provides real-time Slack feedback to interviewers, logs AI errors for transparency, and recommends training if the feedback quality is low. What This Template Does (Step-by-Step) ⚡ Manual Trigger – Runs the workflow manually to start evaluation. 📋 Fetch Raw Feedback Data (Google Sheets) – Reads all feedback entries (Role, Stage, Interviewer Email, Feedback Text, row_number). 🧠 AI Quality Evaluator (Azure GPT-4o-mini) – Processes feedback into structured JSON across 5 dimensions. 🔍 Analyze Feedback Quality (LLM Chain) – Applies scoring rules (Specificity, STAR, Bias-Free, Actionability, Depth) and outputs structured JSON. ✅ Validate AI Response – Ensures AI output isn’t undefined or malformed. 🚨 Log AI Errors (Google Sheets) – Records invalid AI responses for debugging and auditing. 🔄 Parse AI JSON Output (Code Node) – Converts AI JSON text into structured n8n objects with error handling. 🧮 Calculate Weighted Quality Score (Code Node) – Computes final weighted score (0–100), generates flags, formats vague phrases, and preserves context. 💾 Save Scores to Spreadsheet (Google Sheets) – Updates the original feedback row with Score, Flags, and AI JSON. 💬 Send Feedback Summary to Interviewer (Slack) – Sends interviewers a structured Slack report (score, flags, vague phrases, STAR improvement tips). 🎯 Check if Training Needed – Applies threshold logic: if score < 50, route to training recommendations. 📚 Send Training Recommendations (Slack) – Delivers STAR method guides and bias-free interviewing resources to low scorers. Prerequisites Google Sheets (Raw_Feedback + Error Log Sheet) Azure OpenAI API credentials (for GPT-4o-mini) Slack API credentials (for sending feedback & training notifications) n8n instance (cloud or self-hosted) Key Benefits ✅ Automated interview feedback quality scoring ✅ Bias detection and vague feedback flagging ✅ Real-time Slack feedback to interviewers ✅ Error logging for AI reliability tracking ✅ Training recommendations for low scorers ✅ Audit trail maintained in Google Sheets Perfect For HR & Recruitment teams ensuring structured interviewer feedback Organizations enforcing STAR method & bias-free hiring Teams seeking continuous interviewer coaching Companies needing audit-ready records of interview quality
by n8n Automation Expert | Template Creator | 2+ Years Experience
🎯 What This Workflow Does Transform your digital payment business with a fully-featured Telegram bot that handles everything from product listings to transaction processing. Perfect for entrepreneurs looking to automate their PPOB (mobile credit, data packages, bill payments) business operations without coding expertise. ✨ Key Features 📱 Complete Transaction Management Prepaid Services**: Mobile credit, data packages, PLN tokens Gaming**: Game vouchers for popular platforms E-Wallet**: OVO, DANA, GoPay, ShopeePay top-ups Bill Payments**: PLN postpaid, Telkom, cable TV, internet, credit cards 💰 Smart Business Operations Real-time balance checking with low-balance alerts Automated transaction processing with MD5 security Interactive product catalog with categorized browsing Transaction history and status tracking Deposit request management 🤖 User-Friendly Interface Intuitive inline keyboard navigation Multi-step transaction flows with validation Comprehensive error handling and user feedback Professional messaging with emojis and formatting 🛠️ Technical Highlights Robust Architecture Switch-based routing** for efficient command handling MD5 signature authentication** for secure API communications Session management** for multi-step user interactions Comprehensive error handling** with user-friendly messages API Integrations Digiflazz API**: Balance checking, product listings, transactions, bill inquiries Telegram Bot API**: Message handling, inline keyboards, callback queries Secure credential management** with environment variables 📋 Setup Requirements Prerequisites Active Digiflazz account with API credentials Telegram Bot Token from @BotFather n8n instance (cloud or self-hosted) Environment Variables DIGIFLAZZ_USERNAME=your_digiflazz_username DIGIFLAZZ_API_KEY=your_digiflazz_api_key 🎮 How to Use Customer Commands /start - Welcome message and main menu /menu - Access main navigation /balance - Check account balance /products - Browse product catalog /topup - Process prepaid transactions /checkbill - Inquiry postpaid bills /paybill - Pay postpaid services /deposit - Request balance deposit /history - View transaction history Business Features Automated balance monitoring** with threshold alerts Product categorization** for easy browsing Transaction confirmation** with detailed receipts Multi-payment type support** across various service providers 🔒 Security & Compliance MD5 signature verification** for all API calls Input validation** and sanitization Session timeout management** Error logging** and monitoring HTTPS-only communications** 💡 Business Benefits For PPOB Entrepreneurs Reduce manual work** by 90% through automation 24/7 customer service** without human intervention Professional presentation** builds customer trust Scalable operations** handle unlimited transactions For Customers Instant transactions** with real-time confirmations Easy navigation** through intuitive menus Multiple service options** in one convenient bot Reliable service** with comprehensive error handling 📊 Performance Features Sub-second response times** for balance checks Concurrent transaction processing** Automatic retry logic** for failed operations Detailed logging** for business analytics 🎯 Perfect For Digital payment entrepreneurs** starting PPOB businesses Existing businesses** looking to automate customer service Resellers** wanting professional transaction interfaces Developers** seeking proven automation templates 📱 Supported Services Prepaid Products Mobile credit (all Indonesian operators) Data packages and internet vouchers PLN electricity tokens Game vouchers (Mobile Legends, Free Fire, PUBG, etc.) Postpaid Services PLN electricity bills Telkom phone bills Cable TV subscriptions (First Media, MNC, etc.) Internet service providers Credit card payments Multifinance installments 🚀 Getting Started Import the workflow JSON into your n8n instance Configure Telegram and Digiflazz credentials Set up environment variables Activate the workflow Test with your Telegram bot Start serving customers immediately! 💎 Premium Features Comprehensive documentation** with setup guides Error handling** for all edge cases Professional UI/UX** design Scalable architecture** for business growth Community support** and updates Transform your digital payment business today with this production-ready Telegram bot automation. No coding required – just configure and launch! Perfect for the Indonesian PPOB market with full Digiflazz integration and professional customer experience.
by Alexandra Spalato
Short Description This LinkedIn automation workflow monitors post comments for specific trigger words and automatically sends direct messages with lead magnets to engaged users. The system checks connection status, handles non-connected users with connection requests, and prevents duplicate outreach by tracking all interactions in a database. Key Features Comment Monitoring**: Scans LinkedIn post comments for customizable trigger words Connection Status Check**: Determines if users are 1st-degree connections Automated DMs**: Sends personalized messages with lead magnet links to connected users Connection Requests**: Asks non-connected users to connect via comment replies Duplicate Prevention**: Tracks interactions in NocoDB to avoid repeat messages Message Rotation**: Uses different comment reply variations for authenticity Batch Processing**: Handles multiple comments with built-in delays Who This Workflow Is For Content creators looking to convert post engagement into leads Coaches and consultants sharing valuable LinkedIn content Anyone wanting to automate lead capture from LinkedIn posts How It Works Setup: Configure post ID, trigger word, and lead magnet link via form Comment Extraction: Retrieves all comments from the specified post using Unipile Trigger Detection: Filters comments containing the specified trigger word Connection Check: Determines if commenters are 1st-degree connections Smart Routing: Connected users receive DMs, others get connection requests Database Logging: Records all interactions to prevent duplicates Setup Requirements Required Credentials Unipile API Key**: For LinkedIn API access NocoDB API Token**: For database tracking Database Structure **Table: leads linkedin_id: LinkedIn user ID name: User's full name headline: LinkedIn headline url: Profile URL date: Interaction date posts_id: Post reference connection_status: Network distance dm_status: Interaction type (sent/connection request) Customization Options Message Templates**: Modify DM and connection request messages Trigger Words**: Change the words that activate the workflow Timing**: Adjust delays between messages (8-12 seconds default) Reply Variations**: Add more comment reply options for authenticity Installation Instructions Import the workflow into your n8n instance Set up NocoDB database with required table structure Configure Unipile and NocoDB credentials Set environment variables for Unipile root URL and LinkedIn account ID Test with a sample post before full use
by Growth AI
Google Ads automated reporting to spreadsheets with Airtable Who's it for Digital marketing agencies, PPC managers, and marketing teams who manage multiple Google Ads accounts and need automated monthly performance reporting organized by campaign types and conversion metrics. What it does This workflow automatically retrieves Google Ads performance data from multiple client accounts and populates organized spreadsheets with campaign metrics. It differentiates between e-commerce (conversion value) and lead generation (conversion count) campaigns, then organizes data by advertising channel (Performance Max, Search, Display, etc.) with monthly tracking for budget and performance analysis. How it works The workflow follows an automated data collection and reporting process: Account Retrieval: Fetches client information from Airtable (project names, Google Ads IDs, campaign types) Active Filter: Processes only accounts marked as "Actif" for budget reporting Campaign Classification: Routes accounts through e-commerce or lead generation workflows based on "Typologie ADS" Google Ads Queries: Executes different API calls depending on campaign type (conversion value vs. conversion count) Data Processing: Organizes metrics by advertising channel (Performance Max, Search, Display, Video, Shopping, Demand Gen) Dynamic Spreadsheet Updates: Automatically fills the correct monthly column in client spreadsheets Sequential Processing: Handles multiple accounts with wait periods to avoid API rate limits Requirements Airtable account with client database Google Ads API access with developer token Google Sheets API access Client-specific spreadsheet templates (provided) How to set up Step 1: Prepare your reporting template Copy the Google Sheets reporting template Create individual copies for each client Ensure proper column structure (months B-M for January-December) Link template URLs in your Airtable database Step 2: Configure your Airtable database Set up the following fields in your Airtable: Project names: Client project identifiers ID GADS: Google Ads customer IDs Typologie ADS: Campaign classification ("Ecommerce" or "Lead") Status - Prévisionnel budgétaire: Account status ("Actif" for active accounts) Automation budget: URLs to client-specific reporting spreadsheets Step 3: Set up API credentials Configure the following authentication: Airtable Personal Access Token: For client database access Google Ads OAuth2: For advertising data retrieval Google Sheets OAuth2: For spreadsheet updates Developer Token: Required for Google Ads API access Login Customer ID: Manager account identifier Step 4: Configure Google Ads API settings Update the HTTP request nodes with your credentials: Developer Token: Replace "[Your token]" with your actual developer token Login Customer ID: Replace "[Your customer id]" with your manager account ID API Version: Currently using v18 (update as needed) Step 5: Set up scheduling Default schedule: Runs on the 3rd of each month at 5 AM Cron expression: 0 5 3 * * Recommended timing: Early month execution for complete previous month data Processing delay: 1-minute waits between accounts to respect API limits How to customize the workflow Campaign type customization E-commerce campaigns: Tracks: Cost and conversion value metrics Query: metrics.conversions_value for revenue tracking Use case: Online stores, retail businesses Lead generation campaigns: Tracks: Cost and conversion count metrics Query: metrics.conversions for lead quantity Use case: Service businesses, B2B companies Advertising channel expansion Current channels tracked: Performance Max: Automated campaign type Search: Text ads on search results Display: Visual ads on partner sites Video: YouTube and video partner ads Shopping: Product listing ads Demand Gen: Audience-focused campaigns Add new channels by modifying the data processing code nodes. Reporting period adjustment Current setting: Last month data (DURING LAST_MONTH) Alternative periods: Last 30 days, specific date ranges, quarterly reports Custom timeframes: Modify the Google Ads query date parameters Multi-account management Sequential processing: Handles multiple accounts automatically Error handling: Continues processing if individual accounts fail Rate limiting: Built-in waits prevent API quota issues Batch size: No limit on number of accounts processed Data organization features Dynamic monthly columns Automatic detection: Determines previous month column (B-M) Column mapping: January=B, February=C, ..., December=M Data placement: Updates correct month automatically Multi-year support: Handles year transitions seamlessly Campaign performance breakdown Each account populates 10 rows of data: Performance Max Cost (Row 2) Performance Max Conversions/Value (Row 3) Demand Gen Cost (Row 4) Demand Gen Conversions/Value (Row 5) Search Cost (Row 6) Search Conversions/Value (Row 7) Video Cost (Row 8) Video Conversions/Value (Row 9) Shopping Cost (Row 10) Shopping Conversions/Value (Row 11) Data processing logic Cost conversion: Automatically converts micros to euros (÷1,000,000) Precision rounding: Rounds to 2 decimal places for clean presentation Zero handling: Shows 0 for campaign types with no activity Data validation: Handles missing or null values gracefully Results interpretation Monthly performance tracking Historical data: Year-over-year comparison across all channels Channel performance: Identify best-performing advertising types Budget allocation: Data-driven decisions for campaign investments Trend analysis: Month-over-month growth or decline patterns Account-level insights Multi-client view: Consolidated reporting across all managed accounts Campaign diversity: Understanding which channels clients use most Performance benchmarks: Compare similar account types and industries Resource allocation: Focus on high-performing accounts and channels Use cases Agency reporting automation Client dashboards: Automated population of monthly performance reports Budget planning: Historical data for next month's budget recommendations Performance reviews: Ready-to-present data for client meetings Trend identification: Spot patterns across multiple client accounts Internal performance tracking Team productivity: Track account management efficiency Campaign optimization: Identify underperforming channels for improvement Growth analysis: Monitor client account growth and expansion Forecasting: Use historical data for future performance predictions Strategic planning Budget allocation: Data-driven distribution across advertising channels Channel strategy: Determine which campaign types to emphasize Client retention: Proactive identification of declining accounts New business: Performance data to support proposals and pitches Workflow limitations Monthly execution: Designed for monthly reporting (not real-time) API dependencies: Requires stable Google Ads and Sheets API access Rate limiting: Sequential processing prevents parallel account handling Template dependency: Requires specific spreadsheet structure for proper data placement Previous month focus: Optimized for completed month data (run early in new month) Manual credential setup: Requires individual configuration of API tokens and customer IDs
by Grigory Frolov
📊 YouTube Personal Channel Videos → Google Sheets Automatically sync your YouTube videos (title, description, tags, publish date, captions, etc.) into Google Sheets — perfect for creators and marketers who want a clean content database for analysis or reporting. 🚀 What this workflow does ✅ Connects to your personal YouTube channel via Google OAuth 🔁 Fetches all uploaded videos automatically (with pagination) 🏷 Extracts metadata: title, description, tags, privacy status, upload status, thumbnail, etc. 🧾 Retrieves captions (SRT format) if available 📈 Writes or updates data in your Google Sheets document ⚙️ Can be run manually or scheduled via Cron 🧩 Nodes used Manual Trigger** — to start manually or connect with Cron HTTP Request (YouTube API v3)** — fetches channel, uploads, and captions Code Nodes** — manage pagination and collect IDs SplitOut** — iterates through video lists Google Sheets (appendOrUpdate)** — stores data neatly If Conditions** — control data flow and prevent empty responses ⚙️ Setup guide Connect your Google Account Used for both YouTube API and Google Sheets. Make sure the credentials are set up in Google OAuth2 API and Google Sheets OAuth2 API nodes. Create a Google Sheet Add a tab named Videos. Add these columns: youtube_id | title | description | tags | privacyStatus | uploadStatus | thumbnail | captions You can also include categoryId, maxres, or published if you’d like. Replace the sample Sheet ID In each Google Sheets node, open the “Spreadsheet” field and choose your own document. Make sure the sheet name matches the tab name (Videos). Run the workflow Execute it manually first to pull your latest uploads. Optionally add a Cron Trigger node for daily sync (e.g., once per day). Check your Sheet Your data should appear instantly — with each video’s metadata and captions (if available). 🧠 Notes & tips ⚙️ The flow loops through all pages of your upload playlist automatically — no manual pagination needed. 🕒 The workflow uses YouTube’s “contentDetails.relatedPlaylists.uploads” to ensure you only fetch your own uploads. 💡 Captions fetch may fail for private videos — use “Continue on Fail” if you want the rest to continue. 🧮 Ideal for dashboards, reporting sheets, SEO analysis, or automation triggers. 💾 To improve speed, you can disable the “Captions” branch if you only need metadata. 👥 Ideal for 🎬 YouTube creators maintaining a video database 📊 Marketing teams tracking SEO performance 🧠 Digital professionals building analytics dashboards ⚙️ Automation experts using YouTube data in other workflows 💛 Credits Created by Grigory Frolov YouTube: @gregfrolovpersonal More workflows and guides → ozwebexpert.com/n8n