by Rakin Jakaria
Who this is for This workflow is for content creators, digital marketers, or YouTube strategists who want to automatically discover trending videos in their niche, analyze engagement metrics, and get data-driven insights for their content strategy — all from one simple form submission. What this workflow does This workflow starts every time someone submits the YouTube Trends Finder Form. It then: Searches YouTube videos* based on your topic and specified time range using the *YouTube Data API**. Fetches detailed analytics** (views, likes, comments, engagement rates) for each video found. Calculates engagement rates** and filters out low-performing content (below 2% engagement). Applies smart filters** to exclude videos with less than 1000 views, content outside your timeframe, and hashtag-heavy titles. Removes duplicate videos** to ensure clean data. Creates a Google Spreadsheet** with all trending video data organized by performance metrics. Delivers the results** via a completion form with a direct link to your analytics report. Setup To set this workflow up: Form Trigger – Customize the "YouTube Trends Finder" form fields if needed (Topic Name, Last How Many Days). YouTube Data API – Add your YouTube OAuth2 credentials and API key in the respective nodes. Google Sheets – Connect your Google Sheets account for automatic report generation. Engagement Filters – Adjust the 2% engagement rate threshold based on your quality standards. View Filters – Modify the minimum view count (currently 1000+) in the filter conditions. Regional Settings – Update the region code (currently "US") to target specific geographic markets. How to customize this workflow to your needs Change the engagement rate threshold to be more or less strict based on your niche requirements. Add additional filters like video duration, subscriber count, or specific keywords to refine results. Modify the Google Sheets structure to include extra metrics like "Channel Name", "Video Duration", or "Trending Score". Switch to different output formats like CSV export or direct email reports instead of Google Sheets.
by Rahul Joshi
📊 Description Streamline Facebook Messenger inbox management with an AI-powered categorization and response system. 💬⚙️ This workflow automatically classifies new messages as Lead, Query, or Spam using GPT-4, routes them for approval via Slack, responds on Facebook once approved, and logs all interactions into Google Sheets for tracking. Perfect for support and marketing teams managing high volumes of inbound DMs. 🚀📈 What This Template Does 1️⃣ Trigger – Runs hourly to fetch new Facebook Page messages. ⏰ 2️⃣ Extract & Format – Collects sender info, timestamps, and message content for analysis. 📋 3️⃣ AI Categorization – Uses GPT-4 to identify message type (Lead, Query, Spam) and suggest replies. 🧠 4️⃣ Slack Approval Flow – Sends categorized leads and queries to Slack for quick team approval. 💬 5️⃣ Facebook Response – Posts AI-suggested replies back to the original sender once approved. 💌 6️⃣ Data Logging – Records every message, reply, and approval status into Google Sheets for analytics. 📊 7️⃣ Error Handling – Automatically alerts via Slack if the workflow encounters an error. 🚨 Key Benefits ✅ Reduces manual message triage on Facebook Messenger ✅ Ensures consistent and professional customer replies ✅ Provides full visibility via Google Sheets logs ✅ Centralizes team approvals in Slack for faster response times ✅ Leverages GPT-4 for accurate categorization and natural replies Features Hourly Facebook message fetch with Graph API GPT-4 powered text classification and reply suggestion Slack-based dual approval flow Automated Facebook replies post-approval Google Sheets logging for all categorized messages Built-in error detection and Slack alerting Requirements Facebook Graph API credentials with page message permissions OpenAI API key for GPT-4 processing Slack API credentials with chat:write permission Google Sheets OAuth2 credentials Environment Variables: FACEBOOK_PAGE_ID GOOGLE_SHEET_ID GOOGLE_SHEET_NAME SLACK_CHANNEL_ID Target Audience Marketing and lead-generation teams using Facebook Pages 📣 Customer support teams managing Messenger queries 💬 Businesses seeking automated lead routing and CRM sync 🧾 Teams leveraging AI for customer engagement optimization 🤖 Step-by-Step Setup Instructions 1️⃣ Connect Facebook Graph API credentials and set your page ID. 2️⃣ Add OpenAI API credentials for GPT-4. 3️⃣ Configure Slack channel ID and credentials. 4️⃣ Link your Google Sheet for message logging. 5️⃣ Replace environment variable placeholders with your actual IDs. 6️⃣ Test the workflow manually before enabling automation. 7️⃣ Activate the schedule trigger for ongoing hourly execution. ✅
by Adam Gałęcki
How it works: This workflow automates comprehensive SEO reporting by: Extracting keyword rankings and page performance from Google Search Console. Gathering organic reach metrics from Google Analytics. Analyzing internal and external article links. Tracking keyword position changes (gains and losses). Formatting and importing all data into Google Sheets reports. Set up steps: Connect Google Services: Authenticate Google Search Console, Google Analytics, and Google Sheets OAuth2 credentials. Configure Source Sheet: Set up a data source Google Sheet with article URLs to analyze. Set Report Sheet: Create or specify destination Google Sheets for reports. Update Date Ranges: Modify date parameters in GSC and GA nodes for your reporting period. Customize Filters: Adjust keyword filters and row limits based on your needs. Test Individual Sections: Each reporting section (keywords, pages, articles, position changes) can be tested independently. The workflow includes separate flows for: Keyword ranking (top 1000). Page ranking analysis. Organic reach reporting. Internal article link analysis. External article link analysis. Position gain/loss tracking.
by Rahul Joshi
Description Keep your CRM pipeline clean and actionable by automatically archiving inactive deals, logging results to Google Sheets, and sending Slack summary reports. This workflow ensures your sales team focuses on active opportunities while maintaining full audit visibility. 🚀📈 What This Template Does Triggers daily at 9 AM to check all GoHighLevel CRM opportunities. ⏰ Filters deals that have been inactive for 10+ days using last activity or update date. 🔍 Automatically archives inactive deals to keep pipelines clutter-free. 📦 Formats and logs deal details into Google Sheets for record-keeping. 📊 Sends a Slack summary report with total archived count, value, and deal names. 💬 Key Benefits ✅ Keeps pipelines organized by removing stale opportunities. ✅ Saves time through fully automated archiving and reporting. ✅ Maintains a transparent audit trail in Google Sheets. ✅ Improves sales visibility with automated Slack summaries. ✅ Easily adjustable inactivity threshold and scheduling. Features Daily scheduled trigger (9 AM) with adjustable cron expression. GoHighLevel CRM integration for fetching and updating opportunities. Conditional logic to detect inactivity periods. Google Sheets logging with automatic updates. Slack integration for real-time reporting and team visibility. Requirements GoHighLevel API credentials (OAuth2) with opportunity access. Google Sheets OAuth2 credentials with edit permissions. Slack Bot token with chat:write permission. A connected n8n instance (cloud or self-hosted). Target Audience Sales and operations teams managing CRM hygiene. Business owners wanting automated inactive deal cleanup. Agencies monitoring client pipelines across teams. CRM administrators ensuring data accuracy and accountability. Step-by-Step Setup Instructions Connect your GoHighLevel OAuth2 credentials in n8n. 🔑 Link your Google Sheets document and replace the Sheet ID. 📋 Configure Slack credentials and specify your target channel. 💬 Adjust inactivity threshold (default: 10 days) as needed. ⚙️ Update the cron schedule (default: 9 AM daily). ⏰ Test the workflow manually to verify end-to-end automation. ✅
by Hugo
🤖 n8n AI Workflow Dashboard Template Overview This template is designed to collect execution data from your AI workflows and generate an interactive dashboard for easy monitoring. It's compatible with any AI Agent or RAG workflow in n8n. Main Objectives 💾 Collect Execution Data Track messages, tokens used (prompt/completion), session IDs, model names, and compute costs Designed to plug into any AI agent or RAG workflow in n8n 📊 Generate an Interactive Dashboard Visualize KPIs like total messages, unique sessions, tokens used, and costs Display daily charts, including stacked bars for prompt vs completion tokens Monitor AI activity, analyze usage, and track costs at a glance ✨ Key Features 💬 Conversation Data Collection Messages sent to the AI agent are recorded with: sessionId chatInput output promptTokens, completionTokens, totalTokens globalCost and modelName This allows detailed tracking of AI interactions across sessions. 💰 Model Pricing Management A sub-workflow with a Set node provides token prices for LLMs Data is stored in the Model price table for cost calculations 🗄️ Data Storage via n8n Data Tables Two tables need to be created: Model price { "id": 20, "createdAt": "2025-10-11T12:16:47.338Z", "updatedAt": "2025-10-11T12:16:47.338Z", "name": "claude-4.5-sonnet", "promptTokensPrice": 0.000003, "completionTokensPrice": 0.000015 } Messages [ { "id": 20, "createdAt": "2025-10-11T15:28:00.358Z", "updatedAt": "2025-10-11T15:31:28.112Z", "sessionId": "c297cdd4-7026-43f8-b409-11eb943a2518", "action": "sendMessage", "output": "Hey! \nHow's it going?", "chatInput": "yo", "completionTokens": 6, "promptTokens": 139, "totalTokens": 139, "globalCost": null, "modelName": "gpt-4.1-mini", "executionId": 245 } ] These tables store conversation data and pricing info to feed the dashboard and calculations. 📈 Interactive Dashboard KPIs Generated**: total messages, unique sessions, total/average tokens, total/average cost 💸 Charts Included**: daily messages, tokens used per day (prompt vs completion, stacked bar) Provides a visual summary of AI workflow performance ⚙️ Installation & Setup Follow these steps to set up and run the workflow in n8n: 1. Import the Workflow Download or copy the JSON workflow and import it into n8n. 2. Create the Data Tables Model price table**: stores token prices per model Messages table**: stores messages generated by the AI agent 3. Configure the Webhook The workflow is triggered via a webhook Use the webhook URL to send conversation data 4. Set Up the Pricing Sub-workflow Automatically generates price data for the models used Connect it to your main workflow to enrich cost calculations 5. Dashboard Visualization The workflow returns HTML code rendering the dashboard View it in a browser or embed it in your interface 🌐 Once configured, your workflow tracks AI usage and costs in real-time, providing a live dashboard for quick insights. 🔧 Adaptability The template is modular and can be adapted to any AI agent or RAG workflow KPIs, charts, colors, and metrics can be customized in the HTML rendering Ideal for monitoring, cost tracking, and reporting AI workflow performance
by Recrutei Automações
Overview: Automated LinkedIn Job Posting with AI This workflow automates the publication of new job vacancies on LinkedIn immediately after they are created in the Recrutei ATS (Applicant Tracking System). It leverages a Code node to pre-process the job data and a powerful AI model (GPT-4o-mini, configured via the OpenAI node) to generate compelling, marketing-ready content. This template is designed for Recruitment and Marketing teams aiming to ensure consistent, timely, and high-quality job postings while saving significant operational time. Workflow Logic & Steps Recrutei Webhook Trigger: The workflow is instantly triggered when a new job vacancy is published in the Recrutei ATS, sending all relevant job data via a webhook. Data Cleaning (Code Node 1): The first Code node standardizes boolean fields (like remote, fixed_remuneration) from 0/1 to descriptive text ('yes'/'no'). Prompt Transformation (Code Node 2): The second, crucial Code node receives the clean job data and: Maps the original data keys (e.g., title, description) to user-friendly labels (e.g., Job Title, Detailed Description). Cleans and sanitizes the HTML description into readable Markdown format. Generates a single, highly structured prompt containing all job details, ready for the AI model. AI Content Generation (OpenAI): The AI Model receives the structured prompt and acts as a 'Marketing Copywriter' to create a compelling, engaging post specifically optimized for the LinkedIn platform. LinkedIn Post: The generated text is automatically posted to the configured LinkedIn profile or Company Page. Internal Logging (Google Sheets): The workflow concludes by logging the event (Job Title, Confirmation Status) into a Google Sheet for internal tracking and auditing. Setup Instructions To implement this workflow successfully, you must configure the following: Credentials: Configure OpenAI (for the Content Generator). Configure LinkedIn (for the Post action). Configure Google Sheets (for the logging). Node Configuration: Set up the Webhook URL in your Recrutei ATS settings. Replace YOUR_SHEET_ID_HERE in the Google Sheets Logging node with your sheet's ID. Select the correct LinkedIn profile/company page in the Create a post node.
by AFK Crypto
Try It Out! The AI Investment Research Assistant (Discord Summary Bot) transforms your Discord server into a professional-grade AI-driven crypto intelligence center. Running automatically every morning, it gathers real-time news, sentiment, and market data from multiple trusted sources — including NewsAPI, Crypto Compare, and CoinGecko — covering the most influential digital assets like BTC, ETH, SOL, BNB, and ADA. An AI Research Analyst Agent then processes this data using advanced reasoning and summarization to deliver a structured Market Intelligence Briefing. Each report distills key market events, sentiment shifts, price movements, and analyst-grade insights, all formatted into a visually clean and actionable message that posts directly to your Discord channel. Whether you’re a fund manager, community owner, or analyst, this workflow helps you stay informed about market drivers — without manually browsing dozens of news sites or data dashboards. Detailed Use Cases Crypto Research Teams:** Automate daily market briefings across key assets. Investment Communities:** Provide daily insights and sentiment overviews directly on Discord. Trading Desks:** Quickly review summarized market shifts and performance leaders. DAOs or Fund Analysts:** Centralize institutional-style crypto intelligence into your server. How It Works Daily Trigger (Schedule Node) – Activates each morning to begin data collection. News Aggregation Layer – Uses NewsAPI (and optionally CryptoPanic or GDELT) to fetch the latest crypto headlines and event coverage. Market & Sentiment Fetch – Collects market metrics via CoinGecko or Crypto Compare, including: 24-hour price change Market cap trend Social sentiment or Fear & Greed index AI Research Analyst (LLM Agent) – Processes and synthesizes all data into a cohesive insight report containing: 🧠 Executive Summary 📊 Top Gainers & Losers 💬 Sentiment Overview 🔍 Analyst Take / Actionable Insight Formatting Layer (Code Node) – Converts the analysis into a Discord-ready structure. Discord Posting Node – Publishes the final Market Intelligence Briefing to a specified Discord channel. Setup and Customization Import this workflow into your n8n workspace. Configure credentials: NewsAPI Key – For crypto and blockchain news. CoinGecko / Crypto Compare API Key – For real-time asset data. LLM Credential – OpenAI, Gemini, or Anthropic. Discord Webhook URL or Bot Token – To post updates. Customize the tracked assets in the News and Market nodes (BTC, ETH, SOL, BNB, ADA, etc.). Set local timezone for report delivery. Deploy and activate — your server will receive automated morning briefings. Output Format Each daily report includes: 📰 AI Market Intelligence Briefing 📅 Date: October 16, 2025 💰 Top Movers: BTC +2.3%, SOL +1.9%, ETH -0.8% 💬 Sentiment: Moderately Bullish 🔍 Analyst Take: Accumulation signals forming in mid-cap layer-1s. 📈 Outlook: Positive bias, with ETH showing strong support near $2,400. Compact yet rich in insight, this format ensures quick readability and fast decision-making for traders and investors. (Optional) Extend This Workflow Portfolio-Specific Insights:** Fetch your wallet holdings from AFK Crypto or Zapper APIs for personalized reports. Interactive Commands:** Add /compare or /analyze commands for Discord users. Multi-Language Summaries:** Auto-translate for international communities. Historical Data Logging:** Store briefings in Notion or Google Sheets. Weekly Recaps:** Summarize all daily reports into a long-form analysis. Requirements n8n Instance** (with HTTP Request, AI Agent, and Discord nodes enabled) NewsAPI Key** CoinGecko / Crypto Compare API Key** LLM Credential** (OpenAI / Gemini / Anthropic) Discord Bot Token or Webhook URL** APIs Used GET https://newsapi.org/v2/everything?q=crypto OR bitcoin OR ethereum OR defi OR nft&language=en&sortBy=publishedAt&pageSize=10 GET https://api.coingecko.com/api/v3/simple/price?ids=bitcoin,ethereum,solana&vs_currencies=usd&include_market_cap=true&include_24hr_change=true (Optional) GET https://cryptopanic.com/api/v1/posts/?auth_token=YOUR_TOKEN&kind=news (Optional) GET https://api.gdeltproject.org/api/v2/doc/doc?query=crypto&format=json Summary The AI Investment Research Assistant (Discord Summary Bot) is your personal AI research analyst — delivering concise, data-backed crypto briefings directly to Discord. It intelligently combines news aggregation, sentiment analysis, and AI reasoning to create actionable market intelligence each morning. Ideal for crypto traders, funds, or educational communities seeking a reliable daily edge — this workflow replaces hours of manual research with one automated, professional-grade summary. Our Website: https://afkcrypto.com/ Check our blogs: https://www.afkcrypto.com/blog
by Oussama
This n8n template creates an intelligent expense tracking system 🤖 that processes text, voice, and receipt images through Telegram. The assistant automatically categorizes expenses, handles currency conversions 🌍, and maintains financial records in Google Sheets while providing smart spending insights 💡. Use Cases: 🗣️ Personal expense tracking via Telegram chat 🧾 Receipt scanning and data extraction 💱 Multi-currency expense management 📂 Automated financial categorization 🎙️ Voice-to-expense logging 📊 Daily/weekly/monthly spending analysis How it works: Multi-Input Processing: Telegram trigger captures text messages, voice notes, and receipt images. Content Analysis: A Switch node routes different input types (text, audio, images) to appropriate processors. Voice Processing: ElevenLabs converts voice messages to text for expense extraction. Receipt OCR: Google Gemini analyzes receipt images to extract amounts and descriptions. Expense Classification: An LLM determines if the input is an expense or a general query. Expense Parsing: For multiple expenses, the AI splits and normalizes each item. Currency Conversion: An exchange rate API converts foreign currencies to USD. Smart Categorization: The AI agent assigns expenses to predefined categories with emojis. Data Storage: Google Sheets stores all expense records with automatic totals. Intelligent Responses: The agent provides spending summaries, alerts, and financial insights. Requirements: 🌐 Telegram Bot API access 🤖 OpenAI, Gemini, or any other AI model 🗣️ ElevenLabs API for voice processing 📝 Google Sheets API access 💹 Exchange rate API access Good to know: ⚠️ Daily spending alerts trigger when expenses exceed 100 USD. 🏷️ Supports 12 predefined expense categories with emoji indicators. 🔄 Automatic currency detection and conversion to USD. 🎤 Voice messages are processed through speech-to-text. 📸 Receipt images are analyzed using computer vision. Customizing this workflow: ✏️ Modify expense categories in the system prompt. 📈 Adjust spending alert thresholds. 💵 Change the base currency from USD to your preferred currency. ✅ Add additional expense validation rules. 🔗 Integrate with other financial platforms.
by n8n Automation Expert | Template Creator | 2+ Years Experience
Description 🎯 Overview An advanced automated trading bot that implements ICT (Inner Circle Trader) methodology and Smart Money Concepts for cryptocurrency trading. This workflow combines AI-powered market analysis with automated trade execution through Coinbase Advanced Trading API. ⚡ Key Features 📊 ICT Trading Strategy Implementation Kill Zone Detection**: Automatically identifies optimal trading sessions (Asian, London, New York kill zones) Smart Money Concepts**: Analyzes market structure breaks, liquidity grabs, fair value gaps, and order blocks Session Validation**: Real-time GMT time tracking with session strength calculations Structure Analysis**: Detects BOS (Break of Structure) and CHOCH (Change of Character) patterns 🤖 AI-Powered Analysis GPT-4 Integration**: Advanced market analysis using OpenAI's latest model Confidence Scoring**: AI generates confidence scores (0-100) for each trading signal Risk Assessment**: Automated risk level evaluation (LOW/MEDIUM/HIGH) ICT-Specific Prompts**: Custom prompts designed for Inner Circle Trader methodology 🔄 Automated Trading Flow Signal Reception: Receives trading signals via Telegram webhook Data Extraction: Parses symbol, action, price, and technical indicators Session Validation: Verifies current kill zone and trading session strength Market Data: Fetches real-time data from Coinbase Advanced Trading API AI Analysis: Processes signals through GPT-4 with ICT-specific analysis Quality Filter: Multi-condition filtering based on confidence, session, and structure Trade Execution: Automated order placement through Coinbase API Documentation: Records all trades and rejections in Notion databases 📱 Multi-Platform Integration Telegram Bot**: Receives signals and sends formatted notifications Coinbase Advanced**: Real-time market data and trade execution Notion Database**: Comprehensive trade logging and analysis tracking Webhook Support**: External system integration capabilities 🛠️ Setup Requirements API Credentials Needed: Coinbase Advanced Trading API** (API Key, Secret, Passphrase) OpenAI API Key** (GPT-4 access) Telegram Bot Token** and Chat ID Notion Integration** (Database IDs for trade records) Environment Variables: TELEGRAM_CHAT_ID=your_chat_id NOTION_TRADING_DB_ID=your_trading_database_id NOTION_REJECTED_DB_ID=your_rejected_signals_database_id WEBHOOK_URL=your_external_webhook_url 📈 Trading Logic Kill Zone Priority System: London & New York Sessions**: HIGH priority (0.9 strength) Asian & London Close**: MEDIUM priority (0.6 strength) Off Hours**: LOW priority (0.1 strength) Signal Validation Criteria: Signal quality must not be "LOW" Confidence score ≥ 60% Active kill zone session required ICT structure alignment confirmed 🎛️ Workflow Components Extract ICT Signal Data: Parses incoming Telegram messages for trading signals ICT Session Validator: Determines current kill zone and session strength Get Coinbase Market Data: Fetches real-time cryptocurrency data ICT AI Analysis: GPT-4 powered analysis with ICT methodology Parse ICT AI Analysis: Processes AI response with fallback mechanisms ICT Quality & Session Filter: Multi-condition signal validation Execute ICT Trade: Automated trade execution via Coinbase API Create ICT Trading Record: Logs successful trades to Notion Generate ICT Notification: Creates formatted Telegram alerts Log ICT Rejected Signal: Records filtered signals for analysis 🚀 Use Cases Automated ICT-based cryptocurrency trading Smart Money Concepts implementation Kill zone session trading AI-enhanced market structure analysis Professional trading documentation and tracking ⚠️ Risk Management Built-in session validation prevents off-hours trading AI confidence scoring filters low-quality signals Comprehensive logging for performance analysis Automated stop-loss and take-profit calculations This workflow is perfect for traders familiar with ICT methodology who want to automate their Smart Money Concepts trading strategy with AI-enhanced decision making.
by Yaron Been
Comprehensive SEO Strategy with O3 Director & GPT-4 Specialist Team Trigger When chat message received → User submits an SEO request (e.g., “Help me rank for project management software”). The message goes straight to the SEO Director Agent. SEO Director Agent (O3) Acts like the head of SEO strategy. Uses the Think node to plan and decide which specialists to call. Delegates tasks to relevant agents. Specialist Agents (GPT-4.1-mini) Each agent has its own OpenAI model connection for lightweight cost-efficient execution. Tasks include: Keyword Research Specialist → Keyword discovery, clustering, competitor analysis. SEO Content Writer → Generates optimized blog posts, landing pages, etc. Technical SEO Specialist → Site audit, schema markup, crawling fixes. Link Building Strategist → Backlink strategies, outreach campaign ideas. Local SEO Specialist → Local citations, GMB optimization, geo-content. Analytics Specialist → Reports, performance insights, ranking metrics. Feedback Loop Each agent sends results back to the SEO Director. Director compiles insights into a comprehensive SEO campaign plan. ✅ Why This Setup Works Well O3 Model for Director** → Handles reasoning-heavy orchestration (strategy, delegation). GPT-4.1-mini for Specialists** → Cheap, fast, task-specific execution. Parallel Execution** → All specialists can run at the same time. Scalable & Modular** → You can add/remove agents depending on campaign needs. Sticky Notes** → Already document the workflow (great for onboarding & sharing).
by WeblineIndia
Webhook from IoT Devices → Jira Maintenance Ticket → Slack Factory Alert This workflow automates predictive maintenance by receiving IoT machine-failure webhooks, creating Jira maintenance tickets, checking technician availability in Slack and sending the alert to the correct Slack channel. If an active technician is available, the system notifies the designated technician channel; if not, it escalates automatically to your chosen emergency/escalation channel. ⚡ Quick Implementation: Start Using in 10 Seconds Import the workflow JSON into n8n. Add Slack API credentials (with all required scopes). Add Jira Cloud credentials. Select Slack channels for: Technician alerts Emergency/escalation alerts Deploy the webhook URL to your IoT device. Run a test event. What It Does This workflow implements a real-time predictive maintenance automation loop. An IoT device sends machine data — such as temperature, vibration and timestamps — to an n8n webhook whenever a potential failure is detected. The workflow immediately evaluates whether the values exceed a defined safety threshold. If a failure condition is detected, a Jira maintenance ticket is automatically created with all relevant machine information. The workflow then gathers all technicians from your selected Slack channel and checks each technician’s presence status in real time. A built-in decision engine chooses the first available technician. If someone is active, the workflow sends a maintenance alert to your technician channel. If no technicians are available, the workflow escalates the alert to your chosen emergency channel to avoid operational downtime. This eliminates manual monitoring, accelerates response times and ensures no incident goes unnoticed — even if the team is unavailable. Who’s It For This workflow is ideal for: Manufacturing factories Industrial automation setups IoT monitoring systems Warehouse operations Maintenance & facility management teams Companies using Jira + Slack Organizations implementing predictive maintenance or automated escalation workflows Requirements to Use This Workflow You will need: An n8n instance (Cloud or Self-hosted) Slack App with the scopes: users:read users:read.presence channels:read chat:write Jira Cloud credentials (email + API token) Slack channels of your choice for: Technician alerts Emergency/escalation alerts IoT device capable of POST webhook calls Machine payload must include: machineId temperature vibration timestamp How It Works & How To Set Up 🔧 High-Level Workflow Logic IoT Webhook receives machine data. IF Condition checks whether values exceed safety thresholds. Jira Ticket is created with machine details if failure detected. Slack Channel Members are fetched from your selected technician channel. Loop Through Technicians to check real-time presence. Code Node determines: first available (active) technician or fallback mode if none available IF Condition checks technician availability. Slack Notification is sent to: your chosen technician channel if someone is available your chosen emergency/escalation channel if no one is online 🛠 Step-by-Step Setup Instructions Import Workflow: n8n → Workflows → Import from File → Select JSON. Configure Slack: Add required scopes (users:read, users:read.presence, channels:read, chat:write) and reconnect credentials. Select Slack Channels: Choose any Slack channels you want for technician notifications and emergency alerts—no fixed naming is required. Configure Jira: Add credentials, select project and issue type, and set priority mapping if needed. Deploy Webhook: Copy the n8n webhook URL and configure your IoT device to POST machine data. Test System: Send a test payload to ensure Jira tickets are created and Slack notifications route correctly based on technician availability. This setup allows real-time monitoring, automated ticket creation and flexible escalation — reducing manual intervention and ensuring fast maintenance response. How To Customize Nodes Webhook Node Add security tokens Change webhook path Add response message IF Node (Threshold Logic) Lower/raise temperature threshold Change OR to AND Add more conditions (humidity, RPM, pressure) Jira Node Customize fields like summary, labels or assign issues based on technician availability Slack Presence Node Add DND checks Treat “away” as “available” during night shift Combine multiple channels Code Node Randomly rotate technicians Pick technician with lowest alert count Keep a history log Add-Ons SMS fallback notifications (Twilio) WhatsApp alerts Telegram alerts Notify supervisors via email Store machine failures into Google Sheets Push metrics into PowerBI Auto-close Jira tickets after normalizing machine values Create a daily maintenance report Use Case Examples Overheating Machine Alert – Detect spikes and notify technician instantly. Vibration Pattern Anomaly Detection – Trigger early maintenance before full breakdown. Multi-Shift Technician Coverage – Automatically switch to emergency mode when no technician is online. Factory Night-Shift Automation – Night alerts automatically escalate without manual verification. Warehouse Robotics Malfunction – Sends instant Slack + Jira alerts when robots overheat or jam. Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------------- | ----------------------------------- | -------------------------------------------- | | Webhook returns no data | Wrong endpoint or method | Use POST + correct URL | | Slack presence returns error | Missing Slack scopes | Add users:read.presence | | Jira ticket not created | Invalid project key or credentials | Reconfigure Jira API credentials | | All technicians show offline | Wrong channel or IDs | Ensure correct channel members | | Emergency alert not triggered | Code node returning incorrect logic | Test code with all technicians set to “away” | | Slack message fails | Wrong channel ID | Replace with correct Slack channel | Need Help? If you need help customizing this workflow, adding new automation features, connecting additional systems or building enterprise IoT maintenance solutions, our n8n automation development team at WeblineIndia team can help. We can assist with: Workflow setup Advanced alert logic Integrating SMS / WhatsApp / Voice alerts Custom escalation rules Industrial IoT integration Reach out anytime for support or enhancements.
by IranServer.com
Automate IP geolocation and HTTP port scanning with Google Sheets trigger This n8n template automatically enriches IP addresses with geolocation data and performs HTTP port scanning when new IPs are added to a Google Sheets document. Perfect for network monitoring, security research, or maintaining an IP intelligence database. Who's it for Network administrators, security researchers, and IT professionals who need to: Track IP geolocation information automatically Monitor HTTP service availability across multiple ports Maintain centralized IP intelligence in spreadsheets Automate repetitive network reconnaissance tasks How it works The workflow triggers whenever a new row containing an IP address is added to your Google Sheet. It then: Fetches geolocation data using the ip-api.com service to get country, city, coordinates, ISP, and organization information Updates the spreadsheet with the geolocation details Scans common HTTP ports (80, 443, 8080, 8000, 3000) to check service availability Records port status back to the same spreadsheet row, showing which services are accessible The workflow handles both successful connections and various error conditions, providing a comprehensive view of each IP's network profile. Requirements Google Sheets API access** - for reading triggers and updating data Google Sheets document** with at least an "IP" column header How to set up Create a Google Sheet with columns: IP, Country, City, Lat, Lon, ISP, Org, Port_80, Port_443, Port_8000, Port_8080, Port_3000 Configure Google Sheets credentials in both the trigger and update nodes Update the document ID in the Google Sheets Trigger and both Update nodes to point to your spreadsheet Test the workflow by adding an IP address to your sheet and verifying the automation runs How to customize the workflow Modify port list**: Edit the "Edit Fields" node to scan different ports by changing the ports array Add more geolocation fields**: The ip-api.com response includes additional fields like timezone, zip code, and AS number Change trigger frequency**: Adjust the polling interval in the Google Sheets Trigger for faster or slower monitoring Add notifications**: Insert Slack, email, or webhook nodes to alert when specific conditions are detected Filter results**: Add IF nodes to process only certain IP ranges or geolocation criteria