by Avkash Kakdiya
How it works This workflow starts whenever a new domain is added to a Google Sheet. It cleans the domain, fetches traffic insights from SimilarWeb, extracts the most relevant metrics, and updates the sheet with enriched data. Optionally, it can also send this information to Airtable for further tracking or analysis. Step-by-step Trigger on New Domain Workflow starts when a new row is added in the Google Sheet. Captures the raw URL/domain entered by the user. Clean Domain URL Strips unnecessary parts like http://, https://, www., and trailing slashes. Stores a clean domain format (e.g., example.com) along with the row number. Fetch Website Analysis Uses the SimilarWeb API to pull traffic and engagement insights for the domain. Data includes global rank, country rank, category rank, total visits, bounce rate, and more. Extract Key Metrics Processes raw SimilarWeb data into a simplified structure. Extracted insights include: Ranks: Global, Country, and Category. Traffic Overview: Total Visits, Bounce Rate, Pages per Visit, Avg Visit Duration. Top Traffic Sources: Direct, Search, Social. Top Countries (Top 3): With traffic share percentages. Device Split: Mobile vs Desktop. Update Google Sheet Writes the cleaned and enriched domain data back into the same (or another) Google Sheet. Ensures each row is updated with the new traffic insights. Export to Airtable (Optional) Creates a new record in Airtable with the enriched traffic metrics. Useful if you want to manage or visualize company/domain data outside of Google Sheets. Why use this? Automatically enriches domain lists with live traffic data from SimilarWeb. Cleans messy URLs into a standard format. Saves hours of manual research on company traffic insights. Provides structured, comparable metrics for better decision-making. Flexible: update sheets, export to Airtable, or both.
by Artem Boiko
How it works This template helps project managers collect task updates and photo reports from field workers via Telegram and stores everything in a Google Sheet. It enables daily project reporting without paper or back-office overhead. High-level flow: Workers receive daily tasks via Telegram They respond with photo reports Bot auto-saves replies (photos + status) to a Google Sheet The system tracks task completion, adds timestamps, and maintains report history Set up steps 🕒 Estimated setup time: 15–30 min You’ll need: A Telegram bot (via BotFather) A connected Google Sheet (with specific column headers) A set of preconfigured tasks 👉 Detailed setup instructions and required table structure are documented in sticky notes inside the workflow. Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by AppStoneLab Technologies LLP
Reddit Thread → AI-Powered X & LinkedIn Posts with Human Approval Gate Turn any Reddit thread into polished, platform-optimized social media posts for X (Twitter) and LinkedIn - in minutes, not hours. This workflow reads a Reddit thread, extracts the full discussion (including nested comment threads sorted by score), feeds everything to Google Gemini for summarization and post generation, then pauses for your review before publishing anything live. No accidental posts. No context lost. Just high-quality content, on your terms. ✨ Key Features 🔗 Any Reddit URL supported** → Standard, mobile (m.reddit.com), and short redd.it links all work 💬 Full thread extraction** → Recursively pulls all comments and replies, sorted by score at every depth level 🧠 Two-stage AI pipeline** → Gemini first summarizes the thread, then generates platform-specific posts from that summary 🐦 X-optimized post** → Max 280 characters, punchy, curiosity-driven with relevant hashtags; auto-truncated if over the limit 💼 LinkedIn-optimized post** → 150–300 words, professional tone, structured paragraphs, engagement question, and hashtags 👤 Human-in-the-loop approval** → A review form shows both posts before anything is published; supports manual overrides per platform 🚫 Graceful rejection path** → If rejected, the workflow terminates cleanly with no content published 📝 What This Workflow Does This workflow solves a real content creation bottleneck: Reddit threads are goldmines of community insight, niche expertise, and trending discussions - but turning that raw discussion into polished, platform-appropriate social posts takes significant manual effort. This automation handles the entire pipeline: from a raw URL to live posts on two platforms, with you staying in full control via an approval gate. It's ideal for content marketers, community managers, indie hackers, developers, and newsletter writers who want to repurpose Reddit content without losing quality or spending hours manually summarizing threads. ⚙️ How It Works (Step-by-Step) Submit a Reddit URL - A web form (Form Trigger) accepts any Reddit thread URL as input. Parse the URL - A Code node validates and deconstructs the URL using regex, extracting the subreddit and post ID to build the Reddit JSON API endpoint. Fetch the thread - An HTTP Request node calls Reddit's public JSON API (reddit.com/r/.../comments/id.json) with limit=100 and depth=3 to retrieve the full thread. Extract & structure content - A Code node recursively traverses the entire comment tree, sorts comments by score at every depth level, and builds a clean flat text representation of the full thread - including post metadata (title, score, upvote ratio, flair, awards) - ready for AI injection. Summarize with Gemini - The assembled thread content is passed to Google Gemini (3.1 Flash Lite), which returns a comprehensive markdown summary covering: Thread Overview, Key Topics, Notable Insights, Community Sentiment, and Actionable Takeaways. Generate social posts - A second Gemini call uses the summary to craft a platform-optimized X post (≤280 chars) and a LinkedIn post (150–300 words), returning strict JSON output. Parse & validate - A Code node safely extracts the JSON, strips any markdown fences, falls back to regex parsing if needed, and enforces the 280-character hard limit on the X post. Human Approval form - The workflow pauses and presents both posts in a review form. You can approve as-is, paste a manual override for either platform, or reject the entire run. Resolve final content - A Code node merges your overrides (if any) with the AI versions; overrides always win, AI version is the fallback. Route by decision - An IF node checks your approval decision: ✅ Approve & Publish → Posts simultaneously to X and LinkedIn ❌ Reject → Workflow ends cleanly; nothing is published 🚀 How to Use This Workflow Step 1 - Set up credentials Click Use template, then configure the following credentials in n8n: | Service | Credential Type | How to Get It | |---|---|---| | 🤖 Google Gemini | Google PaLM API | Get API Key → Google AI Studio | | 🐦 X (Twitter) | Twitter OAuth2 | X Developer Portal → Create App → OAuth2 | | 💼 LinkedIn | LinkedIn OAuth2 | LinkedIn Developer Portal → Create App | > Note on Reddit: No API key required. This workflow uses Reddit's public JSON API (append .json to any thread URL) which is freely accessible without authentication. Step 2 - Configure the LinkedIn node Open the Post to LinkedIn node and replace the person field value (=ID) with your LinkedIn Person URN. You can retrieve it by calling the LinkedIn API: GET https://api.linkedin.com/v2/userinfo after authenticating. Step 3 - Activate the workflow Toggle the workflow to Active in your n8n instance. This enables the Form Trigger and the Wait node's webhook to function correctly. Step 4 - Run it Open the Form Trigger URL (found in the Reddit URL Input node) Paste any Reddit thread URL Wait for the approval form to arrive (check the execution log for the form URL) Review, optionally edit, and approve or reject Done! your posts are live 🚀 🛠️ How to Customize 🤖 Swap the AI model** - Both Gemini nodes use gemini-3.1-flash-lite-preview. You can switch to gemini-3.1-pro-preview or claude-sonnet-4-6 for higher quality output by updating the modelId in both Gemini nodes or by adding Anthropic nodes. 📝 Change the post format* - Edit the prompt in the *Generate Social Posts** node to adjust tone, length, hashtag count, or add support for other platforms (Instagram, Threads, Facebook). 📊 Add more platforms* - After the *Approved?** node's true branch, connect additional posting nodes (e.g., Facebook Graph API, Buffer, Telegram) in parallel. 📋 Log to Google Sheets** - Add a Google Sheets node after the publish nodes to track published posts, Reddit thread URLs, dates, and engagement metrics. ⏱️ Make it scheduled** - Replace the Form Trigger with a Schedule Trigger + a list of pre-configured Reddit URLs in Google Sheets for fully automated daily publishing. ⚠️ Important Notes The Reddit public JSON API does not require authentication but is rate-limited. For high-volume use, consider adding a Reddit OAuth2 credential. The Wait node requires your n8n instance to be publicly accessible (or use n8n Cloud) so the approval form's webhook URL can be reached by your browser. LinkedIn's API requires your app to have the w_member_social permission scope to post on behalf of a user. X (Twitter) API v2 requires an approved developer account. Free tier allows limited monthly tweets.
by Joseph
Overview This n8n workflow creates an intelligent AI agent that automates browser interactions through Airtop's browser automation platform. The agent can control real browser sessions, navigate websites, interact with web elements, and maintain detailed session records - all while providing live viewing capabilities for real-time monitoring. Youtube Tutorial: https://www.youtube.com/watch?v=XoZqFY7QFps What This Workflow Does The AI agent acts as your virtual assistant in the browser, capable of: Session Management**: Creates, monitors, and terminates browser sessions with proper tracking Web Navigation**: Visits websites, clicks elements, fills forms, and performs complex interactions Multi-Window Support**: Manages multiple browser windows within sessions Live Monitoring**: Provides real-time viewing URLs so you can watch the automation Data Tracking**: Maintains comprehensive records of all browser activities Profile Integration**: Uses Airtop profiles for authenticated sessions Email Notifications**: Sends live URLs and status updates via Gmail Demo Use Case: Automated Reddit Posting The tutorial demonstrates the agent's capabilities by: Logging into Reddit using pre-configured Airtop profile credentials Navigating to a specific subreddit based on user input Creating and publishing a new post with title and content Tracking the entire process with detailed session records Providing live viewing access throughout the automation Core Workflow Components 1. Chat Interface Trigger Node Type**: Chat Trigger Purpose**: Accepts user commands for browser automation tasks Input**: Natural language instructions (e.g., "Create a Reddit post in r/automation") 2. AI Agent Processing Node Type**: OpenAI GPT-4 Purpose**: Interprets user requests and determines appropriate browser actions System Message**: Contains the comprehensive agent instructions from your documentation Capabilities**: Understands complex web interaction requests Plans multi-step browser workflows Manages session states intelligently Handles error scenarios gracefully 3. Google Sheets Data Management Multiple Google Sheets nodes manage different aspects of session tracking: Browser Sessions Sheet Fields**: session_id, description, status, created_date Purpose**: Tracks active browser sessions Operations**: Create, read, update session records Window Sessions Sheet Fields**: session_id, window_id, description, airtop_live_view_url, status Purpose**: Tracks individual browser windows within sessions Operations**: Create, read, update window records Airtop Profiles Sheet Fields**: platform_name, platform_url, profile_name Purpose**: Stores available authenticated profiles Operations**: Read available profiles for session creation 4. Airtop Browser Automation Nodes Multiple specialized nodes for browser control: Session Management create_session**: Creates new browser sessions with optional profile authentication terminate_session**: Closes browser sessions and updates records read_airtop_profiles**: Retrieves available authentication profiles Window Management create_window**: Opens new browser windows with specified URLs query_page**: Analyzes page content and identifies interactive elements Web Interaction click_element**: Clicks specific page elements based on AI descriptions type_text**: Inputs text into form fields and input elements 5. Gmail Integration Node Type**: Gmail Send Purpose**: Sends live viewing URLs and status updates Recipients**: User email for real-time monitoring Content**: Complete Airtop live view URLs for browser session observation 6. Error Handling & Validation Input Validation**: Ensures required parameters are present Session State Checks**: Verifies browser session status before operations Error Recovery**: Handles failed operations gracefully Data Consistency**: Maintains accurate session records even during failures Technical Requirements API Credentials Needed Airtop.ai API Key Sign up at airtop.ai Generate API key from dashboard Required for all browser automation functions OpenAI API Key OpenAI account with GPT-4 access Required for AI agent intelligence and decision-making Google Sheets Access Google account with Google Sheets API access Copy the provided template and get your sheet URL Required for session and profile data management Gmail OAuth Google account with Gmail API access Required for sending live viewing URLs and notifications Airtable Base Structure Create three tables in your Airtable base: 1. Browser Details (Sessions) session_id (Single line text) description (Single line text) status (Single select: Open, Closed) created_date (Date) 2. Window Details (Windows) session_id (Single line text) window_id (Single line text) description (Single line text) airtop_live_view_url (URL) status (Single select: Open, Closed) 3. Airtop Profiles platform_name (Single line text) platform_url (URL) profile_name (Single line text) Workflow Logic Flow User Request Processing User Input: Natural language command via chat interface AI Analysis: OpenAI processes request and determines required actions Session Check: Agent reads current browser session status Action Planning: AI creates step-by-step execution plan Browser Session Lifecycle Session Creation: Check for existing open sessions Ask user about profile usage if needed Create new Airtop session Record session details in Airtable Window Management: Create browser window with target URL Capture live viewing URL Record window details in Airtable Send live URL via Gmail Web Interactions: Query page content for element identification Execute clicks, form fills, navigation Monitor page state changes Handle dynamic content loading Session Cleanup: Terminate browser session when complete Update all related records to "Closed" status Send completion notification Data Flow Architecture User Input → AI Processing → Session Management → Browser Actions → Data Recording → User Notifications Key Features & Benefits Intelligent Automation Natural Language Control**: Users can describe tasks in plain English Context Awareness**: AI understands complex multi-step workflows Adaptive Responses**: Handles unexpected page changes and errors Profile Integration**: Seamlessly uses stored authentication credentials Real-Time Monitoring Live View URLs**: Watch browser automation as it happens Status Updates**: Real-time notifications of task progress Session Tracking**: Complete audit trail of all browser activities Multi-Window Support**: Handle complex workflows across multiple tabs Enterprise-Ready Features Error Recovery**: Robust handling of network issues and page failures Session Persistence**: Maintains state across workflow interruptions Data Integrity**: Consistent record-keeping even during failures Scalable Architecture**: Can handle multiple concurrent automation tasks Use Cases Beyond Reddit This workflow architecture supports automation for any website: Social Media Management Multi-platform posting**: Facebook, Twitter, LinkedIn, Instagram Community engagement**: Responding to comments, messages Content scheduling**: Publishing posts at optimal times Analytics gathering**: Collecting engagement metrics Business Process Automation CRM data entry**: Updating customer records across platforms Support ticket management**: Creating, updating, routing tickets E-commerce operations**: Product listings, inventory updates Report generation**: Gathering data from multiple web sources Personal Productivity Travel booking**: Comparing prices, making reservations Bill management**: Paying utilities, checking statements Job applications**: Submitting applications, tracking status Research tasks**: Gathering information from multiple sources Advanced Configuration Options Custom Profiles Create Airtop profiles for different websites Store authentication credentials securely Switch between different user accounts Handle multi-factor authentication flows Workflow Customization Modify AI system prompts for specific use cases Add custom validation rules Implement retry logic for failed operations Create domain-specific interaction patterns Integration Extensions Connect to additional data sources Add webhook notifications Implement approval workflows Create audit logs and reporting Getting Started 📊 Copy the Google Sheets Template - Just click and make a copy! Set up credentials for Airtop, OpenAI, and Gmail Import workflow into your n8n instance Configure node credentials with your API keys and Google Sheets URL Test with simple commands like "Visit google.com" Expand to complex workflows as you become comfortable Best Practices Session Management Always check for existing sessions before creating new ones Properly terminate sessions to avoid resource waste Use descriptive names for sessions and windows Regularly clean up old session records Error Handling Implement timeout handling for slow-loading pages Add retry logic for network failures Validate element existence before interactions Log detailed error information for debugging Security Considerations Store sensitive credentials in Airtop profiles, not workflow Use webhook authentication for production deployments Implement rate limiting to avoid being blocked by websites Regular audit of browser session activities This workflow transforms n8n into a powerful browser automation platform, enabling you to automate virtually any web-based task while maintaining full visibility and control over the automation process.
by JKingma
🛍️ Automated Product Description Generation for Adobe Commerce (Magento 2) Description This n8n template demonstrates how to automatically generate product descriptions for items in Adobe Commerce (Magento 2) that are missing one. The workflow retrieves product data, converts raw attribute values (like numeric IDs) into human-readable labels, and passes the enriched product data to an LLM (Azure OpenAI by default). The LLM generates a compelling description, which is then saved back to Magento using the API. This ensures all products have professional descriptions without manual writing effort. Use cases include: Auto-generating missing descriptions for catalog completeness. Creating consistent descriptions across large product datasets. Reducing manual workload for content managers. Tailoring descriptions for SEO and customer readability. Good to know All attribute options are resolved to human-readable labels before being sent to the LLM. The flow uses Azure OpenAI, but you can replace it with OpenAI, Anthropic, Gemini, or other LLM providers. The LLM prompt can be customised to adjust tone, length, SEO-focus, or specific brand style. Works out-of-the-box with Adobe Commerce (Magento 2) APIs, but can be adapted for other ecommerce systems. How it works Get Product from Magento Retrieves a product that has no description. Collects all product attributes. Generate Description with LLM Resolves attribute option IDs into human-readable values (e.g. color_id = 23 → "Red"). Passes the readable product attributes to an Azure OpenAI model. The LLM creates a clear, engaging product description. The prompt can be customised (e.g. SEO-optimized, short catalog text, or marketing style). Save Description in Magento Updates the product via the Magento API with the generated description. Ensures product data is enriched and visible in the webshop immediately. How to use Configure your Magento 2 API credentials in n8n. Replace the Azure OpenAI node with another provider if needed. Adjust the prompt to match your brand’s tone of voice. Run the workflow to automatically process products missing descriptions. Requirements ✅ n8n instance (self-hosted or cloud) ✅ Adobe Commerce (Magento 2) instance with API access ✅ Azure OpenAI (or other LLM provider) credentials (Optional) Prompt customisations for SEO or brand voice Customising this workflow This workflow can be adapted for: Other attributes**: Include or exclude attributes (e.g. only color & size for apparel). Different LLMs**: Swap Azure OpenAI for OpenAI, Anthropic, Gemini, or any supported n8n AI node. Prompt tuning**: Adjust instructions to generate shorter, longer, or SEO-rich descriptions. Selective updates**: Target only specific categories (e.g. electronics, fashion). Multi-language support**: Generate product descriptions in multiple languages for international shops.
by Julian Kaiser
Automatically Classify Support Tickets in Zoho Desk with AI with Gemini Transform your customer support workflow with intelligent ticket classification. This automation leverages AI to automatically categorize incoming support tickets in Zoho Desk, reducing manual work and ensuring faster ticket routing to the right teams. How It Works Fetches all tickets from Zoho Desk with pagination support Filters unclassified tickets (where classification field is null) Retrieves complete ticket threads for full conversation context Uses OpenRouter AI (GPT-4, Claude, or other models) to classify tickets into predefined categories Updates tickets in Zoho Desk with accurate classifications automatically Use Cases Customer Support Teams**: Automatically route tickets to specialized departments (billing, technical, sales) Help Desks**: Prioritize urgent issues and categorize feature requests Prerequisites Active Zoho Desk account with API access OpenRouter API account (supports multiple AI models) Basic understanding of OAuth2 authentication Predefined ticket categories in your Zoho Desk setup Setup Steps Time: ~15 minutes Configure Zoho Desk OAuth2 - Follow our step-by-step GitHub guide for OAuth2 credential setup Set up OpenRouter API - Create an account and generate API keys at openrouter.ai Customize classifications - Define your ticket categories (e.g., Technical, Billing, Feature Request, Bug Report) Adapt the workflow - Modify for any field: status, priority, tags, assignment, or custom fields Review API documentation - Check Zoho Desk Search API docs for advanced filtering options Test thoroughly - Run manual triggers before automation Note: This workflow demonstrates proper Zoho Desk API integration, including OAuth2 authentication and pagination handling—two common integration challenges.
by Kevin Meneses
What this workflow does This workflow automates the discovery, evaluation, and notification of job opportunities based on a candidate’s professional profile. It fetches remote job listings, compares each role against a structured candidate profile stored in Google Sheets, and uses AI to evaluate real alignment in terms of skills, seniority, salary, industry, and role complexity. Only the most relevant opportunities are kept, stored in Google Sheets, and delivered via email as a Top 5 shortlist. Decodo – Web Scraper for n8n How to configure (quick setup) Define the candidate profile in Google Sheets (skills, salary expectations, preferences). Configure credentials (Google Sheets, Gmail, decodo, AI provider). Set the matching threshold (e.g. skill match ≥ 90%). Run the workflow manually or on a schedule. Output Structured job match results in Google Sheets Automated email with the Top 5 best-matched job opportunities
by Madame AI
Scrape physician profiles from BrowserAct to Google Sheets This workflow automates the process of building a targeted database of healthcare providers by scraping physician details for a specific location and syncing them to your records. It leverages BrowserAct to extract data from healthcare directories and ensures your database stays clean by preventing duplicate entries. Target Audience Medical recruiters, pharmaceutical sales representatives, lead generation specialists, and healthcare data analysts. How it works Define Location: The workflow starts by setting the target Location and State in a Set node. Scrape Data: A BrowserAct node executes a task (using the "Physician Profile Enricher" template) to search a healthcare directory (e.g., Healow) for doctors matching the criteria. Parse JSON: A Code node takes the raw string output from the scraper and parses it into individual JSON objects. Update Database: The workflow uses a Google Sheets node to append new records or update existing ones based on the physician's name, preventing duplicates. Notify Team: A Slack node sends a message to a specific channel to confirm the batch job has finished successfully. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Physician Profile Enricher template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Google Sheets node and select your newly created file and sheet. Set Variables: Open the Define Location node and input your target Location (City) and State. Configure Notification: Open the Slack node and select the channel where you want to receive alerts. Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Specialty Address Requirements BrowserAct* account with the *Physician Profile Enricher** template. Google Sheets** account. Slack** account. How to customize the workflow Change the Data Source: Modify the BrowserAct template to scrape a different directory (e.g., Zocdoc or WebMD) and update the Google Sheet columns accordingly. Switch Notifications: Replace the Slack node with a Microsoft Teams, Discord, or Email node to suit your team's communication preferences. Enrich Data: Add an AI Agent node after the Code node to format addresses or research the specific clinics listed. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automate Medical Lead Gen: Scrape Healow to Google Sheets & Slack
by Aitor | 1Node
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the distribution and scheduling of video content across multiple social platforms (TikTok, YouTube, Facebook, Instagram, Threads) through Postiz. Videos are collected from Google Drive, approved manually, and scheduled via the Postiz community node. 🧾 Requirements Google Drive** account with access to the folder that will watch for new items uploaded. videos in mp4 format ready to be shared or, alternatively you can connect a community node from Cloud Convert to convert the format before uploading into Postiz. Postiz account with integrations for TikTok, YouTube, Facebook, Instagram, and Threads 🔗 Useful Links Postiz Docs Postiz Community Node 🔄 Workflow Steps Trigger: Google Drive File Added Watches your selected Google Drive folder for new file uploads. Download File Downloads the detected video from Drive. Upload to Postiz Video is uploaded to Postiz to prepare for social scheduling. Set Fields Manual setting of social options Extract Datetime (AI) Uses OpenAI to find/predict intended publish date & time, as the datetime format is required to schedule on Postiz Get Social Integrations Fetches a list of user’s connected platforms from Postiz. Split and Filter Integrations Splits the process per platform (TikTok, YouTube, Facebook, Instagram, Threads). Schedule Post For each enabled platform, schedules the video with chosen options. 🙋♂️ Need Help? Connect with 1 Node
by Shelly-Ann Davy
Description Wake up gently. This elegant workflow runs every morning at 7 AM, picks one uplifting affirmation from a curated list, and delivers it to your inbox (with optional Telegram). Zero code, zero secrets—just drop in your SMTP and Telegram credentials, edit the affirmations, and activate. Perfect for creators, homemakers, and entrepreneurs who crave intention and beauty before the day begins. How it works (high-level steps) Cron wakes the flow daily at 7 AM. Set: Configuration stores your email, Telegram chat ID, and affirmations. Code node randomly selects one affirmation. Email node sends the message via SMTP. IF node decides whether to forward it to Telegram as well. Set-up time 2 – 3 minutes 30 s: add SMTP credential 30 s: add Telegram Bot credential (optional) 1 min: edit affirmations & email addresses 30 s: activate Detailed instructions All deep-dive steps live inside the yellow and white sticky notes on the canvas—no extra docs needed. Requirements SMTP account (SendGrid, Gmail, etc.) Telegram Bot account (optional) Customisation tips Change Cron time or frequency Swap affirmation list for quotes, verses, or mantras Add Notion logger branch for journaling
by WeblineIndia
Webhook from IoT Devices → Jira Maintenance Ticket → Slack Factory Alert This workflow automates predictive maintenance by receiving IoT machine-failure webhooks, creating Jira maintenance tickets, checking technician availability in Slack and sending the alert to the correct Slack channel. If an active technician is available, the system notifies the designated technician channel; if not, it escalates automatically to your chosen emergency/escalation channel. ⚡ Quick Implementation: Start Using in 10 Seconds Import the workflow JSON into n8n. Add Slack API credentials (with all required scopes). Add Jira Cloud credentials. Select Slack channels for: Technician alerts Emergency/escalation alerts Deploy the webhook URL to your IoT device. Run a test event. What It Does This workflow implements a real-time predictive maintenance automation loop. An IoT device sends machine data — such as temperature, vibration and timestamps — to an n8n webhook whenever a potential failure is detected. The workflow immediately evaluates whether the values exceed a defined safety threshold. If a failure condition is detected, a Jira maintenance ticket is automatically created with all relevant machine information. The workflow then gathers all technicians from your selected Slack channel and checks each technician’s presence status in real time. A built-in decision engine chooses the first available technician. If someone is active, the workflow sends a maintenance alert to your technician channel. If no technicians are available, the workflow escalates the alert to your chosen emergency channel to avoid operational downtime. This eliminates manual monitoring, accelerates response times and ensures no incident goes unnoticed — even if the team is unavailable. Who’s It For This workflow is ideal for: Manufacturing factories Industrial automation setups IoT monitoring systems Warehouse operations Maintenance & facility management teams Companies using Jira + Slack Organizations implementing predictive maintenance or automated escalation workflows Requirements to Use This Workflow You will need: An n8n instance (Cloud or Self-hosted) Slack App with the scopes: users:read users:read.presence channels:read chat:write Jira Cloud credentials (email + API token) Slack channels of your choice for: Technician alerts Emergency/escalation alerts IoT device capable of POST webhook calls Machine payload must include: machineId temperature vibration timestamp How It Works & How To Set Up 🔧 High-Level Workflow Logic IoT Webhook receives machine data. IF Condition checks whether values exceed safety thresholds. Jira Ticket is created with machine details if failure detected. Slack Channel Members are fetched from your selected technician channel. Loop Through Technicians to check real-time presence. Code Node determines: first available (active) technician or fallback mode if none available IF Condition checks technician availability. Slack Notification is sent to: your chosen technician channel if someone is available your chosen emergency/escalation channel if no one is online 🛠 Step-by-Step Setup Instructions Import Workflow: n8n → Workflows → Import from File → Select JSON. Configure Slack: Add required scopes (users:read, users:read.presence, channels:read, chat:write) and reconnect credentials. Select Slack Channels: Choose any Slack channels you want for technician notifications and emergency alerts—no fixed naming is required. Configure Jira: Add credentials, select project and issue type, and set priority mapping if needed. Deploy Webhook: Copy the n8n webhook URL and configure your IoT device to POST machine data. Test System: Send a test payload to ensure Jira tickets are created and Slack notifications route correctly based on technician availability. This setup allows real-time monitoring, automated ticket creation and flexible escalation — reducing manual intervention and ensuring fast maintenance response. How To Customize Nodes Webhook Node Add security tokens Change webhook path Add response message IF Node (Threshold Logic) Lower/raise temperature threshold Change OR to AND Add more conditions (humidity, RPM, pressure) Jira Node Customize fields like summary, labels or assign issues based on technician availability Slack Presence Node Add DND checks Treat “away” as “available” during night shift Combine multiple channels Code Node Randomly rotate technicians Pick technician with lowest alert count Keep a history log Add-Ons SMS fallback notifications (Twilio) WhatsApp alerts Telegram alerts Notify supervisors via email Store machine failures into Google Sheets Push metrics into PowerBI Auto-close Jira tickets after normalizing machine values Create a daily maintenance report Use Case Examples Overheating Machine Alert – Detect spikes and notify technician instantly. Vibration Pattern Anomaly Detection – Trigger early maintenance before full breakdown. Multi-Shift Technician Coverage – Automatically switch to emergency mode when no technician is online. Factory Night-Shift Automation – Night alerts automatically escalate without manual verification. Warehouse Robotics Malfunction – Sends instant Slack + Jira alerts when robots overheat or jam. Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------------- | ----------------------------------- | -------------------------------------------- | | Webhook returns no data | Wrong endpoint or method | Use POST + correct URL | | Slack presence returns error | Missing Slack scopes | Add users:read.presence | | Jira ticket not created | Invalid project key or credentials | Reconfigure Jira API credentials | | All technicians show offline | Wrong channel or IDs | Ensure correct channel members | | Emergency alert not triggered | Code node returning incorrect logic | Test code with all technicians set to “away” | | Slack message fails | Wrong channel ID | Replace with correct Slack channel | Need Help? If you need help customizing this workflow, adding new automation features, connecting additional systems or building enterprise IoT maintenance solutions, our n8n automation development team at WeblineIndia team can help. We can assist with: Workflow setup Advanced alert logic Integrating SMS / WhatsApp / Voice alerts Custom escalation rules Industrial IoT integration Reach out anytime for support or enhancements.
by Automate With Marc
Viral Marketing Reel & Autopost with Sora2 + Blotato Create funny, ultra-realistic marketing reels on autopilot using n8n, Sora2, Blotato, and OpenAI. This beginner-friendly template generates a comedic video prompt, creates a 12-second Sora2 video, writes a caption, and auto-posts to Instagram/TikTok — all on a schedule. 🎥 Watch the full step-by-step tutorial: https://www.youtube.com/watch?v=lKZknEzhivo What this template does This workflow automates an entire short-form content production pipeline: Scheduled Trigger: Runs automatically at your chosen time (e.g., every evening at 7PM). AI “Video Prompt Agent”: Creates a cinematic, funny, 12-second Sora2 text-to-video prompt designed to promote a product (default: Sally’s Coffee). Insert Row (Data Table): Logs each generated video prompt for tracking, reuse, or inspiration. Sora2 (via Wavespeed): Sends POST request to generate a video. Waits 30 seconds. Polls the prediction endpoint until the video is completed. Blotato Integration: Uploads the finished video to your connected social account(s). Automatically publishes or schedules the post. Caption Generator: Uses an AI agent to create an Instagram/TikTok-ready caption with relevant hashtags. This turns n8n into a hands-free comedic marketing engine that writes, creates, and posts content for you. Why it’s useful Create daily or weekly marketing reels without filming, editing, or writing scripts. Experiment with new comedic formats, hooks, and product placements in seconds. Perfect for small businesses, agencies, creators, and social media managers. Demonstrates how to combine AI agents + Sora2 + polling + external posting services inside one workflow. Requirements Before running this template, configure: OpenAI API Key (for the prompt agent & caption model) Wavespeed / Sora2 API credentials Blotato account connected to Instagram/TikTok (for posting) n8n Data Table (optional, or replace with your own) ⚠️ All credentials must be added manually after import. No real credentials are included in the template. How it works Schedule Trigger Runs at a fixed time or interval. Video Prompt Agent (LangChain Agent) Generates a cinematic, realistic comedic video idea. Built with a detailed system prompt. Ensures brand integration (e.g., Sally’s Coffee) happens naturally. Insert Row (Data Table) Logs each generated prompt so future videos can be referenced or reused. Sora2 POST Request Sends the generated prompt to Sora2 via Wavespeed’s /text-to-video endpoint. Wait 30s + GET Sora2 Result Polls the result until data.status === "completed". Continues looping if still “processing”. Upload Media (Blotato) Uploads the finished video file. Caption Generator Creates a funny, platform-ready Instagram/TikTok caption with hashtags. Create Post (Blotato) Publishes (or schedules) the video + caption. Setup Instructions (Step-by-Step) Import template into n8n. Open Video Prompt Agent → review or customize the brand name, style, humor tone. Add your OpenAI API credentials: For prompt generation For caption generation Add your Wavespeed/Sora2 credentials to the POST and GET nodes. Connect your Blotato credential for uploading and posting. (Optional) Replace the Data Table ID with your own table. Adjust the Schedule Trigger time to your desired posting schedule. Run once manually to confirm: Prompt is generated Video is created Caption is written Video uploads successfully Enable workflow → your daily/weekly comedic autoposter is live. Customization Ideas Change the brand from Sally’s Coffee to any business, product, or influencer brand. Modify the prompt agent to enforce specific camera styles, settings, or comedic tones. Swap posting destinations: Blotato supports multiple networks—configure IG/TikTok/Facebook/YouTube Shorts. Add approval steps: Insert a Slack/Telegram “Approve before posting” step. Add analytics logging: Store video URLs, caption, and AI cost estimate. Troubleshooting Sora video stuck in processing: Increase the wait time or add another polling loop. Upload fails: Ensure media URL exists and Blotato account has posting permissions. Caption empty: Reconnect OpenAI credential or check model availability. Posting fails: Confirm your Blotato API key is valid and linked to a connected account. Category: Marketing, AI Video, Social Media Automation Difficulty: Beginner–Intermediate Core Nodes: LangChain Agent, HTTP Request, Wait, Data Table, Blotato, OpenAI Includes: System prompts, polling logic, caption generator, posting workflow