by Rahul Joshi
π Description This workflow automates dependency update risk analysis and reporting using Jira, GPT-4o, Slack, and Google Sheets. It continuously monitors Jira for new package or dependency update tickets, uses AI to assess their risk levels (Low, Medium, High), posts structured comments back into Jira, and alerts the DevOps team in Slack β all while logging historical data into Google Sheets for visibility and trend analysis. This ensures fast, data-driven decisions for dependency upgrades, improved code stability, and reduced security risks β with zero manual triage. βοΈ What This Workflow Does (Step-by-Step) π’ When Clicking βExecute Workflowβ Manually triggers the dependency risk analysis sequence for immediate review or scheduled monitoring. π Fetch All Active Jira Issues Retrieves all active Jira issues to identify tickets related to dependency or package updates. Provides the complete dataset β including summary, status, and assignee information β for AI-based risk evaluation. β Validate Jira Query Response Verifies that Jira returned valid issue data before proceeding. If data exists β continues filtering dependency updates. If no data or API error β logs the failure to Google Sheets. Prevents workflow from continuing with empty or broken datasets. π Identify Dependency Update Issues Filters Jira issues to find only dependency-related tickets (keywords like βupdate,β βbump,β βpackage,β or βlibraryβ). This ensures only relevant version update tasks are analyzed β filtering out unrelated feature or bug tickets. π·οΈ Extract Relevant Issue Metadata Extracts essential fields such as key, summary, priority, assignee, status, and created date for downstream AI processing. Simplifies the data payload and ensures accurate, structured analysis. π’ Alert DevOps Team in Slack Immediately notifies the assigned DevOps engineer via Slack DM about any new dependency update issue. Includes formatted details like summary, key, status, priority, and direct Jira link for quick access. Ensures rapid visibility and faster response to potential risk tickets. π€ AI-Powered Risk Assessment Analyzer Uses GPT-4o (Azure OpenAI) to intelligently evaluate each dependency updateβs risk level and impact summary. Considers factors such as: Dependency criticality Version change type (major/minor/patch) Security or EOL indicators Potential breaking changes Outputs a clean JSON with fields: {"risk_level": "Low | Medium | High","impact_summary": "Short human-readable explanation"} Helps DevOps teams prioritize updates with context. π§ GPT-4o Language Model Configuration Configures the AI reasoning engine for precise, context-aware DevOps assessments. Optimized for consistent technical tone and cost-efficient batch evaluation. π Parse AI Response to Structured Data Safely parses the AIβs JSON output, removing markdown artifacts and ensuring structure. Adds parsed fields β risk_level and impact_summary β back to the Jira context. Includes fail-safes to prevent crashes on malformed AI output (fallbacks to βUnknownβ and βFailed to parseβ). π¬ Post AI Risk Assessment to Jira Ticket Automatically posts the AIβs analysis as a comment on the Jira issue: Displays π€ AI Risk Assessment Report header Shows Risk Level and Impact Summary Includes a checklist of next steps for developers Creates a permanent audit trail for each dependency decision inside Jira. π Log Dependency Updates to Tracking Dashboard Appends all analyzed updates into Google Sheets, recording: Date Jira Key & Summary Risk Level & Impact Summary Assignee & Status This builds a historical dependency risk database that supports: Trend monitoring Security compliance reviews Dependency upgrade metrics DevOps productivity tracking π Log Jira Query Failures to Error Sheet If the Jira query fails, the workflow automatically logs the error (API/auth/network) into a centralized error sheet for troubleshooting and visibility. π§© Prerequisites Jira Software Cloud API credentials Azure OpenAI (GPT-4o) access Slack API connection Google Sheets OAuth2 credentials π‘ Key Benefits β Automated dependency risk assessment β Instant Slack alerts for update visibility β Historical tracking in Google Sheets β Reduced manual triage and faster decision-making β Continuous improvement in release reliability and security π₯ Perfect For DevOps and SRE teams managing large dependency graphs Engineering managers monitoring package updates and risks Security/compliance teams tracking vulnerability fix adoption Product teams aiming for stable CI/CD pipelines
by Yves Tkaczyk
Use cases Monitor Google Drive folder, parsing PDF, DOCX and image file into a destination folder, ready for further processing (e.g. RAG ingestion, translation, etc.) Keep processing log in Google Sheet and send Slack notifications. How it works Trigger: Watch Google Drive folder for new and updated files. Create a uniquely named destination folder, copying the input file. Parse the file using Mistral Document, extracting content and handling non-OCRable images separately. Save the data returned by Mistral Document into the destination Google Drive folder (raw JSON file, Markdown files, and images) for further processing. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (14 nodes total) to use the credentials Mistral node: Create Mistral Cloud API credentials. Read more about Mistral Cloud Credentials. Update the OCR Document node to use the Mistral Cloud credentials. Slack nodes: Create Slack OAuth2 credentials. Read more about Slack OAuth2 credentials Update the two Slack nodes: Send Success Message and Send Error Message: Set the credentials Select the channel where you want to send the notifications (channels can be different for success and errors). Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration. Ensure the spreadsheet can be accessed as Editor by the account used by the Google Credentials above. Create a directory for input files and a directory for output folders/files. Ensure the directories can be accessed by the account used by the Google Credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes. Requirements Google account with Google API access Mistral Cloud account access to Mistral API key. Slack account with access to Slack client ID and secret ID. Basic n8n knowledge: understanding of triggers, expressions, and credential management Whoβs it for Anyone building a data pipeline ingesting files to be OCRed for further processing. π Security All credentials are stored as n8n credentials. The only information stored in this workflow that could be considered sensitive are the Google Drive Directory and Sheet IDs. These directories and the spreadsheet should be secured according to your needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by Omer Fayyaz
Efficient loop-less N8N Workflow Backup Automation to Google Drive This workflow eliminates traditional loop-based processing entirely, delivering unprecedented performance and reliability even when the number of workflows to be processed are large What Makes This Different: NO SplitInBatches node** - Traditional workflows use loops to process workflows one by one NO individual file uploads** - No more multiple Google Drive API calls NO batch error handling** - Eliminates complex loop iteration error management ALL workflows processed simultaneously** - Revolutionary single-operation approach Single compressed archive** - One ZIP file containing all workflows One Google Drive upload** - Maximum efficiency, minimum API usage Key Benefits of Non-Loop Architecture: 3-5x Faster Execution** - Eliminated loop overhead and multiple API calls Reduced API Costs** - Single upload instead of dozens of individual operations Higher Reliability** - Fewer failure points, centralized error handling Better Scalability** - Performance doesn't degrade with workflow count Large Workflow Support* - *Efficiently handles hundreds of workflows without performance degradation** Easier Maintenance** - Simpler workflow structure, easier debugging Cleaner Monitoring** - Single success/failure point instead of loop iterations Who's it for This template is designed for n8n users, DevOps engineers, system administrators, and IT professionals who need reliable, automated backup solutions for their n8n workflows. It's perfect for businesses and individuals who want to ensure their workflow automation investments are safely backed up with intelligent scheduling, compression, and cloud storage integration. How it works / What it does This workflow creates an intelligent, automated backup system that transforms n8n workflow backups from inefficient multi-file operations into streamlined single-archive automation. The system: Triggers automatically every 4 hours or manually on-demand Creates timestamped folders in Google Drive for organized backup storage Retrieves all n8n workflows via the n8n API in a single operation Converts workflows to JSON and aggregates binary data efficiently Compresses all workflows into a single ZIP archive (eliminating the need for loops) Uploads the compressed backup to Google Drive in one operation Provides real-time Slack notifications for monitoring and alerting Key Innovation: No Loops Required - Unlike traditional backup workflows that use SplitInBatches or loops to process workflows individually, this system processes all workflows simultaneously and creates a single compressed archive, dramatically improving performance and reliability. How to set up 1. Configure Google Drive API Credentials Set up Google Drive OAuth2 API credentials Ensure the service account has access to create folders and upload files Update the parent folder ID where backup folders will be created 2. Configure n8n API Access Set up internal n8n API credentials for workflow retrieval Ensure the API has permissions to read all workflows Configure retry settings for reliability 3. Set up Slack Notifications Configure Slack API credentials for the notification channel Update the channel ID where backup notifications will be sent Customize notification messages as needed 4. Schedule Configuration The workflow automatically runs every 4 hours Manual execution is available for immediate backups Adjust the schedule in the Schedule Trigger node as needed 5. Test the Integration Run a manual backup to verify all components work correctly Check Google Drive for the created backup folder and ZIP file Verify Slack notifications are received Requirements n8n instance** (self-hosted or cloud) with API access Google Drive account** with API access and sufficient storage Slack workspace** for notifications (optional but recommended) n8n workflows** that need regular backup protection How to customize the workflow Modify Backup Frequency Adjust the Schedule Trigger node for different intervals (hourly, daily, weekly) Add multiple schedule triggers for different backup types Implement conditional scheduling based on workflow changes Enhance Storage Strategy Add multiple Google Drive accounts for redundancy Implement backup rotation and retention policies Add compression options (ZIP, TAR, 7Z) for different use cases Expand Notification System Add email notifications for critical backup events Integrate with monitoring systems (PagerDuty, OpsGenie) Add backup success/failure metrics and reporting Security Enhancements Implement backup encryption before upload Add backup verification and integrity checks Set up access logging and audit trails Performance Optimizations Add parallel processing for large workflow collections Implement incremental backup strategies Add backup size monitoring and alerts Key Features Zero-loop architecture** - Processes all workflows simultaneously without batch processing Intelligent compression** - Single ZIP archive instead of multiple individual files Automated scheduling** - Runs every 4 hours with manual override capability Organized storage** - Timestamped folders with clear naming conventions Real-time monitoring** - Slack notifications for all backup events Error handling** - Centralized error management with graceful failure handling Scalable design** - Handles any number of workflows efficiently Technical Architecture Highlights Eliminated Inefficiencies No SplitInBatches node** - Replaced with direct workflow processing No individual file uploads** - Single compressed archive upload No loop iterations** - All workflows processed in one operation No batch error handling** - Centralized error management Performance Improvements Faster execution** - Eliminated loop overhead and multiple API calls Reduced API quota usage** - Single Google Drive upload per backup Better resource utilization** - Efficient memory and processing usage Improved reliability** - Fewer points of failure in the workflow Data Flow Optimization Parallel processing** - Folder creation and workflow retrieval happen simultaneously Efficient aggregation** - Code node processes all binaries at once Smart compression** - Single ZIP with all workflows included Streamlined upload** - One file transfer instead of multiple operations Use Cases Production n8n instances** requiring reliable backup protection Development teams** needing workflow version control and recovery DevOps automation** requiring disaster recovery capabilities Business continuity** planning for critical automation workflows Compliance requirements** for workflow documentation and backup Team collaboration** with shared workflow backup access Business Value Risk Mitigation** - Protects valuable automation investments Operational Efficiency** - Faster, more reliable backup processes Cost Reduction** - Lower storage costs and API usage Compliance Support** - Organized backup records for audits Team Productivity** - Reduced backup management overhead Scalability** - Handles growth without performance degradation This template revolutionizes n8n workflow backup by eliminating the complexity and inefficiency of traditional loop-based approaches, providing a robust, scalable solution that grows with your automation needs while maintaining the highest levels of reliability and performance.
by WeblineIndia
Smart Contract Event Monitor (Web3) This workflow automatically monitors the Ethereum blockchain, extracts USDT transfer events, filters large-value transactions, stores them in Airtable and sends a clean daily summary alert to Slack. This workflow checks the latest Ethereum block every day and identifies high-value USDT transfers. It fetches on-chain logs using Alchemy, extracts sender/receiver/value details, filters transactions above a threshold, stores them in Airtable and finally sends a single clear summary alert to Slack. You receive: Daily blockchain check (automated) Airtable tracking of all high-value USDT transfers A Slack alert summarizing the count + the largest transfer Ideal for teams wanting simple, automated visibility of suspicious or large crypto movements without manually scanning the blockchain. Quick Start β Implementation Steps Add your Alchemy Ethereum Mainnet API URL in both HTTP nodes. Connect and configure your Airtable base & table. Connect your Slack credentials and set the channel for alerts. Adjust the value threshold in the IF node (default: 1,000,000,000). Activate the workflow β daily monitoring begins instantly. What It Does This workflow automates detection of high-value USDT transfers on Ethereum: Fetches the latest block number using Alchemy. Retrieves all USDT Transfer logs from that block. Extracts structured data: Sender Receiver Amount Contract Block number Transaction hash Filters only transactions above a configurable threshold. Saves each high-value transaction into Airtable for record-keeping. Generates a summary including: Total number of high-value transfers The single largest transfer Sends one clean alert message to Slack. This ensures visibility of suspicious or large fund movements with no repeated alerts. Whoβs It For This workflow is ideal for: Crypto analytics teams Blockchain monitoring platforms Compliance teams tracking high-value activity Web3 product teams Developers needing automated USDT transfer tracking Anyone monitoring whale movements / suspicious transactions Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Alchemy API URL** (Ethereum Mainnet) Airtable base** + Personal Access Token Slack workspace** with API permissions Basic understanding of Ethereum logs, hex values & JSON data How It Works Daily Check β Workflow runs automatically at your set time. Get Latest Block Number β Fetches newest Ethereum block from Alchemy. Fetch USDT Logs β Queries all Transfer events (ERC-20 standard). Extract Transaction Details β Converts hex β readable data. Filter High-Value Transactions β Keeps only large value transfers. Save to Airtable β Adds each transfer record to your database. Generate Summary β Finds the largest transfer & total count. Send Slack Alert β Notifies your team with one clean summary. Setup Steps Import the provided n8n JSON file. Open the Get Latest Block and Fetch Logs HTTP nodes β add your Alchemy API URL. Ensure USDT contract address (already included):0xdAC17F958D2ee523a2206206994597C13D831ec7 Connect your Airtable account and map: Contract From Address To Address Value Block Number txHash Connect Slack API credentials and choose your channel. Change the threshold limit in the IF node if needed (default: 1B). Activate the workflow β done! How To Customize Nodes Customize Value Threshold Modify the IF node: Increase or decrease the minimum transfer value Change logic to smaller or larger whale-tracking Customize Airtable Storage You can add fields like: Timestamp Token symbol USD price (using price API) Transaction status Risk classification Customize Slack Alerts You may add: Emojis Mentions (@channel, @team) Links to Etherscan Highlighted blocks for critical transfers Customize Web3 Provider Replace Alchemy with: Infura QuickNode Public RPC (not recommended for reliability) Add-Ons (Optional Enhancements) You can extend this workflow to: Track multiple ERC-20 tokens Process several blocks instead of just the latest Add price conversion (USDT β USD value) Detect transfers to suspicious wallets Generate daily or weekly summary reports in Slack Create a dashboard using Airtable Interfaces Add OpenAI-based insights (large spike, suspicious pattern, etc.) Use Case Examples 1\. Whale Tracking Detect large USDT movements (>1M or >5M). 2\. Compliance Monitoring Log high-value transfers in Airtable for audits. 3\. Real-Time Alerts Slack alerts tell your team instantly about big movements. 4\. On-Chain Analytics Automate structured extraction of Ethereum logs. 5\. Exchange Monitoring Detect large inflows/outflows to known addresses. Troubleshooting Guide | Issue | Possible Cause | Solution | |------------------------|-----------------------------------|---------------------------------------------------------| | No data in Airtable | Logs returned empty | Ensure USDT transfer events exist in that block | | Values are βzeroβ | Hex parsing failed | Check extract-code logic | | Slack alert not sent | Invalid credentials | Update Slack API key | | Airtable error | Wrong field names | Match Airtable column names exactly | | HTTP request fails | Wrong RPC URL | Re-check Alchemy API key | | Workflow not running | Schedule disabled | Enable "Daily Check" node | Need Help? If you need help customizing or extending this workflow β adding multi-token monitoring, setting up dashboards, improving alerts or scaling this for production β our n8n workflow developers at WeblineIndia can assist you with advanced automation.
by Elay Guez
π AI-Powered Web Research in Google Sheets with Bright Data π Overview Transform any Google Sheets cell into an intelligent web scraper! Type =BRIGHTDATA("cell", "search prompt") and get AI-filtered result from every website in ~20 seconds. What happens automatically: AI optimizes your search query Bright Data scrapes the web (bypasses bot detection) AI analyzes and filters result Returns clean data directly to your cell Completes in <25 seconds Cost: ~$0.02-0.05 per search | Time saved: 3-5 minutes per search π₯ Who's it for Market researchers needing competitive intelligence E-commerce teams tracking prices Sales teams doing lead prospecting SEO specialists gathering content research Real estate agents monitoring listings Anyone tired of manual copy-paste βοΈ How it works Webhook Call - Google Sheets function sends POST request Data Preparation - Organizes input structure AI Query Optimization - GPT-4.1 Mini refines search query Web Scraping - Bright Data fetches data while bypassing blocks AI Analysis - GPT-4o Mini filters and summarizes result Response - Returns plain text to your cell Logging - Updates logs for monitoring π οΈ Setup Instructions Time to deploy: 20 minutes Requirements n8n instance with public URL Bright Data account + API key OpenAI API key Google account for Apps Script Part 1: n8n Workflow Setup Import this template into your n8n instance Configure Webhook node: Copy your webhook URL: https://n8n.yourdomain.com/webhook/brightdata-search Set authentication: Header Auth Set API key: 12312346 (or create your own) Add OpenAI credentials to AI nodes. Configure Bright Data: Add API credentials Configure Output Language: Manually edit the "Set Variables" Node. Test workflow with manual execution Activate the workflow Part 2: Google Sheets Function Open Google Sheet β Extensions β Apps Script Paste this code: function BRIGHTDATA(prompt, source) { if (!prompt || prompt === "") { return "β Must enter prompt"; } source = source || "google"; // Update with YOUR webhook URL const N8N_WEBHOOK_URL = "https://your-n8n-domain.com/webhook/brightdata-search"; // Update with YOUR password const API_KEY = "12312346"; let spreadsheetId, sheetName, cellAddress; try { const sheet = SpreadsheetApp.getActiveSheet(); const activeCell = sheet.getActiveCell(); spreadsheetId = SpreadsheetApp.getActiveSpreadsheet().getId(); sheetName = sheet.getName(); cellAddress = activeCell.getA1Notation(); } catch (e) { return "β Cannot identify cell"; } const payload = { prompt: prompt, source: source.toLowerCase(), context: { spreadsheetId: spreadsheetId, sheetName: sheetName, cellAddress: cellAddress, timestamp: new Date().toISOString() } }; const options = { method: "post", contentType: "application/json", payload: JSON.stringify(payload), muteHttpExceptions: true, headers: { "Accept": "text/plain", "key": API_KEY } }; try { const response = UrlFetchApp.fetch(N8N_WEBHOOK_URL, options); const responseCode = response.getResponseCode(); if (responseCode !== 200) { Logger.log("Error response: " + response.getContentText()); return "β Error " + responseCode; } return response.getContentText(); } catch (error) { Logger.log("Exception: " + error.toString()); return "β Connection error: " + error.toString(); } } function doGet(e) { return ContentService.createTextOutput(JSON.stringify({ status: "alive", message: "Apps Script is running", timestamp: new Date().toISOString() })).setMimeType(ContentService.MimeType.JSON); } Update N8N_WEBHOOK_URL with your webhook Update API_KEY with your password Save (Ctrl+S / Cmd+S) - Important! Close Apps Script editor π‘ Usage Examples =BRIGHTDATA("C3", "What is the current price of the product?") =BRIGHTDATA("D30", "What is the size of this company?") =BRIGHTDATA("A4", "Do this comapny is hiring Developers?") π¨ Customization Easy Tweaks AI Models** - Switch to GPT-4o for better optimization Response Format** - Modify prompt for specific outputs Speed** - Optimize AI prompts to reduce time Language** - Change prompts for any language Advanced Options Implement rate limiting Add data validation Create async mode for long queries Add Slack notifications π Pro Tips Be Specific** - "What is iPhone 15 Pro 256GB US price?" beats "What is iPhone price?" Speed Matters** - Keep prompts concise (30s timeout limit) Monitor Costs** - Track Bright Data usage Debug** - Check workflow logs for errors β οΈ Important Notes Timeout:** 30-second Google Sheets limit (aim for <20s) Plain Text Only:** No JSON responses Costs:** Monitor Bright Data at console.brightdata.com Security:** Keep API keys secret No Browser Storage:** Don't use localStorage/sessionStorage π§ Troubleshooting | Error | Solution | |-------|----------| | "Exceeded maximum execution time" | Optimize AI prompts or use async mode | | "Could not fetch data" | Verify Bright Data credentials | | Empty cell | Check n8n logs for AI parsing issues | | Broken characters | Verify UTF-8 encoding in webhook node | π Resources Bright Data API Docs n8n Webhook Documentation Google Apps Script Reference Built with β€οΈ by Elay Guez
by takuma
This workflow automates reputation management for physical stores (restaurants, retail, clinics) by monitoring Google Maps reviews, analyzing them with AI, and drafting professional replies. It acts as a 24/7 customer support assistant, ensuring you never miss a negative review and saving hours of manual writing time. Who is this for? Store Managers & Owners:** Keep track of customer sentiment without manually checking Google Maps every day. Marketing Agencies:** Automate local SEO reporting and response drafting for multiple clients. Customer Support Teams:** Get instant alerts for negative feedback to resolve issues quickly. How it works Schedule: Runs every 24 hours (customizable) to fetch the latest data. Scrape: Uses Apify to retrieve the latest reviews from a specific Google Maps URL. Filter: Checks the Google Sheet database to identify only new reviews and avoid duplicates. AI Analysis: An AI Agent (via OpenRouter/OpenAI) analyzes the review text to: Generate a short summary. Draft a polite, context-aware reply based on the star rating (e.g., apologies for low stars, gratitude for high stars). Alert: Sends a Slack notification. Low Rating (<4 stars): Alerts a specific channel (e.g., #customer-support) with a warning. High Rating: Alerts a general channel (e.g., #wins) to celebrate. Save: Appends the review details, AI summary, and draft reply to the Google Sheet. Requirements n8n:** Cloud or self-hosted (v1.0+). Apify Account:* To run the *Google Maps Reviews Scraper. Google Cloud Platform:** Enabled Google Sheets API. Slack Workspace:** A webhook URL or OAuth connection. OpenRouter (or OpenAI) API Key:** For the LLM generation. How to set up Google Sheets: Create a new sheet with the following headers in the first row: reviewId, publishedAt, reviewerName, stars, text, ai_summary, ai_reply, reviewUrl, output, publishedAt date. Configure Credentials: Set up your accounts for Google Sheets, Apify, Slack, and OpenRouter within n8n. Edit the "CONFIG" Node: MAPS_URL: Paste the full Google Maps link to your store. SHEET_ID: Paste the ID found in your Google Sheet URL. SHOP_NAME: Your store's name. Slack Nodes: Select the appropriate channels for positive and negative alerts. How to customize Change the AI Persona:* Open the *AI Agent** node and modify the "System Message" to match your brand's tone of voice (e.g., casual, formal, or witty). Adjust Alert Thresholds:* Edit the *If Rating < 4** node to change the criteria for what constitutes a "negative" review (e.g., strictly < 3 stars). Multi-Store Support:** You can loop this workflow over a list of URLs to manage multiple locations in a single execution.
by Adam GaΕΔcki
How it works: This workflow automates comprehensive SEO reporting by: Extracting keyword rankings and page performance from Google Search Console. Gathering organic reach metrics from Google Analytics. Analyzing internal and external article links. Tracking keyword position changes (gains and losses). Formatting and importing all data into Google Sheets reports. Set up steps: Connect Google Services: Authenticate Google Search Console, Google Analytics, and Google Sheets OAuth2 credentials. Configure Source Sheet: Set up a data source Google Sheet with article URLs to analyze. Set Report Sheet: Create or specify destination Google Sheets for reports. Update Date Ranges: Modify date parameters in GSC and GA nodes for your reporting period. Customize Filters: Adjust keyword filters and row limits based on your needs. Test Individual Sections: Each reporting section (keywords, pages, articles, position changes) can be tested independently. The workflow includes separate flows for: Keyword ranking (top 1000). Page ranking analysis. Organic reach reporting. Internal article link analysis. External article link analysis. Position gain/loss tracking.
by isaWOW
Submit a screen recording URL, customer name, and bug type through a simple web form. The workflow automatically scans the recording using WayinVideo AI to pinpoint the exact moment the bug occurs, then uses GPT-4o-mini to write a structured support ticket with priority, steps to reproduce, and a fix suggestion. The completed ticket is saved instantly as a new row in your Google Sheet β ready for your dev team. Built for support agents, QA teams, and SaaS companies who want to triage bugs faster without manual video scrubbing. What This Workflow Does AI bug detection** β Sends the screen recording to WayinVideo, which scans the video and finds the exact timestamp where the bug occurs Smart polling loop** β Automatically waits 45 seconds then checks for results, retrying until WayinVideo returns the bug moments Structured ticket generation** β GPT-4o-mini reads the bug moment data and writes a full support ticket with priority level, steps to reproduce, expected vs actual behaviour, and a suggested fix Auto-triage and assignment** β The AI decides severity (Critical to Low) and which team should own it (Backend, Frontend, QA, DevOps, or Product) Google Sheets logging** β Every ticket is saved with customer name, bug type, recording URL, timestamp, ticket content, status, and submission date Web form trigger** β A built-in form lets any support agent submit a bug report without touching n8n directly Setup Requirements Tools you'll need Active n8n instance (self-hosted or n8n Cloud) WayinVideo account + API key (for AI video bug detection) OpenAI API key (GPT-4o-mini for ticket generation) Google account with Google Sheets OAuth access Estimated Setup Time: 10β15 minutes Step-by-Step Setup Get your WayinVideo API key β Sign in to your WayinVideo account at WayinVideo, go to your API settings, and copy your key. Add the WayinVideo key to the Find Bug Moments step β Open the node called 2. WayinVideo β Find Bug Moments and replace YOUR_WAYINVIDEO_API_KEY in the Authorization header with your actual key. Add the WayinVideo key to the Get Bug Moments step β Open the node called 4. WayinVideo β Get Bug Moments and replace YOUR_WAYINVIDEO_API_KEY in the Authorization header with your actual key. > β οΈ Your WayinVideo API key appears in two nodes β replace it in both 2. WayinVideo β Find Bug Moments and 4. WayinVideo β Get Bug Moments or the workflow will fail. Add your OpenAI API key β In n8n, go to Credentials and create a new OpenAI credential. Paste your API key there. Then connect that credential to the node called 6a. OpenAI β Chat Model (GPT-4o-mini). Create your Google Sheet β Make a new Google Sheet with a tab named exactly Bug Tickets. Add these column headers in row 1: | Customer Name | Bug Type | Recording URL | Bug Moments Found | Top Bug Timestamp | Bug Ticket | Status | Date Submitted | Connect your Google account β In n8n, create a Google Sheets OAuth2 credential and connect your Google account. Apply that credential to the node called 7. Google Sheets β Save Bug Ticket. Add your Google Sheet ID β Open the node 7. Google Sheets β Save Bug Ticket and replace YOUR_GOOGLE_SHEET_ID with your actual Sheet ID. You can find this in the Google Sheet URL β it is the long string of characters between /d/ and /edit. Activate the workflow β Toggle the workflow to Active. Open the form URL from the 1. Form β Bug Recording + Details node and submit a test bug report to confirm everything is working. How It Works (Step by Step) Step 1 β Web Form (Bug Recording + Details) A support agent opens a web form and fills in three fields: the screen recording URL, the customer's name, and the type of bug (for example: Login Error, Payment Failure, or App Crash). Submitting the form kicks off the entire workflow automatically. Step 2 β WayinVideo β Find Bug Moments The recording URL is sent to the WayinVideo API with an instruction to search for moments matching the reported bug type. WayinVideo scans the video using AI and starts building a list of the most relevant bug moments, including timestamps and relevance scores. The API returns a job ID that is used in the next steps. Step 3 β Wait 45 Seconds The workflow pauses for 45 seconds to give WayinVideo time to process the video before checking for results. Step 4 β WayinVideo β Get Bug Moments Using the job ID from Step 2, the workflow requests the results from WayinVideo. This returns a list of clips, each with a title, description, start and end timestamp, and a relevance score. Step 5 β Check: Bug Moments Found? The workflow checks whether WayinVideo has returned any bug moment clips yet. If yes** β it moves forward to generate the ticket. If no** β it loops back to Step 3, waits another 45 seconds, and checks again. > β οΈ If WayinVideo never returns results (for example, if the URL is broken or the video format is unsupported), this loop will repeat indefinitely. Consider adding a retry limit to avoid runaway executions. Step 6 β AI Agent β Generate Bug Ticket GPT-4o-mini receives the customer details from the form and the bug moment data from WayinVideo. It writes a complete structured support ticket covering: a one-sentence bug summary, priority level, bug category, exact video timestamp, steps to reproduce, expected vs actual behaviour, which team to assign it to, and a suggested fix. Step 7 β Google Sheets β Save Bug Ticket The completed ticket is appended as a new row in your Google Sheet. The row includes the customer name, bug type, recording URL, number of bug moments found, the top bug timestamp, the full ticket text, a status of "Open", and the submission date and time. Key Features β Exact bug timestamp detection β WayinVideo pinpoints the precise moment in the recording where the bug occurs, so your dev team doesn't have to watch the whole video β Auto-priority assignment β GPT-4o-mini assigns Critical, High, Medium, or Low priority based on the type of bug reported β Auto-team routing β The AI decides whether the ticket belongs to Backend Dev, Frontend Dev, QA, DevOps, or Product β no manual triage needed β Smart retry loop β The workflow polls WayinVideo automatically every 45 seconds until results are ready, with no manual intervention required β Structured ticket format β Every ticket follows the same format with all required fields, making it consistent and ready to import into any project management tool β Zero-code form β Support agents submit bug reports through a hosted web form β no n8n access needed β Full audit trail in Sheets β Every ticket is logged with submission date, recording URL, and status so nothing falls through the cracks Customisation Options Capture more bug moments β In the node 2. WayinVideo β Find Bug Moments, change "limit": 3 to "limit": 5 to detect up to five bug moments per recording instead of three. Use a more powerful AI model β In the node 6a. OpenAI β Chat Model (GPT-4o-mini), switch from gpt-4o-mini to gpt-4o for higher-quality ticket writing on complex or ambiguous bugs. Send instant Slack alerts β Add a Slack node after 7. Google Sheets β Save Bug Ticket to notify your dev channel as soon as a new ticket is saved, including the priority level and timestamp. Add a max-retry counter β To prevent the polling loop from running forever, add a counter variable that increments each loop and an extra condition in 5. If β Bug Moments Found? to stop after a set number of attempts (for example, 10 retries). Push tickets to Jira or Linear β Replace or add after the Google Sheets node an HTTP Request node that calls the Jira or Linear API to create an issue automatically from the generated ticket content. Support multiple languages β Change "target_lang": "en" in 2. WayinVideo β Find Bug Moments to another language code (for example "fr" or "de") to handle recordings in other languages. Support Need help setting this up or want a custom version built for your team or agency? π§ Email: info@isawow.com π Website: https://isawow.com/
by Ranjan Dailata
This n8n workflow automates domain level keyword ranking analysis and enriches raw SEO metrics with AI-generated summaries. It combines structured keyword data from SE Ranking with natural-language insights produced by OpenAI, turning complex SERP datasets into actionable SEO intelligence. Who this is for? This workflow is designed for: SEO engineers and technical marketers Growth teams running programmatic SEO Agencies managing multi-domain keyword analysis Product teams building SEO analytics pipelines Developers using n8n for data enrichment and reporting If you work with keyword data and need machine-readable output plus human-readable insights, this workflow is for you. What this workflow does Accepts a target domain or URL, region, keyword type (organic/paid), and filters Fetches keyword ranking data from the SE Ranking Domain Keywords API Extracts metrics such as: Keyword positions Search volume & CPC Competition & difficulty SERP features & search intent Traffic estimates Uses OpenAI GPT-4.1-mini to generate: A comprehensive narrative summary A concise abstract overview Merges raw data and AI insights into a single enriched dataset Exports the final output as structured JSON for downstream use Setup Prerequisites Active SE Ranking API access OpenAI API key with GPT-4.1-mini enabled Running n8n instance (self-hosted or cloud) Basic understanding of keyword ranking metrics Configuration steps If you are new to SE Ranking, please signup on seranking.com Import the workflow JSON into n8n Configure credentials: SE Ranking using HTTP Header Authentication. Please make sure to set the header authentication as below. The value should contain a Token followed by a space with the SE Ranking API Key. OpenAI API (GPT-4.1-mini model) Open the Set the Input Fields node and define: target_site (domain or URL) source (region, e.g. us) type (organic or paid) limit, filters, and requested columns Verify the output as per the export data handling. Converts enriched SEO results into structured JSON output Creates binary data to support file-based exports Converts processed data into CSV format for easy analysis Inserts or updates records in Google Sheets for reporting Ensures data consistency across all export destinations Enables downstream automation, dashboards, and audits Click Execute Workflow How to customize this workflow to your needs You can easily adapt this workflow by: Switching between organic and paid keyword analysis Changing regions for international SEO tracking Modifying requested keyword columns and SERP filters Customizing the OpenAI prompt to generate: SEO action items Competitive insights Executive summaries Replacing file export with: Databases Dashboards Slack/Email alerts Data warehouses Summary This n8n template delivers a production ready SEO analytics pipeline that bridges structured SERP data with AI powered interpretation. By combining SE Rankingβs keyword intelligence with OpenAI driven summarization, it enables faster insights, better reporting, and scalable SEO decision making without manual analysis.
by RenderIO
Who is this for Content creators, YouTubers, and social media managers who want to repurpose long form videos into short clips without doing it manually. Works on self hosted n8n instances. What it does Monitors a Google Drive folder for new videos. When a video appears, the workflow downloads it, extracts the audio, transcribes it using Whisper, and sends the transcript to OpenAI to identify the best highlight moments. Each selected clip is then rendered in three aspect ratios (9:16 for TikTok, 9:16 for Reels, 1:1 for Square) using cloud based FFmpeg through RenderIO. The finished clips are uploaded back to Google Drive and every run is logged to a Google Sheet. How it works Watch Drive Folder polls your source folder every minute and triggers when a new video file is detected. Set Config holds all tunable settings in one place: clip count, folder IDs, sheet IDs, and LLM model. The video is downloaded from Google Drive and uploaded to RenderIO for processing. Extract Audio runs an FFmpeg command to pull the audio track from the video. The audio is sent to Whisper for transcription. Both TXT and SRT transcript files are saved to Google Drive. Pick Clips sends the transcript to OpenAI, which returns timestamped highlight suggestions. Validate Clips checks that all timestamps and durations are valid before rendering. Each clip is rendered in three formats through RenderIO with separate FFmpeg commands for each aspect ratio. All rendered clips are downloaded and uploaded to a dedicated output folder in Google Drive. Append Clip Row logs each clip to a Google Sheet and Append Run Summary records the overall processing stats. Requirements A self hosted or cloud n8n instance (uses a community node) The n8n-nodes-renderio community node installed via Settings > Community Nodes A free RenderIO account and API key from renderio.dev Google Drive and Google Sheets OAuth credentials An OpenAI API key How to set up Install the n8n-nodes-renderio community node from Settings > Community Nodes. Create credentials for Google Drive (OAuth2), Google Sheets (OAuth2), OpenAI, and RenderIO API. Import the workflow and open the Set Config node. Update the outputParentFolderId with the Google Drive folder ID where output folders should be created. Update the sheetId with your Google Sheet document ID. Set sheetTab and sheetRunsTab to the correct sheet tab IDs for clip logging and run summaries. Configure the Watch Drive Folder trigger node to point at your source video folder. Activate the workflow and drop a test video into the folder. How to customize Change clipCount in Set Config to generate more or fewer clips per video. Swap llmModel from gpt-4o-mini to gpt-4o or another model for different clip selection quality. Modify the FFmpeg commands in Build Commands for Clip to adjust resolution, bitrate, add watermarks, or change output formats. Replace Google Drive with S3 or another storage provider if that fits your stack. Add a Slack or Telegram notification node after the summary step to get alerted when processing finishes.
by Avkash Kakdiya
How it works This workflow automatically collects a list of companies from Google Sheets, searches for their competitors using SerpAPI, extracts up to 10 relevant competitor names with source links, and logs the results into both Google Sheets and Airtable. It runs on a set schedule, cleans and formats the company list, processes each entry individually, checks if competitors exist, and separates results into successful and βno competitors foundβ lists for organized tracking. Step-by-step 1. Trigger & Input Auto Run (Scheduled) β Executes every day at the set time (e.g., 9 AM). Read Companies Sheet β Pulls the list of companies from a Google Sheet (List column). Clean & Format Company List β Removes empty rows, trims names, and attaches row numbers for tracking. Loop Over Companies β Processes each company one at a time in batches. 2. Competitor Search Search Company Competitors (SerpAPI) β Sends a query like "{Company} competitors" to SerpAPI, retrieving structured search results in JSON format. 3. Data Extraction & Validation Extract Competitor Data from Search β Parses SerpAPI results to: Identify the company name Extract up to 10 competitor names Capture the top source URL Count total search results Has Competitors? β Checks if any competitors were found: Yes β Proceeds to logging No β Logs in βno resultsβ list 4. Logging Results Log to Result Sheet β Appends or updates competitor data into the results Google Sheet. Log Companies Without Results β Records companies with zero competitors found in a separate section of the results sheet. Sync to Airtable β Pushes all results (successful or not) into Airtable for unified storage and analysis. Benefits Automated Competitor Research β Eliminates the need for manual Google searching. Daily Insights β Runs automatically at your chosen schedule. Clean Data Output β Stores structured competitor lists with sources for easy review. Multi-Destination Sync β Saves to both Google Sheets and Airtable for flexibility. Scalable & Hands-Free β Handles hundreds of companies without extra effort.
by Daniel Shashko
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. Who is this for? Brand managers and marketing teams** needing automated social listening and engagement on Reddit Community managers** responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers** who want to scale Reddit marketing without hiring a team PR and reputation teams** monitoring brand sentiment and responding to discussions in real-time Product marketers** seeking organic engagement opportunities in product-related communities Any business** that wants to build authentic Reddit presence while avoiding spammy marketing tactics Setup Steps Setup time:** Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements:** Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOUR_SHEET_ID with your actual Google Sheets document ID Send Slack Report: Replace YOUR_CHANNEL_ID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel