by WeblineIndia
Smart Contract Event Monitor (Web3) This workflow automatically monitors the Ethereum blockchain, extracts USDT transfer events, filters large-value transactions, stores them in Airtable and sends a clean daily summary alert to Slack. This workflow checks the latest Ethereum block every day and identifies high-value USDT transfers. It fetches on-chain logs using Alchemy, extracts sender/receiver/value details, filters transactions above a threshold, stores them in Airtable and finally sends a single clear summary alert to Slack. You receive: Daily blockchain check (automated) Airtable tracking of all high-value USDT transfers A Slack alert summarizing the count + the largest transfer Ideal for teams wanting simple, automated visibility of suspicious or large crypto movements without manually scanning the blockchain. Quick Start – Implementation Steps Add your Alchemy Ethereum Mainnet API URL in both HTTP nodes. Connect and configure your Airtable base & table. Connect your Slack credentials and set the channel for alerts. Adjust the value threshold in the IF node (default: 1,000,000,000). Activate the workflow — daily monitoring begins instantly. What It Does This workflow automates detection of high-value USDT transfers on Ethereum: Fetches the latest block number using Alchemy. Retrieves all USDT Transfer logs from that block. Extracts structured data: Sender Receiver Amount Contract Block number Transaction hash Filters only transactions above a configurable threshold. Saves each high-value transaction into Airtable for record-keeping. Generates a summary including: Total number of high-value transfers The single largest transfer Sends one clean alert message to Slack. This ensures visibility of suspicious or large fund movements with no repeated alerts. Who’s It For This workflow is ideal for: Crypto analytics teams Blockchain monitoring platforms Compliance teams tracking high-value activity Web3 product teams Developers needing automated USDT transfer tracking Anyone monitoring whale movements / suspicious transactions Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Alchemy API URL** (Ethereum Mainnet) Airtable base** + Personal Access Token Slack workspace** with API permissions Basic understanding of Ethereum logs, hex values & JSON data How It Works Daily Check – Workflow runs automatically at your set time. Get Latest Block Number – Fetches newest Ethereum block from Alchemy. Fetch USDT Logs – Queries all Transfer events (ERC-20 standard). Extract Transaction Details – Converts hex → readable data. Filter High-Value Transactions – Keeps only large value transfers. Save to Airtable – Adds each transfer record to your database. Generate Summary – Finds the largest transfer & total count. Send Slack Alert – Notifies your team with one clean summary. Setup Steps Import the provided n8n JSON file. Open the Get Latest Block and Fetch Logs HTTP nodes → add your Alchemy API URL. Ensure USDT contract address (already included):0xdAC17F958D2ee523a2206206994597C13D831ec7 Connect your Airtable account and map: Contract From Address To Address Value Block Number txHash Connect Slack API credentials and choose your channel. Change the threshold limit in the IF node if needed (default: 1B). Activate the workflow — done! How To Customize Nodes Customize Value Threshold Modify the IF node: Increase or decrease the minimum transfer value Change logic to smaller or larger whale-tracking Customize Airtable Storage You can add fields like: Timestamp Token symbol USD price (using price API) Transaction status Risk classification Customize Slack Alerts You may add: Emojis Mentions (@channel, @team) Links to Etherscan Highlighted blocks for critical transfers Customize Web3 Provider Replace Alchemy with: Infura QuickNode Public RPC (not recommended for reliability) Add-Ons (Optional Enhancements) You can extend this workflow to: Track multiple ERC-20 tokens Process several blocks instead of just the latest Add price conversion (USDT → USD value) Detect transfers to suspicious wallets Generate daily or weekly summary reports in Slack Create a dashboard using Airtable Interfaces Add OpenAI-based insights (large spike, suspicious pattern, etc.) Use Case Examples 1\. Whale Tracking Detect large USDT movements (>1M or >5M). 2\. Compliance Monitoring Log high-value transfers in Airtable for audits. 3\. Real-Time Alerts Slack alerts tell your team instantly about big movements. 4\. On-Chain Analytics Automate structured extraction of Ethereum logs. 5\. Exchange Monitoring Detect large inflows/outflows to known addresses. Troubleshooting Guide | Issue | Possible Cause | Solution | |------------------------|-----------------------------------|---------------------------------------------------------| | No data in Airtable | Logs returned empty | Ensure USDT transfer events exist in that block | | Values are “zero” | Hex parsing failed | Check extract-code logic | | Slack alert not sent | Invalid credentials | Update Slack API key | | Airtable error | Wrong field names | Match Airtable column names exactly | | HTTP request fails | Wrong RPC URL | Re-check Alchemy API key | | Workflow not running | Schedule disabled | Enable "Daily Check" node | Need Help? If you need help customizing or extending this workflow — adding multi-token monitoring, setting up dashboards, improving alerts or scaling this for production — our n8n workflow developers at WeblineIndia can assist you with advanced automation.
by Rahi
n8n Workflow: AI-Personalized Email Outreach (Smartlead) 🔄 Purpose This workflow automates cold email campaigns by: Fetching leads Generating hyper-personalized email content using AI Sending emails via Smartlead API Logging campaign activity into Google Sheets 🧩 Workflow Structure Schedule Trigger Starts the workflow automatically at scheduled intervals. Ensures continuous campaign execution. Get Leads Fetches lead data (name, email, company, role, industry). Serves as the input for personalization. Loop Over Leads Processes each lead one by one. Maintains individualized email generation. Aggregate Lead Data Collects and formats lead attributes. Prepares structured input for the AI model. Basic LLM Chain #1 Generates personalized snippets/openers using AI. Tailored based on company, role, and industry. Update Row (Google Sheets) Saves AI outputs (snippets) for tracking and QA. Basic LLM Chain #2 Expands snippet into a full personalized email draft. Includes subject line + email body. Information Extractor Extracts structured fields from AI output: Subject Greeting Call-to-Action (CTA) Closing Update Row (Google Sheets) Stores finalized draft in Google Sheets. Provides visibility and audit trail. Code Formats email into Smartlead-compatible payload. Maps fields like subject, body, and recipient details. Smartlead API Request Sends the personalized email through Smartlead. Returns message ID and delivery status. Basic LLM Chain #3 (Optional) Generates follow-up versions for multi-step campaigns. Ensures varied engagement over time. Information Extractor (Follow-ups) Structures follow-up emails into ready-to-send format. Update Row (Google Sheets) Updates campaign logs with: Smartlead send status Message IDs AI personalization notes ⚙️ Data Flow Summary Trigger** → Runs workflow Get Leads** → Fetch lead records LLM Personalization** → Create openers + full emails Google Sheets** → Save drafts & logs Smartlead API** → Send personalized email Follow-ups** → Generate and log structured follow-up messages 📊 Use Case Automates hyper-personalized cold email outreach at scale. Uses AI to improve response rates with contextual personalization. Provides full visibility by saving drafts and send logs in Google Sheets. Integrates seamlessly with Smartlead for sending and tracking.
by Rahul Joshi
📘 Description This workflow automates dependency update risk analysis and reporting using Jira, GPT-4o, Slack, and Google Sheets. It continuously monitors Jira for new package or dependency update tickets, uses AI to assess their risk levels (Low, Medium, High), posts structured comments back into Jira, and alerts the DevOps team in Slack — all while logging historical data into Google Sheets for visibility and trend analysis. This ensures fast, data-driven decisions for dependency upgrades, improved code stability, and reduced security risks — with zero manual triage. ⚙️ What This Workflow Does (Step-by-Step) 🟢 When Clicking “Execute Workflow” Manually triggers the dependency risk analysis sequence for immediate review or scheduled monitoring. 📋 Fetch All Active Jira Issues Retrieves all active Jira issues to identify tickets related to dependency or package updates. Provides the complete dataset — including summary, status, and assignee information — for AI-based risk evaluation. ✅ Validate Jira Query Response Verifies that Jira returned valid issue data before proceeding. If data exists → continues filtering dependency updates. If no data or API error → logs the failure to Google Sheets. Prevents workflow from continuing with empty or broken datasets. 🔍 Identify Dependency Update Issues Filters Jira issues to find only dependency-related tickets (keywords like “update,” “bump,” “package,” or “library”). This ensures only relevant version update tasks are analyzed — filtering out unrelated feature or bug tickets. 🏷️ Extract Relevant Issue Metadata Extracts essential fields such as key, summary, priority, assignee, status, and created date for downstream AI processing. Simplifies the data payload and ensures accurate, structured analysis. 📢 Alert DevOps Team in Slack Immediately notifies the assigned DevOps engineer via Slack DM about any new dependency update issue. Includes formatted details like summary, key, status, priority, and direct Jira link for quick access. Ensures rapid visibility and faster response to potential risk tickets. 🤖 AI-Powered Risk Assessment Analyzer Uses GPT-4o (Azure OpenAI) to intelligently evaluate each dependency update’s risk level and impact summary. Considers factors such as: Dependency criticality Version change type (major/minor/patch) Security or EOL indicators Potential breaking changes Outputs a clean JSON with fields: {"risk_level": "Low | Medium | High","impact_summary": "Short human-readable explanation"} Helps DevOps teams prioritize updates with context. 🧠 GPT-4o Language Model Configuration Configures the AI reasoning engine for precise, context-aware DevOps assessments. Optimized for consistent technical tone and cost-efficient batch evaluation. 📊 Parse AI Response to Structured Data Safely parses the AI’s JSON output, removing markdown artifacts and ensuring structure. Adds parsed fields — risk_level and impact_summary — back to the Jira context. Includes fail-safes to prevent crashes on malformed AI output (fallbacks to “Unknown” and “Failed to parse”). 💬 Post AI Risk Assessment to Jira Ticket Automatically posts the AI’s analysis as a comment on the Jira issue: Displays 🤖 AI Risk Assessment Report header Shows Risk Level and Impact Summary Includes a checklist of next steps for developers Creates a permanent audit trail for each dependency decision inside Jira. 📈 Log Dependency Updates to Tracking Dashboard Appends all analyzed updates into Google Sheets, recording: Date Jira Key & Summary Risk Level & Impact Summary Assignee & Status This builds a historical dependency risk database that supports: Trend monitoring Security compliance reviews Dependency upgrade metrics DevOps productivity tracking 📊 Log Jira Query Failures to Error Sheet If the Jira query fails, the workflow automatically logs the error (API/auth/network) into a centralized error sheet for troubleshooting and visibility. 🧩 Prerequisites Jira Software Cloud API credentials Azure OpenAI (GPT-4o) access Slack API connection Google Sheets OAuth2 credentials 💡 Key Benefits ✅ Automated dependency risk assessment ✅ Instant Slack alerts for update visibility ✅ Historical tracking in Google Sheets ✅ Reduced manual triage and faster decision-making ✅ Continuous improvement in release reliability and security 👥 Perfect For DevOps and SRE teams managing large dependency graphs Engineering managers monitoring package updates and risks Security/compliance teams tracking vulnerability fix adoption Product teams aiming for stable CI/CD pipelines
by Rahul Joshi
📊 Description Automate post-purchase workflows by instantly fetching successful Stripe payments, matching them to corresponding automation templates in Google Sheets, and sending customers personalized access emails using AI-generated content. This system ensures each buyer receives their digital template, password, and onboarding details automatically after payment. 💳📩🤖 What This Template Does Step 1: Triggers daily at 7:00 AM IST to fetch all successful payment charges from Stripe. ⏰ Step 2: Retrieves payment intent and product details for each successful charge to enrich context. 💰 Step 3: Validates required fields (order reference, product name, customer name, email). ✅ Step 4: Matches purchased product with the automation record in Google Sheets via AI lookup. 🔍 Step 5: Combines Stripe and Sheet data into one record, ensuring accuracy and completeness. 🔄 Step 6: Filters out already-processed customers to avoid duplicate sends. 🧮 Step 7: Generates a personalized thank-you email using Azure OpenAI (GPT-4o-mini) including access links, password, and onboarding tips. 💌 Step 8: Sends the email through Gmail to the customer automatically. 📧 Step 9: Logs each transaction and email delivery into Google Sheets for tracking and auditing. 📊 Key Benefits ✅ Fully automated Stripe-to-email delivery flow ✅ Zero manual intervention — instant template delivery ✅ AI-personalized HTML emails with customer details ✅ Centralized purchase logging and analytics ✅ Eliminates duplicates and ensures smooth customer experience Features Scheduled daily trigger (7:00 AM IST) Stripe API integration for payment and product details Google Sheets lookup for automation files and passwords GPT-powered email content generation Gmail API integration for delivery Google Sheets logging for audit trail Requirements Stripe API credentials Google Sheets OAuth2 credentials Gmail OAuth2 credentials Azure OpenAI API credentials Target Audience SaaS or digital product sellers using Stripe Automation template marketplaces Small teams delivering digital assets via email Businesses seeking instant customer fulfillment
by Jean-Marie Rizkallah
🛡️ Jamf Policy Integrity Monitor 🎯 Overview A security-focused n8n workflow that monitors Jamf Pro policies for any unauthorized or accidental modification. It delivers configuration integrity and real-time visibility across managed Apple environments. ⚙️ Setup Instructions Add your Jamf Pro and Slack credentials in n8n. Import the workflow template. Configure your preferred schedule and alert channel. No coding required. The setup takes minutes. 🔍 How It Works The workflow connects to Jamf Pro API, detects configuration changes, and sends instant alerts to Slack. It maintains awareness of policy integrity while minimizing manual checks. The logic runs automatically in the background for continuous monitoring. 📢 Slack Notification Example :warning: Policy: Uninstall EDR modified :calendar: DateTime: Oct 5, 2025, 10:23:27 AM ✅ Why It Matters Jamf provides no built-in alerts when policies are modified. This workflow closes that visibility gap and gives your team instant awareness of policy changes without manual auditing. Ideal for security engineers, Jamf administrators, and compliance teams focused on operational assurance.
by Yves Tkaczyk
Use cases Monitor Google Drive folder, parsing PDF, DOCX and image file into a destination folder, ready for further processing (e.g. RAG ingestion, translation, etc.) Keep processing log in Google Sheet and send Slack notifications. How it works Trigger: Watch Google Drive folder for new and updated files. Create a uniquely named destination folder, copying the input file. Parse the file using Mistral Document, extracting content and handling non-OCRable images separately. Save the data returned by Mistral Document into the destination Google Drive folder (raw JSON file, Markdown files, and images) for further processing. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (14 nodes total) to use the credentials Mistral node: Create Mistral Cloud API credentials. Read more about Mistral Cloud Credentials. Update the OCR Document node to use the Mistral Cloud credentials. Slack nodes: Create Slack OAuth2 credentials. Read more about Slack OAuth2 credentials Update the two Slack nodes: Send Success Message and Send Error Message: Set the credentials Select the channel where you want to send the notifications (channels can be different for success and errors). Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration. Ensure the spreadsheet can be accessed as Editor by the account used by the Google Credentials above. Create a directory for input files and a directory for output folders/files. Ensure the directories can be accessed by the account used by the Google Credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes. Requirements Google account with Google API access Mistral Cloud account access to Mistral API key. Slack account with access to Slack client ID and secret ID. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone building a data pipeline ingesting files to be OCRed for further processing. 🔒 Security All credentials are stored as n8n credentials. The only information stored in this workflow that could be considered sensitive are the Google Drive Directory and Sheet IDs. These directories and the spreadsheet should be secured according to your needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by Omer Fayyaz
Efficient loop-less N8N Workflow Backup Automation to Google Drive This workflow eliminates traditional loop-based processing entirely, delivering unprecedented performance and reliability even when the number of workflows to be processed are large What Makes This Different: NO SplitInBatches node** - Traditional workflows use loops to process workflows one by one NO individual file uploads** - No more multiple Google Drive API calls NO batch error handling** - Eliminates complex loop iteration error management ALL workflows processed simultaneously** - Revolutionary single-operation approach Single compressed archive** - One ZIP file containing all workflows One Google Drive upload** - Maximum efficiency, minimum API usage Key Benefits of Non-Loop Architecture: 3-5x Faster Execution** - Eliminated loop overhead and multiple API calls Reduced API Costs** - Single upload instead of dozens of individual operations Higher Reliability** - Fewer failure points, centralized error handling Better Scalability** - Performance doesn't degrade with workflow count Large Workflow Support* - *Efficiently handles hundreds of workflows without performance degradation** Easier Maintenance** - Simpler workflow structure, easier debugging Cleaner Monitoring** - Single success/failure point instead of loop iterations Who's it for This template is designed for n8n users, DevOps engineers, system administrators, and IT professionals who need reliable, automated backup solutions for their n8n workflows. It's perfect for businesses and individuals who want to ensure their workflow automation investments are safely backed up with intelligent scheduling, compression, and cloud storage integration. How it works / What it does This workflow creates an intelligent, automated backup system that transforms n8n workflow backups from inefficient multi-file operations into streamlined single-archive automation. The system: Triggers automatically every 4 hours or manually on-demand Creates timestamped folders in Google Drive for organized backup storage Retrieves all n8n workflows via the n8n API in a single operation Converts workflows to JSON and aggregates binary data efficiently Compresses all workflows into a single ZIP archive (eliminating the need for loops) Uploads the compressed backup to Google Drive in one operation Provides real-time Slack notifications for monitoring and alerting Key Innovation: No Loops Required - Unlike traditional backup workflows that use SplitInBatches or loops to process workflows individually, this system processes all workflows simultaneously and creates a single compressed archive, dramatically improving performance and reliability. How to set up 1. Configure Google Drive API Credentials Set up Google Drive OAuth2 API credentials Ensure the service account has access to create folders and upload files Update the parent folder ID where backup folders will be created 2. Configure n8n API Access Set up internal n8n API credentials for workflow retrieval Ensure the API has permissions to read all workflows Configure retry settings for reliability 3. Set up Slack Notifications Configure Slack API credentials for the notification channel Update the channel ID where backup notifications will be sent Customize notification messages as needed 4. Schedule Configuration The workflow automatically runs every 4 hours Manual execution is available for immediate backups Adjust the schedule in the Schedule Trigger node as needed 5. Test the Integration Run a manual backup to verify all components work correctly Check Google Drive for the created backup folder and ZIP file Verify Slack notifications are received Requirements n8n instance** (self-hosted or cloud) with API access Google Drive account** with API access and sufficient storage Slack workspace** for notifications (optional but recommended) n8n workflows** that need regular backup protection How to customize the workflow Modify Backup Frequency Adjust the Schedule Trigger node for different intervals (hourly, daily, weekly) Add multiple schedule triggers for different backup types Implement conditional scheduling based on workflow changes Enhance Storage Strategy Add multiple Google Drive accounts for redundancy Implement backup rotation and retention policies Add compression options (ZIP, TAR, 7Z) for different use cases Expand Notification System Add email notifications for critical backup events Integrate with monitoring systems (PagerDuty, OpsGenie) Add backup success/failure metrics and reporting Security Enhancements Implement backup encryption before upload Add backup verification and integrity checks Set up access logging and audit trails Performance Optimizations Add parallel processing for large workflow collections Implement incremental backup strategies Add backup size monitoring and alerts Key Features Zero-loop architecture** - Processes all workflows simultaneously without batch processing Intelligent compression** - Single ZIP archive instead of multiple individual files Automated scheduling** - Runs every 4 hours with manual override capability Organized storage** - Timestamped folders with clear naming conventions Real-time monitoring** - Slack notifications for all backup events Error handling** - Centralized error management with graceful failure handling Scalable design** - Handles any number of workflows efficiently Technical Architecture Highlights Eliminated Inefficiencies No SplitInBatches node** - Replaced with direct workflow processing No individual file uploads** - Single compressed archive upload No loop iterations** - All workflows processed in one operation No batch error handling** - Centralized error management Performance Improvements Faster execution** - Eliminated loop overhead and multiple API calls Reduced API quota usage** - Single Google Drive upload per backup Better resource utilization** - Efficient memory and processing usage Improved reliability** - Fewer points of failure in the workflow Data Flow Optimization Parallel processing** - Folder creation and workflow retrieval happen simultaneously Efficient aggregation** - Code node processes all binaries at once Smart compression** - Single ZIP with all workflows included Streamlined upload** - One file transfer instead of multiple operations Use Cases Production n8n instances** requiring reliable backup protection Development teams** needing workflow version control and recovery DevOps automation** requiring disaster recovery capabilities Business continuity** planning for critical automation workflows Compliance requirements** for workflow documentation and backup Team collaboration** with shared workflow backup access Business Value Risk Mitigation** - Protects valuable automation investments Operational Efficiency** - Faster, more reliable backup processes Cost Reduction** - Lower storage costs and API usage Compliance Support** - Organized backup records for audits Team Productivity** - Reduced backup management overhead Scalability** - Handles growth without performance degradation This template revolutionizes n8n workflow backup by eliminating the complexity and inefficiency of traditional loop-based approaches, providing a robust, scalable solution that grows with your automation needs while maintaining the highest levels of reliability and performance.
by Elay Guez
🔍 AI-Powered Web Research in Google Sheets with Bright Data 📋 Overview Transform any Google Sheets cell into an intelligent web scraper! Type =BRIGHTDATA("cell", "search prompt") and get AI-filtered result from every website in ~20 seconds. What happens automatically: AI optimizes your search query Bright Data scrapes the web (bypasses bot detection) AI analyzes and filters result Returns clean data directly to your cell Completes in <25 seconds Cost: ~$0.02-0.05 per search | Time saved: 3-5 minutes per search 👥 Who's it for Market researchers needing competitive intelligence E-commerce teams tracking prices Sales teams doing lead prospecting SEO specialists gathering content research Real estate agents monitoring listings Anyone tired of manual copy-paste ⚙️ How it works Webhook Call - Google Sheets function sends POST request Data Preparation - Organizes input structure AI Query Optimization - GPT-4.1 Mini refines search query Web Scraping - Bright Data fetches data while bypassing blocks AI Analysis - GPT-4o Mini filters and summarizes result Response - Returns plain text to your cell Logging - Updates logs for monitoring 🛠️ Setup Instructions Time to deploy: 20 minutes Requirements n8n instance with public URL Bright Data account + API key OpenAI API key Google account for Apps Script Part 1: n8n Workflow Setup Import this template into your n8n instance Configure Webhook node: Copy your webhook URL: https://n8n.yourdomain.com/webhook/brightdata-search Set authentication: Header Auth Set API key: 12312346 (or create your own) Add OpenAI credentials to AI nodes. Configure Bright Data: Add API credentials Configure Output Language: Manually edit the "Set Variables" Node. Test workflow with manual execution Activate the workflow Part 2: Google Sheets Function Open Google Sheet → Extensions → Apps Script Paste this code: function BRIGHTDATA(prompt, source) { if (!prompt || prompt === "") { return "❌ Must enter prompt"; } source = source || "google"; // Update with YOUR webhook URL const N8N_WEBHOOK_URL = "https://your-n8n-domain.com/webhook/brightdata-search"; // Update with YOUR password const API_KEY = "12312346"; let spreadsheetId, sheetName, cellAddress; try { const sheet = SpreadsheetApp.getActiveSheet(); const activeCell = sheet.getActiveCell(); spreadsheetId = SpreadsheetApp.getActiveSpreadsheet().getId(); sheetName = sheet.getName(); cellAddress = activeCell.getA1Notation(); } catch (e) { return "❌ Cannot identify cell"; } const payload = { prompt: prompt, source: source.toLowerCase(), context: { spreadsheetId: spreadsheetId, sheetName: sheetName, cellAddress: cellAddress, timestamp: new Date().toISOString() } }; const options = { method: "post", contentType: "application/json", payload: JSON.stringify(payload), muteHttpExceptions: true, headers: { "Accept": "text/plain", "key": API_KEY } }; try { const response = UrlFetchApp.fetch(N8N_WEBHOOK_URL, options); const responseCode = response.getResponseCode(); if (responseCode !== 200) { Logger.log("Error response: " + response.getContentText()); return "❌ Error " + responseCode; } return response.getContentText(); } catch (error) { Logger.log("Exception: " + error.toString()); return "❌ Connection error: " + error.toString(); } } function doGet(e) { return ContentService.createTextOutput(JSON.stringify({ status: "alive", message: "Apps Script is running", timestamp: new Date().toISOString() })).setMimeType(ContentService.MimeType.JSON); } Update N8N_WEBHOOK_URL with your webhook Update API_KEY with your password Save (Ctrl+S / Cmd+S) - Important! Close Apps Script editor 💡 Usage Examples =BRIGHTDATA("C3", "What is the current price of the product?") =BRIGHTDATA("D30", "What is the size of this company?") =BRIGHTDATA("A4", "Do this comapny is hiring Developers?") 🎨 Customization Easy Tweaks AI Models** - Switch to GPT-4o for better optimization Response Format** - Modify prompt for specific outputs Speed** - Optimize AI prompts to reduce time Language** - Change prompts for any language Advanced Options Implement rate limiting Add data validation Create async mode for long queries Add Slack notifications 🚀 Pro Tips Be Specific** - "What is iPhone 15 Pro 256GB US price?" beats "What is iPhone price?" Speed Matters** - Keep prompts concise (30s timeout limit) Monitor Costs** - Track Bright Data usage Debug** - Check workflow logs for errors ⚠️ Important Notes Timeout:** 30-second Google Sheets limit (aim for <20s) Plain Text Only:** No JSON responses Costs:** Monitor Bright Data at console.brightdata.com Security:** Keep API keys secret No Browser Storage:** Don't use localStorage/sessionStorage 🔧 Troubleshooting | Error | Solution | |-------|----------| | "Exceeded maximum execution time" | Optimize AI prompts or use async mode | | "Could not fetch data" | Verify Bright Data credentials | | Empty cell | Check n8n logs for AI parsing issues | | Broken characters | Verify UTF-8 encoding in webhook node | 📚 Resources Bright Data API Docs n8n Webhook Documentation Google Apps Script Reference Built with ❤️ by Elay Guez
by takuma
This workflow automates reputation management for physical stores (restaurants, retail, clinics) by monitoring Google Maps reviews, analyzing them with AI, and drafting professional replies. It acts as a 24/7 customer support assistant, ensuring you never miss a negative review and saving hours of manual writing time. Who is this for? Store Managers & Owners:** Keep track of customer sentiment without manually checking Google Maps every day. Marketing Agencies:** Automate local SEO reporting and response drafting for multiple clients. Customer Support Teams:** Get instant alerts for negative feedback to resolve issues quickly. How it works Schedule: Runs every 24 hours (customizable) to fetch the latest data. Scrape: Uses Apify to retrieve the latest reviews from a specific Google Maps URL. Filter: Checks the Google Sheet database to identify only new reviews and avoid duplicates. AI Analysis: An AI Agent (via OpenRouter/OpenAI) analyzes the review text to: Generate a short summary. Draft a polite, context-aware reply based on the star rating (e.g., apologies for low stars, gratitude for high stars). Alert: Sends a Slack notification. Low Rating (<4 stars): Alerts a specific channel (e.g., #customer-support) with a warning. High Rating: Alerts a general channel (e.g., #wins) to celebrate. Save: Appends the review details, AI summary, and draft reply to the Google Sheet. Requirements n8n:** Cloud or self-hosted (v1.0+). Apify Account:* To run the *Google Maps Reviews Scraper. Google Cloud Platform:** Enabled Google Sheets API. Slack Workspace:** A webhook URL or OAuth connection. OpenRouter (or OpenAI) API Key:** For the LLM generation. How to set up Google Sheets: Create a new sheet with the following headers in the first row: reviewId, publishedAt, reviewerName, stars, text, ai_summary, ai_reply, reviewUrl, output, publishedAt date. Configure Credentials: Set up your accounts for Google Sheets, Apify, Slack, and OpenRouter within n8n. Edit the "CONFIG" Node: MAPS_URL: Paste the full Google Maps link to your store. SHEET_ID: Paste the ID found in your Google Sheet URL. SHOP_NAME: Your store's name. Slack Nodes: Select the appropriate channels for positive and negative alerts. How to customize Change the AI Persona:* Open the *AI Agent** node and modify the "System Message" to match your brand's tone of voice (e.g., casual, formal, or witty). Adjust Alert Thresholds:* Edit the *If Rating < 4** node to change the criteria for what constitutes a "negative" review (e.g., strictly < 3 stars). Multi-Store Support:** You can loop this workflow over a list of URLs to manage multiple locations in a single execution.
by Rahul Joshi
📊 Description Streamline Facebook Messenger inbox management with an AI-powered categorization and response system. 💬⚙️ This workflow automatically classifies new messages as Lead, Query, or Spam using GPT-4, routes them for approval via Slack, responds on Facebook once approved, and logs all interactions into Google Sheets for tracking. Perfect for support and marketing teams managing high volumes of inbound DMs. 🚀📈 What This Template Does 1️⃣ Trigger – Runs hourly to fetch new Facebook Page messages. ⏰ 2️⃣ Extract & Format – Collects sender info, timestamps, and message content for analysis. 📋 3️⃣ AI Categorization – Uses GPT-4 to identify message type (Lead, Query, Spam) and suggest replies. 🧠 4️⃣ Slack Approval Flow – Sends categorized leads and queries to Slack for quick team approval. 💬 5️⃣ Facebook Response – Posts AI-suggested replies back to the original sender once approved. 💌 6️⃣ Data Logging – Records every message, reply, and approval status into Google Sheets for analytics. 📊 7️⃣ Error Handling – Automatically alerts via Slack if the workflow encounters an error. 🚨 Key Benefits ✅ Reduces manual message triage on Facebook Messenger ✅ Ensures consistent and professional customer replies ✅ Provides full visibility via Google Sheets logs ✅ Centralizes team approvals in Slack for faster response times ✅ Leverages GPT-4 for accurate categorization and natural replies Features Hourly Facebook message fetch with Graph API GPT-4 powered text classification and reply suggestion Slack-based dual approval flow Automated Facebook replies post-approval Google Sheets logging for all categorized messages Built-in error detection and Slack alerting Requirements Facebook Graph API credentials with page message permissions OpenAI API key for GPT-4 processing Slack API credentials with chat:write permission Google Sheets OAuth2 credentials Environment Variables: FACEBOOK_PAGE_ID GOOGLE_SHEET_ID GOOGLE_SHEET_NAME SLACK_CHANNEL_ID Target Audience Marketing and lead-generation teams using Facebook Pages 📣 Customer support teams managing Messenger queries 💬 Businesses seeking automated lead routing and CRM sync 🧾 Teams leveraging AI for customer engagement optimization 🤖 Step-by-Step Setup Instructions 1️⃣ Connect Facebook Graph API credentials and set your page ID. 2️⃣ Add OpenAI API credentials for GPT-4. 3️⃣ Configure Slack channel ID and credentials. 4️⃣ Link your Google Sheet for message logging. 5️⃣ Replace environment variable placeholders with your actual IDs. 6️⃣ Test the workflow manually before enabling automation. 7️⃣ Activate the schedule trigger for ongoing hourly execution. ✅
by Daniel Shashko
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. Who is this for? Brand managers and marketing teams** needing automated social listening and engagement on Reddit Community managers** responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers** who want to scale Reddit marketing without hiring a team PR and reputation teams** monitoring brand sentiment and responding to discussions in real-time Product marketers** seeking organic engagement opportunities in product-related communities Any business** that wants to build authentic Reddit presence while avoiding spammy marketing tactics Setup Steps Setup time:** Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements:** Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOUR_SHEET_ID with your actual Google Sheets document ID Send Slack Report: Replace YOUR_CHANNEL_ID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
by Avkash Kakdiya
How it works This workflow automatically collects a list of companies from Google Sheets, searches for their competitors using SerpAPI, extracts up to 10 relevant competitor names with source links, and logs the results into both Google Sheets and Airtable. It runs on a set schedule, cleans and formats the company list, processes each entry individually, checks if competitors exist, and separates results into successful and “no competitors found” lists for organized tracking. Step-by-step 1. Trigger & Input Auto Run (Scheduled) – Executes every day at the set time (e.g., 9 AM). Read Companies Sheet – Pulls the list of companies from a Google Sheet (List column). Clean & Format Company List – Removes empty rows, trims names, and attaches row numbers for tracking. Loop Over Companies – Processes each company one at a time in batches. 2. Competitor Search Search Company Competitors (SerpAPI) – Sends a query like "{Company} competitors" to SerpAPI, retrieving structured search results in JSON format. 3. Data Extraction & Validation Extract Competitor Data from Search – Parses SerpAPI results to: Identify the company name Extract up to 10 competitor names Capture the top source URL Count total search results Has Competitors? – Checks if any competitors were found: Yes → Proceeds to logging No → Logs in “no results” list 4. Logging Results Log to Result Sheet – Appends or updates competitor data into the results Google Sheet. Log Companies Without Results – Records companies with zero competitors found in a separate section of the results sheet. Sync to Airtable – Pushes all results (successful or not) into Airtable for unified storage and analysis. Benefits Automated Competitor Research – Eliminates the need for manual Google searching. Daily Insights – Runs automatically at your chosen schedule. Clean Data Output – Stores structured competitor lists with sources for easy review. Multi-Destination Sync – Saves to both Google Sheets and Airtable for flexibility. Scalable & Hands-Free – Handles hundreds of companies without extra effort.