by IranServer.com
Automate IP geolocation and HTTP port scanning with Google Sheets trigger This n8n template automatically enriches IP addresses with geolocation data and performs HTTP port scanning when new IPs are added to a Google Sheets document. Perfect for network monitoring, security research, or maintaining an IP intelligence database. Who's it for Network administrators, security researchers, and IT professionals who need to: Track IP geolocation information automatically Monitor HTTP service availability across multiple ports Maintain centralized IP intelligence in spreadsheets Automate repetitive network reconnaissance tasks How it works The workflow triggers whenever a new row containing an IP address is added to your Google Sheet. It then: Fetches geolocation data using the ip-api.com service to get country, city, coordinates, ISP, and organization information Updates the spreadsheet with the geolocation details Scans common HTTP ports (80, 443, 8080, 8000, 3000) to check service availability Records port status back to the same spreadsheet row, showing which services are accessible The workflow handles both successful connections and various error conditions, providing a comprehensive view of each IP's network profile. Requirements Google Sheets API access** - for reading triggers and updating data Google Sheets document** with at least an "IP" column header How to set up Create a Google Sheet with columns: IP, Country, City, Lat, Lon, ISP, Org, Port_80, Port_443, Port_8000, Port_8080, Port_3000 Configure Google Sheets credentials in both the trigger and update nodes Update the document ID in the Google Sheets Trigger and both Update nodes to point to your spreadsheet Test the workflow by adding an IP address to your sheet and verifying the automation runs How to customize the workflow Modify port list**: Edit the "Edit Fields" node to scan different ports by changing the ports array Add more geolocation fields**: The ip-api.com response includes additional fields like timezone, zip code, and AS number Change trigger frequency**: Adjust the polling interval in the Google Sheets Trigger for faster or slower monitoring Add notifications**: Insert Slack, email, or webhook nodes to alert when specific conditions are detected Filter results**: Add IF nodes to process only certain IP ranges or geolocation criteria
by vinci-king-01
Daily Stock Regulatory News Aggregator with Compliance Alerts and Google Sheets Tracking ๐ฏ Target Audience Compliance officers and regulatory teams Financial services firms monitoring regulatory updates Investment advisors tracking regulatory changes Risk management professionals Corporate legal departments Stock traders and analysts monitoring regulatory news ๐ Problem Statement Manually monitoring regulatory updates from multiple agencies (SEC, FINRA, ESMA) is time-consuming and error-prone. This template automates daily regulatory news monitoring, aggregates updates from major regulatory bodies, filters for recent announcements, and instantly alerts compliance teams to critical regulatory changes, enabling timely responses and maintaining regulatory compliance. ๐ง How it Works This workflow automatically monitors regulatory news daily, scrapes the latest updates from major regulatory agencies using AI-powered web scraping, filters for updates from the last 24 hours, and sends Slack alerts while logging all updates to Google Sheets for historical tracking. Key Components Daily Schedule Trigger - Automatically runs the workflow every 24 hours to check for regulatory updates Regulatory Sources Configuration - Defines the list of regulatory agencies and their URLs to monitor (SEC, FINRA, ESMA) Batch Processing - Iterates through regulatory sources one at a time for reliable processing AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract regulatory updates including title, summary, date, agency, and source URL Data Flattening - Transforms scraped data structure into individual update records Time Filtering - Filters updates to keep only those from the last 24 hours Historical Tracking - Logs all filtered updates to Google Sheets for compliance records Compliance Alerts - Sends Slack notifications to compliance teams when new regulatory updates are detected ๐ฐ Key Features Automated Regulatory Monitoring Daily Execution**: Runs automatically every 24 hours without manual intervention Multi-Agency Support**: Monitors SEC, FINRA, and ESMA simultaneously Error Handling**: Gracefully handles scraping errors and continues processing other sources Smart Filtering Time-Based Filtering**: Automatically filters updates to show only those from the last 24 hours Date Validation**: Discards updates with unreadable or invalid dates Recent Updates Focus**: Ensures compliance teams only receive actionable, timely information Alert System Compliance Alerts**: Instant Slack notifications for new regulatory updates Structured Data**: Alerts include title, summary, date, agency, and source URL Dedicated Channel**: Posts to designated compliance alerts channel for team visibility ๐ Output Specifications The workflow generates and stores structured data including: | Output Type | Format | Description | Example | |-------------|--------|-------------|---------| | Regulatory Updates | JSON Object | Extracted regulatory update information | {"title": "SEC Announces New Rule", "date": "2024-01-15", "agency": "SEC"} | | Update History | Google Sheets | Historical regulatory update records with timestamps | Columns: Title, Summary, Date, Agency, Source URL, Scraped At | | Slack Alerts | Messages | Compliance notifications for new updates | "๐ข New SEC update: [Title] - [Summary]" | | Error Logs | System Logs | Scraping error notifications | "โ Error scraping FINRA updates" | ๐ ๏ธ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets API access (OAuth2) Slack workspace with API access Google Sheets spreadsheet for regulatory update tracking Step-by-Step Configuration 1. Install Community Nodes Install ScrapeGraphAI community node npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Authorize access to your Google account Create or identify the spreadsheet for regulatory update tracking Note the spreadsheet ID and sheet name (default: "RegUpdates") 4. Configure Slack Integration Add Slack API credentials to your n8n instance Create or identify Slack channel: #compliance-alerts Test Slack connection with a sample message Ensure the bot has permission to post messages 5. Customize Regulatory Sources Open the "Regulatory Sources" Code node Update the urls array with additional regulatory sources if needed: const urls = [ 'https://www.sec.gov/news/pressreleases', 'https://www.finra.org/rules-guidance/notices', 'https://www.esma.europa.eu/press-news', // Add more URLs as needed ]; 6. Configure Google Sheets Update documentId in "Log to Google Sheets" node with your spreadsheet ID Update sheetName to match your sheet name (default: "RegUpdates") Ensure the sheet has columns: Title, Summary, Date, Agency, Source URL, Scraped At Create the sheet with proper column headers if starting fresh 7. Customize Slack Channel Open "Send Compliance Alert" Slack node Update the channel name (default: "#compliance-alerts") Customize the message format if needed Test with a sample message 8. Adjust Schedule Open "Daily Regulatory Poll" Schedule Trigger Modify hoursInterval to change frequency (default: 24 hours) Set specific times if needed for daily execution 9. Customize Scraping Prompt Open "Scrape Regulatory Updates" ScrapeGraphAI node Adjust the userPrompt to extract different or additional fields Modify the JSON schema in the prompt if needed Change the number of updates extracted (default: 5 most recent) 10. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for data structure and format Verify Slack alerts are working correctly Test error handling with invalid URLs Validate date filtering is working properly ๐ Workflow Customization Options Modify Monitoring Frequency Change hoursInterval in Schedule Trigger for different frequencies Switch to multiple times per day for critical monitoring Add multiple schedule triggers for different agency checks Extend Data Collection Modify ScrapeGraphAI prompt to extract additional fields (documents, categories, impact level) Add data enrichment nodes for risk assessment Integrate with regulatory databases for more comprehensive tracking Add sentiment analysis for regulatory updates Enhance Alert System Add email notifications alongside Slack alerts Create different alert channels for different agencies Add priority-based alerting based on update keywords Integrate with SMS or push notification services Add webhook integrations for other compliance tools Advanced Analytics Add data visualization nodes for regulatory trend analysis Create automated compliance reports with summaries Integrate with business intelligence tools Add machine learning for update categorization Track regulatory themes and topics over time Multi-Source Support Add support for additional regulatory agencies Implement agency-specific scraping strategies Add regional regulatory sources (FCA, BaFin, etc.) Include state-level regulatory updates ๐ Use Cases Compliance Monitoring**: Automatically track regulatory updates to ensure timely compliance responses Risk Management**: Monitor regulatory changes that may impact business operations or investments Regulatory Intelligence**: Build historical databases of regulatory announcements for trend analysis Client Communication**: Stay informed to provide timely updates to clients about regulatory changes Legal Research**: Track regulatory developments for legal research and case preparation Investment Strategy**: Monitor regulatory changes that may affect investment decisions ๐จ Important Notes Respect website terms of service and rate limits when scraping regulatory sites Monitor ScrapeGraphAI API usage to manage costs Ensure Google Sheets has proper column structure before first run Set up Slack channel before running the workflow Consider implementing rate limiting for multiple regulatory sources Keep credentials secure and rotate them regularly Test with one regulatory source first before adding multiple sources Verify date formats are consistent across different regulatory agencies Be aware that some regulatory sites may have anti-scraping measures ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets logging failures: Check spreadsheet ID, sheet name, and column structure Slack notification failures: Verify channel name exists and bot has permissions Date filtering issues: Ensure dates from scraped content are in a parseable format Validation errors: Check that scraped data matches expected schema Empty results: Verify regulatory sites are accessible and haven't changed structure Optimization Tips: Start with one regulatory source to test the workflow Monitor API usage and costs regularly Use batch processing to avoid overwhelming scraping services Implement retry logic for failed scraping attempts Consider caching mechanisms for frequently checked sources Adjust the number of updates extracted based on typical volume Support Resources: ScrapeGraphAI documentation and API reference Google Sheets API documentation Slack API documentation for webhooks n8n community forums for workflow assistance n8n documentation for node configuration SEC, FINRA, and ESMA official websites for source verification
by Madame AI
Scrape physician profiles from BrowserAct to Google Sheets This workflow automates the process of building a targeted database of healthcare providers by scraping physician details for a specific location and syncing them to your records. It leverages BrowserAct to extract data from healthcare directories and ensures your database stays clean by preventing duplicate entries. Target Audience Medical recruiters, pharmaceutical sales representatives, lead generation specialists, and healthcare data analysts. How it works Define Location: The workflow starts by setting the target Location and State in a Set node. Scrape Data: A BrowserAct node executes a task (using the "Physician Profile Enricher" template) to search a healthcare directory (e.g., Healow) for doctors matching the criteria. Parse JSON: A Code node takes the raw string output from the scraper and parses it into individual JSON objects. Update Database: The workflow uses a Google Sheets node to append new records or update existing ones based on the physician's name, preventing duplicates. Notify Team: A Slack node sends a message to a specific channel to confirm the batch job has finished successfully. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Physician Profile Enricher template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Google Sheets node and select your newly created file and sheet. Set Variables: Open the Define Location node and input your target Location (City) and State. Configure Notification: Open the Slack node and select the channel where you want to receive alerts. Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Specialty Address Requirements BrowserAct* account with the *Physician Profile Enricher** template. Google Sheets** account. Slack** account. How to customize the workflow Change the Data Source: Modify the BrowserAct template to scrape a different directory (e.g., Zocdoc or WebMD) and update the Google Sheet columns accordingly. Switch Notifications: Replace the Slack node with a Microsoft Teams, Discord, or Email node to suit your team's communication preferences. Enrich Data: Add an AI Agent node after the Code node to format addresses or research the specific clinics listed. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automate Medical Lead Gen: Scrape Healow to Google Sheets & Slack
by Madame AI
Generate audio documentaries from web articles to Telegram with ElevenLabs & BrowserAct This workflow transforms any web article or blog post into a high-production-value audio documentary. It automates the entire production chainโfrom scraping content and writing an engaging narrative script to generating realistic voiceoversโdelivering a listenable MP3 file directly to your Telegram chat. Target Audience Commuters, podcast enthusiasts, content creators, and researchers who prefer listening to content over reading. How it works Analyze Intent: The workflow receives a message via Telegram. An AI Agent (using Google Gemini) classifies the input to determine if it is a casual chat or a request to process a URL. Scrape Content: If a valid link is detected, BrowserAct executes a background task to visit the webpage and extract the raw text. Write Script: A Scriptwriter Agent (using Claude via OpenRouter) converts the dry article text into a dramatic, narrative-driven script optimized for audio, including cues for pacing and tone. Generate Audio: ElevenLabs synthesizes the script into high-fidelity speech using a specific voice model (e.g., "Liam"). Deliver Output: The workflow sends the generated MP3 file and a formatted HTML summary caption back to the user on Telegram. How to set up Configure Credentials: Connect your Telegram, ElevenLabs, OpenRouter, Google Gemini, and BrowserAct accounts in n8n. Prepare BrowserAct: Ensure the AI Summarization & Eleven Labs Podcast Generation template is saved in your BrowserAct account. Select Voice: Open the Convert text to speech node and select your preferred ElevenLabs voice model. Configure Model: Open the OpenRouter node to confirm the model selection (e.g., Claude Haiku) or switch to a different LLM for scriptwriting. Activate: Turn on the workflow and send a link to your Telegram bot to test it. Requirements BrowserAct* account with the *AI Summarization & Eleven Labs Podcast Generation** template. ElevenLabs** account. OpenRouter** account (or access to an LLM like Claude). Google Gemini** account. Telegram** account (Bot Token). How to customize the workflow Change the Persona: Modify the system prompt in the Scriptwriter node to change the narrative style (e.g., from "Documentary Host" to "Comedian" or "News Anchor"). Switch Output Channel: Replace the Telegram output node with a Google Drive or Dropbox node to archive the generated audio files for a podcast feed. Multi-Voice Support: Add logic to split the script into multiple parts and use different ElevenLabs voices to simulate a conversation between two hosts. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video How to Build an AI Podcast Generator: n8n, BrowserAct & Eleven Labs
by Meak
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Most businesses spend hours cleaning up photos and manually posting them to social media. This workflow does it all automatically: image enhancement, caption creation, and posting โ directly from a simple Google Drive upload. Benefits Clean & enhance images instantly with Nano Banana Auto-generate catchy captions with GPT-5 Post directly to Instagram (or other social channels) Track everything in Google Sheets Save hours per week on repetitive content tasks How It Works Upload image to Google Drive Workflow sends image to Nano Banana (via Wavespeed API) Waits for enhanced version and logs URL in Google Sheets Uploads result to Postiz media library GPT-5 writes an engaging caption Publishes post instantly or schedules for later Who Is This For Real estate agents posting property photos E-commerce sellers updating product images Social media managers handling multiple accounts Setup Connect Google Drive (select upload folder) Add Wavespeed API key for Nano Banana Connect Google Sheets for logging Add Postiz API credentials & integration ID Enter OpenAI API key for GPT-5 captioning ROI & Monetization Save 5โ10 hours per week of manual editing and posting Offer as a $1kโ$3k/month content automation service for clients Scale to multi-platform posting (TikTok, LinkedIn) for premium retainers Strategy Insights In the full walkthrough, I show how to: Build this workflow step by step Pitch it as a โDone-For-You Social Posting Systemโ Automate outreach to agencies and creators who need it Turn this into recurring revenue with retainers Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Harvex AI
AI Lead Enrichment & Notification System This n8n template automates the lead enrichment process for your business. Once a lead fills out a form, the workflow scrapes their website, provides a summary of their business, and logs everything into a CRM before notifying your team on Slack. Some use cases: "Speed-to-Lead" optimization, lead enrichment, automated prospect research. How it works Ingestion: A lead submits their details (Name, Email, Website) via a form. Intelligent scraping: The workflow scrapes the provided URL. AI Analysis: OpenAI's model (GPT-4o) analyzes the extracted data and determines whether there is enough info or if the workflow needs to scrape the "About Us" page. CRM Sync: The CRM (Airtable) is updated with the enriched data. Notification: An instant Slack notification is sent to the team channel. How to use the workflow Configure the form: Open the trigger form and input the required fields. Setup OpenAI: Ensure that your credentials are connected. Database mapping: Ensure your Airtable base has the following columns: Name, Website, AI Insight, Email, and Date. Slack setup: Specify the desired Slack channel for notifications. Test it out! Open the form, enter sample data (with a real website), and watch the system enrich the lead for you. Requirements OpenAI API Key** (For analyzing website content and generating summaries) Airtable** (For CRM and logging) Slack** (For team notifications)
by vinci-king-01
Job Posting Aggregator with Email and GitHub โ ๏ธ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically aggregates certification-related job-posting requirements from multiple industry sources, compares them against last yearโs data stored in GitHub, and emails a concise change log to subscribed professionals. It streamlines annual requirement checks and renewal reminders, ensuring users never miss an update. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Git installed (for optional local testing of the repo) Working SMTP server or other Email credential supported by n8n Required Credentials ScrapeGraphAI API Key** โ Enables web scraping of certification pages GitHub Personal Access Token** โ Allows the workflow to read/write files in the repo Email / SMTP Credentials** โ Sends the summary email to end-users Specific Setup Requirements | Resource | Purpose | Example | |----------|---------|---------| | GitHub Repository | Stores certification_requirements.json versioned annually | https://github.com/<you>/cert-requirements.git | | Watch List File | List of page URLs & selectors to scrape | Saved in the repo under /config/watchList.json | | Email List | Semicolon-separated list of recipients | me@company.com;team@company.com | How it works This workflow automatically aggregates certification-related job-posting requirements from multiple industry sources, compares them against last yearโs data stored in GitHub, and emails a concise change log to subscribed professionals. It streamlines annual requirement checks and renewal reminders, ensuring users never miss an update. Key Steps: Manual Trigger**: Starts the workflow on demand or via scheduled cron. Load Watch List (Code Node)**: Reads the list of certification URLs and CSS selectors. Split In Batches**: Iterates through each URL to avoid rate limits. ScrapeGraphAI**: Scrapes requirement details from each page. Merge (Wait)**: Reassembles individual scrape results into a single JSON array. GitHub (Read File)**: Retrieves last yearโs certification_requirements.json. IF (Change Detector)**: Compares current vs. previous JSON and decides whether changes exist. Email Send**: Composes and sends a formatted summary of changes. GitHub (Upsert File)**: Commits the new JSON file back to the repo for future comparisons. Set up steps Setup Time: 15-25 minutes Install Community Node: From n8n UI โ Settings โ Community Nodes โ search and install โScrapeGraphAIโ. Create/Clone GitHub Repo: Add an empty certification_requirements.json ( {} ) and a config/watchList.json with an array of objects like: [ { "url": "https://cert-body.org/requirements", "selector": "#requirements" } ] Generate GitHub PAT: Scope repo, store in n8n Credentials as โGitHub APIโ. Add ScrapeGraphAI Credential: Paste your API key into n8n Credentials. Configure Email Credentials: E.g., SMTP with username/password or OAuth2. Open Workflow: Import the template JSON into n8n. Update Environment Variables (in the Code node or via n8n variables): GITHUB_REPO (e.g., user/cert-requirements) EMAIL_RECIPIENTS Test Run: Trigger manually. Verify email content and GitHub commit. Schedule: Add a Cron node (optional) for yearly or quarterly automatic runs. Node Descriptions Core Workflow Nodes: Manual Trigger** โ Initiates the workflow manually or via external schedule. Code (Load Watch List)** โ Reads and parses watchList.json from GitHub or static input. SplitInBatches** โ Controls request concurrency to avoid scraping bans. ScrapeGraphAI** โ Extracts requirement text using provided CSS selectors or XPath. Merge (Combine)** โ Waits for all batches and merges them into one dataset. GitHub (Read/Write File)** โ Handles version-controlled storage of JSON data. IF (Change Detector)** โ Compares hashes/JSON diff to detect updates. EmailSend** โ Sends change log, including renewal reminders and diff summary. Sticky Note** โ Provides in-workflow documentation for future editors. Data Flow: Manual Trigger โ Code (Load Watch List) โ SplitInBatches SplitInBatches โ ScrapeGraphAI โ Merge Merge โ GitHub (Read File) โ IF (Change Detector) IF (True) โ Email Send โ GitHub (Upsert File) Customization Examples Adjusting Scraper Configuration // Inside the Watch List JSON object { "url": "https://new-association.com/cert-update", "selector": ".content article:nth-of-type(1) ul" } Custom Email Template // In Email Send node โ HTML Content ๐ Certification Updates โ {{ $json.date }} The following certifications have new requirements: {{ $json.diffHtml }} For full details visit our GitHub repo. Data Output Format The workflow outputs structured JSON data: { "timestamp": "2024-09-01T12:00:00Z", "source": "watchList.json", "current": { "AWS-SAA": "Version 3.0, requires renewed proctored exam", "PMP": "60 PDUs every 3 years" }, "previous": { "AWS-SAA": "Version 2.0", "PMP": "60 PDUs every 3 years" }, "changes": { "AWS-SAA": "Updated to Version 3.0; exam format changed." } } Troubleshooting Common Issues ScrapeGraphAI returns empty data โ Check CSS/XPath selectors and ensure page is publicly accessible. GitHub authentication fails โ Verify PAT scope includes repo and that the credential is linked in both GitHub nodes. Performance Tips Limit SplitInBatches size to 3-5 URLs when sources are heavy to avoid timeouts. Enable n8n execution mode โQueueโ for long-running scrapes. Pro Tips: Store selector samples in comments next to each watch list entry for future maintenance. Use a Cron node set to โ0 0 1 1 *โ for an annual run exactly on Jan 1st. Add a Telegram node after Email Send for instant mobile notifications.
by Abdul Mir
Overview Stop spending hours formatting proposals. This workflow turns a short post-call form into a high-converting, fully-personalized PandaDoc proposalโplus updates your CRM and drafts the follow-up email for you. After a sales call, just fill out a 3-minute form summarizing key pain points, solutions pitched, and the price. The workflow uses AI to generate polished proposal copy, then builds a PandaDoc draft using dynamic data mapped into the JSON body (which you can fully customize per business). It also updates the lead record in ClickUp with the proposal link, company name, and quoteโthen creates an email draft in Gmail, ready to send. Whoโs it for Freelancers and consultants sending service proposals Agencies closing deals over sales calls Sales reps who want to automate proposal follow-up Teams using ClickUp as their lightweight CRM How it works After a call, fill out a short form with client details, pitch notes, and price AI generates professional proposal copy based on form input Proposal is formatted and sent to PandaDoc via HTTP request ClickUp lead is updated with: Company Name Proposal URL Quote/price A Gmail draft is created using the proposal link and a thank-you message Example use case > You hop off a call, fill out: > - Prospect: Shopify agency > - Pain: No lead gen system > - Solution: Automated cold outreach > - Price: $2,500/month > > 3 minutes later: PandaDoc proposal is ready, CRM is updated, and your email draft is waiting to be sent. How to set up Replace the form with your preferred tool (e.g. Tally, Typeform) Connect PandaDoc API and structure your proposal template Customize the JSON body inside the HTTP request to match your business Link your ClickUp space and custom fields Connect Gmail (or other email tool) for final follow-up draft Requirements Form tool for capturing sales call notes OpenAI or LLM key for generating proposal copy PandaDoc API access ClickUp custom fields set up for lead tracking Gmail integration How to customize Customize your PandaDoc proposal fields in the JSON body of the HTTP node Replace ClickUp with another CRM like HubSpot or Notion Adjust AI tone (casual, premium, corporate) for proposal writing Add Slack or Telegram alerts when the draft is ready Add PDF generation or auto-send email step
by Rahul Joshi
Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does ๐ Runs every 5 minutes to monitor SLA breaches ๐ Fetches lead data (status, contact, timestamps) from Google Sheets ๐ Identifies leads marked โUn-repliedโ beyond the 15-minute SLA ๐ Enriches alerts with direct Google Sheet row links for quick action ๐ฒ Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent
by Bakir Ali
Automated BBB Lead Generation with BrowserAct ๐ Overview This workflow automates business data extraction, duplicate checking, and email outreach using BrowserAct, Google Sheets, Gmail, and Google Gemini AI โ all inside n8n. Itโs designed for marketers, lead generation specialists, or automation developers who want to build a fully autonomous AI agent that finds businesses online, filters duplicates, and automatically sends personalized outreach emails. ๐งฉ Key Features ๐ BrowserAct Integration โ Scrapes business data (name, phone, email, website, rating) from any target site. ๐ค AI Data Extraction Agent โ Uses Google Gemini AI to clean, structure, and validate scraped data into standardized JSON. ๐ Google Sheets Sync โ Reads all existing records Checks for duplicates Appends new rows automatically โ๏ธ Automated Gmail Outreach โ Validates email addresses Sends outreach emails to valid leads Logs each status (e.g., Successful, Duplicate, Pending - Invalid Email) โณ Smart Delay Control โ Uses Wait node to pause execution and respect email sending limits (max 2 emails per run). ๐ ๏ธ Included Nodes | Node | Function | | -------------------------- | ------------------------------------------------- | | ๐ Schedule Trigger | Runs the workflow automatically on schedule | | ๐ BrowserAct | Scrapes or extracts business data | | โ๏ธ If Node | Checks scraping results before processing | | ๐ง AI Agent (Gemini) | Extracts structured business info | | ๐ป Code (JavaScript) | Cleans and parses AI output into usable JSON | | ๐ฉ AI Agent 2 (Gemini) | Handles decision-making for email + sheet updates | | ๐ Google Sheets Tools | Reads, appends, and manages lead data | | ๐จ Gmail Node | Sends automated outreach emails | | โฑ๏ธ Wait Node | Adds delay to control workflow speed | ๐งพ How It Works Schedule Trigger starts the automation. BrowserAct fetches business listings based on defined keywords and location. AI Agent (Gemini) extracts business details (business_name, website_url, phone_number, email_address, rating). JavaScript Code Node parses the AIโs JSON response. AI Agent 2 (Gemini) decides: If duplicate โ send message on your email address Duplicate data found If invalid email โ marks as โPending - Invalid Emailโ If valid email โ sends via Gmail + updates Google Sheet Final output returns structured statuses for each processed business. ๐ผ๏ธ Workflow Diagram > * Schedule Trigger > * BrowserAct > * AI Agent (Gemini) > * JavaScript Code > * Gmail & Google Sheets tools ![Workflow Preview] โ๏ธ Setup Instructions Connect your BrowserAct, Google Sheets, Gmail, and Google Gemini API credentials. Define search keywords and locations inside the BrowserAct node. Set your Google Sheet ID in the relevant nodes. Customize the Gmail message if needed. Activate the workflow and schedule it. ๐ค Output Example [ { "business_name": "ABC Restaurant", "email_sent": "Successful" }, { "business_name": "XYZ Foods", "email_sent": "Duplicate - Already Exist" }, { "business_name": "Fresh Eats", "email_sent": "Pending - Invalid Email" } ] ๐จโ๐ป Created by Bakir Ali Automation & AI Workflow Creator โ specialized in BrowserAct, Google AI (Gemini), and n8n-based automation systems.
by n8n Automation Expert | Template Creator | 2+ Years Experience
๐ค๏ธ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. โก What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system ๐ฏ Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work ๐ ๏ธ Key Features ๐ Automated Scheduling**: Runs every 6 hours with manual trigger option ๐ Comprehensive Reports**: Current conditions + 6-hour detailed forecasts โ ๏ธ Smart Warnings**: Contextual alerts for temperature extremes and rain probability ๐จ Beautiful Formatting**: Rich Telegram messages with emojis and structured data ๐ง Error Recovery**: Automatic error handling with notification system ๐ Location-Aware**: Supports any Indonesian location via BMKG regional codes ๐ What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends ๐ Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area ๐ก Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation ๐ Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Oneclick AI Squad
This automated n8n workflow checks daily travel itineraries, syncs upcoming trips to Google Calendar, and sends reminder notifications to travelers via email or SMS. Perfect for travel agencies, tour operators, and organizations managing group trips to keep travelers informed about their schedules and bookings. What This Workflow Does Automatically checks travel itineraries every day Identifies today's trips and upcoming departures Syncs trip information to Google Calendar Sends personalized reminders to assigned travelers Tracks reminder delivery status and logs activities Handles both email and SMS notification preferences Provides pre-travel checklists and booking confirmations Manages multi-day trip schedules and activities Main Components Daily Travel Check** - Triggers daily to check travel itineraries Read Travel Itinerary** - Retrieves today's trips and bookings from database/Excel Filter Today's Trips** - Identifies trips departing today and upcoming activities Has Trips Today?** - Checks if there are any trips scheduled Read Traveler Contacts** - Gets traveler contact information for assigned trips Sync to Google Calendar** - Creates/updates trip events in Google Calendar Create Traveler Reminders** - Generates personalized reminder messages with travel details Split Into Batches** - Processes reminders in manageable batches Email or SMS?** - Routes based on traveler communication preferences Prepare Email Reminders** - Creates detailed email reminder content with checklists Prepare SMS Reminders** - Creates SMS reminder content optimized for text Read Reminder Log** - Checks previous reminder history Update Reminder Log** - Records sent reminders with timestamps Save Reminder Log** - Saves updated log data for audit trail Essential Prerequisites Travel itinerary database/Excel file with trip assignments Traveler contact database with email and phone numbers Google Calendar API access and credentials SMTP server for email notifications SMS service provider (Twilio, Nexmo, etc.) for text reminders Reminder log file for tracking sent notifications Booking confirmation system (flight, hotel, transport) Required Data Files trip_itinerary.xlsx: Trip ID | Trip Name | Date | Departure Time | Duration Departure Location | Destination | Hotel | Flight Number Assigned Travelers | Status | Booking Reference | Cost traveler_contacts.xlsx: Traveler ID | First Name | Last Name | Email | Phone Preferred Contact | Assigned Trips | Passport Number | Emergency Contact reminder_log.xlsx: Log ID | Date | Traveler ID | Trip ID | Contact Method Status | Sent Time | Message Preview | Confirmation Key Features โฐ Daily Automation: Runs automatically every day at scheduled times ๐ Calendar Sync: Syncs trips to Google Calendar for easy viewing ๐ง Smart Reminders: Sends email or SMS based on traveler preference ๐ฅ Batch Processing: Handles multiple travelers efficiently ๐ Activity Logging: Tracks all reminder activities and delivery status ๐ Duplicate Prevention: Avoids sending multiple reminders ๐ฑ Multi-Channel: Supports both email and SMS notifications โ๏ธ Travel-Specific: Includes flight numbers, locations, accommodation details ๐ Pre-Travel Checklist: Provides comprehensive packing and document reminders ๐ Multi-Destination: Manages complex multi-stop itineraries Quick Setup Import workflow JSON into n8n Configure daily trigger schedule (recommended: 6 AM and 6 PM) Set up trip itinerary and traveler contact files Connect Google Calendar API credentials Configure SMTP server for emails Set up SMS service provider (Twilio, Nexmo, or similar) Map Excel sheet columns to workflow variables Test with sample trip data Activate workflow Parameters to Configure schedule_file_path: Path to trip itinerary file contacts_file_path: Path to traveler contacts file reminder_hours: Hours before departure to send reminder (default: 24) google_calendar_id: Google Calendar ID for syncing trips google_api_credentials: Google Calendar API credentials smtp_host: Email server settings smtp_user: Email username smtp_password: Email password sms_api_key: SMS service API key sms_phone_number: SMS sender phone number reminder_log_path: Path to reminder log file Sample Reminder Messages Email Subject: "โ๏ธ Travel Reminder: [Trip Name] Today at [Time]" Email Body: Hello [Traveler Name], Your trip is happening today! Here are your travel details: Trip: [Trip Name] Departure: [Departure Time] From: [Departure Location] To: [Destination] Flight/Transport: [Flight Number] Hotel: [Hotel Name] Duration: [X] days Pre-Travel Checklist: โ Passport and travel documents โ Travel insurance documents โ Hotel confirmations โ Medications and toiletries โ Weather-appropriate clothing โ Phone charger and adapters โ ๏ธ Please arrive at the departure point 2 hours early! Have a wonderful trip! SMS: "โ๏ธ Travel Reminder: '[Trip Name]' departs at [Time] today from [Location]. Arrive 2 hours early! Flight: [Number]" Tomorrow Evening Preview (SMS): "๐ Tomorrow: '[Trip Name]' departs at [Time] from [Location]. Pack tonight! ([X] days)" Use Cases Daily trip departure reminders for travelers Last-minute itinerary change notifications Flight cancellation and delay alerts Hotel check-in and checkout reminders Travel document expiration warnings Group tour activity scheduling Adventure/hiking trip departure alerts Business travel itinerary updates Family vacation coordination Study abroad program notifications Multi-city tour route confirmations Transport connection reminders Advanced Features Reminder Escalation 24-hour reminder: Full details with checklist 6-hour reminder: Quick confirmation with transport details 2-hour reminder: Urgent departure notification Conditional Logic Different messages for single-day vs. multi-day trips Domestic vs. international travel variations Group size-based messaging Weather-based travel advisories Integration Capabilities Connect to airline APIs for real-time flight status Link to hotel management systems for check-in info Integrate weather services for destination forecasts Sync with payment systems for booking confirmations Troubleshooting | Issue | Solution | |-------|----------| | Reminders not sending | Check email/SMS credentials and service quotas | | Calendar sync failing | Verify Google Calendar API permissions | | Duplicate reminders | Check for overlapping reminder time windows | | Missing traveler data | Verify contact file formatting and column mapping | | Batch processing slow | Reduce batch size in Split Into Batches node | Security Considerations Store API credentials in n8n environment variables Use OAuth2 for Google Calendar authentication Encrypt sensitive data in reminder logs Implement role-based access to trip data Audit log all reminder activities Comply with GDPR/privacy regulations for traveler data Performance Metrics Processing Time**: ~2-5 seconds per 50 travelers Success Rate**: >99% for delivery logging Calendar Sync**: Real-time updates Batch Limit**: 10 travelers per batch (configurable) Support & Maintenance Review reminder logs weekly for delivery issues Update traveler contacts as needed Monitor email/SMS service quotas Test workflow after system updates Archive old reminder logs monthly