by Rahul Joshi
Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does 📅 Runs every 5 minutes to monitor SLA breaches 📋 Fetches lead data (status, contact, timestamps) from Google Sheets 🕒 Identifies leads marked “Un-replied” beyond the 15-minute SLA 🔗 Enriches alerts with direct Google Sheet row links for quick action 📲 Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent
by n8n Automation Expert | Template Creator | 2+ Years Experience
🌤️ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. ⚡ What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system 🎯 Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work 🛠️ Key Features 🔄 Automated Scheduling**: Runs every 6 hours with manual trigger option 📊 Comprehensive Reports**: Current conditions + 6-hour detailed forecasts ⚠️ Smart Warnings**: Contextual alerts for temperature extremes and rain probability 🎨 Beautiful Formatting**: Rich Telegram messages with emojis and structured data 🔧 Error Recovery**: Automatic error handling with notification system 📍 Location-Aware**: Supports any Indonesian location via BMKG regional codes 📋 What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends 🚀 Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area 💡 Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation 🌟 Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Oneclick AI Squad
This automated n8n workflow checks daily travel itineraries, syncs upcoming trips to Google Calendar, and sends reminder notifications to travelers via email or SMS. Perfect for travel agencies, tour operators, and organizations managing group trips to keep travelers informed about their schedules and bookings. What This Workflow Does Automatically checks travel itineraries every day Identifies today's trips and upcoming departures Syncs trip information to Google Calendar Sends personalized reminders to assigned travelers Tracks reminder delivery status and logs activities Handles both email and SMS notification preferences Provides pre-travel checklists and booking confirmations Manages multi-day trip schedules and activities Main Components Daily Travel Check** - Triggers daily to check travel itineraries Read Travel Itinerary** - Retrieves today's trips and bookings from database/Excel Filter Today's Trips** - Identifies trips departing today and upcoming activities Has Trips Today?** - Checks if there are any trips scheduled Read Traveler Contacts** - Gets traveler contact information for assigned trips Sync to Google Calendar** - Creates/updates trip events in Google Calendar Create Traveler Reminders** - Generates personalized reminder messages with travel details Split Into Batches** - Processes reminders in manageable batches Email or SMS?** - Routes based on traveler communication preferences Prepare Email Reminders** - Creates detailed email reminder content with checklists Prepare SMS Reminders** - Creates SMS reminder content optimized for text Read Reminder Log** - Checks previous reminder history Update Reminder Log** - Records sent reminders with timestamps Save Reminder Log** - Saves updated log data for audit trail Essential Prerequisites Travel itinerary database/Excel file with trip assignments Traveler contact database with email and phone numbers Google Calendar API access and credentials SMTP server for email notifications SMS service provider (Twilio, Nexmo, etc.) for text reminders Reminder log file for tracking sent notifications Booking confirmation system (flight, hotel, transport) Required Data Files trip_itinerary.xlsx: Trip ID | Trip Name | Date | Departure Time | Duration Departure Location | Destination | Hotel | Flight Number Assigned Travelers | Status | Booking Reference | Cost traveler_contacts.xlsx: Traveler ID | First Name | Last Name | Email | Phone Preferred Contact | Assigned Trips | Passport Number | Emergency Contact reminder_log.xlsx: Log ID | Date | Traveler ID | Trip ID | Contact Method Status | Sent Time | Message Preview | Confirmation Key Features ⏰ Daily Automation: Runs automatically every day at scheduled times 📅 Calendar Sync: Syncs trips to Google Calendar for easy viewing 📧 Smart Reminders: Sends email or SMS based on traveler preference 👥 Batch Processing: Handles multiple travelers efficiently 📊 Activity Logging: Tracks all reminder activities and delivery status 🔄 Duplicate Prevention: Avoids sending multiple reminders 📱 Multi-Channel: Supports both email and SMS notifications ✈️ Travel-Specific: Includes flight numbers, locations, accommodation details 📋 Pre-Travel Checklist: Provides comprehensive packing and document reminders 🌍 Multi-Destination: Manages complex multi-stop itineraries Quick Setup Import workflow JSON into n8n Configure daily trigger schedule (recommended: 6 AM and 6 PM) Set up trip itinerary and traveler contact files Connect Google Calendar API credentials Configure SMTP server for emails Set up SMS service provider (Twilio, Nexmo, or similar) Map Excel sheet columns to workflow variables Test with sample trip data Activate workflow Parameters to Configure schedule_file_path: Path to trip itinerary file contacts_file_path: Path to traveler contacts file reminder_hours: Hours before departure to send reminder (default: 24) google_calendar_id: Google Calendar ID for syncing trips google_api_credentials: Google Calendar API credentials smtp_host: Email server settings smtp_user: Email username smtp_password: Email password sms_api_key: SMS service API key sms_phone_number: SMS sender phone number reminder_log_path: Path to reminder log file Sample Reminder Messages Email Subject: "✈️ Travel Reminder: [Trip Name] Today at [Time]" Email Body: Hello [Traveler Name], Your trip is happening today! Here are your travel details: Trip: [Trip Name] Departure: [Departure Time] From: [Departure Location] To: [Destination] Flight/Transport: [Flight Number] Hotel: [Hotel Name] Duration: [X] days Pre-Travel Checklist: ☑ Passport and travel documents ☑ Travel insurance documents ☑ Hotel confirmations ☑ Medications and toiletries ☑ Weather-appropriate clothing ☑ Phone charger and adapters ⚠️ Please arrive at the departure point 2 hours early! Have a wonderful trip! SMS: "✈️ Travel Reminder: '[Trip Name]' departs at [Time] today from [Location]. Arrive 2 hours early! Flight: [Number]" Tomorrow Evening Preview (SMS): "📅 Tomorrow: '[Trip Name]' departs at [Time] from [Location]. Pack tonight! ([X] days)" Use Cases Daily trip departure reminders for travelers Last-minute itinerary change notifications Flight cancellation and delay alerts Hotel check-in and checkout reminders Travel document expiration warnings Group tour activity scheduling Adventure/hiking trip departure alerts Business travel itinerary updates Family vacation coordination Study abroad program notifications Multi-city tour route confirmations Transport connection reminders Advanced Features Reminder Escalation 24-hour reminder: Full details with checklist 6-hour reminder: Quick confirmation with transport details 2-hour reminder: Urgent departure notification Conditional Logic Different messages for single-day vs. multi-day trips Domestic vs. international travel variations Group size-based messaging Weather-based travel advisories Integration Capabilities Connect to airline APIs for real-time flight status Link to hotel management systems for check-in info Integrate weather services for destination forecasts Sync with payment systems for booking confirmations Troubleshooting | Issue | Solution | |-------|----------| | Reminders not sending | Check email/SMS credentials and service quotas | | Calendar sync failing | Verify Google Calendar API permissions | | Duplicate reminders | Check for overlapping reminder time windows | | Missing traveler data | Verify contact file formatting and column mapping | | Batch processing slow | Reduce batch size in Split Into Batches node | Security Considerations Store API credentials in n8n environment variables Use OAuth2 for Google Calendar authentication Encrypt sensitive data in reminder logs Implement role-based access to trip data Audit log all reminder activities Comply with GDPR/privacy regulations for traveler data Performance Metrics Processing Time**: ~2-5 seconds per 50 travelers Success Rate**: >99% for delivery logging Calendar Sync**: Real-time updates Batch Limit**: 10 travelers per batch (configurable) Support & Maintenance Review reminder logs weekly for delivery issues Update traveler contacts as needed Monitor email/SMS service quotas Test workflow after system updates Archive old reminder logs monthly
by Automate With Marc
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Drop an image into Google Drive and let this workflow handle the rest: it auto-cleans and enhances the image with Google’s Nano Banana (via Wavespeed API), generates a catchy caption with GPT-5, and publishes directly to your connected social accounts using Postiz. 👉 Watch step-by-step video tutorials of workflows like these on *https://www.youtube.com/watch?v=4wk6PYgBtBM&list=PL05w1TE8X3bb1H9lXBqUy98zmTrrPP-s1* What it does Triggers from Google Drive when a new image is uploaded Sends the image to Nano Banana to declutter, brighten, and make it real-estate/photo-listing ready Polls for the edited result until it’s complete Logs the edited image URL into Google Sheets for tracking Downloads and uploads the edited image into Postiz media library Generates an engaging caption with GPT-5 Caption Agent Publishes instantly to Instagram (can be extended to TikTok, LinkedIn, etc.) Perfect for Real-estate agents posting property shots Ecommerce sellers updating product catalogs Social media marketers needing fast, polished posts Apps & Services Tools Used Google Drive (Trigger) Wavespeed API – Google Nano Banana (Image editing) Google Sheets (Logging) Postiz (Social scheduling/posting) OpenAI GPT-5 (Caption agent) Setup Connect your Google Drive and select the upload folder. Add your Wavespeed API key for Nano Banana. Connect Google Sheets for logging. Add Postiz API credentials and set the integration ID for your channel(s). Enter your OpenAI API key for GPT-5 captioning. Customization Adjust the edit prompt for different use cases (e.g., product cleanup, lighting tweaks). Change Postiz post type to scheduled instead of “now.” Add more Postiz posts for multi-platform publishing. Insert an approval loop (Slack/Email) before posting. Logs Edited Image Log (Sheets): stores final image URL + timestamp. Publishing Log (Sheets): tracks workflow status per asset. Notes Sticky notes in the template explain each major block. Replace sample IDs with your own (folder IDs, sheet IDs, Postiz integration). Keep all API keys in n8n Credentials, not in node parameters.
by Aitor | 1Node
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the distribution and scheduling of video content across multiple social platforms (TikTok, YouTube, Facebook, Instagram, Threads) through Postiz. Videos are collected from Google Drive, approved manually, and scheduled via the Postiz community node. 🧾 Requirements Google Drive** account with access to the folder that will watch for new items uploaded. videos in mp4 format ready to be shared or, alternatively you can connect a community node from Cloud Convert to convert the format before uploading into Postiz. Postiz account with integrations for TikTok, YouTube, Facebook, Instagram, and Threads 🔗 Useful Links Postiz Docs Postiz Community Node 🔄 Workflow Steps Trigger: Google Drive File Added Watches your selected Google Drive folder for new file uploads. Download File Downloads the detected video from Drive. Upload to Postiz Video is uploaded to Postiz to prepare for social scheduling. Set Fields Manual setting of social options Extract Datetime (AI) Uses OpenAI to find/predict intended publish date & time, as the datetime format is required to schedule on Postiz Get Social Integrations Fetches a list of user’s connected platforms from Postiz. Split and Filter Integrations Splits the process per platform (TikTok, YouTube, Facebook, Instagram, Threads). Schedule Post For each enabled platform, schedules the video with chosen options. 🙋♂️ Need Help? Connect with 1 Node
by Onur
🏠 Extract Zillow Property Data to Google Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts property listing data from Zillow URLs using Scrape.do web scraping API, parses key property information, and saves structured results into Google Sheets for real estate analysis, market research, and property tracking. 📋 Overview This workflow provides a lightweight real estate data extraction solution that pulls property details from Zillow listings and organizes them into a structured spreadsheet. Ideal for real estate professionals, investors, market analysts, and property managers who need automated property data collection without manual effort. Who is this for? Real estate investors tracking properties Market analysts conducting property research Real estate agents monitoring listings Property managers organizing data Data analysts building real estate databases What problem does this workflow solve? Eliminates manual copy-paste from Zillow Processes multiple property URLs in bulk Extracts structured data (price, address, zestimate, etc.) Automates saving results into Google Sheets Ensures repeatable & consistent data collection ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Read Zillow URLs from Google Sheets → Reads property URLs from a Google Sheet Scrape Zillow URL via Scrape.do → Fetches full HTML from Zillow (bypasses PerimeterX protection) Parse Zillow Data → Extracts structured property information from HTML Write Results to Google Sheets → Saves parsed data into a results sheet 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | URL | Original Zillow listing URL | https://www.zillow.com/homedetails/... | | Price | Property listing price | $300,000 | | Address | Street address | 8926 Silver City | | City | City name | San Antonio | | State | State abbreviation | TX | | Days on Zillow | How long listed | 5 | | Zestimate | Zillow's estimated value | $297,800 | | Scraped At | Timestamp of extraction | 2025-01-29T12:00:00.000Z | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do account with API token (Get 1000 free credits/month) Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Sheet1" | Column | Type | Description | Example | |--------|------|-------------|---------| | URLs | URL | Zillow listing URL | https://www.zillow.com/homedetails/123... | Output Tab: "Results" | Column | Type | Description | Example | |--------|------|-------------|---------| | URL | URL | Original listing URL | https://www.zillow.com/homedetails/... | | Price | Text | Property price | $300,000 | | Address | Text | Street address | 8926 Silver City | | City | Text | City name | San Antonio | | State | Text | State code | TX | | Days on Zillow | Number | Days listed | 5 | | Zestimate | Text | Estimated value | $297,800 | | Scraped At | Timestamp | When scraped | 2025-01-29T12:00:00.000Z | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure Scrape.do API: Sign up at Scrape.do Dashboard Get your API token In HTTP Request node, replace YOUR_SCRAPE_DO_TOKEN with your actual token The workflow uses super=true for premium residential proxies (10 credits per request) Configure Google Sheets: Create a new Google Sheet Add two tabs: "Sheet1" (input) and "Results" (output) In Sheet1, add header "URLs" in cell A1 Add Zillow URLs starting from A2 Set up Google Sheets OAuth2 credentials in n8n Replace YOUR_SPREADSHEET_ID with your actual Google Sheet ID Replace YOUR_GOOGLE_SHEETS_CREDENTIAL_ID with your credential ID Run & Test: Add 1-2 test Zillow URLs in Sheet1 Click "Execute workflow" Check results in Results tab 🧰 How to Customize Add more fields**: Extend parsing logic in "Parse Zillow Data" node to capture additional data (bedrooms, bathrooms, square footage) Filtering**: Add conditions to skip certain properties or price ranges Rate Limiting**: Insert a Wait node between requests if processing many URLs Error Handling**: Add error branches to handle failed scrapes gracefully Scheduling**: Replace Manual Trigger with Schedule Trigger for automated daily/weekly runs 📊 Use Cases Investment Analysis**: Track property prices and zestimates over time Market Research**: Analyze listing trends in specific neighborhoods Portfolio Management**: Monitor properties for sale in target areas Competitive Analysis**: Compare similar properties across locations Lead Generation**: Build databases of properties matching specific criteria 📈 Performance & Limits Single Property**: ~5-10 seconds per URL Batch of 10**: 1-2 minutes typical Large Sets (50+)**: 5-10 minutes depending on Scrape.do credits API Calls**: 1 Scrape.do request per URL (10 credits with super=true) Reliability**: 95%+ success rate with premium proxies 🧩 Troubleshooting | Problem | Solution | |---------|----------| | API error 400 | Check your Scrape.do token and credits | | URL showing "undefined" | Verify Google Sheet column name is "URLs" (capital U) | | No data parsed | Check if Zillow changed their HTML structure | | Permission denied | Re-authenticate Google Sheets OAuth2 in n8n | | 50000 character error | Verify Parse Zillow Data code is extracting fields, not returning raw HTML | | Price shows HTML/CSS | Update price extraction regex in Parse Zillow Data node | 🤝 Support & Community Scrape.do Documentation Scrape.do Dashboard Scrape.do Zillow Scraping Guide n8n Forum n8n Docs 🎯 Final Notes This workflow provides a repeatable foundation for extracting Zillow property data with Scrape.do and saving to Google Sheets. You can extend it with: Historical tracking (append timestamps) Price change alerts (compare with previous scrapes) Multi-platform scraping (Redfin, Realtor.com) Integration with CRM or reporting dashboards Important: Scrape.do handles all anti-bot bypassing (PerimeterX, CAPTCHAs) automatically with rotating residential proxies, so you only pay for successful requests. Always use super=true parameter for Zillow to ensure high success rates.
by WeblineIndia
Send daily applicant digest by role from Gmail to hiring managers with Google Gemini This workflow automatically collects all new job application emails from your Gmail labeled as applicants in the last 24 hours. Every day at 6:00 PM (Asia/Kolkata), it extracts structured details (name, email, phone, role, experience, skills, location, notice, summary) from each applicant (using Gemini AI or OpenAI). It then groups applicants by role and manager, compiles a neat HTML table digest for each manager and emails them a single summary — so hiring managers get everything they need, at a glance, in one place. Who’s It For Recruiters and hiring managers tired of digging through multiple application threads. Small HR teams / agencies not yet on a full applicant tracking system. Anyone wanting a consolidated, role-targeted applicant update each day. Teams that want to automate candidate triage using Google Workspace and AI. How It Works Schedule Trigger (6PM IST): Runs automatically at 18:00 India time. Fetch Applicant Emails: Reads Gmail for emails labeled 'applicants' from the past 24 hours. Prepare Email Text: Converts email content to plain text for reliable AI extraction. Extract Applicant Details: Gemini/OpenAI extracts applicant’s info in structured JSON. Assign Manager Emails: Routes each applicant to the correct manager via role→email mapping or fallback. Group & Build HTML Tables: Organizes applicants by manager and role, builds summary tables. Send Digest to Managers: Sends each manager one HTML summary email for their new applicants. How to Set Up Create/verify Gmail label applicants and set up filters to route job emails there. Import the workflow: Use your Google/Gmail and Gemini/OpenAI accounts as credentials. Configure connections: Gmail with OAuth2 (IMAP not required, uses Gmail API) Gemini or OpenAI API key for extraction Set role→manager mapping in the “Assign Manager Emails” node (just edit the map!). Adjust time / defaults: Edit schedule and fallback email if you wish. Test it: Send yourself a test application, label it, check workflow logs. Requirements Gmail account (with OAuth2 enabled and 'applicants' label set up) Gemini or OpenAI API key for structured AI extraction n8n instance (self-hosted or cloud) SMTP credentials (if using direct email instead of Gmail node) At least one valid hiring manager email mapped to a role How to Customize the Workflow Centralize config with a Set node (label name, fallback/manager email, model name, schedule). Add attachment-to-text conversion for applications with resume attachments. Normalize role names in the mapping code for more robust routing. Enable additional delivery: Slack, Teams, Google Sheets log, extra Cron for mid-day urgents. Refine AI extraction prompt for specific fields (add portfolio URL, etc.). Change schedule for daily, weekly or per-role timing. Add‑Ons / Extensions Resume Text Extraction:** Add PDF/DOCX to text parsing for attachment-only applications. ChatOps:** Send the summary to Slack or Teams channels along with/instead of email. Applicant Logging:** Auto-log every applicant/action into Google Sheets, Notion or Airtable. Multi-timezone:** Duplicate/modify the Cron trigger for different manager regions or urgency levels. Use Case Examples Tech Hiring:** Java, Python, Frontend candidates are automatically routed to their respective leads. Small Agency:** All applications summarized for reviewers, with per-role breakdowns. HR Operations:** Daily rollups sent before hiring sync, facilitating fast decision-making. Common Troubleshooting | Issue | Possible Cause | Solution | |-----------------------------------------|----------------------------------------------------------|-------------------------------------------------------------| | No emails processed | No 'applicants' label or wrong time window | Check Gmail filters and adjust search query in fetch node | | All digests go to fallback manager | Incorrect or missing role → manager mapping | Normalize role text in assignment node, expand map | | AI Extraction returns bad/missing JSON | Wrong prompt, high temperature or missing field names | Tighten prompt, lower temperature, check example response | | Duplicate/Old Emails appear | Date filter not correct | Use 'newer_than:1d' and keep 'mark as read' in email node | | SMTP/Gmail Send errors | Auth problem, quota or app password missing | Use OAuth2, check daily send caps and app password settings | | Blank or partially filled summary table | AI unable to parse poorly formatted/empty email | Improve sender email consistency, add fallback handling | | Attachments not processed | No attachment extraction node | Add attachment-to-text parsing before AI node | Need Help? If you get stuck, need help customizing a mapping or adding nodes or want to integrate extra steps (e.g., resume text, Slack), just ask! We're happy to guide you step by step, review your workflow, or help you troubleshoot any errors. Contact WeblineIndia — Your n8n Automation partner!
by Ian Kerins
Overview This n8n template automates the process of scraping job listings from Indeed, parsing the data into a structured format, and saving it to Google Sheets for easy tracking. It also includes a Slack notification system to alert you when new jobs are found. Built with ScrapeOps, it handles the complexities of web scraping - such as proxy rotation, anti-bot bypassing, and HTML parsing - so you can focus on the data. Who is this for? Job Seekers**: Automate your daily job search and get instant alerts for new postings. Recruiters & HR Agencies**: Track hiring trends and find new leads for candidate placement. Sales & Marketing Teams**: Monitor companies that are hiring to identify growth signals and lead opportunities. Data Analysts**: Gather labor market data for research and competitive analysis. What problems it solves Manual Searching**: Eliminates the need to manually refresh Indeed and copy-paste job details. Data Structure**: Converts messy HTML into clean, organized rows in a spreadsheet. Blocking & Captchas**: Uses ScrapeOps residential proxies to bypass Indeed's anti-bot protections reliably. Missed Opportunities**: Automated scheduling ensures you are the first to know about new listings. How it works Trigger: The workflow runs on a schedule (default: every 6 hours). Configuration: You define your search query (e.g., "Software Engineer") in the Set Search URL node. Scraping: The ScrapeOps Proxy API fetches the Indeed search results page using a residential proxy to avoid detection. Parsing: The ScrapeOps Parser API takes the raw HTML and extracts key details like Job Title, Company, Location, Salary, and URL. Filtering: A code node filters out invalid results and structures the data. Storage: Valid jobs are appended to a Google Sheet. Notification: A message is sent to Slack confirming the update. Setup steps (~ 10-15 minutes) ScrapeOps Account: Register for a free ScrapeOps API Key. In n8n, open the ScrapeOps nodes and create a new credential with your API key. Google Sheets: Duplicate this Google Sheet Template. Open the Save to Google Sheets node. Connect your Google account and select your duplicated sheet. Slack Setup: Open the Send a message node. Connect your Slack account and select the channel where you want to receive alerts. Customize Search: Open the Set Search URL node. Update the search_query value to the job title or keyword you want to track. Pre-conditions An active ScrapeOps account (Free tier available). A Google Cloud account with Google Sheets API enabled (for n8n connection). A Slack workspace for notifications. Disclaimer This template uses ScrapeOps as a community node. You are responsible for complying with Indeed's Terms of Use, robots directives, and applicable laws in your jurisdiction. Scraping targets may change at any time; adjust render/scroll/wait settings and parsers as needed. Use responsibly for legitimate business purposes. Resources ScrapeOps n8n Overview ScrapeOps Proxy API Documentation ScrapeOps Parser API Documentation
by Gilbert Onyebuchi
A complete email campaign automation system featuring dual-mode access control (Demo/Pro), usage tracking, and professional email delivery. Perfect for SaaS products, marketing agencies, or anyone building newsletter tools with freemium models. WHAT IT DOES This workflow manages email newsletter campaigns with built-in rate limiting for free users and unlimited access for premium users. It automatically tracks daily usage, manages user data in Google Sheets, delivers emails via SendGrid, and sends real-time notifications through Telegram. KEY FEATURES Dual-Mode System: Demo mode (5 emails/day) and Pro mode (unlimited) Smart Rate Limiting: Automatic daily counter reset User Management: Automatic new user registration and tracking Google Sheets Integration: Stores user data, send counts, and usage history Professional Email Delivery: SendGrid integration for reliable sending Real-Time Monitoring: Telegram notifications for every send Ready-to-Use Templates: 4 professional email designs included (Modern, Professional, Promotional, Newsletter) Live Preview: See exactly how emails look before sending HTML Export: Copy email HTML for use in any platform HOW IT WORKS User accesses the Email Newsletter Builder web form Designs email using one of 4 professional templates Chooses Demo or Pro mode Webhook receives the email data and configuration Workflow checks mode (Demo/Pro) For Demo mode: Queries Google Sheets for user email Checks if user exists and validates daily limit (<5 sends) If new user: Creates database entry If existing user + under limit: Increments counter If limit reached: Returns error message For Pro mode: Sends immediately without limits SendGrid delivers the email Google Sheets updates with new send count and timestamp Telegram notification sent to admin Success/error response returned to user SETUP REQUIREMENTS Google Sheets account (free) SendGrid account (free tier: 100 emails/day) Telegram account + bot (free) n8n instance (self-hosted or cloud) SUPPORT & FEEDBACK Questions or issues? Connect with me on LinkedIn Want to see it in action? Try the live demo: Click here ⭐ If you find this workflow helpful, please give it a rating and share your feedback!
by Dr. Firas
Build Your First AI Agent with ChatGPT-5 Who is this for? This workflow is designed for beginners and professionals who want to build their first AI-powered assistant with n8n. It’s perfect for anyone managing online trainings, consultations, or services that require both a knowledge base and appointment scheduling. What problem is this workflow solving? Manually handling client questions, checking your availability, and confirming bookings can be time-consuming and error-prone. This workflow automates the process, ensuring quick, accurate answers and seamless scheduling directly through chat. What this workflow does Answers user questions using your knowledge base stored in Google Sheets. Checks availability in Google Calendar and proposes alternative time slots if needed. Books 1-hour appointments in Paris time only after client confirmation. Sends a professional confirmation email with all appointment details. Setup Import this workflow into your n8n instance. Connect your Google Sheets, Gmail, and Google Calendar credentials. Add your knowledge base into Google Sheets (questions, answers, policies, packs, etc.). Test the workflow using the Connected Chat Trigger node to start conversations with the AI Agent. How to customize this workflow to your needs Update the Google Sheets database with your own training packs, services, or company FAQs. Adjust the email template to reflect your branding and communication style. Modify the appointment duration if you need sessions longer or shorter than 1 hour. Add extra nodes (e.g., CRM integration) to capture leads or sync appointments with external systems. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Fabian Herhold
Who's it for Sales teams, BDRs, account managers, and customer success professionals who want to show up prepared for every meeting. Perfect for anyone using Calendly who wants to automate prospect research and never walk into a call blind again. Watch the full tutorial here: What it does This workflow automatically researches your meeting attendees the moment they book through Calendly. It combines multiple AI agents to gather comprehensive intelligence: Company Research**: Uses Perplexity AI to validate company details, recent news, funding, leadership changes, and business signals LinkedIn Analysis**: Leverages RapidAPI to analyze the person's profile, recent posts, comments, and engagement patterns from the last 60-90 days Signal Detection**: Identifies hiring signals, growth indicators, and potential risks with confidence scoring Meeting Prep**: Synthesizes everything into personalized talking points, conversation starters, and strategic recommendations The final research brief gets delivered directly to your Slack, saving 30-45 minutes of manual research per meeting. How it works Someone books a meeting via your Calendly (must include LinkedIn URL in booking form) Main AI Agent extracts company domain from email and coordinates three specialist research agents Company Agent researches business intel via Perplexity Person Agent analyzes LinkedIn activity using 4 different RapidAPI endpoints Signal Agent identifies business opportunities and risks Comprehensive meeting brief gets sent to your Slack channel Requirements API Credentials needed: Calendly API (for webhook trigger) OpenAI API key (GPT-4 recommended for orchestration) Perplexity API key (for web research) RapidAPI subscription (for LinkedIn data endpoints) Slack bot token (for output delivery) Important: Your Calendly booking form must include a LinkedIn URL field to get optimal results. How to set up Configure Calendly: Add the Calendly trigger node with your API credentials Update Slack destination: Modify the final Slack node with your user ID or channel Add API keys: Configure all the API credentials in their respective nodes Test the workflow: Book a test meeting through Calendly to verify the complete flow Customize prompts: Adjust the AI agent prompts based on your specific industry or use case The workflow uses structured JSON output with confidence scoring and source citation for reliable, actionable intelligence. How to customize the workflow Change output destination**: Replace Slack with email, Teams, or CRM integration Modify research depth**: Adjust the AI prompts to focus on specific industries or company types Add more signals**: Extend the Signal Research Agent to detect additional business indicators Integrate with CRM**: Add nodes to automatically update contact records in your sales system Schedule follow-ups**: Connect to calendar tools to automatically schedule research updates The modular design makes it easy to adapt for different sales processes and research requirements.
by Olivier
This template syncs prospects from ProspectPro into HubSpot. It checks if a company already exists in HubSpot (by ProspectPro ID or domain), then updates the record or creates a new one. Sync results are logged back in ProspectPro with tags to prevent duplicates and mark errors, ensuring reliable and repeatable integrations. ✨ Features Automatically sync ProspectPro prospects to HubSpot companies Smart search logic: match by ProspectPro ID first, then by domain Creates new HubSpot companies when no match is found Updates existing HubSpot companies with latest ProspectPro data Logs sync results back into ProspectPro with tags (HubspotSynced, HubspotSyncFailed) Extendable and modular: use as a trigger workflow or callable sub-flow ⚙ Requirements n8n instance or cloud workspace Install the ProspectPro Verified Community Node ProspectPro account & API credentials (14-day free trial) HubSpot account with OAuth2 app and API credentials 🔧 Setup Instructions Import the template and set your credentials (ProspectPro, HubSpot). Connect to a trigger (e.g., ProspectPro "New website visitor") or call as a sub-workflow. Add a propery to Hubspot for the ProspectPro ID if you don't already have one Adjust sync logic in the "Continue?"-node and HubSpot fields to match your setup. Optional: extend error handling, add Slack/CRM notifications, or sync back HubSpot data into ProspectPro. 🔐 Security Notes Prevents re-processing of failed syncs using the HubspotSyncFailed tag Error branches included for failed updates/creates Manual resolution required if sync errors persist 🧪 Testing Run with a ProspectPro ID of a company with a known domain Check HubSpot for creation or update of the company record Verify updated tags (HubspotSynced / HubspotSyncFailed) in ProspectPro 📌 About ProspectPro ProspectPro is a B2B Prospecting Platform for Dutch SMEs. It helps sales teams identify prospects, track website visitors, and streamline sales without a full CRM. Website: https://www.prospectpro.nl Platform: https://mijn.prospectpro.nl API docs: https://www.docs.bedrijfsdata.nl Support: https://www.prospectpro.nl/klantenservice Support hours: Monday–Friday, 09:00–17:00 CET 📌 About HubSpot HubSpot is a leading CRM platform offering marketing, sales, and customer service tools. It helps companies manage contacts, automate workflows, and grow their customer base. Website: https://www.hubspot.com Developer Docs: https://developers.hubspot.com