by n8n Automation Expert | Template Creator | 2+ Years Experience
🌤️ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. ⚡ What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system 🎯 Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work 🛠️ Key Features 🔄 Automated Scheduling**: Runs every 6 hours with manual trigger option 📊 Comprehensive Reports**: Current conditions + 6-hour detailed forecasts ⚠️ Smart Warnings**: Contextual alerts for temperature extremes and rain probability 🎨 Beautiful Formatting**: Rich Telegram messages with emojis and structured data 🔧 Error Recovery**: Automatic error handling with notification system 📍 Location-Aware**: Supports any Indonesian location via BMKG regional codes 📋 What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends 🚀 Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area 💡 Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation 🌟 Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Abdul Mir
Overview Stop spending hours formatting proposals. This workflow turns a short post-call form into a high-converting, fully-personalized PandaDoc proposal—plus updates your CRM and drafts the follow-up email for you. After a sales call, just fill out a 3-minute form summarizing key pain points, solutions pitched, and the price. The workflow uses AI to generate polished proposal copy, then builds a PandaDoc draft using dynamic data mapped into the JSON body (which you can fully customize per business). It also updates the lead record in ClickUp with the proposal link, company name, and quote—then creates an email draft in Gmail, ready to send. Who’s it for Freelancers and consultants sending service proposals Agencies closing deals over sales calls Sales reps who want to automate proposal follow-up Teams using ClickUp as their lightweight CRM How it works After a call, fill out a short form with client details, pitch notes, and price AI generates professional proposal copy based on form input Proposal is formatted and sent to PandaDoc via HTTP request ClickUp lead is updated with: Company Name Proposal URL Quote/price A Gmail draft is created using the proposal link and a thank-you message Example use case > You hop off a call, fill out: > - Prospect: Shopify agency > - Pain: No lead gen system > - Solution: Automated cold outreach > - Price: $2,500/month > > 3 minutes later: PandaDoc proposal is ready, CRM is updated, and your email draft is waiting to be sent. How to set up Replace the form with your preferred tool (e.g. Tally, Typeform) Connect PandaDoc API and structure your proposal template Customize the JSON body inside the HTTP request to match your business Link your ClickUp space and custom fields Connect Gmail (or other email tool) for final follow-up draft Requirements Form tool for capturing sales call notes OpenAI or LLM key for generating proposal copy PandaDoc API access ClickUp custom fields set up for lead tracking Gmail integration How to customize Customize your PandaDoc proposal fields in the JSON body of the HTTP node Replace ClickUp with another CRM like HubSpot or Notion Adjust AI tone (casual, premium, corporate) for proposal writing Add Slack or Telegram alerts when the draft is ready Add PDF generation or auto-send email step
by Rahul Joshi
Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does 📅 Runs every 5 minutes to monitor SLA breaches 📋 Fetches lead data (status, contact, timestamps) from Google Sheets 🕒 Identifies leads marked “Un-replied” beyond the 15-minute SLA 🔗 Enriches alerts with direct Google Sheet row links for quick action 📲 Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent
by Onur
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts competitor data from Google search results for specific keywords and target countries using Scrape.do SERP API, and saves structured results into Google Sheets for SEO, competitive analysis, and market research. 📋 Overview This workflow provides a lightweight competitor analysis solution that identifies ranking websites for chosen keywords across different countries. Ideal for SEO specialists, content strategists, and digital marketers who need structured SERP insights without manual effort. Who is this for? SEO professionals tracking keyword competitors Digital marketers conducting market analysis Content strategists planning based on SERP insights Business analysts researching competitor positioning Agencies automating SEO reporting What problem does this workflow solve? Eliminates manual SERP scraping Processes multiple keywords across countries Extracts structured data (position, title, URL, description) Automates saving results into Google Sheets Ensures repeatable & consistent methodology ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Get Keywords from Sheet → Reads keywords + target countries from a Google Sheet URL Encode Keywords → Converts keywords into URL-safe format Process Keywords in Batches → Handles multiple keywords sequentially to avoid rate limits Fetch Google Search Results → Calls Scrape.do SERP API to retrieve raw HTML of Google SERPs Extract Competitor Data from HTML → Parses HTML into structured competitor data (top 10 results) Append Results to Sheet → Writes structured SERP results into a Google Sheet 📊 Output Data Points | Field | Description | Example | |--------------------|------------------------------------------|-------------------------------------------| | Keyword | Original search term | digital marketing services | | Target Country | 2-letter ISO code of target region | US | | position | Ranking position in search results | 1 | | websiteTitle | Page title from SERP result | Digital Marketing Software & Tools | | websiteUrl | Extracted website URL | https://www.hubspot.com/marketing | | websiteDescription | Snippet/description from search results | Grow your business with HubSpot’s tools… | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do* account with *SERP API token** Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Keywords" | Column | Type | Description | Example | |----------|------|-------------|---------| | Keyword | Text | Search query | digital marketing | | Target Country | Text | 2-letter ISO code | US | Output Tab: "Results" | Column | Type | Description | Example | |--------------------|-------|-------------|---------| | Keyword | Text | Original search term | digital marketing | | position | Number| SERP ranking | 1 | | websiteTitle | Text | Title of the page | Digital Marketing Software & Tools | | websiteUrl | URL | Website/page URL | https://www.hubspot.com/marketing | | websiteDescription | Text | Snippet text | Grow your business with HubSpot’s tools | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure **Scrape.do API**: Endpoint: https://api.scrape.do/ Parameter: token=YOUR_SCRAPEDO_TOKEN Add render=true for full HTML rendering Configure Google Sheets: Create a sheet with two tabs: Keywords (input), Results (output) Set up Google Sheets OAuth2 credentials in n8n Replace placeholders: YOUR_GOOGLE_SHEET_ID and YOUR_GOOGLE_SHEETS_CREDENTIAL_ID Run & Test: Add test data in Keywords tab Execute workflow → Check results in Results tab 🧰 How to Customize Add more fields**: Extend HTML parsing logic in the “Extract Competitor Data” node to capture extra data (e.g., domain, sitelinks). Filtering**: Exclude domains or results with custom rules. Batch Size**: Adjust “Process Keywords in Batches” for speed vs. rate-limits. Rate Limiting: Insert a **Wait node (e.g., 10–30 seconds) if API rate limits apply. Multi-Sheet Output**: Save per-country or per-keyword results into separate tabs. 📊 Use Cases SEO Competitor Analysis**: Identify top-ranking sites for target keywords Market Research**: See how SERPs differ by region Content Strategy**: Analyze titles & descriptions of competitor pages Agency Reporting**: Automate competitor SERP snapshots for clients 📈 Performance & Limits Single Keyword: ~10–20 seconds (depends on **Scrape.do response) Batch of 10**: 3–5 minutes typical Large Sets (50+)**: 20–40 minutes depending on API credits & batching API Calls: 1 **Scrape.do request per keyword Reliability**: 95%+ extraction success, 98%+ data accuracy 🧩 Troubleshooting API error** → Check YOUR_SCRAPEDO_TOKEN and API credits No keywords loaded** → Verify Google Sheet ID & tab name = Keywords Permission denied** → Re-authenticate Google Sheets OAuth2 in n8n Empty results** → Check parsing logic and verify search term validity Workflow stops early** → Ensure batching loop (SplitInBatches) is properly connected 🤝 Support & Community n8n Forum: https://community.n8n.io n8n Docs: https://docs.n8n.io Scrape.do Dashboard: https://dashboard.scrape.do 🎯 Final Notes This workflow provides a repeatable foundation for extracting competitor SERP rankings with Scrape.do and saving them to Google Sheets. You can extend it with filtering, richer parsing, or integration with reporting dashboards to create a fully automated SEO intelligence pipeline.
by Jitesh Dugar
Revolutionize university admissions with intelligent AI-driven application evaluation that analyzes student profiles, calculates eligibility scores, and automatically routes decisions - saving 2.5 hours per application and reducing decision time from weeks to hours. 🎯 What This Workflow Does Transforms your admissions process from manual application review to intelligent automation: 📝 Captures Applications - Jotform intake with student info, GPA, test scores, essay, extracurriculars 🤖 AI Holistic Evaluation - OpenAI analyzes academic strength, essay quality, extracurriculars, and fit 🎯 Intelligent Scoring - Evaluates students using 40% academics, 25% extracurriculars, 20% essay, 15% fit (0-100 scale) 🚦 Smart Routing - Automatically routes based on AI evaluation: Auto-Accept (95-100)**: Acceptance letter with scholarship details → Admin alert → Database Interview Required (70-94)**: Interview invitation with scheduling link → Admin alert → Database Reject (<70)**: Respectful rejection with improvement suggestions → Database 💰 Scholarship Automation - Calculates merit scholarships ($5k-$20k+) based on eligibility score 📊 Analytics Tracking - All applications logged to Google Sheets for admissions insights ✨ Key Features AI Holistic Evaluation: Comprehensive analysis weighing academics, extracurriculars, essays, and institutional fit Intelligent Scoring System: 0-100 eligibility score with automated categorization and scholarship determination Structured Output: Consistent JSON schema with academic strength, admission likelihood, and decision reasoning Automated Communication: Personalized acceptance, interview, and rejection letters for every applicant Fallback Scoring: Manual GPA/SAT scoring if AI fails - ensures zero downtime Admin Alerts: Instant email notifications for exceptional high-scoring applicants (95+) Comprehensive Analytics: Track acceptance rates, average scores, scholarship distribution, and applicant demographics Customizable Criteria: Easy prompt editing to match your institution's values and requirements 💼 Perfect For Universities & Colleges: Processing 500+ undergraduate applications per semester Graduate Programs: Screening master's and PhD applications with consistent evaluation Private Institutions: Scaling admissions without expanding admissions staff Community Colleges: Handling high-volume transfer and new student applications International Offices: Evaluating global applicants 24/7 across all timezones Scholarship Committees: Identifying merit scholarship candidates automatically 🔧 What You'll Need Required Integrations Jotform - Application form with student data collection (free tier works) Create your form for free on Jotform using this link Create your application form with fields: Name, Email, Phone, GPA, SAT Score, Major, Essay, Extracurriculars OpenAI API - GPT-4o-mini for cost-effective AI evaluation (~$0.01-0.05 per application) Gmail - Automated applicant communication (acceptance, interview, rejection letters) Google Sheets - Application database and admissions analytics Optional Integrations Slack - Real-time alerts for exceptional applicants Calendar APIs - Automated interview scheduling Student Information System (SIS) - Push accepted students to enrollment system Document Analysis Tools - OCR for transcript verification 🚀 Quick Start Import Template - Copy JSON and import into n8n (requires LangChain support) Create Jotform - Use provided field structure (Name, Email, GPA, SAT, Major, Essay, etc.) Add API Keys - OpenAI, Jotform, Gmail OAuth2, Google Sheets Customize AI Prompt - Edit admissions criteria with your university's specific requirements and values Set Score Thresholds - Adjust auto-accept (95+), interview (70-94), reject (<70) cutoffs if needed Personalize Emails - Update templates with your university branding, dates, and contact info Create Google Sheet - Set up columns: id, Name, Email, GPA, SAT Score, Major, Essay, Extracurriculars Test & Deploy - Submit test application with pinned data and verify all nodes execute correctly 🎨 Customization Options Adjust Evaluation Weights: Change academics (40%), extracurriculars (25%), essay (20%), fit (15%) percentages Multiple Programs: Clone workflow for different majors with unique evaluation criteria Add Document Analysis: Integrate OCR for transcript and recommendation letter verification Interview Scheduling: Connect Google Calendar or Calendly for automated booking SIS Integration: Push accepted students directly to Banner, Ellucian, or PeopleSoft Waitlist Management: Add conditional routing for borderline scores (65-69) Diversity Tracking: Include demographic fields and bias detection in AI evaluation Financial Aid Integration: Automatically calculate need-based aid eligibility alongside merit scholarships 📈 Expected Results 90% reduction in manual application review time (from 2.5 hours to 15 minutes per application) 24-48 hour decision turnaround time vs 4-6 weeks traditional process 40% higher yield rate - faster responses increase enrollment commitment 100% consistency - every applicant evaluated with identical criteria Zero missed applications - automated tracking ensures no application falls through cracks Data-driven admissions - comprehensive analytics on applicant pools and acceptance patterns Better applicant experience - professional, timely communication regardless of decision Defensible decisions - documented scoring criteria for accreditation and compliance 🏆 Use Cases Large Public Universities Screen 5,000+ applications per semester, identify top 20% for auto-admit, route borderline to committee review. Selective Private Colleges Evaluate 500+ highly competitive applications, calculate merit scholarships automatically, schedule interviews with top candidates. Graduate Programs Process master's and PhD applications with research experience weighting, flag candidates for faculty review, automate fellowship awards. Community Colleges Handle high-volume open enrollment while identifying honors program candidates and scholarship recipients instantly. International Admissions Evaluate global applicants 24/7, account for different GPA scales and testing systems, respond same-day regardless of timezone. Rolling Admissions Provide instant decisions for early applicants, fill classes strategically, optimize scholarship budget allocation. 💡 Pro Tips Calibrate Your AI: After 100+ applications, refine evaluation criteria based on enrolled student success A/B Test Thresholds: Experiment with score cutoffs (e.g., 93 vs 95 for auto-admit) to optimize yield Build Waitlist Pipeline: Keep 70-84 score candidates engaged for spring enrollment or next year Track Source Effectiveness: Add UTM parameters to measure which recruiting channels deliver best students Committee Review: Route 85-94 scores to human admissions committee for final review Bias Audits: Quarterly review of AI decisions by demographic groups to ensure fairness Parent Communication: Add parent/guardian emails for admitted students under 18 Financial Aid Coordination: Sync scholarship awards with financial aid office for packaging 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output** - LangChain integration for consistent JSON responses Multi-stage conditional routing** - IF nodes for three-tier decision logic Holistic evaluation** - Weighted scoring across multiple dimensions Automated communication** - HTML email templates with dynamic content Real-time notifications** - Admin alerts for high-value applicants Analytics and data logging** - Google Sheets integration for reporting Fallback mechanisms** - Manual scoring when AI unavailable Perfect for learning advanced n8n automation patterns in educational technology! 🔐 Compliance & Ethics FERPA Compliance: Protects student data with secure credential handling Fair Admissions: Documented criteria eliminate unconscious bias Human Oversight: Committee review option for borderline cases Transparency: Applicants can request evaluation criteria Appeals Process: Structured workflow for decision reconsideration Data Retention: Configurable Google Sheets retention policies 📊 What Gets Tracked Application submission date and time Complete student profile (GPA, test scores, major, essay, activities) AI eligibility score (0-100) and decision category Academic strength rating (excellent/strong/average) Scholarship eligibility and amount ($0-$20,000+) Admission likelihood (high/medium/low) Decision outcome (accepted/interview/rejected) Email delivery status and open rates Time from application to decision Ready to transform your admissions process? Import this template and start evaluating applications intelligently in under 1 hour. Questions or customization needs? The workflow includes detailed sticky notes explaining each section and comprehensive fallback logic for reliability.
by Automate With Marc
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Drop an image into Google Drive and let this workflow handle the rest: it auto-cleans and enhances the image with Google’s Nano Banana (via Wavespeed API), generates a catchy caption with GPT-5, and publishes directly to your connected social accounts using Postiz. 👉 Watch step-by-step video tutorials of workflows like these on *https://www.youtube.com/watch?v=4wk6PYgBtBM&list=PL05w1TE8X3bb1H9lXBqUy98zmTrrPP-s1* What it does Triggers from Google Drive when a new image is uploaded Sends the image to Nano Banana to declutter, brighten, and make it real-estate/photo-listing ready Polls for the edited result until it’s complete Logs the edited image URL into Google Sheets for tracking Downloads and uploads the edited image into Postiz media library Generates an engaging caption with GPT-5 Caption Agent Publishes instantly to Instagram (can be extended to TikTok, LinkedIn, etc.) Perfect for Real-estate agents posting property shots Ecommerce sellers updating product catalogs Social media marketers needing fast, polished posts Apps & Services Tools Used Google Drive (Trigger) Wavespeed API – Google Nano Banana (Image editing) Google Sheets (Logging) Postiz (Social scheduling/posting) OpenAI GPT-5 (Caption agent) Setup Connect your Google Drive and select the upload folder. Add your Wavespeed API key for Nano Banana. Connect Google Sheets for logging. Add Postiz API credentials and set the integration ID for your channel(s). Enter your OpenAI API key for GPT-5 captioning. Customization Adjust the edit prompt for different use cases (e.g., product cleanup, lighting tweaks). Change Postiz post type to scheduled instead of “now.” Add more Postiz posts for multi-platform publishing. Insert an approval loop (Slack/Email) before posting. Logs Edited Image Log (Sheets): stores final image URL + timestamp. Publishing Log (Sheets): tracks workflow status per asset. Notes Sticky notes in the template explain each major block. Replace sample IDs with your own (folder IDs, sheet IDs, Postiz integration). Keep all API keys in n8n Credentials, not in node parameters.
by Aitor | 1Node
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the distribution and scheduling of video content across multiple social platforms (TikTok, YouTube, Facebook, Instagram, Threads) through Postiz. Videos are collected from Google Drive, approved manually, and scheduled via the Postiz community node. 🧾 Requirements Google Drive** account with access to the folder that will watch for new items uploaded. videos in mp4 format ready to be shared or, alternatively you can connect a community node from Cloud Convert to convert the format before uploading into Postiz. Postiz account with integrations for TikTok, YouTube, Facebook, Instagram, and Threads 🔗 Useful Links Postiz Docs Postiz Community Node 🔄 Workflow Steps Trigger: Google Drive File Added Watches your selected Google Drive folder for new file uploads. Download File Downloads the detected video from Drive. Upload to Postiz Video is uploaded to Postiz to prepare for social scheduling. Set Fields Manual setting of social options Extract Datetime (AI) Uses OpenAI to find/predict intended publish date & time, as the datetime format is required to schedule on Postiz Get Social Integrations Fetches a list of user’s connected platforms from Postiz. Split and Filter Integrations Splits the process per platform (TikTok, YouTube, Facebook, Instagram, Threads). Schedule Post For each enabled platform, schedules the video with chosen options. 🙋♂️ Need Help? Connect with 1 Node
by Onur
🏠 Extract Zillow Property Data to Google Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts property listing data from Zillow URLs using Scrape.do web scraping API, parses key property information, and saves structured results into Google Sheets for real estate analysis, market research, and property tracking. 📋 Overview This workflow provides a lightweight real estate data extraction solution that pulls property details from Zillow listings and organizes them into a structured spreadsheet. Ideal for real estate professionals, investors, market analysts, and property managers who need automated property data collection without manual effort. Who is this for? Real estate investors tracking properties Market analysts conducting property research Real estate agents monitoring listings Property managers organizing data Data analysts building real estate databases What problem does this workflow solve? Eliminates manual copy-paste from Zillow Processes multiple property URLs in bulk Extracts structured data (price, address, zestimate, etc.) Automates saving results into Google Sheets Ensures repeatable & consistent data collection ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Read Zillow URLs from Google Sheets → Reads property URLs from a Google Sheet Scrape Zillow URL via Scrape.do → Fetches full HTML from Zillow (bypasses PerimeterX protection) Parse Zillow Data → Extracts structured property information from HTML Write Results to Google Sheets → Saves parsed data into a results sheet 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | URL | Original Zillow listing URL | https://www.zillow.com/homedetails/... | | Price | Property listing price | $300,000 | | Address | Street address | 8926 Silver City | | City | City name | San Antonio | | State | State abbreviation | TX | | Days on Zillow | How long listed | 5 | | Zestimate | Zillow's estimated value | $297,800 | | Scraped At | Timestamp of extraction | 2025-01-29T12:00:00.000Z | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do account with API token (Get 1000 free credits/month) Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Sheet1" | Column | Type | Description | Example | |--------|------|-------------|---------| | URLs | URL | Zillow listing URL | https://www.zillow.com/homedetails/123... | Output Tab: "Results" | Column | Type | Description | Example | |--------|------|-------------|---------| | URL | URL | Original listing URL | https://www.zillow.com/homedetails/... | | Price | Text | Property price | $300,000 | | Address | Text | Street address | 8926 Silver City | | City | Text | City name | San Antonio | | State | Text | State code | TX | | Days on Zillow | Number | Days listed | 5 | | Zestimate | Text | Estimated value | $297,800 | | Scraped At | Timestamp | When scraped | 2025-01-29T12:00:00.000Z | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure Scrape.do API: Sign up at Scrape.do Dashboard Get your API token In HTTP Request node, replace YOUR_SCRAPE_DO_TOKEN with your actual token The workflow uses super=true for premium residential proxies (10 credits per request) Configure Google Sheets: Create a new Google Sheet Add two tabs: "Sheet1" (input) and "Results" (output) In Sheet1, add header "URLs" in cell A1 Add Zillow URLs starting from A2 Set up Google Sheets OAuth2 credentials in n8n Replace YOUR_SPREADSHEET_ID with your actual Google Sheet ID Replace YOUR_GOOGLE_SHEETS_CREDENTIAL_ID with your credential ID Run & Test: Add 1-2 test Zillow URLs in Sheet1 Click "Execute workflow" Check results in Results tab 🧰 How to Customize Add more fields**: Extend parsing logic in "Parse Zillow Data" node to capture additional data (bedrooms, bathrooms, square footage) Filtering**: Add conditions to skip certain properties or price ranges Rate Limiting**: Insert a Wait node between requests if processing many URLs Error Handling**: Add error branches to handle failed scrapes gracefully Scheduling**: Replace Manual Trigger with Schedule Trigger for automated daily/weekly runs 📊 Use Cases Investment Analysis**: Track property prices and zestimates over time Market Research**: Analyze listing trends in specific neighborhoods Portfolio Management**: Monitor properties for sale in target areas Competitive Analysis**: Compare similar properties across locations Lead Generation**: Build databases of properties matching specific criteria 📈 Performance & Limits Single Property**: ~5-10 seconds per URL Batch of 10**: 1-2 minutes typical Large Sets (50+)**: 5-10 minutes depending on Scrape.do credits API Calls**: 1 Scrape.do request per URL (10 credits with super=true) Reliability**: 95%+ success rate with premium proxies 🧩 Troubleshooting | Problem | Solution | |---------|----------| | API error 400 | Check your Scrape.do token and credits | | URL showing "undefined" | Verify Google Sheet column name is "URLs" (capital U) | | No data parsed | Check if Zillow changed their HTML structure | | Permission denied | Re-authenticate Google Sheets OAuth2 in n8n | | 50000 character error | Verify Parse Zillow Data code is extracting fields, not returning raw HTML | | Price shows HTML/CSS | Update price extraction regex in Parse Zillow Data node | 🤝 Support & Community Scrape.do Documentation Scrape.do Dashboard Scrape.do Zillow Scraping Guide n8n Forum n8n Docs 🎯 Final Notes This workflow provides a repeatable foundation for extracting Zillow property data with Scrape.do and saving to Google Sheets. You can extend it with: Historical tracking (append timestamps) Price change alerts (compare with previous scrapes) Multi-platform scraping (Redfin, Realtor.com) Integration with CRM or reporting dashboards Important: Scrape.do handles all anti-bot bypassing (PerimeterX, CAPTCHAs) automatically with rotating residential proxies, so you only pay for successful requests. Always use super=true parameter for Zillow to ensure high success rates.
by WeblineIndia
Send daily applicant digest by role from Gmail to hiring managers with Google Gemini This workflow automatically collects all new job application emails from your Gmail labeled as applicants in the last 24 hours. Every day at 6:00 PM (Asia/Kolkata), it extracts structured details (name, email, phone, role, experience, skills, location, notice, summary) from each applicant (using Gemini AI or OpenAI). It then groups applicants by role and manager, compiles a neat HTML table digest for each manager and emails them a single summary — so hiring managers get everything they need, at a glance, in one place. Who’s It For Recruiters and hiring managers tired of digging through multiple application threads. Small HR teams / agencies not yet on a full applicant tracking system. Anyone wanting a consolidated, role-targeted applicant update each day. Teams that want to automate candidate triage using Google Workspace and AI. How It Works Schedule Trigger (6PM IST): Runs automatically at 18:00 India time. Fetch Applicant Emails: Reads Gmail for emails labeled 'applicants' from the past 24 hours. Prepare Email Text: Converts email content to plain text for reliable AI extraction. Extract Applicant Details: Gemini/OpenAI extracts applicant’s info in structured JSON. Assign Manager Emails: Routes each applicant to the correct manager via role→email mapping or fallback. Group & Build HTML Tables: Organizes applicants by manager and role, builds summary tables. Send Digest to Managers: Sends each manager one HTML summary email for their new applicants. How to Set Up Create/verify Gmail label applicants and set up filters to route job emails there. Import the workflow: Use your Google/Gmail and Gemini/OpenAI accounts as credentials. Configure connections: Gmail with OAuth2 (IMAP not required, uses Gmail API) Gemini or OpenAI API key for extraction Set role→manager mapping in the “Assign Manager Emails” node (just edit the map!). Adjust time / defaults: Edit schedule and fallback email if you wish. Test it: Send yourself a test application, label it, check workflow logs. Requirements Gmail account (with OAuth2 enabled and 'applicants' label set up) Gemini or OpenAI API key for structured AI extraction n8n instance (self-hosted or cloud) SMTP credentials (if using direct email instead of Gmail node) At least one valid hiring manager email mapped to a role How to Customize the Workflow Centralize config with a Set node (label name, fallback/manager email, model name, schedule). Add attachment-to-text conversion for applications with resume attachments. Normalize role names in the mapping code for more robust routing. Enable additional delivery: Slack, Teams, Google Sheets log, extra Cron for mid-day urgents. Refine AI extraction prompt for specific fields (add portfolio URL, etc.). Change schedule for daily, weekly or per-role timing. Add‑Ons / Extensions Resume Text Extraction:** Add PDF/DOCX to text parsing for attachment-only applications. ChatOps:** Send the summary to Slack or Teams channels along with/instead of email. Applicant Logging:** Auto-log every applicant/action into Google Sheets, Notion or Airtable. Multi-timezone:** Duplicate/modify the Cron trigger for different manager regions or urgency levels. Use Case Examples Tech Hiring:** Java, Python, Frontend candidates are automatically routed to their respective leads. Small Agency:** All applications summarized for reviewers, with per-role breakdowns. HR Operations:** Daily rollups sent before hiring sync, facilitating fast decision-making. Common Troubleshooting | Issue | Possible Cause | Solution | |-----------------------------------------|----------------------------------------------------------|-------------------------------------------------------------| | No emails processed | No 'applicants' label or wrong time window | Check Gmail filters and adjust search query in fetch node | | All digests go to fallback manager | Incorrect or missing role → manager mapping | Normalize role text in assignment node, expand map | | AI Extraction returns bad/missing JSON | Wrong prompt, high temperature or missing field names | Tighten prompt, lower temperature, check example response | | Duplicate/Old Emails appear | Date filter not correct | Use 'newer_than:1d' and keep 'mark as read' in email node | | SMTP/Gmail Send errors | Auth problem, quota or app password missing | Use OAuth2, check daily send caps and app password settings | | Blank or partially filled summary table | AI unable to parse poorly formatted/empty email | Improve sender email consistency, add fallback handling | | Attachments not processed | No attachment extraction node | Add attachment-to-text parsing before AI node | Need Help? If you get stuck, need help customizing a mapping or adding nodes or want to integrate extra steps (e.g., resume text, Slack), just ask! We're happy to guide you step by step, review your workflow, or help you troubleshoot any errors. Contact WeblineIndia — Your n8n Automation partner!
by Dr. Firas
Build Your First AI Agent with ChatGPT-5 Who is this for? This workflow is designed for beginners and professionals who want to build their first AI-powered assistant with n8n. It’s perfect for anyone managing online trainings, consultations, or services that require both a knowledge base and appointment scheduling. What problem is this workflow solving? Manually handling client questions, checking your availability, and confirming bookings can be time-consuming and error-prone. This workflow automates the process, ensuring quick, accurate answers and seamless scheduling directly through chat. What this workflow does Answers user questions using your knowledge base stored in Google Sheets. Checks availability in Google Calendar and proposes alternative time slots if needed. Books 1-hour appointments in Paris time only after client confirmation. Sends a professional confirmation email with all appointment details. Setup Import this workflow into your n8n instance. Connect your Google Sheets, Gmail, and Google Calendar credentials. Add your knowledge base into Google Sheets (questions, answers, policies, packs, etc.). Test the workflow using the Connected Chat Trigger node to start conversations with the AI Agent. How to customize this workflow to your needs Update the Google Sheets database with your own training packs, services, or company FAQs. Adjust the email template to reflect your branding and communication style. Modify the appointment duration if you need sessions longer or shorter than 1 hour. Add extra nodes (e.g., CRM integration) to capture leads or sync appointments with external systems. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Fabian Herhold
Who's it for Sales teams, BDRs, account managers, and customer success professionals who want to show up prepared for every meeting. Perfect for anyone using Calendly who wants to automate prospect research and never walk into a call blind again. Watch the full tutorial here: What it does This workflow automatically researches your meeting attendees the moment they book through Calendly. It combines multiple AI agents to gather comprehensive intelligence: Company Research**: Uses Perplexity AI to validate company details, recent news, funding, leadership changes, and business signals LinkedIn Analysis**: Leverages RapidAPI to analyze the person's profile, recent posts, comments, and engagement patterns from the last 60-90 days Signal Detection**: Identifies hiring signals, growth indicators, and potential risks with confidence scoring Meeting Prep**: Synthesizes everything into personalized talking points, conversation starters, and strategic recommendations The final research brief gets delivered directly to your Slack, saving 30-45 minutes of manual research per meeting. How it works Someone books a meeting via your Calendly (must include LinkedIn URL in booking form) Main AI Agent extracts company domain from email and coordinates three specialist research agents Company Agent researches business intel via Perplexity Person Agent analyzes LinkedIn activity using 4 different RapidAPI endpoints Signal Agent identifies business opportunities and risks Comprehensive meeting brief gets sent to your Slack channel Requirements API Credentials needed: Calendly API (for webhook trigger) OpenAI API key (GPT-4 recommended for orchestration) Perplexity API key (for web research) RapidAPI subscription (for LinkedIn data endpoints) Slack bot token (for output delivery) Important: Your Calendly booking form must include a LinkedIn URL field to get optimal results. How to set up Configure Calendly: Add the Calendly trigger node with your API credentials Update Slack destination: Modify the final Slack node with your user ID or channel Add API keys: Configure all the API credentials in their respective nodes Test the workflow: Book a test meeting through Calendly to verify the complete flow Customize prompts: Adjust the AI agent prompts based on your specific industry or use case The workflow uses structured JSON output with confidence scoring and source citation for reliable, actionable intelligence. How to customize the workflow Change output destination**: Replace Slack with email, Teams, or CRM integration Modify research depth**: Adjust the AI prompts to focus on specific industries or company types Add more signals**: Extend the Signal Research Agent to detect additional business indicators Integrate with CRM**: Add nodes to automatically update contact records in your sales system Schedule follow-ups**: Connect to calendar tools to automatically schedule research updates The modular design makes it easy to adapt for different sales processes and research requirements.
by Olivier
This template syncs prospects from ProspectPro into HubSpot. It checks if a company already exists in HubSpot (by ProspectPro ID or domain), then updates the record or creates a new one. Sync results are logged back in ProspectPro with tags to prevent duplicates and mark errors, ensuring reliable and repeatable integrations. ✨ Features Automatically sync ProspectPro prospects to HubSpot companies Smart search logic: match by ProspectPro ID first, then by domain Creates new HubSpot companies when no match is found Updates existing HubSpot companies with latest ProspectPro data Logs sync results back into ProspectPro with tags (HubspotSynced, HubspotSyncFailed) Extendable and modular: use as a trigger workflow or callable sub-flow ⚙ Requirements n8n instance or cloud workspace Install the ProspectPro Verified Community Node ProspectPro account & API credentials (14-day free trial) HubSpot account with OAuth2 app and API credentials 🔧 Setup Instructions Import the template and set your credentials (ProspectPro, HubSpot). Connect to a trigger (e.g., ProspectPro "New website visitor") or call as a sub-workflow. Add a propery to Hubspot for the ProspectPro ID if you don't already have one Adjust sync logic in the "Continue?"-node and HubSpot fields to match your setup. Optional: extend error handling, add Slack/CRM notifications, or sync back HubSpot data into ProspectPro. 🔐 Security Notes Prevents re-processing of failed syncs using the HubspotSyncFailed tag Error branches included for failed updates/creates Manual resolution required if sync errors persist 🧪 Testing Run with a ProspectPro ID of a company with a known domain Check HubSpot for creation or update of the company record Verify updated tags (HubspotSynced / HubspotSyncFailed) in ProspectPro 📌 About ProspectPro ProspectPro is a B2B Prospecting Platform for Dutch SMEs. It helps sales teams identify prospects, track website visitors, and streamline sales without a full CRM. Website: https://www.prospectpro.nl Platform: https://mijn.prospectpro.nl API docs: https://www.docs.bedrijfsdata.nl Support: https://www.prospectpro.nl/klantenservice Support hours: Monday–Friday, 09:00–17:00 CET 📌 About HubSpot HubSpot is a leading CRM platform offering marketing, sales, and customer service tools. It helps companies manage contacts, automate workflows, and grow their customer base. Website: https://www.hubspot.com Developer Docs: https://developers.hubspot.com