by Shelly-Ann Davy
Description Wake up gently. This elegant workflow runs every morning at 7 AM, picks one uplifting affirmation from a curated list, and delivers it to your inbox (with optional Telegram). Zero code, zero secrets—just drop in your SMTP and Telegram credentials, edit the affirmations, and activate. Perfect for creators, homemakers, and entrepreneurs who crave intention and beauty before the day begins. How it works (high-level steps) Cron wakes the flow daily at 7 AM. Set: Configuration stores your email, Telegram chat ID, and affirmations. Code node randomly selects one affirmation. Email node sends the message via SMTP. IF node decides whether to forward it to Telegram as well. Set-up time 2 – 3 minutes 30 s: add SMTP credential 30 s: add Telegram Bot credential (optional) 1 min: edit affirmations & email addresses 30 s: activate Detailed instructions All deep-dive steps live inside the yellow and white sticky notes on the canvas—no extra docs needed. Requirements SMTP account (SendGrid, Gmail, etc.) Telegram Bot account (optional) Customisation tips Change Cron time or frequency Swap affirmation list for quotes, verses, or mantras Add Notion logger branch for journaling
by Oneclick AI Squad
This automated n8n workflow checks daily travel itineraries, syncs upcoming trips to Google Calendar, and sends reminder notifications to travelers via email or SMS. Perfect for travel agencies, tour operators, and organizations managing group trips to keep travelers informed about their schedules and bookings. What This Workflow Does Automatically checks travel itineraries every day Identifies today's trips and upcoming departures Syncs trip information to Google Calendar Sends personalized reminders to assigned travelers Tracks reminder delivery status and logs activities Handles both email and SMS notification preferences Provides pre-travel checklists and booking confirmations Manages multi-day trip schedules and activities Main Components Daily Travel Check** - Triggers daily to check travel itineraries Read Travel Itinerary** - Retrieves today's trips and bookings from database/Excel Filter Today's Trips** - Identifies trips departing today and upcoming activities Has Trips Today?** - Checks if there are any trips scheduled Read Traveler Contacts** - Gets traveler contact information for assigned trips Sync to Google Calendar** - Creates/updates trip events in Google Calendar Create Traveler Reminders** - Generates personalized reminder messages with travel details Split Into Batches** - Processes reminders in manageable batches Email or SMS?** - Routes based on traveler communication preferences Prepare Email Reminders** - Creates detailed email reminder content with checklists Prepare SMS Reminders** - Creates SMS reminder content optimized for text Read Reminder Log** - Checks previous reminder history Update Reminder Log** - Records sent reminders with timestamps Save Reminder Log** - Saves updated log data for audit trail Essential Prerequisites Travel itinerary database/Excel file with trip assignments Traveler contact database with email and phone numbers Google Calendar API access and credentials SMTP server for email notifications SMS service provider (Twilio, Nexmo, etc.) for text reminders Reminder log file for tracking sent notifications Booking confirmation system (flight, hotel, transport) Required Data Files trip_itinerary.xlsx: Trip ID | Trip Name | Date | Departure Time | Duration Departure Location | Destination | Hotel | Flight Number Assigned Travelers | Status | Booking Reference | Cost traveler_contacts.xlsx: Traveler ID | First Name | Last Name | Email | Phone Preferred Contact | Assigned Trips | Passport Number | Emergency Contact reminder_log.xlsx: Log ID | Date | Traveler ID | Trip ID | Contact Method Status | Sent Time | Message Preview | Confirmation Key Features ⏰ Daily Automation: Runs automatically every day at scheduled times 📅 Calendar Sync: Syncs trips to Google Calendar for easy viewing 📧 Smart Reminders: Sends email or SMS based on traveler preference 👥 Batch Processing: Handles multiple travelers efficiently 📊 Activity Logging: Tracks all reminder activities and delivery status 🔄 Duplicate Prevention: Avoids sending multiple reminders 📱 Multi-Channel: Supports both email and SMS notifications ✈️ Travel-Specific: Includes flight numbers, locations, accommodation details 📋 Pre-Travel Checklist: Provides comprehensive packing and document reminders 🌍 Multi-Destination: Manages complex multi-stop itineraries Quick Setup Import workflow JSON into n8n Configure daily trigger schedule (recommended: 6 AM and 6 PM) Set up trip itinerary and traveler contact files Connect Google Calendar API credentials Configure SMTP server for emails Set up SMS service provider (Twilio, Nexmo, or similar) Map Excel sheet columns to workflow variables Test with sample trip data Activate workflow Parameters to Configure schedule_file_path: Path to trip itinerary file contacts_file_path: Path to traveler contacts file reminder_hours: Hours before departure to send reminder (default: 24) google_calendar_id: Google Calendar ID for syncing trips google_api_credentials: Google Calendar API credentials smtp_host: Email server settings smtp_user: Email username smtp_password: Email password sms_api_key: SMS service API key sms_phone_number: SMS sender phone number reminder_log_path: Path to reminder log file Sample Reminder Messages Email Subject: "✈️ Travel Reminder: [Trip Name] Today at [Time]" Email Body: Hello [Traveler Name], Your trip is happening today! Here are your travel details: Trip: [Trip Name] Departure: [Departure Time] From: [Departure Location] To: [Destination] Flight/Transport: [Flight Number] Hotel: [Hotel Name] Duration: [X] days Pre-Travel Checklist: ☑ Passport and travel documents ☑ Travel insurance documents ☑ Hotel confirmations ☑ Medications and toiletries ☑ Weather-appropriate clothing ☑ Phone charger and adapters ⚠️ Please arrive at the departure point 2 hours early! Have a wonderful trip! SMS: "✈️ Travel Reminder: '[Trip Name]' departs at [Time] today from [Location]. Arrive 2 hours early! Flight: [Number]" Tomorrow Evening Preview (SMS): "📅 Tomorrow: '[Trip Name]' departs at [Time] from [Location]. Pack tonight! ([X] days)" Use Cases Daily trip departure reminders for travelers Last-minute itinerary change notifications Flight cancellation and delay alerts Hotel check-in and checkout reminders Travel document expiration warnings Group tour activity scheduling Adventure/hiking trip departure alerts Business travel itinerary updates Family vacation coordination Study abroad program notifications Multi-city tour route confirmations Transport connection reminders Advanced Features Reminder Escalation 24-hour reminder: Full details with checklist 6-hour reminder: Quick confirmation with transport details 2-hour reminder: Urgent departure notification Conditional Logic Different messages for single-day vs. multi-day trips Domestic vs. international travel variations Group size-based messaging Weather-based travel advisories Integration Capabilities Connect to airline APIs for real-time flight status Link to hotel management systems for check-in info Integrate weather services for destination forecasts Sync with payment systems for booking confirmations Troubleshooting | Issue | Solution | |-------|----------| | Reminders not sending | Check email/SMS credentials and service quotas | | Calendar sync failing | Verify Google Calendar API permissions | | Duplicate reminders | Check for overlapping reminder time windows | | Missing traveler data | Verify contact file formatting and column mapping | | Batch processing slow | Reduce batch size in Split Into Batches node | Security Considerations Store API credentials in n8n environment variables Use OAuth2 for Google Calendar authentication Encrypt sensitive data in reminder logs Implement role-based access to trip data Audit log all reminder activities Comply with GDPR/privacy regulations for traveler data Performance Metrics Processing Time**: ~2-5 seconds per 50 travelers Success Rate**: >99% for delivery logging Calendar Sync**: Real-time updates Batch Limit**: 10 travelers per batch (configurable) Support & Maintenance Review reminder logs weekly for delivery issues Update traveler contacts as needed Monitor email/SMS service quotas Test workflow after system updates Archive old reminder logs monthly
by Meak
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Most businesses spend hours cleaning up photos and manually posting them to social media. This workflow does it all automatically: image enhancement, caption creation, and posting — directly from a simple Google Drive upload. Benefits Clean & enhance images instantly with Nano Banana Auto-generate catchy captions with GPT-5 Post directly to Instagram (or other social channels) Track everything in Google Sheets Save hours per week on repetitive content tasks How It Works Upload image to Google Drive Workflow sends image to Nano Banana (via Wavespeed API) Waits for enhanced version and logs URL in Google Sheets Uploads result to Postiz media library GPT-5 writes an engaging caption Publishes post instantly or schedules for later Who Is This For Real estate agents posting property photos E-commerce sellers updating product images Social media managers handling multiple accounts Setup Connect Google Drive (select upload folder) Add Wavespeed API key for Nano Banana Connect Google Sheets for logging Add Postiz API credentials & integration ID Enter OpenAI API key for GPT-5 captioning ROI & Monetization Save 5–10 hours per week of manual editing and posting Offer as a $1k–$3k/month content automation service for clients Scale to multi-platform posting (TikTok, LinkedIn) for premium retainers Strategy Insights In the full walkthrough, I show how to: Build this workflow step by step Pitch it as a “Done-For-You Social Posting System” Automate outreach to agencies and creators who need it Turn this into recurring revenue with retainers Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by IranServer.com
Automate IP geolocation and HTTP port scanning with Google Sheets trigger This n8n template automatically enriches IP addresses with geolocation data and performs HTTP port scanning when new IPs are added to a Google Sheets document. Perfect for network monitoring, security research, or maintaining an IP intelligence database. Who's it for Network administrators, security researchers, and IT professionals who need to: Track IP geolocation information automatically Monitor HTTP service availability across multiple ports Maintain centralized IP intelligence in spreadsheets Automate repetitive network reconnaissance tasks How it works The workflow triggers whenever a new row containing an IP address is added to your Google Sheet. It then: Fetches geolocation data using the ip-api.com service to get country, city, coordinates, ISP, and organization information Updates the spreadsheet with the geolocation details Scans common HTTP ports (80, 443, 8080, 8000, 3000) to check service availability Records port status back to the same spreadsheet row, showing which services are accessible The workflow handles both successful connections and various error conditions, providing a comprehensive view of each IP's network profile. Requirements Google Sheets API access** - for reading triggers and updating data Google Sheets document** with at least an "IP" column header How to set up Create a Google Sheet with columns: IP, Country, City, Lat, Lon, ISP, Org, Port_80, Port_443, Port_8000, Port_8080, Port_3000 Configure Google Sheets credentials in both the trigger and update nodes Update the document ID in the Google Sheets Trigger and both Update nodes to point to your spreadsheet Test the workflow by adding an IP address to your sheet and verifying the automation runs How to customize the workflow Modify port list**: Edit the "Edit Fields" node to scan different ports by changing the ports array Add more geolocation fields**: The ip-api.com response includes additional fields like timezone, zip code, and AS number Change trigger frequency**: Adjust the polling interval in the Google Sheets Trigger for faster or slower monitoring Add notifications**: Insert Slack, email, or webhook nodes to alert when specific conditions are detected Filter results**: Add IF nodes to process only certain IP ranges or geolocation criteria
by Rahul Joshi
Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does 📅 Runs every 5 minutes to monitor SLA breaches 📋 Fetches lead data (status, contact, timestamps) from Google Sheets 🕒 Identifies leads marked “Un-replied” beyond the 15-minute SLA 🔗 Enriches alerts with direct Google Sheet row links for quick action 📲 Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent
by Abdullah Alshiekh
🧩 What Problem Does It Solve? In real estate, inquiries come from many sources and often require immediate, personalized attention. Brokers waste significant time manually: Qualifying leads:** Determining if a prospect's budget, neighborhood, and needs match available inventory. Searching listings:** Cross-referencing customer criteria against a large, static database. Data entry:** Moving contact details and search summaries into a CRM like Zoho. Initial follow-up:** Sending an email to confirm the submission and schedule the next step. 🛠️ How to Configure It Jotform & CRM Setup Jotform Trigger:** Replace the placeholder with your specific Jotform ID. Zoho CRM:** Replace the placeholder TEMPLATED_COMPANY_NAME with your actual company name. Gmail:** Replace the placeholder Calendly link YOUR_CALENDLY_LINK in the Send a message node with your real estate consultant's booking link. Database & AI Setup Google Sheets:** Replace YOUR_GOOGLE_SHEET_DOCUMENT_ID and YOUR_SHEET_GID_OR_NAME in both Google Sheets nodes. Your listings must be structured with columns matching the AI prompt (e.g., bedrooms, rent, neighborhoods). AI Models:** Ensure your Google Gemini API key is linked to the Google Gemini Chat Model node. AI Agent Prompt:** The included prompt contains the exact matching and scoring rules for the AI. You can edit this prompt to refine how the AI prioritizes factors like supplier_rating or neighborhood proximity. 🧠 Use Case Examples Small Startups:** Collect High-Quality Leads: New inquiries must be quickly logged for sales follow-up, but manual entry is slow. B2B Sales:** High-Value Lead Enrichment: Need to prioritize leads that match specific product requirements and budget tiers. Travel/Hospitality:** Personalized Itinerary Matching: Quickly match customer preferences (e.g., dates, group size, activity level) to available packages. E-commerce:** Manual Product Recommendation: Sales teams manually recommend expensive, configurable items (e.g., furniture, specialized equipment). If you need any help Get in Touch
by Onur
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts competitor data from Google search results for specific keywords and target countries using Scrape.do SERP API, and saves structured results into Google Sheets for SEO, competitive analysis, and market research. 📋 Overview This workflow provides a lightweight competitor analysis solution that identifies ranking websites for chosen keywords across different countries. Ideal for SEO specialists, content strategists, and digital marketers who need structured SERP insights without manual effort. Who is this for? SEO professionals tracking keyword competitors Digital marketers conducting market analysis Content strategists planning based on SERP insights Business analysts researching competitor positioning Agencies automating SEO reporting What problem does this workflow solve? Eliminates manual SERP scraping Processes multiple keywords across countries Extracts structured data (position, title, URL, description) Automates saving results into Google Sheets Ensures repeatable & consistent methodology ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Get Keywords from Sheet → Reads keywords + target countries from a Google Sheet URL Encode Keywords → Converts keywords into URL-safe format Process Keywords in Batches → Handles multiple keywords sequentially to avoid rate limits Fetch Google Search Results → Calls Scrape.do SERP API to retrieve raw HTML of Google SERPs Extract Competitor Data from HTML → Parses HTML into structured competitor data (top 10 results) Append Results to Sheet → Writes structured SERP results into a Google Sheet 📊 Output Data Points | Field | Description | Example | |--------------------|------------------------------------------|-------------------------------------------| | Keyword | Original search term | digital marketing services | | Target Country | 2-letter ISO code of target region | US | | position | Ranking position in search results | 1 | | websiteTitle | Page title from SERP result | Digital Marketing Software & Tools | | websiteUrl | Extracted website URL | https://www.hubspot.com/marketing | | websiteDescription | Snippet/description from search results | Grow your business with HubSpot’s tools… | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do* account with *SERP API token** Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Keywords" | Column | Type | Description | Example | |----------|------|-------------|---------| | Keyword | Text | Search query | digital marketing | | Target Country | Text | 2-letter ISO code | US | Output Tab: "Results" | Column | Type | Description | Example | |--------------------|-------|-------------|---------| | Keyword | Text | Original search term | digital marketing | | position | Number| SERP ranking | 1 | | websiteTitle | Text | Title of the page | Digital Marketing Software & Tools | | websiteUrl | URL | Website/page URL | https://www.hubspot.com/marketing | | websiteDescription | Text | Snippet text | Grow your business with HubSpot’s tools | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure **Scrape.do API**: Endpoint: https://api.scrape.do/ Parameter: token=YOUR_SCRAPEDO_TOKEN Add render=true for full HTML rendering Configure Google Sheets: Create a sheet with two tabs: Keywords (input), Results (output) Set up Google Sheets OAuth2 credentials in n8n Replace placeholders: YOUR_GOOGLE_SHEET_ID and YOUR_GOOGLE_SHEETS_CREDENTIAL_ID Run & Test: Add test data in Keywords tab Execute workflow → Check results in Results tab 🧰 How to Customize Add more fields**: Extend HTML parsing logic in the “Extract Competitor Data” node to capture extra data (e.g., domain, sitelinks). Filtering**: Exclude domains or results with custom rules. Batch Size**: Adjust “Process Keywords in Batches” for speed vs. rate-limits. Rate Limiting: Insert a **Wait node (e.g., 10–30 seconds) if API rate limits apply. Multi-Sheet Output**: Save per-country or per-keyword results into separate tabs. 📊 Use Cases SEO Competitor Analysis**: Identify top-ranking sites for target keywords Market Research**: See how SERPs differ by region Content Strategy**: Analyze titles & descriptions of competitor pages Agency Reporting**: Automate competitor SERP snapshots for clients 📈 Performance & Limits Single Keyword: ~10–20 seconds (depends on **Scrape.do response) Batch of 10**: 3–5 minutes typical Large Sets (50+)**: 20–40 minutes depending on API credits & batching API Calls: 1 **Scrape.do request per keyword Reliability**: 95%+ extraction success, 98%+ data accuracy 🧩 Troubleshooting API error** → Check YOUR_SCRAPEDO_TOKEN and API credits No keywords loaded** → Verify Google Sheet ID & tab name = Keywords Permission denied** → Re-authenticate Google Sheets OAuth2 in n8n Empty results** → Check parsing logic and verify search term validity Workflow stops early** → Ensure batching loop (SplitInBatches) is properly connected 🤝 Support & Community n8n Forum: https://community.n8n.io n8n Docs: https://docs.n8n.io Scrape.do Dashboard: https://dashboard.scrape.do 🎯 Final Notes This workflow provides a repeatable foundation for extracting competitor SERP rankings with Scrape.do and saving them to Google Sheets. You can extend it with filtering, richer parsing, or integration with reporting dashboards to create a fully automated SEO intelligence pipeline.
by Jitesh Dugar
Revolutionize university admissions with intelligent AI-driven application evaluation that analyzes student profiles, calculates eligibility scores, and automatically routes decisions - saving 2.5 hours per application and reducing decision time from weeks to hours. 🎯 What This Workflow Does Transforms your admissions process from manual application review to intelligent automation: 📝 Captures Applications - Jotform intake with student info, GPA, test scores, essay, extracurriculars 🤖 AI Holistic Evaluation - OpenAI analyzes academic strength, essay quality, extracurriculars, and fit 🎯 Intelligent Scoring - Evaluates students using 40% academics, 25% extracurriculars, 20% essay, 15% fit (0-100 scale) 🚦 Smart Routing - Automatically routes based on AI evaluation: Auto-Accept (95-100)**: Acceptance letter with scholarship details → Admin alert → Database Interview Required (70-94)**: Interview invitation with scheduling link → Admin alert → Database Reject (<70)**: Respectful rejection with improvement suggestions → Database 💰 Scholarship Automation - Calculates merit scholarships ($5k-$20k+) based on eligibility score 📊 Analytics Tracking - All applications logged to Google Sheets for admissions insights ✨ Key Features AI Holistic Evaluation: Comprehensive analysis weighing academics, extracurriculars, essays, and institutional fit Intelligent Scoring System: 0-100 eligibility score with automated categorization and scholarship determination Structured Output: Consistent JSON schema with academic strength, admission likelihood, and decision reasoning Automated Communication: Personalized acceptance, interview, and rejection letters for every applicant Fallback Scoring: Manual GPA/SAT scoring if AI fails - ensures zero downtime Admin Alerts: Instant email notifications for exceptional high-scoring applicants (95+) Comprehensive Analytics: Track acceptance rates, average scores, scholarship distribution, and applicant demographics Customizable Criteria: Easy prompt editing to match your institution's values and requirements 💼 Perfect For Universities & Colleges: Processing 500+ undergraduate applications per semester Graduate Programs: Screening master's and PhD applications with consistent evaluation Private Institutions: Scaling admissions without expanding admissions staff Community Colleges: Handling high-volume transfer and new student applications International Offices: Evaluating global applicants 24/7 across all timezones Scholarship Committees: Identifying merit scholarship candidates automatically 🔧 What You'll Need Required Integrations Jotform - Application form with student data collection (free tier works) Create your form for free on Jotform using this link Create your application form with fields: Name, Email, Phone, GPA, SAT Score, Major, Essay, Extracurriculars OpenAI API - GPT-4o-mini for cost-effective AI evaluation (~$0.01-0.05 per application) Gmail - Automated applicant communication (acceptance, interview, rejection letters) Google Sheets - Application database and admissions analytics Optional Integrations Slack - Real-time alerts for exceptional applicants Calendar APIs - Automated interview scheduling Student Information System (SIS) - Push accepted students to enrollment system Document Analysis Tools - OCR for transcript verification 🚀 Quick Start Import Template - Copy JSON and import into n8n (requires LangChain support) Create Jotform - Use provided field structure (Name, Email, GPA, SAT, Major, Essay, etc.) Add API Keys - OpenAI, Jotform, Gmail OAuth2, Google Sheets Customize AI Prompt - Edit admissions criteria with your university's specific requirements and values Set Score Thresholds - Adjust auto-accept (95+), interview (70-94), reject (<70) cutoffs if needed Personalize Emails - Update templates with your university branding, dates, and contact info Create Google Sheet - Set up columns: id, Name, Email, GPA, SAT Score, Major, Essay, Extracurriculars Test & Deploy - Submit test application with pinned data and verify all nodes execute correctly 🎨 Customization Options Adjust Evaluation Weights: Change academics (40%), extracurriculars (25%), essay (20%), fit (15%) percentages Multiple Programs: Clone workflow for different majors with unique evaluation criteria Add Document Analysis: Integrate OCR for transcript and recommendation letter verification Interview Scheduling: Connect Google Calendar or Calendly for automated booking SIS Integration: Push accepted students directly to Banner, Ellucian, or PeopleSoft Waitlist Management: Add conditional routing for borderline scores (65-69) Diversity Tracking: Include demographic fields and bias detection in AI evaluation Financial Aid Integration: Automatically calculate need-based aid eligibility alongside merit scholarships 📈 Expected Results 90% reduction in manual application review time (from 2.5 hours to 15 minutes per application) 24-48 hour decision turnaround time vs 4-6 weeks traditional process 40% higher yield rate - faster responses increase enrollment commitment 100% consistency - every applicant evaluated with identical criteria Zero missed applications - automated tracking ensures no application falls through cracks Data-driven admissions - comprehensive analytics on applicant pools and acceptance patterns Better applicant experience - professional, timely communication regardless of decision Defensible decisions - documented scoring criteria for accreditation and compliance 🏆 Use Cases Large Public Universities Screen 5,000+ applications per semester, identify top 20% for auto-admit, route borderline to committee review. Selective Private Colleges Evaluate 500+ highly competitive applications, calculate merit scholarships automatically, schedule interviews with top candidates. Graduate Programs Process master's and PhD applications with research experience weighting, flag candidates for faculty review, automate fellowship awards. Community Colleges Handle high-volume open enrollment while identifying honors program candidates and scholarship recipients instantly. International Admissions Evaluate global applicants 24/7, account for different GPA scales and testing systems, respond same-day regardless of timezone. Rolling Admissions Provide instant decisions for early applicants, fill classes strategically, optimize scholarship budget allocation. 💡 Pro Tips Calibrate Your AI: After 100+ applications, refine evaluation criteria based on enrolled student success A/B Test Thresholds: Experiment with score cutoffs (e.g., 93 vs 95 for auto-admit) to optimize yield Build Waitlist Pipeline: Keep 70-84 score candidates engaged for spring enrollment or next year Track Source Effectiveness: Add UTM parameters to measure which recruiting channels deliver best students Committee Review: Route 85-94 scores to human admissions committee for final review Bias Audits: Quarterly review of AI decisions by demographic groups to ensure fairness Parent Communication: Add parent/guardian emails for admitted students under 18 Financial Aid Coordination: Sync scholarship awards with financial aid office for packaging 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output** - LangChain integration for consistent JSON responses Multi-stage conditional routing** - IF nodes for three-tier decision logic Holistic evaluation** - Weighted scoring across multiple dimensions Automated communication** - HTML email templates with dynamic content Real-time notifications** - Admin alerts for high-value applicants Analytics and data logging** - Google Sheets integration for reporting Fallback mechanisms** - Manual scoring when AI unavailable Perfect for learning advanced n8n automation patterns in educational technology! 🔐 Compliance & Ethics FERPA Compliance: Protects student data with secure credential handling Fair Admissions: Documented criteria eliminate unconscious bias Human Oversight: Committee review option for borderline cases Transparency: Applicants can request evaluation criteria Appeals Process: Structured workflow for decision reconsideration Data Retention: Configurable Google Sheets retention policies 📊 What Gets Tracked Application submission date and time Complete student profile (GPA, test scores, major, essay, activities) AI eligibility score (0-100) and decision category Academic strength rating (excellent/strong/average) Scholarship eligibility and amount ($0-$20,000+) Admission likelihood (high/medium/low) Decision outcome (accepted/interview/rejected) Email delivery status and open rates Time from application to decision Ready to transform your admissions process? Import this template and start evaluating applications intelligently in under 1 hour. Questions or customization needs? The workflow includes detailed sticky notes explaining each section and comprehensive fallback logic for reliability.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🌤️ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. ⚡ What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system 🎯 Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work 🛠️ Key Features 🔄 Automated Scheduling**: Runs every 6 hours with manual trigger option 📊 Comprehensive Reports**: Current conditions + 6-hour detailed forecasts ⚠️ Smart Warnings**: Contextual alerts for temperature extremes and rain probability 🎨 Beautiful Formatting**: Rich Telegram messages with emojis and structured data 🔧 Error Recovery**: Automatic error handling with notification system 📍 Location-Aware**: Supports any Indonesian location via BMKG regional codes 📋 What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends 🚀 Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area 💡 Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation 🌟 Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Automate With Marc
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Drop an image into Google Drive and let this workflow handle the rest: it auto-cleans and enhances the image with Google’s Nano Banana (via Wavespeed API), generates a catchy caption with GPT-5, and publishes directly to your connected social accounts using Postiz. 👉 Watch step-by-step video tutorials of workflows like these on *https://www.youtube.com/watch?v=4wk6PYgBtBM&list=PL05w1TE8X3bb1H9lXBqUy98zmTrrPP-s1* What it does Triggers from Google Drive when a new image is uploaded Sends the image to Nano Banana to declutter, brighten, and make it real-estate/photo-listing ready Polls for the edited result until it’s complete Logs the edited image URL into Google Sheets for tracking Downloads and uploads the edited image into Postiz media library Generates an engaging caption with GPT-5 Caption Agent Publishes instantly to Instagram (can be extended to TikTok, LinkedIn, etc.) Perfect for Real-estate agents posting property shots Ecommerce sellers updating product catalogs Social media marketers needing fast, polished posts Apps & Services Tools Used Google Drive (Trigger) Wavespeed API – Google Nano Banana (Image editing) Google Sheets (Logging) Postiz (Social scheduling/posting) OpenAI GPT-5 (Caption agent) Setup Connect your Google Drive and select the upload folder. Add your Wavespeed API key for Nano Banana. Connect Google Sheets for logging. Add Postiz API credentials and set the integration ID for your channel(s). Enter your OpenAI API key for GPT-5 captioning. Customization Adjust the edit prompt for different use cases (e.g., product cleanup, lighting tweaks). Change Postiz post type to scheduled instead of “now.” Add more Postiz posts for multi-platform publishing. Insert an approval loop (Slack/Email) before posting. Logs Edited Image Log (Sheets): stores final image URL + timestamp. Publishing Log (Sheets): tracks workflow status per asset. Notes Sticky notes in the template explain each major block. Replace sample IDs with your own (folder IDs, sheet IDs, Postiz integration). Keep all API keys in n8n Credentials, not in node parameters.
by Onur
🏠 Extract Zillow Property Data to Google Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts property listing data from Zillow URLs using Scrape.do web scraping API, parses key property information, and saves structured results into Google Sheets for real estate analysis, market research, and property tracking. 📋 Overview This workflow provides a lightweight real estate data extraction solution that pulls property details from Zillow listings and organizes them into a structured spreadsheet. Ideal for real estate professionals, investors, market analysts, and property managers who need automated property data collection without manual effort. Who is this for? Real estate investors tracking properties Market analysts conducting property research Real estate agents monitoring listings Property managers organizing data Data analysts building real estate databases What problem does this workflow solve? Eliminates manual copy-paste from Zillow Processes multiple property URLs in bulk Extracts structured data (price, address, zestimate, etc.) Automates saving results into Google Sheets Ensures repeatable & consistent data collection ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Read Zillow URLs from Google Sheets → Reads property URLs from a Google Sheet Scrape Zillow URL via Scrape.do → Fetches full HTML from Zillow (bypasses PerimeterX protection) Parse Zillow Data → Extracts structured property information from HTML Write Results to Google Sheets → Saves parsed data into a results sheet 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | URL | Original Zillow listing URL | https://www.zillow.com/homedetails/... | | Price | Property listing price | $300,000 | | Address | Street address | 8926 Silver City | | City | City name | San Antonio | | State | State abbreviation | TX | | Days on Zillow | How long listed | 5 | | Zestimate | Zillow's estimated value | $297,800 | | Scraped At | Timestamp of extraction | 2025-01-29T12:00:00.000Z | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do account with API token (Get 1000 free credits/month) Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Sheet1" | Column | Type | Description | Example | |--------|------|-------------|---------| | URLs | URL | Zillow listing URL | https://www.zillow.com/homedetails/123... | Output Tab: "Results" | Column | Type | Description | Example | |--------|------|-------------|---------| | URL | URL | Original listing URL | https://www.zillow.com/homedetails/... | | Price | Text | Property price | $300,000 | | Address | Text | Street address | 8926 Silver City | | City | Text | City name | San Antonio | | State | Text | State code | TX | | Days on Zillow | Number | Days listed | 5 | | Zestimate | Text | Estimated value | $297,800 | | Scraped At | Timestamp | When scraped | 2025-01-29T12:00:00.000Z | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure Scrape.do API: Sign up at Scrape.do Dashboard Get your API token In HTTP Request node, replace YOUR_SCRAPE_DO_TOKEN with your actual token The workflow uses super=true for premium residential proxies (10 credits per request) Configure Google Sheets: Create a new Google Sheet Add two tabs: "Sheet1" (input) and "Results" (output) In Sheet1, add header "URLs" in cell A1 Add Zillow URLs starting from A2 Set up Google Sheets OAuth2 credentials in n8n Replace YOUR_SPREADSHEET_ID with your actual Google Sheet ID Replace YOUR_GOOGLE_SHEETS_CREDENTIAL_ID with your credential ID Run & Test: Add 1-2 test Zillow URLs in Sheet1 Click "Execute workflow" Check results in Results tab 🧰 How to Customize Add more fields**: Extend parsing logic in "Parse Zillow Data" node to capture additional data (bedrooms, bathrooms, square footage) Filtering**: Add conditions to skip certain properties or price ranges Rate Limiting**: Insert a Wait node between requests if processing many URLs Error Handling**: Add error branches to handle failed scrapes gracefully Scheduling**: Replace Manual Trigger with Schedule Trigger for automated daily/weekly runs 📊 Use Cases Investment Analysis**: Track property prices and zestimates over time Market Research**: Analyze listing trends in specific neighborhoods Portfolio Management**: Monitor properties for sale in target areas Competitive Analysis**: Compare similar properties across locations Lead Generation**: Build databases of properties matching specific criteria 📈 Performance & Limits Single Property**: ~5-10 seconds per URL Batch of 10**: 1-2 minutes typical Large Sets (50+)**: 5-10 minutes depending on Scrape.do credits API Calls**: 1 Scrape.do request per URL (10 credits with super=true) Reliability**: 95%+ success rate with premium proxies 🧩 Troubleshooting | Problem | Solution | |---------|----------| | API error 400 | Check your Scrape.do token and credits | | URL showing "undefined" | Verify Google Sheet column name is "URLs" (capital U) | | No data parsed | Check if Zillow changed their HTML structure | | Permission denied | Re-authenticate Google Sheets OAuth2 in n8n | | 50000 character error | Verify Parse Zillow Data code is extracting fields, not returning raw HTML | | Price shows HTML/CSS | Update price extraction regex in Parse Zillow Data node | 🤝 Support & Community Scrape.do Documentation Scrape.do Dashboard Scrape.do Zillow Scraping Guide n8n Forum n8n Docs 🎯 Final Notes This workflow provides a repeatable foundation for extracting Zillow property data with Scrape.do and saving to Google Sheets. You can extend it with: Historical tracking (append timestamps) Price change alerts (compare with previous scrapes) Multi-platform scraping (Redfin, Realtor.com) Integration with CRM or reporting dashboards Important: Scrape.do handles all anti-bot bypassing (PerimeterX, CAPTCHAs) automatically with rotating residential proxies, so you only pay for successful requests. Always use super=true parameter for Zillow to ensure high success rates.
by WeblineIndia
Send daily applicant digest by role from Gmail to hiring managers with Google Gemini This workflow automatically collects all new job application emails from your Gmail labeled as applicants in the last 24 hours. Every day at 6:00 PM (Asia/Kolkata), it extracts structured details (name, email, phone, role, experience, skills, location, notice, summary) from each applicant (using Gemini AI or OpenAI). It then groups applicants by role and manager, compiles a neat HTML table digest for each manager and emails them a single summary — so hiring managers get everything they need, at a glance, in one place. Who’s It For Recruiters and hiring managers tired of digging through multiple application threads. Small HR teams / agencies not yet on a full applicant tracking system. Anyone wanting a consolidated, role-targeted applicant update each day. Teams that want to automate candidate triage using Google Workspace and AI. How It Works Schedule Trigger (6PM IST): Runs automatically at 18:00 India time. Fetch Applicant Emails: Reads Gmail for emails labeled 'applicants' from the past 24 hours. Prepare Email Text: Converts email content to plain text for reliable AI extraction. Extract Applicant Details: Gemini/OpenAI extracts applicant’s info in structured JSON. Assign Manager Emails: Routes each applicant to the correct manager via role→email mapping or fallback. Group & Build HTML Tables: Organizes applicants by manager and role, builds summary tables. Send Digest to Managers: Sends each manager one HTML summary email for their new applicants. How to Set Up Create/verify Gmail label applicants and set up filters to route job emails there. Import the workflow: Use your Google/Gmail and Gemini/OpenAI accounts as credentials. Configure connections: Gmail with OAuth2 (IMAP not required, uses Gmail API) Gemini or OpenAI API key for extraction Set role→manager mapping in the “Assign Manager Emails” node (just edit the map!). Adjust time / defaults: Edit schedule and fallback email if you wish. Test it: Send yourself a test application, label it, check workflow logs. Requirements Gmail account (with OAuth2 enabled and 'applicants' label set up) Gemini or OpenAI API key for structured AI extraction n8n instance (self-hosted or cloud) SMTP credentials (if using direct email instead of Gmail node) At least one valid hiring manager email mapped to a role How to Customize the Workflow Centralize config with a Set node (label name, fallback/manager email, model name, schedule). Add attachment-to-text conversion for applications with resume attachments. Normalize role names in the mapping code for more robust routing. Enable additional delivery: Slack, Teams, Google Sheets log, extra Cron for mid-day urgents. Refine AI extraction prompt for specific fields (add portfolio URL, etc.). Change schedule for daily, weekly or per-role timing. Add‑Ons / Extensions Resume Text Extraction:** Add PDF/DOCX to text parsing for attachment-only applications. ChatOps:** Send the summary to Slack or Teams channels along with/instead of email. Applicant Logging:** Auto-log every applicant/action into Google Sheets, Notion or Airtable. Multi-timezone:** Duplicate/modify the Cron trigger for different manager regions or urgency levels. Use Case Examples Tech Hiring:** Java, Python, Frontend candidates are automatically routed to their respective leads. Small Agency:** All applications summarized for reviewers, with per-role breakdowns. HR Operations:** Daily rollups sent before hiring sync, facilitating fast decision-making. Common Troubleshooting | Issue | Possible Cause | Solution | |-----------------------------------------|----------------------------------------------------------|-------------------------------------------------------------| | No emails processed | No 'applicants' label or wrong time window | Check Gmail filters and adjust search query in fetch node | | All digests go to fallback manager | Incorrect or missing role → manager mapping | Normalize role text in assignment node, expand map | | AI Extraction returns bad/missing JSON | Wrong prompt, high temperature or missing field names | Tighten prompt, lower temperature, check example response | | Duplicate/Old Emails appear | Date filter not correct | Use 'newer_than:1d' and keep 'mark as read' in email node | | SMTP/Gmail Send errors | Auth problem, quota or app password missing | Use OAuth2, check daily send caps and app password settings | | Blank or partially filled summary table | AI unable to parse poorly formatted/empty email | Improve sender email consistency, add fallback handling | | Attachments not processed | No attachment extraction node | Add attachment-to-text parsing before AI node | Need Help? If you get stuck, need help customizing a mapping or adding nodes or want to integrate extra steps (e.g., resume text, Slack), just ask! We're happy to guide you step by step, review your workflow, or help you troubleshoot any errors. Contact WeblineIndia — Your n8n Automation partner!