by Abdul Mir
Overview Stop spending hours formatting proposals. This workflow turns a short post-call form into a high-converting, fully-personalized PandaDoc proposal—plus updates your CRM and drafts the follow-up email for you. After a sales call, just fill out a 3-minute form summarizing key pain points, solutions pitched, and the price. The workflow uses AI to generate polished proposal copy, then builds a PandaDoc draft using dynamic data mapped into the JSON body (which you can fully customize per business). It also updates the lead record in ClickUp with the proposal link, company name, and quote—then creates an email draft in Gmail, ready to send. Who’s it for Freelancers and consultants sending service proposals Agencies closing deals over sales calls Sales reps who want to automate proposal follow-up Teams using ClickUp as their lightweight CRM How it works After a call, fill out a short form with client details, pitch notes, and price AI generates professional proposal copy based on form input Proposal is formatted and sent to PandaDoc via HTTP request ClickUp lead is updated with: Company Name Proposal URL Quote/price A Gmail draft is created using the proposal link and a thank-you message Example use case > You hop off a call, fill out: > - Prospect: Shopify agency > - Pain: No lead gen system > - Solution: Automated cold outreach > - Price: $2,500/month > > 3 minutes later: PandaDoc proposal is ready, CRM is updated, and your email draft is waiting to be sent. How to set up Replace the form with your preferred tool (e.g. Tally, Typeform) Connect PandaDoc API and structure your proposal template Customize the JSON body inside the HTTP request to match your business Link your ClickUp space and custom fields Connect Gmail (or other email tool) for final follow-up draft Requirements Form tool for capturing sales call notes OpenAI or LLM key for generating proposal copy PandaDoc API access ClickUp custom fields set up for lead tracking Gmail integration How to customize Customize your PandaDoc proposal fields in the JSON body of the HTTP node Replace ClickUp with another CRM like HubSpot or Notion Adjust AI tone (casual, premium, corporate) for proposal writing Add Slack or Telegram alerts when the draft is ready Add PDF generation or auto-send email step
by Oneclick AI Squad
This automated n8n workflow streamlines real estate marketing by combining voice campaigns and email outreach with AI-powered lead generation. The system monitors real estate offers, generates personalized promotional content using AI, creates targeted email campaigns, and manages lead follow-up through automated voice calls and CRM integration. Good to Know Integrates voice campaign automation with email marketing for multi-channel outreach Uses Llama 3.2 AI model for generating personalized promotional content Automatically syncs lead data with CRM systems for comprehensive tracking Includes delay mechanisms to ensure proper data synchronization Supports both email and voice-based lead nurturing strategies How It Works Watch Real Estate Offer** - Monitors incoming real estate listings and opportunities to trigger marketing campaigns Get Client Contact List** - Fetches targeted client information and contact details from CRM or database systems Generate Promo Content with Llama** - Uses AI to create personalized marketing content based on property details and client preferences Trigger Voice Campaign via VAPI** - Initiates automated voice calls to prospects using personalized messaging Create Personalized Email Template** - Generates custom HTML email templates with property information and promotional content Email Promo to Clients (Gmail)** - Sends targeted email campaigns to segmented client lists through Gmail integration Delay to Sync Data** - Ensures proper data synchronization between systems before processing leads Receive Lead Data from VAPI** - Captures lead information and responses from voice campaign interactions Save Lead to CRM Sheet** - Logs all lead data and campaign results to spreadsheet for tracking and analysis Send Acknowledgment to VAPI** - Confirms successful lead processing and maintains system synchronization How to Use Import workflow into n8n Configure VAPI credentials for voice campaign automation Set up Gmail API for email marketing integration Connect CRM or Google Sheets for lead management Configure Llama 3.2 AI model access Test with sample real estate data Monitor campaign performance and lead conversion rates Requirements VAPI account for voice campaigns Gmail API credentials Llama 3.2 AI model access Google Sheets or CRM integration Real estate data source Customizing This Workflow Adjust AI prompts for different property types or market segments Modify email templates for various campaign styles Configure voice campaign scripts based on target audience Set up custom lead scoring and qualification criteria Integrate additional CRM systems or marketing platforms
by Rahul Joshi
Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does 📅 Runs every 5 minutes to monitor SLA breaches 📋 Fetches lead data (status, contact, timestamps) from Google Sheets 🕒 Identifies leads marked “Un-replied” beyond the 15-minute SLA 🔗 Enriches alerts with direct Google Sheet row links for quick action 📲 Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent
by Onur
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts competitor data from Google search results for specific keywords and target countries using Scrape.do SERP API, and saves structured results into Google Sheets for SEO, competitive analysis, and market research. 📋 Overview This workflow provides a lightweight competitor analysis solution that identifies ranking websites for chosen keywords across different countries. Ideal for SEO specialists, content strategists, and digital marketers who need structured SERP insights without manual effort. Who is this for? SEO professionals tracking keyword competitors Digital marketers conducting market analysis Content strategists planning based on SERP insights Business analysts researching competitor positioning Agencies automating SEO reporting What problem does this workflow solve? Eliminates manual SERP scraping Processes multiple keywords across countries Extracts structured data (position, title, URL, description) Automates saving results into Google Sheets Ensures repeatable & consistent methodology ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Get Keywords from Sheet → Reads keywords + target countries from a Google Sheet URL Encode Keywords → Converts keywords into URL-safe format Process Keywords in Batches → Handles multiple keywords sequentially to avoid rate limits Fetch Google Search Results → Calls Scrape.do SERP API to retrieve raw HTML of Google SERPs Extract Competitor Data from HTML → Parses HTML into structured competitor data (top 10 results) Append Results to Sheet → Writes structured SERP results into a Google Sheet 📊 Output Data Points | Field | Description | Example | |--------------------|------------------------------------------|-------------------------------------------| | Keyword | Original search term | digital marketing services | | Target Country | 2-letter ISO code of target region | US | | position | Ranking position in search results | 1 | | websiteTitle | Page title from SERP result | Digital Marketing Software & Tools | | websiteUrl | Extracted website URL | https://www.hubspot.com/marketing | | websiteDescription | Snippet/description from search results | Grow your business with HubSpot’s tools… | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do* account with *SERP API token** Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Keywords" | Column | Type | Description | Example | |----------|------|-------------|---------| | Keyword | Text | Search query | digital marketing | | Target Country | Text | 2-letter ISO code | US | Output Tab: "Results" | Column | Type | Description | Example | |--------------------|-------|-------------|---------| | Keyword | Text | Original search term | digital marketing | | position | Number| SERP ranking | 1 | | websiteTitle | Text | Title of the page | Digital Marketing Software & Tools | | websiteUrl | URL | Website/page URL | https://www.hubspot.com/marketing | | websiteDescription | Text | Snippet text | Grow your business with HubSpot’s tools | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure **Scrape.do API**: Endpoint: https://api.scrape.do/ Parameter: token=YOUR_SCRAPEDO_TOKEN Add render=true for full HTML rendering Configure Google Sheets: Create a sheet with two tabs: Keywords (input), Results (output) Set up Google Sheets OAuth2 credentials in n8n Replace placeholders: YOUR_GOOGLE_SHEET_ID and YOUR_GOOGLE_SHEETS_CREDENTIAL_ID Run & Test: Add test data in Keywords tab Execute workflow → Check results in Results tab 🧰 How to Customize Add more fields**: Extend HTML parsing logic in the “Extract Competitor Data” node to capture extra data (e.g., domain, sitelinks). Filtering**: Exclude domains or results with custom rules. Batch Size**: Adjust “Process Keywords in Batches” for speed vs. rate-limits. Rate Limiting: Insert a **Wait node (e.g., 10–30 seconds) if API rate limits apply. Multi-Sheet Output**: Save per-country or per-keyword results into separate tabs. 📊 Use Cases SEO Competitor Analysis**: Identify top-ranking sites for target keywords Market Research**: See how SERPs differ by region Content Strategy**: Analyze titles & descriptions of competitor pages Agency Reporting**: Automate competitor SERP snapshots for clients 📈 Performance & Limits Single Keyword: ~10–20 seconds (depends on **Scrape.do response) Batch of 10**: 3–5 minutes typical Large Sets (50+)**: 20–40 minutes depending on API credits & batching API Calls: 1 **Scrape.do request per keyword Reliability**: 95%+ extraction success, 98%+ data accuracy 🧩 Troubleshooting API error** → Check YOUR_SCRAPEDO_TOKEN and API credits No keywords loaded** → Verify Google Sheet ID & tab name = Keywords Permission denied** → Re-authenticate Google Sheets OAuth2 in n8n Empty results** → Check parsing logic and verify search term validity Workflow stops early** → Ensure batching loop (SplitInBatches) is properly connected 🤝 Support & Community n8n Forum: https://community.n8n.io n8n Docs: https://docs.n8n.io Scrape.do Dashboard: https://dashboard.scrape.do 🎯 Final Notes This workflow provides a repeatable foundation for extracting competitor SERP rankings with Scrape.do and saving them to Google Sheets. You can extend it with filtering, richer parsing, or integration with reporting dashboards to create a fully automated SEO intelligence pipeline.
by Rahul Joshi
📘 Description This workflow automates the employee onboarding process by creating Jira accounts, generating Notion onboarding checklists, crafting AI-generated welcome messages, and sending personalized welcome emails — all automatically. It provides a complete hands-free onboarding experience for HR and IT teams by connecting Jira, Notion, Google Sheets, Gmail, and Azure OpenAI. Failures (like Jira account creation errors) are logged into Google Sheets to ensure full transparency and no missed onboardings. ⚙️ What This Workflow Does (Step-by-Step) 🟢 When Clicking “Execute Workflow” Manually triggers the entire onboarding automation. Useful for testing or initiating onboarding on demand for a new hire. 👤 Define New Hire Profile Data Structures all essential employee information into a clean dataset including name, email, start date, buddy, and access links (Slack, GitHub, Jira, Notion). Acts as the single source of truth for all downstream systems ensuring consistent, error-free onboarding data. 🎫 Create Jira User Account Automatically creates a Jira account for the new employee using REST API calls. Includes email, display name, username, and product access (Jira Software). Removes the need for manual admin setup and ensures immediate access to project boards. ✅ Validate Jira Account Creation Success: Checks if the Jira API response contains a valid accountId. If successful → continues onboarding. If failed → logs error to Google Sheets. Ensures downstream steps don’t continue if Jira setup fails. 📊 Log Jira Provisioning Failures to Error Sheet Appends any account creation errors (duplicate emails, invalid permissions, or API issues) into an “error log sheet” in Google Sheets. Helps HR/IT monitor issues and manually resolve them. Guarantees no silent onboarding failures. 📋 Generate Notion Onboarding Checklist Creates a personalized Notion page titled “{Name} - Onboarding Checklist” that includes: Welcome message Access links (Slack, GitHub, Jira) Assigned buddy details Start date and status Optionally, embedded videos or docs Gives each new hire a structured hub to manage onboarding tasks independently. 🤖 AI-Generated Welcome Message Creator Uses GPT-4o (Azure OpenAI) to craft a friendly, motivational welcome message for the new employee. Incorporates name, buddy, and access details with emojis and warm tone. Ensures every message feels human and engaging — not robotic. 🧠 GPT-4o Language Model Configuration Configures the AI assistant persona for personalized onboarding messages. Ensures tone consistency, friendliness, and empathy across all communications. 🔗 Consolidate Onboarding Data Streams Merges data from Jira, Notion, and AI message generation into a single payload. This ensures the final email contains every onboarding element — access links, checklist URL, and the AI-generated message. 📧 Format Comprehensive Welcome Email Generates a complete HTML-formatted email with: Personalized greeting AI-generated welcome message Clickable links (Jira, Notion, Slack, GitHub) Buddy info and start date Designed for mobile responsiveness and branded presentation. 📬 Send Welcome Email to New Hire Sends the final welcome email to the employee’s inbox with the subject: “Welcome to Techdome, {Name}! 🎉” Includes all essential access information, links, and team introductions — ensuring the new hire starts strong on Day 1. 🧩 Prerequisites Jira Admin API credentials Notion API integration Gmail OAuth2 credentials Azure OpenAI (GPT-4o) access Google Sheets document for logging errors 💡 Key Benefits ✅ Fully automated new hire onboarding ✅ AI-generated personalized communications ✅ Real-time error logging for IT transparency ✅ Seamless integration across Jira, Notion, and Gmail ✅ Professional first-day experience with zero manual work 👥 Perfect For HR teams managing multiple onboardings IT admins automating access provisioning Startups scaling employee onboarding Organizations using Jira + Notion + Gmail stack
by n8n Automation Expert | Template Creator | 2+ Years Experience
🌤️ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. ⚡ What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system 🎯 Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work 🛠️ Key Features 🔄 Automated Scheduling**: Runs every 6 hours with manual trigger option 📊 Comprehensive Reports**: Current conditions + 6-hour detailed forecasts ⚠️ Smart Warnings**: Contextual alerts for temperature extremes and rain probability 🎨 Beautiful Formatting**: Rich Telegram messages with emojis and structured data 🔧 Error Recovery**: Automatic error handling with notification system 📍 Location-Aware**: Supports any Indonesian location via BMKG regional codes 📋 What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends 🚀 Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area 💡 Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation 🌟 Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Automate With Marc
Auto-Edit Google Drive Images with Nano Banana + Social Auto-Post Drop an image into Google Drive and let this workflow handle the rest: it auto-cleans and enhances the image with Google’s Nano Banana (via Wavespeed API), generates a catchy caption with GPT-5, and publishes directly to your connected social accounts using Postiz. 👉 Watch step-by-step video tutorials of workflows like these on *https://www.youtube.com/watch?v=4wk6PYgBtBM&list=PL05w1TE8X3bb1H9lXBqUy98zmTrrPP-s1* What it does Triggers from Google Drive when a new image is uploaded Sends the image to Nano Banana to declutter, brighten, and make it real-estate/photo-listing ready Polls for the edited result until it’s complete Logs the edited image URL into Google Sheets for tracking Downloads and uploads the edited image into Postiz media library Generates an engaging caption with GPT-5 Caption Agent Publishes instantly to Instagram (can be extended to TikTok, LinkedIn, etc.) Perfect for Real-estate agents posting property shots Ecommerce sellers updating product catalogs Social media marketers needing fast, polished posts Apps & Services Tools Used Google Drive (Trigger) Wavespeed API – Google Nano Banana (Image editing) Google Sheets (Logging) Postiz (Social scheduling/posting) OpenAI GPT-5 (Caption agent) Setup Connect your Google Drive and select the upload folder. Add your Wavespeed API key for Nano Banana. Connect Google Sheets for logging. Add Postiz API credentials and set the integration ID for your channel(s). Enter your OpenAI API key for GPT-5 captioning. Customization Adjust the edit prompt for different use cases (e.g., product cleanup, lighting tweaks). Change Postiz post type to scheduled instead of “now.” Add more Postiz posts for multi-platform publishing. Insert an approval loop (Slack/Email) before posting. Logs Edited Image Log (Sheets): stores final image URL + timestamp. Publishing Log (Sheets): tracks workflow status per asset. Notes Sticky notes in the template explain each major block. Replace sample IDs with your own (folder IDs, sheet IDs, Postiz integration). Keep all API keys in n8n Credentials, not in node parameters.
by Aitor | 1Node
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the distribution and scheduling of video content across multiple social platforms (TikTok, YouTube, Facebook, Instagram, Threads) through Postiz. Videos are collected from Google Drive, approved manually, and scheduled via the Postiz community node. 🧾 Requirements Google Drive** account with access to the folder that will watch for new items uploaded. videos in mp4 format ready to be shared or, alternatively you can connect a community node from Cloud Convert to convert the format before uploading into Postiz. Postiz account with integrations for TikTok, YouTube, Facebook, Instagram, and Threads 🔗 Useful Links Postiz Docs Postiz Community Node 🔄 Workflow Steps Trigger: Google Drive File Added Watches your selected Google Drive folder for new file uploads. Download File Downloads the detected video from Drive. Upload to Postiz Video is uploaded to Postiz to prepare for social scheduling. Set Fields Manual setting of social options Extract Datetime (AI) Uses OpenAI to find/predict intended publish date & time, as the datetime format is required to schedule on Postiz Get Social Integrations Fetches a list of user’s connected platforms from Postiz. Split and Filter Integrations Splits the process per platform (TikTok, YouTube, Facebook, Instagram, Threads). Schedule Post For each enabled platform, schedules the video with chosen options. 🙋♂️ Need Help? Connect with 1 Node
by vinci-king-01
Daily Stock Regulatory News Aggregator with Compliance Alerts and Google Sheets Tracking 🎯 Target Audience Compliance officers and regulatory teams Financial services firms monitoring regulatory updates Investment advisors tracking regulatory changes Risk management professionals Corporate legal departments Stock traders and analysts monitoring regulatory news 🚀 Problem Statement Manually monitoring regulatory updates from multiple agencies (SEC, FINRA, ESMA) is time-consuming and error-prone. This template automates daily regulatory news monitoring, aggregates updates from major regulatory bodies, filters for recent announcements, and instantly alerts compliance teams to critical regulatory changes, enabling timely responses and maintaining regulatory compliance. 🔧 How it Works This workflow automatically monitors regulatory news daily, scrapes the latest updates from major regulatory agencies using AI-powered web scraping, filters for updates from the last 24 hours, and sends Slack alerts while logging all updates to Google Sheets for historical tracking. Key Components Daily Schedule Trigger - Automatically runs the workflow every 24 hours to check for regulatory updates Regulatory Sources Configuration - Defines the list of regulatory agencies and their URLs to monitor (SEC, FINRA, ESMA) Batch Processing - Iterates through regulatory sources one at a time for reliable processing AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract regulatory updates including title, summary, date, agency, and source URL Data Flattening - Transforms scraped data structure into individual update records Time Filtering - Filters updates to keep only those from the last 24 hours Historical Tracking - Logs all filtered updates to Google Sheets for compliance records Compliance Alerts - Sends Slack notifications to compliance teams when new regulatory updates are detected 💰 Key Features Automated Regulatory Monitoring Daily Execution**: Runs automatically every 24 hours without manual intervention Multi-Agency Support**: Monitors SEC, FINRA, and ESMA simultaneously Error Handling**: Gracefully handles scraping errors and continues processing other sources Smart Filtering Time-Based Filtering**: Automatically filters updates to show only those from the last 24 hours Date Validation**: Discards updates with unreadable or invalid dates Recent Updates Focus**: Ensures compliance teams only receive actionable, timely information Alert System Compliance Alerts**: Instant Slack notifications for new regulatory updates Structured Data**: Alerts include title, summary, date, agency, and source URL Dedicated Channel**: Posts to designated compliance alerts channel for team visibility 📊 Output Specifications The workflow generates and stores structured data including: | Output Type | Format | Description | Example | |-------------|--------|-------------|---------| | Regulatory Updates | JSON Object | Extracted regulatory update information | {"title": "SEC Announces New Rule", "date": "2024-01-15", "agency": "SEC"} | | Update History | Google Sheets | Historical regulatory update records with timestamps | Columns: Title, Summary, Date, Agency, Source URL, Scraped At | | Slack Alerts | Messages | Compliance notifications for new updates | "📢 New SEC update: [Title] - [Summary]" | | Error Logs | System Logs | Scraping error notifications | "❌ Error scraping FINRA updates" | 🛠️ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets API access (OAuth2) Slack workspace with API access Google Sheets spreadsheet for regulatory update tracking Step-by-Step Configuration 1. Install Community Nodes Install ScrapeGraphAI community node npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Authorize access to your Google account Create or identify the spreadsheet for regulatory update tracking Note the spreadsheet ID and sheet name (default: "RegUpdates") 4. Configure Slack Integration Add Slack API credentials to your n8n instance Create or identify Slack channel: #compliance-alerts Test Slack connection with a sample message Ensure the bot has permission to post messages 5. Customize Regulatory Sources Open the "Regulatory Sources" Code node Update the urls array with additional regulatory sources if needed: const urls = [ 'https://www.sec.gov/news/pressreleases', 'https://www.finra.org/rules-guidance/notices', 'https://www.esma.europa.eu/press-news', // Add more URLs as needed ]; 6. Configure Google Sheets Update documentId in "Log to Google Sheets" node with your spreadsheet ID Update sheetName to match your sheet name (default: "RegUpdates") Ensure the sheet has columns: Title, Summary, Date, Agency, Source URL, Scraped At Create the sheet with proper column headers if starting fresh 7. Customize Slack Channel Open "Send Compliance Alert" Slack node Update the channel name (default: "#compliance-alerts") Customize the message format if needed Test with a sample message 8. Adjust Schedule Open "Daily Regulatory Poll" Schedule Trigger Modify hoursInterval to change frequency (default: 24 hours) Set specific times if needed for daily execution 9. Customize Scraping Prompt Open "Scrape Regulatory Updates" ScrapeGraphAI node Adjust the userPrompt to extract different or additional fields Modify the JSON schema in the prompt if needed Change the number of updates extracted (default: 5 most recent) 10. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for data structure and format Verify Slack alerts are working correctly Test error handling with invalid URLs Validate date filtering is working properly 🔄 Workflow Customization Options Modify Monitoring Frequency Change hoursInterval in Schedule Trigger for different frequencies Switch to multiple times per day for critical monitoring Add multiple schedule triggers for different agency checks Extend Data Collection Modify ScrapeGraphAI prompt to extract additional fields (documents, categories, impact level) Add data enrichment nodes for risk assessment Integrate with regulatory databases for more comprehensive tracking Add sentiment analysis for regulatory updates Enhance Alert System Add email notifications alongside Slack alerts Create different alert channels for different agencies Add priority-based alerting based on update keywords Integrate with SMS or push notification services Add webhook integrations for other compliance tools Advanced Analytics Add data visualization nodes for regulatory trend analysis Create automated compliance reports with summaries Integrate with business intelligence tools Add machine learning for update categorization Track regulatory themes and topics over time Multi-Source Support Add support for additional regulatory agencies Implement agency-specific scraping strategies Add regional regulatory sources (FCA, BaFin, etc.) Include state-level regulatory updates 📈 Use Cases Compliance Monitoring**: Automatically track regulatory updates to ensure timely compliance responses Risk Management**: Monitor regulatory changes that may impact business operations or investments Regulatory Intelligence**: Build historical databases of regulatory announcements for trend analysis Client Communication**: Stay informed to provide timely updates to clients about regulatory changes Legal Research**: Track regulatory developments for legal research and case preparation Investment Strategy**: Monitor regulatory changes that may affect investment decisions 🚨 Important Notes Respect website terms of service and rate limits when scraping regulatory sites Monitor ScrapeGraphAI API usage to manage costs Ensure Google Sheets has proper column structure before first run Set up Slack channel before running the workflow Consider implementing rate limiting for multiple regulatory sources Keep credentials secure and rotate them regularly Test with one regulatory source first before adding multiple sources Verify date formats are consistent across different regulatory agencies Be aware that some regulatory sites may have anti-scraping measures 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets logging failures: Check spreadsheet ID, sheet name, and column structure Slack notification failures: Verify channel name exists and bot has permissions Date filtering issues: Ensure dates from scraped content are in a parseable format Validation errors: Check that scraped data matches expected schema Empty results: Verify regulatory sites are accessible and haven't changed structure Optimization Tips: Start with one regulatory source to test the workflow Monitor API usage and costs regularly Use batch processing to avoid overwhelming scraping services Implement retry logic for failed scraping attempts Consider caching mechanisms for frequently checked sources Adjust the number of updates extracted based on typical volume Support Resources: ScrapeGraphAI documentation and API reference Google Sheets API documentation Slack API documentation for webhooks n8n community forums for workflow assistance n8n documentation for node configuration SEC, FINRA, and ESMA official websites for source verification
by Omer Fayyaz
This n8n template implements a Calendly Booking Link Generator that creates single-use, personalized booking links, logs them to Google Sheets, and optionally notifies a Slack channel Who's it for This template is designed for teams and businesses that send Calendly links proactively and want to generate trackable, single-use booking links on demand. It’s perfect for: Sales and SDR teams** sending 1:1 outreach and needing unique booking links per prospect Customer success and support teams** who want prefilled, one-click rescheduling or follow-up links Marketing and growth teams** that want UTM-tagged booking links for campaigns Ops/RevOps** who need a central log of every generated link for tracking and reporting How it works / What it does This workflow turns a simple HTTP request into a fully configured single-use Calendly booking link: Webhook Trigger (POST) Receives JSON payload with recipient details: name, email, optional event_type_uri, optional utm_source Configuration & Input Normalization Set Configuration extracts and normalizes: recipient_name, recipient_email requested_event_type (can be empty) utm_source (defaults to "n8n" if not provided) Calendly API – User & Event Types Get Current User calls GET /users/me using Calendly OAuth2 to get the current user URI Extract User stores user_uri and user_name Get Event Types calls GET /event_types?user={user_uri}&active=true to fetch active event types Select Event Type: Uses requested_event_type if provided, otherwise selects the first active event type Stores event type URI, name, and duration (minutes) Create Calendly Single-Use Scheduling Link Create Single-Use Link calls POST /scheduling_links with: owner: selected event type URI owner_type: "EventType" max_event_count: 1 (single use) Build Personalized Booking URL Build Personalized Link: Reads the base booking_url from Calendly Appends query parameters to prefill: name (encoded) email (encoded) utm_source Stores: base_booking_url personalized_booking_url recipient_name, recipient_email event_type_name, event_duration link_created_at (ISO timestamp) Optional Logging and Notifications Log to Google Sheets (optional but preconfigured): Appends each generated link to a “Generated Links” sheet Columns: Recipient Name, Recipient Email, Event Type, Duration (min), Booking URL, Created At, Status Notify via Slack (optional): Posts a nicely formatted Slack message with: recipient name & email event name & duration clickable booking link API Response to Caller Respond to Webhook returns a structured JSON response: success booking_url (personalized) base_url recipient object event object (name + duration) created_at expires explanation ("Single-use or 90 days") The result is an API-style service you can call from any system to generate trackable, single-use Calendly links. How to set up 1. Calendly OAuth2 setup Go to calendly.com/integrations or developer.calendly.com Create an OAuth2 application (or use an existing one) In n8n, create Calendly OAuth2 credentials: Add client ID, client secret, and redirect URL as required by Calendly Connect your Calendly user account In the workflow, make sure all Calendly HTTP Request nodes use your Calendly OAuth2 credential 2. Webhook Trigger configuration Open the Webhook Trigger node Confirm: HTTP Method: POST Path: generate-calendly-link Response Mode: Response Node (points to Respond to Webhook) Copy the Production URL from the node once the workflow is active Use this URL as the endpoint for your CRM, outbound tool, or any system that needs to request links Expected request body: { "name": "John Doe", "email": "john@example.com", "event_type_uri": "optional", "utm_source": "optional" } If event_type_uri is not provided, the workflow automatically uses the first active event type for the current Calendly user. 3. Google Sheets setup (optional but recommended) Create a Google Sheet for tracking links Add a sheet/tab named e.g. “Generated Links” Set the header row to: Recipient Name, Recipient Email, Event Type, Duration (min), Booking URL, Created At, Status In n8n: Create Google Sheets OAuth2 credentials Open the Log to Google Sheets node Update: documentId → your spreadsheet ID sheetName → your tab name (e.g. “Generated Links”) 4. Slack notification setup (optional) Create a Slack app at api.slack.com Add Bot Token scopes (for basic posting): chat:write channels:read (or groups:read if posting to private channels) Install the app to your workspace and get the Bot User OAuth Token In n8n: Create a Slack API credential using the bot token Open the Notify via Slack node Select your credential Set: select: channel channelId: your desired channel (e.g. #sales or #booking-links) 5. Test the workflow end-to-end Activate the workflow Use Postman, curl, or another system to POST to the webhook URL, e.g.: { "name": "Test User", "email": "test@example.com" } Verify: The HTTP response contains a valid booking_url A new row is added to your Google Sheet (if configured) A Slack notification is posted (if configured) Requirements Calendly account* with at least one *active event type** n8n instance** (cloud or self-hosted) with public access for the webhook Calendly OAuth2 credentials** configured in n8n (Optional) Google Sheets account and OAuth2 credentials (Optional) Slack workspace with permissions to install a bot and post to channels How to customize the workflow Input & validation Update the Set Configuration node to: Enforce required fields (e.g. fail if email is missing) Add more optional parameters (e.g. utm_campaign, utm_medium, language) Add an IF node after the Webhook Trigger for stricter validation and custom error responses Event type selection logic In Select Event Type: Change the fallback selection rule (e.g. pick the longest or shortest duration event) Add logic to map a custom field (like event_key) to specific event type URIs Link parameters & tracking In Build Personalized Link: Add additional query parameters (e.g. utm_campaign, source, segment) Remove or rename existing parameters if needed If you don’t want prefilled name/email, remove those query parameters and just keep tracking fields Google Sheets logging Extend the Log to Google Sheets mapping to include: utm_source or other marketing attributes Sales owner, campaign name, or pipeline stage Any additional fields you compute in previous nodes Slack notification formatting In Notify via Slack: Adjust the message text to your team’s tone Add emojis or @mentions for certain event types Include utm_source or other metadata for debugging and tracking Key features Single-use Calendly links** – each generated link is limited to one booking (or expires after ~90 days) Prefilled recipient details** – name and email are embedded in the URL, making it frictionless to book Webhook-first design** – easily call this from CRMs, outreach tools, or any external system Central link logging** – every link is stored in Google Sheets for auditing and reporting Optional Slack alerts** – keep sales/support teams notified when new links are generated Safe error handling** – HTTP nodes are configured with continueRegularOutput to avoid hard workflow failures Example scenarios Scenario 1: Sales outreach A CRM workflow triggers when a lead moves to “Meeting Requested”. It calls this n8n webhook with the lead’s name and email. The workflow generates a single-use Calendly link, logs it to Sheets, and posts to Slack. The CRM sends an email to the lead with the personalized booking link. Scenario 2: Automated follow-up link A support ticket is resolved and the system wants to offer a follow-up call. It calls the webhook with name, email, and a dedicated event_type_uri for “Follow-up Call”. The generated link is logged and returned via API, then included in an automated email. Scenario 3: Campaign tracking A marketing automation tool triggers this webhook for each contact in a campaign, passing utm_source (e.g. q1-outbound). The workflow adds utm_source to the link and logs it in Google Sheets. Later, you can analyze which campaigns generated the most completed bookings from single-use links. This template gives you a reliable, reusable Calendly link generation service that plugs into any part of your stack, while keeping tracking, logging, and team visibility fully automated.
by Onur
🏠 Extract Zillow Property Data to Google Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts property listing data from Zillow URLs using Scrape.do web scraping API, parses key property information, and saves structured results into Google Sheets for real estate analysis, market research, and property tracking. 📋 Overview This workflow provides a lightweight real estate data extraction solution that pulls property details from Zillow listings and organizes them into a structured spreadsheet. Ideal for real estate professionals, investors, market analysts, and property managers who need automated property data collection without manual effort. Who is this for? Real estate investors tracking properties Market analysts conducting property research Real estate agents monitoring listings Property managers organizing data Data analysts building real estate databases What problem does this workflow solve? Eliminates manual copy-paste from Zillow Processes multiple property URLs in bulk Extracts structured data (price, address, zestimate, etc.) Automates saving results into Google Sheets Ensures repeatable & consistent data collection ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Read Zillow URLs from Google Sheets → Reads property URLs from a Google Sheet Scrape Zillow URL via Scrape.do → Fetches full HTML from Zillow (bypasses PerimeterX protection) Parse Zillow Data → Extracts structured property information from HTML Write Results to Google Sheets → Saves parsed data into a results sheet 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | URL | Original Zillow listing URL | https://www.zillow.com/homedetails/... | | Price | Property listing price | $300,000 | | Address | Street address | 8926 Silver City | | City | City name | San Antonio | | State | State abbreviation | TX | | Days on Zillow | How long listed | 5 | | Zestimate | Zillow's estimated value | $297,800 | | Scraped At | Timestamp of extraction | 2025-01-29T12:00:00.000Z | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do account with API token (Get 1000 free credits/month) Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Sheet1" | Column | Type | Description | Example | |--------|------|-------------|---------| | URLs | URL | Zillow listing URL | https://www.zillow.com/homedetails/123... | Output Tab: "Results" | Column | Type | Description | Example | |--------|------|-------------|---------| | URL | URL | Original listing URL | https://www.zillow.com/homedetails/... | | Price | Text | Property price | $300,000 | | Address | Text | Street address | 8926 Silver City | | City | Text | City name | San Antonio | | State | Text | State code | TX | | Days on Zillow | Number | Days listed | 5 | | Zestimate | Text | Estimated value | $297,800 | | Scraped At | Timestamp | When scraped | 2025-01-29T12:00:00.000Z | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure Scrape.do API: Sign up at Scrape.do Dashboard Get your API token In HTTP Request node, replace YOUR_SCRAPE_DO_TOKEN with your actual token The workflow uses super=true for premium residential proxies (10 credits per request) Configure Google Sheets: Create a new Google Sheet Add two tabs: "Sheet1" (input) and "Results" (output) In Sheet1, add header "URLs" in cell A1 Add Zillow URLs starting from A2 Set up Google Sheets OAuth2 credentials in n8n Replace YOUR_SPREADSHEET_ID with your actual Google Sheet ID Replace YOUR_GOOGLE_SHEETS_CREDENTIAL_ID with your credential ID Run & Test: Add 1-2 test Zillow URLs in Sheet1 Click "Execute workflow" Check results in Results tab 🧰 How to Customize Add more fields**: Extend parsing logic in "Parse Zillow Data" node to capture additional data (bedrooms, bathrooms, square footage) Filtering**: Add conditions to skip certain properties or price ranges Rate Limiting**: Insert a Wait node between requests if processing many URLs Error Handling**: Add error branches to handle failed scrapes gracefully Scheduling**: Replace Manual Trigger with Schedule Trigger for automated daily/weekly runs 📊 Use Cases Investment Analysis**: Track property prices and zestimates over time Market Research**: Analyze listing trends in specific neighborhoods Portfolio Management**: Monitor properties for sale in target areas Competitive Analysis**: Compare similar properties across locations Lead Generation**: Build databases of properties matching specific criteria 📈 Performance & Limits Single Property**: ~5-10 seconds per URL Batch of 10**: 1-2 minutes typical Large Sets (50+)**: 5-10 minutes depending on Scrape.do credits API Calls**: 1 Scrape.do request per URL (10 credits with super=true) Reliability**: 95%+ success rate with premium proxies 🧩 Troubleshooting | Problem | Solution | |---------|----------| | API error 400 | Check your Scrape.do token and credits | | URL showing "undefined" | Verify Google Sheet column name is "URLs" (capital U) | | No data parsed | Check if Zillow changed their HTML structure | | Permission denied | Re-authenticate Google Sheets OAuth2 in n8n | | 50000 character error | Verify Parse Zillow Data code is extracting fields, not returning raw HTML | | Price shows HTML/CSS | Update price extraction regex in Parse Zillow Data node | 🤝 Support & Community Scrape.do Documentation Scrape.do Dashboard Scrape.do Zillow Scraping Guide n8n Forum n8n Docs 🎯 Final Notes This workflow provides a repeatable foundation for extracting Zillow property data with Scrape.do and saving to Google Sheets. You can extend it with: Historical tracking (append timestamps) Price change alerts (compare with previous scrapes) Multi-platform scraping (Redfin, Realtor.com) Integration with CRM or reporting dashboards Important: Scrape.do handles all anti-bot bypassing (PerimeterX, CAPTCHAs) automatically with rotating residential proxies, so you only pay for successful requests. Always use super=true parameter for Zillow to ensure high success rates.
by WeblineIndia
Send daily applicant digest by role from Gmail to hiring managers with Google Gemini This workflow automatically collects all new job application emails from your Gmail labeled as applicants in the last 24 hours. Every day at 6:00 PM (Asia/Kolkata), it extracts structured details (name, email, phone, role, experience, skills, location, notice, summary) from each applicant (using Gemini AI or OpenAI). It then groups applicants by role and manager, compiles a neat HTML table digest for each manager and emails them a single summary — so hiring managers get everything they need, at a glance, in one place. Who’s It For Recruiters and hiring managers tired of digging through multiple application threads. Small HR teams / agencies not yet on a full applicant tracking system. Anyone wanting a consolidated, role-targeted applicant update each day. Teams that want to automate candidate triage using Google Workspace and AI. How It Works Schedule Trigger (6PM IST): Runs automatically at 18:00 India time. Fetch Applicant Emails: Reads Gmail for emails labeled 'applicants' from the past 24 hours. Prepare Email Text: Converts email content to plain text for reliable AI extraction. Extract Applicant Details: Gemini/OpenAI extracts applicant’s info in structured JSON. Assign Manager Emails: Routes each applicant to the correct manager via role→email mapping or fallback. Group & Build HTML Tables: Organizes applicants by manager and role, builds summary tables. Send Digest to Managers: Sends each manager one HTML summary email for their new applicants. How to Set Up Create/verify Gmail label applicants and set up filters to route job emails there. Import the workflow: Use your Google/Gmail and Gemini/OpenAI accounts as credentials. Configure connections: Gmail with OAuth2 (IMAP not required, uses Gmail API) Gemini or OpenAI API key for extraction Set role→manager mapping in the “Assign Manager Emails” node (just edit the map!). Adjust time / defaults: Edit schedule and fallback email if you wish. Test it: Send yourself a test application, label it, check workflow logs. Requirements Gmail account (with OAuth2 enabled and 'applicants' label set up) Gemini or OpenAI API key for structured AI extraction n8n instance (self-hosted or cloud) SMTP credentials (if using direct email instead of Gmail node) At least one valid hiring manager email mapped to a role How to Customize the Workflow Centralize config with a Set node (label name, fallback/manager email, model name, schedule). Add attachment-to-text conversion for applications with resume attachments. Normalize role names in the mapping code for more robust routing. Enable additional delivery: Slack, Teams, Google Sheets log, extra Cron for mid-day urgents. Refine AI extraction prompt for specific fields (add portfolio URL, etc.). Change schedule for daily, weekly or per-role timing. Add‑Ons / Extensions Resume Text Extraction:** Add PDF/DOCX to text parsing for attachment-only applications. ChatOps:** Send the summary to Slack or Teams channels along with/instead of email. Applicant Logging:** Auto-log every applicant/action into Google Sheets, Notion or Airtable. Multi-timezone:** Duplicate/modify the Cron trigger for different manager regions or urgency levels. Use Case Examples Tech Hiring:** Java, Python, Frontend candidates are automatically routed to their respective leads. Small Agency:** All applications summarized for reviewers, with per-role breakdowns. HR Operations:** Daily rollups sent before hiring sync, facilitating fast decision-making. Common Troubleshooting | Issue | Possible Cause | Solution | |-----------------------------------------|----------------------------------------------------------|-------------------------------------------------------------| | No emails processed | No 'applicants' label or wrong time window | Check Gmail filters and adjust search query in fetch node | | All digests go to fallback manager | Incorrect or missing role → manager mapping | Normalize role text in assignment node, expand map | | AI Extraction returns bad/missing JSON | Wrong prompt, high temperature or missing field names | Tighten prompt, lower temperature, check example response | | Duplicate/Old Emails appear | Date filter not correct | Use 'newer_than:1d' and keep 'mark as read' in email node | | SMTP/Gmail Send errors | Auth problem, quota or app password missing | Use OAuth2, check daily send caps and app password settings | | Blank or partially filled summary table | AI unable to parse poorly formatted/empty email | Improve sender email consistency, add fallback handling | | Attachments not processed | No attachment extraction node | Add attachment-to-text parsing before AI node | Need Help? If you get stuck, need help customizing a mapping or adding nodes or want to integrate extra steps (e.g., resume text, Slack), just ask! We're happy to guide you step by step, review your workflow, or help you troubleshoot any errors. Contact WeblineIndia — Your n8n Automation partner!