by vinci-king-01
Medical Research Tracker with Matrix and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically monitors selected government and healthcare-policy websites, extracts newly published or updated policy documents, logs them as deals in a Pipedrive pipeline, and announces critical changes in a Matrix room. It gives healthcare administrators and policy analysts a near real-time view of policy developments without manual web checks. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Active Pipedrive account with at least one pipeline Matrix account & accessible room for notifications Basic knowledge of n8n credential setup Required Credentials ScrapeGraphAI API Key** – Enables the scraping engine Pipedrive OAuth2 / API Token** – Creates & updates deals Matrix Credentials** – Homeserver URL, user, access token (or password) Specific Setup Requirements | Variable | Description | Example | |----------|-------------|---------| | POLICY_SITES | Comma-separated list of URLs to scrape | https://health.gov/policies,https://who.int/proposals | | PD_PIPELINE_ID | Pipedrive pipeline where deals are created | 5 | | PD_STAGE_ID_ALERT | Stage ID for “Review Needed” | 17 | | MATRIX_ROOM_ID | Room to send alerts (incl. leading !) | !policy:matrix.org | Edit the initial Set node to provide these values before running. How it works This workflow automatically monitors selected government and healthcare-policy websites, extracts newly published or updated policy documents, logs them as deals in a Pipedrive pipeline, and announces critical changes in a Matrix room. It gives healthcare administrators and policy analysts a near real-time view of policy developments without manual web checks. Key Steps: Scheduled Trigger**: Runs every 6 hours (configurable) to start the monitoring cycle. Code (URL List Builder)**: Generates an array from POLICY_SITES for downstream batching. SplitInBatches**: Iterates through each policy URL individually. ScrapeGraphAI**: Scrapes page titles, publication dates, and summary paragraphs. If (New vs Existing)**: Compares scraped hash with last run; continues only for fresh content. Merge (Aggregate Results)**: Collects all “new” policies into a single payload. Set (Deal Formatter)**: Maps scraped data to Pipedrive deal fields. Pipedrive Node**: Creates or updates a deal per policy item. Matrix Node**: Posts a formatted alert message in the specified Matrix room. Set up steps Setup Time: 15-20 minutes Install Community Node – In n8n, go to Settings → Community Nodes → Install and search for ScrapeGraphAI. Add Credentials – Create New credentials for ScrapeGraphAI, Pipedrive, and Matrix under Credentials. Configure Environment Variables – Open the Set (Initial Config) node and replace placeholders (POLICY_SITES, PD_PIPELINE_ID, etc.) with your values. Review Schedule – Double-click the Schedule Trigger node to adjust the interval if needed. Activate Workflow – Click Activate. The workflow will run at the next scheduled interval. Verify Outputs – Check Pipedrive for new deals and the Matrix room for alert messages after the first run. Node Descriptions Core Workflow Nodes: stickyNote** – Provides an at-a-glance description of the workflow logic directly on the canvas. scheduleTrigger** – Fires the workflow periodically (default 6 hours). code (URL List Builder)** – Splits the POLICY_SITES variable into an array. splitInBatches** – Ensures each URL is processed individually to avoid timeouts. scrapegraphAi** – Parses HTML and extracts policy metadata using XPath/CSS selectors. if (New vs Existing)** – Uses hashing to ignore unchanged pages. merge** – Combines all new items so they can be processed in bulk. set (Deal Formatter)** – Maps scraped fields to Pipedrive deal properties. matrix** – Sends formatted messages to a Matrix room for team visibility. pipedrive** – Creates or updates deals representing each policy update. Data Flow: scheduleTrigger → code → splitInBatches → scrapegraphAi → if → merge → set → pipedrive → matrix Customization Examples 1. Add another data field (e.g., policy author) // Inside ScrapeGraphAI node → Selectors { "title": "//h1/text()", "date": "//time/@datetime", "summary": "//p[1]/text()", "author": "//span[@class='author']/text()" // new line } 2. Switch notifications from Matrix to Email // Replace Matrix node with “Send Email” { "to": "policy-team@example.com", "subject": "New Healthcare Policy Detected: {{$json.title}}", "text": "Summary:\n{{$json.summary}}\n\nRead more at {{$json.url}}" } Data Output Format The workflow outputs structured JSON data for each new policy article: { "title": "Affordable Care Expansion Act – 2024", "url": "https://health.gov/policies/acea-2024", "date": "2024-06-14T09:00:00Z", "summary": "Proposes expansion of coverage to rural areas...", "source": "health.gov", "hash": "2d6f1c8e3b..." } Troubleshooting Common Issues ScrapeGraphAI returns empty objects – Verify selectors match the current HTML structure; inspect the site with developer tools and update the node configuration. Duplicate deals appear in Pipedrive – Ensure the “Find or Create” option is enabled in the Pipedrive node, using the page hash or url as a unique key. Performance Tips Limit POLICY_SITES to under 50 URLs per run to avoid hitting rate limits. Increase Schedule Trigger interval if you notice ScrapeGraphAI rate-limiting. Pro Tips: Store historical scraped data in a database node for long-term audit trails. Use the n8n Workflow Executions page to replay failed runs without waiting for the next schedule. Add an Error Trigger node to emit alerts if scraping or API calls fail.
by Onur
Lead Sourcing by Job Posts For Outreach With Scrape.do API & Open AI & Google Sheets Overview This n8n workflow automates the complete lead generation process by scraping job postings from Indeed, enriching company data via Apollo.io, identifying decision-makers, and generating personalized LinkedIn outreach messages using OpenAI. It integrates with Scrape.do for reliable web scraping, Apollo.io for B2B data enrichment, OpenAI for AI-powered personalization, and Google Sheets for centralized data storage. Perfect for: Sales teams, recruiters, business development professionals, and marketing agencies looking to automate their outbound prospecting pipeline. Workflow Components 1. ⏰ Schedule Trigger | Property | Value | |----------|-------| | Type | Schedule Trigger | | Purpose | Automatically initiates workflow on a recurring schedule | | Frequency | Weekly (Every Monday) | | Time | 00:00 UTC | Function: Ensures consistent, hands-off lead generation by running the pipeline automatically without manual intervention. 2. 🔍 Scrape.do Indeed API | Property | Value | |----------|-------| | Type | HTTP Request (GET) | | Purpose | Scrapes job listings from Indeed via Scrape.do proxy API | | Endpoint | https://api.scrape.do | | Output Format | Markdown | Request Parameters: | Parameter | Value | Description | |-----------|-------|-------------| | token | API Token | Scrape.do authentication | | url | Indeed Search URL | Target job search page | | super | true | Uses residential proxies | | geoCode | us | US-based content | | render | true | JavaScript rendering enabled | | device | mobile | Mobile viewport for cleaner HTML | | output | markdown | Lightweight text output | Function: Fetches Indeed job listings with anti-bot bypass, returning clean markdown for easy parsing. 3. 📋 Parse Indeed Jobs | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Extracts structured job data from markdown | | Mode | Run once for all items | Extracted Fields: | Field | Description | Example | |-------|-------------|---------| | jobTitle | Position title | "Senior Data Engineer" | | jobUrl | Indeed job link | "https://indeed.com/viewjob?jk=abc123" | | jobId | Indeed job identifier | "abc123" | | companyName | Hiring company | "Acme Corporation" | | location | City, State | "San Francisco, CA" | | salary | Pay range | "$120,000 - $150,000" | | jobType | Employment type | "Full-time" | | source | Data source | "Indeed" | | dateFound | Scrape date | "2025-01-15" | Function: Parses markdown using regex patterns, filters invalid entries, and deduplicates by company name. 4. 📊 Add New Company (Google Sheets) | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores parsed job postings for tracking | | Operation | Append rows | | Target Sheet | "Add New Company" | Function: Creates a historical record of all discovered job postings and companies for pipeline tracking. 5. 🏢 Apollo Organization Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Enriches company data via Apollo.io API | | Endpoint | https://api.apollo.io/v1/organizations/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "q_organization_name": "Company Name", "page": 1, "per_page": 1 } Response Fields: | Field | Description | |-------|-------------| | id | Apollo organization ID | | name | Official company name | | website_url | Company website | | linkedin_url | LinkedIn company page | | industry | Business sector | | estimated_num_employees | Company size | | founded_year | Year established | | city, state, country | Location details | | short_description | Company overview | Function: Retrieves comprehensive company intelligence including LinkedIn profiles, industry classification, and employee count. 6. 📤 Extract Apollo Org Data | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Parses Apollo response and merges with original data | | Mode | Run once for each item | Function: Extracts relevant fields from Apollo API response and combines with job posting data for downstream processing. 7. 👥 Apollo People Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Finds decision-makers at target companies | | Endpoint | https://api.apollo.io/v1/mixed_people/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "organization_ids": ["apollo_org_id"], "person_titles": [ "CTO", "Chief Technology Officer", "VP Engineering", "Head of Engineering", "Engineering Manager", "Technical Director", "CEO", "Founder" ], "page": 1, "per_page": 3 } Response Fields: | Field | Description | |-------|-------------| | first_name | Contact first name | | last_name | Contact last name | | title | Job title | | email | Email address | | linkedin_url | LinkedIn profile URL | | phone_number | Direct phone | Function: Identifies key stakeholders and decision-makers based on configurable title filters. 8. 📝 Format Leads | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Structures lead data for outreach | | Mode | Run once for all items | Function: Combines person data with company context, creating comprehensive lead profiles ready for personalization. 9. 🤖 Generate Personalized Message (OpenAI) | Property | Value | |----------|-------| | Type | OpenAI Node | | Purpose | Creates custom LinkedIn connection messages | | Model | gpt-4o-mini | | Max Tokens | 150 | | Temperature | 0.7 | System Prompt: You are a professional outreach specialist. Write personalized LinkedIn connection request messages. Keep messages under 300 characters. Be friendly, professional, and mention a specific reason for connecting based on their role and company. User Prompt Variables: | Variable | Source | |----------|--------| | Name | $json.fullName | | Title | $json.title | | Company | $json.companyName | | Industry | $json.industry | | Job Context | $json.jobTitle | Function: Generates unique, contextual outreach messages that reference specific hiring activity and company details. 10. 🔗 Merge Lead + Message | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Combines lead data with generated message | | Mode | Run once for each item | Function: Merges OpenAI response with lead profile, creating the final enriched record. 11. 💾 Save Leads to Sheet | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores final lead data with personalized messages | | Operation | Append rows | | Target Sheet | "Leads" | Data Mapping: | Column | Data | |--------|------| | First Name | Lead's first name | | Last Name | Lead's last name | | Title | Job title | | Company | Company name | | LinkedIn URL | Profile link | | Country | Location | | Industry | Business sector | | Date Added | Timestamp | | Source | "Indeed + Apollo" | | Personalized Message | AI-generated outreach text | Function: Creates actionable lead database ready for outreach campaigns. Workflow Flow ⏰ Schedule Trigger │ ▼ 🔍 Scrape.do Indeed API ──► Fetches job listings with JS rendering │ ▼ 📋 Parse Indeed Jobs ──► Extracts company names, job details │ ▼ 📊 Add New Company ──► Saves to Google Sheets (Companies) │ ▼ 🏢 Apollo Org Search ──► Enriches company data │ ▼ 📤 Extract Apollo Org Data ──► Parses API response │ ▼ 👥 Apollo People Search ──► Finds decision-makers │ ▼ 📝 Format Leads ──► Structures lead profiles │ ▼ 🤖 Generate Personalized Message ──► AI creates custom outreach │ ▼ 🔗 Merge Lead + Message ──► Combines all data │ ▼ 💾 Save Leads to Sheet ──► Final storage (Leads) Configuration Requirements API Keys & Credentials | Credential | Purpose | Where to Get | |------------|---------|--------------| | Scrape.do API Token | Web scraping with anti-bot bypass | scrape.do/dashboard | | Apollo.io API Key | B2B data enrichment | apollo.io/settings/integrations | | OpenAI API Key | AI message generation | platform.openai.com | | Google Sheets OAuth2 | Data storage | n8n Credentials Setup | n8n Credential Setup | Credential Type | Configuration | |-----------------|---------------| | HTTP Header Auth (Apollo) | Header: x-api-key, Value: Your Apollo API key | | OpenAI API | API Key: Your OpenAI API key | | Google Sheets OAuth2 | Complete OAuth flow with Google | Key Features 🔍 Intelligent Job Scraping Anti-Bot Bypass:** Residential proxy rotation via Scrape.do JavaScript Rendering:** Full headless browser for dynamic content Mobile Optimization:** Cleaner HTML with mobile viewport Markdown Output:** Lightweight, easy-to-parse format 🏢 B2B Data Enrichment Company Intelligence:** Industry, size, location, LinkedIn Decision-Maker Discovery:** Title-based filtering Contact Information:** Email, phone, LinkedIn profiles Real-Time Data:** Fresh information from Apollo.io 🤖 AI-Powered Personalization Contextual Messages:** References specific hiring activity Character Limit:** Optimized for LinkedIn (300 chars) Variable Temperature:** Balanced creativity and consistency Role-Specific:** Tailored to recipient's title and company 📊 Automated Data Management Dual Sheet Storage:** Companies + Leads separation Timestamp Tracking:** Historical records Deduplication:** Prevents duplicate entries Ready for Export:** CSV-compatible format Use Cases 🎯 Sales Prospecting Identify companies actively hiring in your target market Find decision-makers at companies investing in growth Generate personalized cold outreach at scale Track pipeline from discovery to contact 👥 Recruiting & Talent Acquisition Monitor competitor hiring patterns Identify companies building specific teams Connect with hiring managers directly Build talent pipeline relationships 📈 Market Intelligence Track industry hiring trends Monitor competitor expansion signals Identify emerging market opportunities Benchmark salary ranges by role 🤝 Partnership Development Find companies investing in complementary areas Identify potential integration partners Connect with technical leadership Build strategic relationship pipeline Technical Notes | Specification | Value | |---------------|-------| | Processing Time | 2-5 minutes per run (depending on job count) | | Jobs per Run | ~25 unique companies | | API Calls per Run | 1 Scrape.do + 25 Apollo Org + 25 Apollo People + ~75 OpenAI | | Data Accuracy | 90%+ for company matching | | Success Rate | 99%+ with proper error handling | Rate Limits to Consider | Service | Free Tier Limit | Recommendation | |---------|-----------------|----------------| | Scrape.do | 1,000 credits/month | ~40 runs/month | | Apollo.io | 100 requests/day | Add Wait nodes if needed | | OpenAI | Based on usage | Monitor costs (~$0.01-0.05/run) | | Google Sheets | 300 requests/minute | No issues expected | Setup Instructions Step 1: Import Workflow Copy the JSON workflow configuration In n8n: Workflows → Import from JSON Paste configuration and save Step 2: Configure Scrape.do Sign up at scrape.do Navigate to Dashboard → API Token Copy your token Token is embedded in URL query parameter (already configured) To customize search: Change the url parameter in "Scrape.do Indeed API" node: q=data+engineer (search term) l=Remote (location) fromage=7 (last 7 days) Step 3: Configure Apollo.io Sign up at apollo.io Go to Settings → Integrations → API Keys Create new API key In n8n: Credentials → Add Credential → Header Auth Name: x-api-key Value: Your Apollo API key Select this credential in both Apollo HTTP nodes Step 4: Configure OpenAI Go to platform.openai.com Create new API key In n8n: Credentials → Add Credential → OpenAI Paste API key Select credential in "Generate Personalized Message" node Step 5: Configure Google Sheets Create new Google Spreadsheet Create two sheets: Sheet 1: "Add New Company" Columns: companyName | jobTitle | jobUrl | location | salary | source | postedDate Sheet 2: "Leads" Columns: First Name | Last Name | Title | Company | LinkedIn URL | Country | Industry | Date Added | Source | Personalized Message Copy Sheet ID from URL In n8n: Credentials → Add Credential → Google Sheets OAuth2 Update both Google Sheets nodes with your Sheet ID Step 6: Test and Activate Manual Test: Click "Execute Workflow" button Verify Each Node: Check outputs step by step Review Data: Confirm data appears in Google Sheets Activate: Toggle workflow to "Active" Error Handling Common Issues | Issue | Cause | Solution | |-------|-------|----------| | "Invalid character: " | Empty/malformed company name | Check Parse Indeed Jobs output | | "Node does not have credentials" | Credential not linked | Open node → Select credential | | Empty Parse Results | Indeed HTML structure changed | Check Scrape.do raw output | | Apollo Rate Limit (429) | Too many requests | Add 5-10s Wait node between calls | | OpenAI Timeout | Too many tokens | Reduce batch size or max_tokens | | "Your request is invalid" | Malformed JSON body | Verify expression syntax in HTTP nodes | Troubleshooting Steps Verify Credentials: Test each credential individually Check Node Outputs: Use "Execute Node" for debugging Monitor API Usage: Check Apollo and OpenAI dashboards Review Logs: Check n8n execution history for details Test with Sample: Use known company name to verify Apollo Recommended Error Handling Additions For production use, consider adding: IF node after Apollo Org Search to handle empty results Error Workflow trigger for notifications Wait nodes between API calls for rate limiting Retry logic for transient failures Performance Specifications | Metric | Value | |--------|-------| | Execution Time | 2-5 minutes per scheduled run | | Jobs Discovered | ~25 per Indeed page | | Leads Generated | 1-3 per company (based on title matches) | | Message Quality | Professional, contextual, <300 chars | | Data Freshness | Real-time from Indeed + Apollo | | Storage Format | Google Sheets (unlimited rows) | API Reference Scrape.do API | Endpoint | Method | Purpose | |----------|--------|---------| | https://api.scrape.do | GET | Direct URL scraping | Documentation: [scrape.do/documentation Apollo.io API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/organizations/search | POST | Company lookup | | /v1/mixed_people/search | POST | People search | Documentation: apolloio.github.io/apollo-api-docs OpenAI API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/chat/completions | POST | Message generation | Documentation: [platform.openai.com
by Amirul Hakimi
🚀 Enrich CRM Leads with LinkedIn Company Data Using AI Who's it for Sales teams, marketers, and business development professionals who need to automatically enrich their CRM records with detailed company information from LinkedIn profiles. Perfect for anyone doing B2B outreach who wants to personalize their messaging at scale. What it does This workflow transforms bare-bones lead records into rich, personalized prospect profiles by: Automatically scraping LinkedIn company profiles Using AI (GPT-4) to extract key business intelligence Generating 15+ email-ready personalization variables Updating your CRM with structured, actionable data The workflow pulls company overviews, products/services, funding information, recent posts, and converts everything into natural-language variables that can be dropped directly into your outreach templates. How it works Trigger: Workflow starts when a new lead is added to Airtable (or on schedule) Fetch: Retrieves the lead record containing the LinkedIn company URL Scrape: Pulls the raw HTML from the company's LinkedIn profile Clean: Strips HTML tags and formats content for AI processing Analyze: GPT-4 extracts structured company intelligence (overview, products, market presence, recent posts) Transform: Converts analysis into 15+ email-ready variables with natural phrasing Update: Writes enriched data back to your CRM Setup Requirements Airtable account** (free tier works fine) OpenAI API key** (GPT-4o-mini recommended for cost-effectiveness) LinkedIn company URLs** stored in your CRM 5 minutes** for initial configuration How to set up Configure Airtable Connection Replace YOUR_AIRTABLE_BASE_ID with your base ID Replace YOUR_TABLE_ID with your leads table ID Ensure your table has a "LinkedIn Organization URL" field Add your Airtable API credentials Add OpenAI Credentials Click on both OpenAI nodes Add your OpenAI API key GPT-4o-mini is recommended (cost-effective and fast) Set Up Trigger Add a trigger node (Schedule, Webhook, or Airtable trigger) Configure to run when new leads are added or on a daily schedule Test the Workflow Add a test lead with a LinkedIn company URL Execute
by Cheng Siong Chin
Introduction Automates flight deal discovery and intelligent analysis for travel bloggers and deal hunters. Scrapes live pricing, enriches with weather data, applies AI evaluation, and auto-publishes to WordPress—eliminating manual research and accelerating content delivery. How It Works User submits route via form, scrapes real-time flight prices and weather data, AI analyzes deal quality considering weather conditions, formats results, publishes to WordPress, sends Slack notification—fully automated from input to publication. Workflow Template Form Input → Extract Data → Scrape Flight Prices → Extract Pricing → Fetch Weather → Parse Weather → Prepare AI Input → AI Analysis → Parse Output → Format Results → Publish WordPress → Slack Alert → User Response Setup Instructions Form Setup: Configure user input fields for flight routes and preferences APIs: Connect Google Flights scraping endpoint, weather API credentials, OpenAI/Chat Model API key Publishing: Set WordPress credentials, target blog category, Slack webhook URL AI Configuration: Define analysis prompts, output structure, parser rules Workflow Steps Data Collection: Form captures route, scrapes Google Flights pricing, fetches destination weather via API AI Processing: Enriches flight data with weather context, analyzes deal quality using OpenAI/Chat Model with structured output parsing Publishing: Formats analysis results, creates WordPress post, sends Slack notification, delivers response to user Prerequisites n8n instance, Google Flights access, weather API key, OpenAI/compatible AI service, WordPress site with API access, Slack workspace Use Cases Travel blog automation, flight deal newsletters, price comparison services, seasonal travel planning, destination weather analysis, automated social media content Customization Modify AI analysis criteria, adjust weather impact weighting, customize WordPress post templates, add email distribution, integrate additional data sources, expand to hotel/rental deals Benefits Eliminates manual price checking, combines multiple data sources automatically, delivers AI-enhanced insights, accelerates publishing workflow, scales across unlimited routes, provides weather-aware recommendations
by Intuz
This n8n template from Intuz provides a complete and automated solution for identifying high-intent leads from LinkedIn job postings and automatically generating personalized outreach emails. Disclaimer Community nodes are used in this workflow. Who’s this workflow for? B2B Sales Teams & SDRs Recruitment Agencies & Tech Recruiters Startup Founders Growth Marketing Teams How it works 1. Scrape Hiring Signals: The workflow starts by using an Apify scraper to find companies actively hiring for specific roles on LinkedIn (e.g., “ML Engineer”). 2. Filter & Qualify Companies: It automatically filters the results based on your criteria (e.g., company size, industry) to create a high-quality list of target accounts. 3. Find Decision-Makers: For each qualified company, it uses Apollo.io to find key decision-makers (VPs, Directors, etc.) and enrich their profiles with verified email addresses using user’s Apollo API. 4. Build a Lead List: All the enriched lead data—contact name, title, email, company info—is systematically added to a Google Sheet. 5. Generate AI-Powered Emails: The workflow then feeds each lead’s data to a Google Gemini AI model, which drafts a unique, personalized cold email that references the specific job the company is hiring for. 6. Complete the Outreach List: Finally, the AI-generated subject line and email body are saved back into the Google Sheet, leaving you with a fully prepared, hyper-targeted outreach campaign. Setup Instructions 1. Apify Configuration: Connect your Apify account in the Run the LinkedIn Job Scraper node. You’ll need an apify scrapper, we have used this scrapper In the Custom Body field, paste the URL of your target LinkedIn Jobs search query. 2. Data Enrichment: Connect your account API of data providers like Clay, Hunter, Apollo, etc. using HTTP Header Auth in the Get Targeted Personnel and Email Finder nodes. 3. Google Gemini AI: Connect your Google Gemini (or PaLM) API account in the Google Gemini Chat Model node. 4. Google Sheets Setup: Connect your Google Sheets account. Create a spreadsheet and update the Document ID and Sheet Name in the three Google Sheets nodes to match your own. 5. Activate Workflow: Click “Execute workflow” to run the entire lead generation and email-writing process on demand. Connect with us: Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Club de Inteligencia Artificial Politécnico CIAP
Telegram Appointment Scheduling Bot with n8n 📃 Description Tired of managing appointments manually? This template transforms your Telegram account into a smart virtual assistant that handles the entire scheduling process for you, 24/7. This workflow allows you to deploy a fully functional Telegram bot that not only schedules appointments but also checks real-time availability in your Google Calendar, logs a history in Google Sheets, and allows your clients to cancel or view their upcoming appointments. It's the perfect solution for professionals, small businesses, or anyone looking to automate their booking system professionally and effortlessly. ✨ Key Features Complete Appointment Management:** Allows users to schedule, cancel, and list their future appointments. Conflict Prevention:** Integrates with Google Calendar to check availability before confirming a booking, eliminating the risk of double-booking. Automatic Logging:** Every confirmed appointment is saved to a row in Google Sheets, creating a perfect database for tracking and analysis. Smart Interaction:** The bot handles unrecognized commands and guides the user, ensuring a smooth experience. Easy to Adapt:** Connect your own accounts, customize messages, and tailor it to your business needs in minutes. 🚀 Setup Follow these steps to deploy your own instance of this bot: 1. Prerequisites An n8n instance (Cloud or self-hosted). A Telegram account. A Google account. 2. Telegram Bot Talk to @BotFather on Telegram. Create a new bot using /newbot. Give it a name and a username. Copy and save the API token it provides. 3. Google Cloud & APIs Go to the Google Cloud Console. Create a new project. Enable the Google Calendar API and Google Sheets API. Create OAuth 2.0 Client ID credentials. Make sure to add your n8n instance's OAuth redirect URL. Save the Client ID and Client Secret. 4. Google Sheets Create a new spreadsheet in Google Sheets. Define the column headers in the first row. For example: id, Client Name, Date and Time, ISO Date. 5. n8n Import the workflow JSON file into your n8n instance. Set up the credentials: Telegram: Create a new credential and paste your bot's token. Google Calendar & Google Sheets (OAuth2): Create a new credential and paste the Client ID and Client Secret from the Google Cloud Console. Review the Google Calendar and Google Sheets nodes to select your correct calendar and spreadsheet. Activate the workflow! 💬 Usage Once the bot is running, you can interact with it using the following commands in Telegram: To start the bot:** /start To schedule a new appointment:** agendar YYYY-MM-DD HH:MM Your Full Name To cancel an existing appointment:** cancelar YYYY-MM-DD HH:MM Your Full Name To view your future appointments:** mis citas Your Full Name 👥 Authors Jaren Pazmiño President of the Polytechnic Artificial Intelligence Club (CIAP)
by Fahmi Fahreza
Weekly SEO Watchlist Audit to Google Sheets (Gemini + Decodo) Sign up for Decodo HERE for Discount Automatically fetches page content, generates a compact SEO audit (score, issues, fixes), and writes both a per-URL summary and a normalized “All Issues” table to Google Sheets—great for weekly monitoring and prioritization. Who’s it for? Content/SEO teams that want lightweight, scheduled audits of key pages with actionable next steps and spreadsheet reporting. How it works Weekly trigger loads the Google Sheet of URLs. Split in Batches processes each URL. Decodo fetches page content (markdown + status). Gemini produces a strict JSON audit via the AI Chain + Output Parser. Code nodes flatten data for two tabs. Google Sheets nodes append Summary and All Issues rows. Split in Batches continues to the next URL. How to set up Add credentials for Google Sheets, Decodo, and Gemini. Set sheet_id and Sheet GIDs in the Set node. Ensure input sheet has a URL column. Configure your Google Sheets tabs with proper headers matching each field being appended (e.g., URL, Decodo Score, Priority, etc.). Adjust schedule as needed. Activate the workflow.
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Website sitemap generator and visual tree creator Who's it for Web developers, SEO specialists, UX designers, and digital marketers who need to analyze website structure, create visual sitemaps, or audit site architecture for optimization purposes. What it does This workflow automatically generates a comprehensive sitemap from any website URL and creates an organized hierarchical structure in Google Sheets. It follows the website's sitemap to discover all pages, then organizes them by navigation levels (Level 1, Level 2, etc.) with proper parent-child relationships. The output can be further processed to create visual tree diagrams and mind maps. How it works The workflow follows a five-step automation process: URL Input: Accepts website URL via chat interface Site Crawling: Uses Firecrawl to discover all pages following the website's sitemap only Success Validation: Checks if crawling was successful (some sites block external crawlers) Hierarchical Organization: Processes URLs into a structured tree with proper level relationships Google Sheets Export: Creates a formatted spreadsheet with the complete site architecture The system respects robots.txt and follows only sitemap-declared pages to ensure ethical crawling. Requirements Firecrawl API key (for website crawling and sitemap discovery) Google Sheets access Google Drive access (for template duplication) How to set up Step 1: Prepare your template (recommended) It's recommended to create your own copy of the base template: Access the base Google Sheets template Make a copy for your personal use Update the workflow's "Copy template" node with your template's file ID (replace the default ID: 12lV4HwgudgzPPGXKNesIEExbFg09Tuu9gyC_jSS1HjI) This ensures you have control over the template formatting and can customize it as needed Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For crawling websites and discovering sitemaps Google Sheets OAuth2: For creating and updating spreadsheets Google Drive OAuth2: For duplicating the template file Step 3: Configure Firecrawl settings (optional) The workflow uses optimized Firecrawl settings: ignoreSitemap: false - Respects the website's sitemap sitemapOnly: true - Only crawls URLs listed in sitemap files These settings ensure ethical crawling and faster processing Step 4: Access the workflow The workflow uses a chat trigger interface - no manual configuration needed Simply provide the website URL you want to analyze when prompted How to use the workflow Basic usage Start the chat: Access the workflow via the chat interface Provide URL: Enter the website URL you want to analyze (e.g., "https://example.com") Wait for processing: The system will crawl, organize, and export the data Receive your results: Get an automatic direct clickable link to your generated Google Sheets - no need to search for the file Error handling Invalid URLs: If the provided URL is invalid or the website blocks crawling, you'll receive an immediate error message Graceful failure: The workflow stops without creating unnecessary files when errors occur Common causes: Incorrect URL format, robots.txt restrictions, or site security settings File organization Automatic naming: Generated files follow the pattern "[Website URL] - n8n - Arborescence" Google Drive storage: Files are automatically organized in your Google Drive Instant access: Direct link provided immediately upon completion Advanced processing for visual diagrams Step 1: Copy sitemap data Once your Google Sheets is ready: Copy all the hierarchical data from the generated spreadsheet Prepare it for AI processing Step 2: Generate ASCII tree structure Use any AI model with this prompt: Create a hierarchical tree structure from the following website sitemap data. Return ONLY the tree structure using ASCII tree formatting with ├── and └── characters. Do not include any explanations, comments, or additional text - just the pure tree structure. The tree should start with the root domain and show all pages organized by their hierarchical levels. Use proper indentation to show parent-child relationships. Here is the sitemap data: [PASTE THE SITEMAP DATA HERE] Requirements: Use ASCII tree characters (├── └── │) Show clear hierarchical relationships Include all pages from the sitemap Return ONLY the tree structure, no other text Start with the root domain as the top level Step 3: Create visual mind map Visit the Whimsical Diagrams GPT Request a mind map creation using your ASCII tree structure Get a professional visual representation of your website architecture Results interpretation Google Sheets output structure The generated spreadsheet contains: Niv 0 to Niv 5: Hierarchical levels (0 = homepage, 1-5 = navigation depth) URL column: Complete URLs for reference Hyperlinked structure: Clickable links organized by hierarchy Multi-domain support: Handles subdomains and different domain structures Data organization features Automatic sorting: Pages organized by navigation depth and alphabetical order Parent-child relationships: Clear hierarchical structure maintained Domain separation: Main domains and subdomains processed separately Clean formatting: URLs decoded and formatted for readability Workflow limitations Sitemap dependency: Only discovers pages listed in the website's sitemap Crawling restrictions: Some websites may block external crawlers Level depth: Limited to 5 hierarchical levels for clarity Rate limits: Respects Firecrawl API limitations Template dependency: Requires access to the base template for duplication Use cases SEO audits: Analyze site structure for optimization opportunities UX research: Understand navigation patterns and user paths Content strategy: Identify content gaps and organizational issues Site migrations: Document existing structure before redesigns Competitive analysis: Study competitor site architectures Client presentations: Create visual site maps for stakeholder reviews
by George Zargaryan
AI Real Estate Agent with OpenRouter and SrpAPI to talk with property objects from propertyfinder.ae This n8n template demonstrates a simple AI Agent that can: Scrape information from a provided propertyfinder.ae listing link. Answer questions about a specific property using the scraped information. Use SerpAPI to find details that are missing from the scraped data. Answer general real-estate questions using SerpAPI. Use Case This workflow serves as a starting point for building complex AI assistants for real estate or other domains. See the demo video Potential Enhancements Expand Knowledge:** Augment the workflow with your own knowledge base using a vector database (RAG approach). Add More Sources:** Adapt the scraper to support other real estate websites. Optimize Speed:** Add a cache for scraped data to reduce response latency. Improve Context Handling:** Implement reliable persistence to track the current listing instead of iterating through conversation history. Customize Prompts:** Write more tailored prompts for your specific needs (the current one is for demonstration only). Integrate Channels:** Connect the workflow to communication channels like Instagram, Telegram, or WhatsApp. How It Works The workflow is triggered by a "When chat message received" node for simple demonstration. The Chat Memory Manager node extracts the last 30 messages for the current session. A code node finds the property link, first by checking the most recent user message and then by searching the conversation history. If a link is found, an HTTP Request node scrapes the HTML content from the listing page. The Summarize code node parses the HTML, retrieves key information, and passes it to the AI Agent as a temporary knowledge base. The final AI Agent node answers user queries using the scraped knowledge base and falls back to the SerpAPI tool when information is missing. How to Use You can test this workflow directly in n8n or integrate it into any social media channel or your website. The AI Agent node is configured to use OpenRouter. Add your OpenRouter credentials, or replace the node with your preferred LLM provider. Add your SerpAPI key to the SerpAPI tool within the AI Agent node. Requirements An API key for OpenRouter (or credentials for your preferred LLM provider). A SerpAPI key. You can get one from their website; a free plan is available for testing. Need Help Building Something More? Contact me on: Telegram:** @ninesfork LinkedIn:** George Zargaryan Happy Hacking! 🚀
by WeblineIndia
WhatsApp AI Sales Agent using PDF Vector Store This workflow turns your WhatsApp number into an intelligent AI-powered Sales Agent that answers product queries using real data extracted from a PDF brochure. It loads a product brochure via HTTP Request, converts it into embeddings using OpenAI, stores them in an in-memory vector store and allows the AI Agent to provide factual answers to users via WhatsApp. Non-text messages are filtered and only text queries are processed. This makes the workflow ideal for building a lightweight chatbot that understands your product documentation deeply. Quick Start: 5-Step Fast Implementation Insert your WhatsApp credentials in the WhatsApp Trigger and WhatsApp Send nodes. Add your OpenAI API Key to all OpenAI-powered nodes. Replace the PDF URL in the HTTP Request node with your own brochure. Run the Manual Trigger once to build the vector store. Activate the workflow and start chatting from WhatsApp. What It Does This workflow converts a product brochure (PDF) into a searchable knowledgebase using LangChain vector embeddings. Incoming WhatsApp messages are processed and if the message is text, the AI Sales Agent uses OpenAI + the vector store to produce accurate, brochure-based answers. The AI responds naturally to customer queries, supports conversation memory across the session and retrieves information directly from the brochure when needed. Non-text messages are filtered out to maintain clean conversational flow. The workflow is fully modular: you can replace the PDF, modify AI prompts, plug into CRM systems or extend it into a broader sales automation pipeline. Who’s It For This workflow is ideal for: Businesses wanting a WhatsApp-based AI customer assistant. Sales teams needing automated product query handling. Companies with large product catalog PDFs. Marketers wanting a zero-code product brochure chatbot. Technical teams experimenting with LangChain + OpenAI inside n8n. Requirements to Use This Workflow To run this workflow successfully, you need: An n8n instance (cloud or self-hosted). A WhatsApp Business API connection. An OpenAI API key. A publicly accessible PDF brochure URL. Basic familiarity with n8n node configuration. Optional: A custom vector store backend (Qdrant, Pinecone) – the template uses in-memory storage. How It Works & How To Set Up 1. Import the Workflow JSON Upload the workflow JSON provided. 2. Configure WhatsApp Trigger Open WhatsApp Trigger Add your WhatsApp credentials Set the webhook correctly to match your n8n endpoint 3. Configure WhatsApp Response Nodes The workflow uses two WhatsApp send nodes: Reply To User** → Sends AI response Reply To User1** → Sends “unsupported message” reply Add your WhatsApp credentials to both. 4. Replace the PDF Brochure In get Product Brochure (HTTP Request): Update the url parameter with your own PDF 5. Run the PDF → Vector Store Setup (One-Time Only) Use the Manual Trigger ("When clicking ‘Test workflow’") to: Download the PDF Extract text Split into chunks Generate embeddings Store them in Product Catalogue vector store > You must run this once after importing the workflow. 6. Set OpenAI Credentials Add your OpenAI API Key to the following nodes: OpenAI Chat Model OpenAI Chat Model1 Embeddings OpenAI Embeddings OpenAI1 7. Review the AI Agent Prompt Inside AI Sales Agent, you can edit the system message to match: Your brand Your product types Your tone of voice 8. Activate the Workflow Once activated, WhatsApp users can chat with your AI Sales Agent. How to Customize Nodes? Here are common customization options: Customize the PDF / Knowledgebase Change the URL in get Product Brochure or Upload your own file via other nodes. Customize AI Behavior Edit the systemMessage inside AI Sales Agent: Change personality Set product rules Restrict/expand scope Change Supported Message Types Modify Handle Message Types switch logic to allow: Image → OCR Audio → Whisper Documents → Additional processing Modify WhatsApp Message Templates Inside the textBody of response nodes. Extend or replace Vector Store Swap vectorStoreInMemory with: Qdrant Pinecone Redis vector store By updating the vector store node. Add-Ons (Optional Enhancements) You can extend this workflow with: 1. Multi-language support Add OpenAI translation nodes before agent input. 2. CRM Integration Send user queries and chat logs into: HubSpot Salesforce Zoho CRM 3. Product Recommendation Engine Use embeddings similarity to suggest products. 4. Order Placement Workflow Connect to Stripe or Shopify APIs. 5. Analytics Dashboard Log chats into Airtable / Postgres for analysis. Use Case Examples Here are some practical uses: Product Inquiry Chatbot Customers ask about specs, pricing, or compatibility. Digital Catalog Assistant Converts PDF brochures into interactive WhatsApp search. Sales Support Bot Reduces load on human sales reps by handling common questions. Internal Knowledge Bot Teams access manuals, training documents, or service guides. Event/Product Launch Assistant Provides instant details about newly launched items. And many more similar use cases where an AI-powered WhatsApp assistant is valuable. Troubleshooting Guide | Issue | Possible Cause | Solution | | ------------------------------------------ | -------------------------------------- | ------------------------------------------------------------- | | WhatsApp messages not triggering workflow | Wrong webhook URL or inactive workflow | Ensure webhook is correct & activate workflow | | AI replies are empty | Missing OpenAI credentials | Add OpenAI API key to all AI nodes | | Vector store not populated | Manual trigger not executed | Run the Test Workflow trigger once | | PDF extraction returns blank text | PDF is image-based | Use OCR before text splitting | | “Unsupported message type” always triggers | Message type filter misconfigured | Check conditions in Handle Message Types | | AI not using brochure data | VectorStore tool not linked properly | Check connections between Embeddings → VectorStore → AI Agent | Need Help with Support & Extensions? If you need help setting up, customizing or extending this workflow, feel free to reach out to our n8n automation developers at WeblineIndia. We can help with Custom WhatsApp automation workflows AI-powered product catalog systems Integrating CRM, ERP or eCommerce platforms Building advanced LangChain-powered n8n automations Deploying scalable vector stores (Qdrant/Pinecone) And so much more.
by Max Tkacz
Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can enter prompts and select predefined visual styles that are customizable with no-code. The workflow integrates seamlessly with Hugging Face's free tier, and it’s easy to modify for any Text-to-Image model that supports API access. Try it Curious what this template does? Try a public version here: https://devrel.app.n8n.cloud/form/flux Set Up Watch this quick set up video 👇 Accounts required Huggingface.co account (free) Cloudflare.com account (free - used for storage; but can be swapped easily e.g. GDrive) Key Features: Text-to-Image Creation**: Generates unique visuals based on your prompt and style. Hugging Face Integration**: Utilizes Hugging Face’s Inference API for reliable image generation. Customizable Visual Styles**: Select from preset styles or easily add your own. Adaptable**: Swap in any Hugging Face Text-to-Image model that supports API calls. Ideal for: Creators**: Rapidly create visuals for projects. Marketers**: Prototype campaign visuals. Developers**: Test different AI image models effortlessly. How It Works: You submit an image prompt via the webform and select a visual style, which appends style instructions to your prompt. The Hugging Face Inference API then generates and returns the image, which gets hosted on Cloudflare S3. The workflow can be easily adjusted to use other models and styles for complete flexibility.
by Mutasem
Use case This workflow snoozes any Todoist tasks, by moving them into a Snoozed todoist list and unsnoozes them 3 days before due date. Helps keep inbox clear only of tasks you need to worry about soon. How to setup Add your Todoist creds Create a Todoist project called snoozed Set the project ids in the relevant nodes Add due dates to your tasks in Inbox. Watch them disappear to snoozed. Set their date to tomorrow, watch it return to inbox. How to adjust this template Adjust the timeline.. Maybe 3 days is too close for you. Works mostly for me :)