by Abi Odedeyi
How It Works Trigger: Watches for new emails in Gmail with PDF/image attachments. OCR: Sends the attachment to OCR.space API (https://ocr.space/OCRAPI) to extract invoice text. Parsing: Extracts key fields: Vendor Invoice number Amount Currency Invoice date Due date Description Validation Logic: Checks if amount is valid Ensures vendor and invoice number are present Flags high-value invoices (e.g., over $10,000) Routing: If invalid: Sends a Slack message highlighting issues Labels email as Rejected If valid: Logs the invoice into Google Sheets Sends a Slack message to the finance team for approval After approval, creates a draft invoice in Xero Labels the email as Processed in Gmail Set up steps • Estimated setup time: 45-60 mins • You’ll need connected credentials for Gmail, Slack, Google Sheets, and Xero • Replace the default API key for OCR.space with your own (in the HTTP Request node) • Update Slack channel IDs and label IDs to match your workspace • Adjust invoice validation rules as needed (e.g. currency, red flag conditions) All detailed explanations and field mappings are provided in sticky notes within the workflow.
by Incrementors
AI-Powered Keyword Cannibalization Detection Workflow Overview This is an advanced n8n automation workflow designed to detect and analyze keyword cannibalization issues across multiple client websites using Google Search Console data and artificial intelligence. The system provides real-time monitoring and comprehensive reporting to help SEO professionals identify and resolve internal competition between pages ranking for the same keywords. Core Components 1. Automated Monitoring System Real-time trigger:** Monitors Google Sheets for keyword changes every minute Multi-client support:** Handles up to 4 different client websites simultaneously Intelligent routing:** Automatically directs each client's data through dedicated processing paths 2. Data Collection & Processing GSC Integration:** Fetches 30 days of search performance data from Google Search Console API Comprehensive metrics:** Collects keyword rankings, page URLs, positions, clicks, impressions, and CTR Data transformation:** Groups raw API responses by keywords for structured analysis Cross-referencing:** Matches target keywords from Google Sheets with actual GSC performance data 3. AI Analysis Engine GPT-4o powered:** Uses advanced AI to analyze keyword competition patterns Risk categorization:** Automatically classifies cannibalization risk as: High Risk: 5+ pages competing for the same keyword Moderate Risk: 3+ pages ranking in top 10 positions Low Risk: 2 pages with one clearly dominating No Risk: Single page ranking for the keyword Intelligent reasoning:** Provides detailed explanations for each risk assessment 4. Comprehensive Reporting Automated output:** Saves analysis results back to Google Sheets Detailed insights:** Includes risk levels, reasoning, observations, and actionable remediation steps Performance tracking:** Complete keyword performance metrics for client reporting Status tracking:** Identifies which keywords are ranking vs. missing from search results
by Rully Saputra
Automate Lighthouse report alerts to messenger and Google Sheets Who’s it for This workflow is ideal for developers, SEO specialists, performance engineers, and digital agencies who want to continuously monitor website performance using Core Web Vitals. It’s also perfect for product or infrastructure teams that need real-time alerts when a site underperforms and want a historical log of reports in Google Sheets. What it does This automation periodically fetches a Lighthouse report from the PageSpeed Insights API, checks whether any of the Core Web Vitals (CWV) scores fall below a defined threshold, and sends a notification to Telegram (or any other preferred messenger). It also logs the summarized report in a specific row within a Google Spreadsheet for long-term tracking and reporting. The CWV audit results are processed using JavaScript and passed through a summarization step using Gemini Chat, making the audit descriptions concise and actionable. How to set up Configure the Schedule Trigger node to run at your preferred frequency. Set your target URLs and API Key, then connect the HTTP Request node to Google PageSpeed Insights. Update the JavaScript Code node to filter and parse only CWV metrics. Define thresholds in the IF Node to trigger Telegram messages only when needed. Connect your Telegram (or other messenger) credentials. Set up the Google Sheets node by linking your account and choosing the sheet and range to log data. Requirements Google account with access to Google Sheets Telegram bot token or any preferred messenger API key for PageSpeed Insights Gemini Chat integration (optional for summarization, can be replaced or removed) How to customize the workflow Swap Telegram for Slack, Discord, or email by replacing the Send Notification node. Adjust the CWV thresholds or include other Lighthouse metrics by modifying the IF Node and JavaScript logic. Add multiple URLs to monitor by introducing a loop or extending the schedule with different endpoints. Replace the Gemini Chat model with OpenAI, Claude, or your custom summarizer if needed.
by Rahul Joshi
📘 Description This workflow automates the employee onboarding process by creating Jira accounts, generating Notion onboarding checklists, crafting AI-generated welcome messages, and sending personalized welcome emails — all automatically. It provides a complete hands-free onboarding experience for HR and IT teams by connecting Jira, Notion, Google Sheets, Gmail, and Azure OpenAI. Failures (like Jira account creation errors) are logged into Google Sheets to ensure full transparency and no missed onboardings. ⚙️ What This Workflow Does (Step-by-Step) 🟢 When Clicking “Execute Workflow” Manually triggers the entire onboarding automation. Useful for testing or initiating onboarding on demand for a new hire. 👤 Define New Hire Profile Data Structures all essential employee information into a clean dataset including name, email, start date, buddy, and access links (Slack, GitHub, Jira, Notion). Acts as the single source of truth for all downstream systems ensuring consistent, error-free onboarding data. 🎫 Create Jira User Account Automatically creates a Jira account for the new employee using REST API calls. Includes email, display name, username, and product access (Jira Software). Removes the need for manual admin setup and ensures immediate access to project boards. ✅ Validate Jira Account Creation Success: Checks if the Jira API response contains a valid accountId. If successful → continues onboarding. If failed → logs error to Google Sheets. Ensures downstream steps don’t continue if Jira setup fails. 📊 Log Jira Provisioning Failures to Error Sheet Appends any account creation errors (duplicate emails, invalid permissions, or API issues) into an “error log sheet” in Google Sheets. Helps HR/IT monitor issues and manually resolve them. Guarantees no silent onboarding failures. 📋 Generate Notion Onboarding Checklist Creates a personalized Notion page titled “{Name} - Onboarding Checklist” that includes: Welcome message Access links (Slack, GitHub, Jira) Assigned buddy details Start date and status Optionally, embedded videos or docs Gives each new hire a structured hub to manage onboarding tasks independently. 🤖 AI-Generated Welcome Message Creator Uses GPT-4o (Azure OpenAI) to craft a friendly, motivational welcome message for the new employee. Incorporates name, buddy, and access details with emojis and warm tone. Ensures every message feels human and engaging — not robotic. 🧠 GPT-4o Language Model Configuration Configures the AI assistant persona for personalized onboarding messages. Ensures tone consistency, friendliness, and empathy across all communications. 🔗 Consolidate Onboarding Data Streams Merges data from Jira, Notion, and AI message generation into a single payload. This ensures the final email contains every onboarding element — access links, checklist URL, and the AI-generated message. 📧 Format Comprehensive Welcome Email Generates a complete HTML-formatted email with: Personalized greeting AI-generated welcome message Clickable links (Jira, Notion, Slack, GitHub) Buddy info and start date Designed for mobile responsiveness and branded presentation. 📬 Send Welcome Email to New Hire Sends the final welcome email to the employee’s inbox with the subject: “Welcome to Techdome, {Name}! 🎉” Includes all essential access information, links, and team introductions — ensuring the new hire starts strong on Day 1. 🧩 Prerequisites Jira Admin API credentials Notion API integration Gmail OAuth2 credentials Azure OpenAI (GPT-4o) access Google Sheets document for logging errors 💡 Key Benefits ✅ Fully automated new hire onboarding ✅ AI-generated personalized communications ✅ Real-time error logging for IT transparency ✅ Seamless integration across Jira, Notion, and Gmail ✅ Professional first-day experience with zero manual work 👥 Perfect For HR teams managing multiple onboardings IT admins automating access provisioning Startups scaling employee onboarding Organizations using Jira + Notion + Gmail stack
by Abdul Mir
Overview Stop spending hours formatting proposals. This workflow turns a short post-call form into a high-converting, fully-personalized PandaDoc proposal—plus updates your CRM and drafts the follow-up email for you. After a sales call, just fill out a 3-minute form summarizing key pain points, solutions pitched, and the price. The workflow uses AI to generate polished proposal copy, then builds a PandaDoc draft using dynamic data mapped into the JSON body (which you can fully customize per business). It also updates the lead record in ClickUp with the proposal link, company name, and quote—then creates an email draft in Gmail, ready to send. Who’s it for Freelancers and consultants sending service proposals Agencies closing deals over sales calls Sales reps who want to automate proposal follow-up Teams using ClickUp as their lightweight CRM How it works After a call, fill out a short form with client details, pitch notes, and price AI generates professional proposal copy based on form input Proposal is formatted and sent to PandaDoc via HTTP request ClickUp lead is updated with: Company Name Proposal URL Quote/price A Gmail draft is created using the proposal link and a thank-you message Example use case > You hop off a call, fill out: > - Prospect: Shopify agency > - Pain: No lead gen system > - Solution: Automated cold outreach > - Price: $2,500/month > > 3 minutes later: PandaDoc proposal is ready, CRM is updated, and your email draft is waiting to be sent. How to set up Replace the form with your preferred tool (e.g. Tally, Typeform) Connect PandaDoc API and structure your proposal template Customize the JSON body inside the HTTP request to match your business Link your ClickUp space and custom fields Connect Gmail (or other email tool) for final follow-up draft Requirements Form tool for capturing sales call notes OpenAI or LLM key for generating proposal copy PandaDoc API access ClickUp custom fields set up for lead tracking Gmail integration How to customize Customize your PandaDoc proposal fields in the JSON body of the HTTP node Replace ClickUp with another CRM like HubSpot or Notion Adjust AI tone (casual, premium, corporate) for proposal writing Add Slack or Telegram alerts when the draft is ready Add PDF generation or auto-send email step
by Rakin Jakaria
Who this is for This workflow is for freelancers, job seekers, or service providers who want to automatically apply to businesses by scraping their website information, extracting contact details, and sending personalized job application emails with AI-powered content — all from one form submission. What this workflow does This workflow starts every time someone submits the Job Applier Form. It then: Scrapes the target business website** to gather company information and contact details. Converts HTML content** to readable markdown format for better AI processing. Extracts email addresses* and creates a company summary using *GPT-5 AI**. Validates email addresses** to ensure they contain proper formatting (@ symbol check). Accesses your experience data* from a connected *Google Sheet** with your skills and portfolio. Generates personalized application emails* (subject + body) using *GPT-5** based on the job position and company info. Sends the application email* automatically via *Gmail** with your name as sender. Provides confirmation** through a completion form showing the AI's response. Setup To set this workflow up: Form Trigger – Customize the job application form fields (Target Business Website, Applying As dropdown with positions like Video Editor, SEO Expert, etc.). OpenAI GPT-5 – Add your OpenAI API credentials for both AI models used in the workflow. Google Sheets – Connect your sheet containing your working experience data, skills, and portfolio information. Gmail Account – Link your Gmail account for sending application emails automatically. Experience Data – Update the Google Sheet with your relevant skills, experience, and achievements for each job type. Sender Name – Modify the sender name in Gmail settings (currently set to "Jamal Mia"). How to customize this workflow to your needs Add more job positions to the dropdown menu (currently includes Video Editor, SEO Expert, Full-Stack Developer, Social Media Manager). Modify the AI prompt to reflect your unique value proposition and application style. Enhance email validation with additional checks like domain verification or email format patterns. Add follow-up scheduling to automatically send reminder emails after a certain period. Include attachment functionality to automatically attach your resume or portfolio to applications. Switch to different email providers or add multiple sender accounts for variety.
by Peyton Leveillee
Created by: Peyton Leveillee Last updated: October 2025 🧠 TL;DR Marketing companies charge hundreds or even thousands per month for automated “Google Business Pulse” reports that show visibility, reviews, and engagement trends. Now you can do it for free — right inside n8n. This workflow pulls Google Business data, compares weekly and 12-week trends, runs it through an LLM for insight summaries, and sends out Slack updates and weekly email reports — automatically. 🔥 Name: Google Pulse Pro Your AI marketing analyst that runs 24/7 — no subscription, no fluff. 🧩 Categories Business Intelligence Marketing Automation AI Summarization Multichannel Reporting 💬 Overview Google Pulse Pro automates weekly Google Business reporting for multiple companies or locations. It combines data collection, trend analysis, and AI commentary into a fully automated system — delivering updates through Slack and email. Perfect for agencies or businesses that want client-ready, insight-driven updates without paying for SaaS dashboards. ⚙️ Good to Know Uses OpenAI Chat Models for summarization and insights Integrates Google Business API, Google Sheets, Slack, and Email (Gmail or SMTP) Compares this week vs last week and 12-week averages Outputs LLM-generated summaries formatted for Slack Blocks and email templates 100% no-code friendly — ready to deploy instantly 🧠 How It Works Read Companies (Google Sheets) Loads company info, Google Business IDs, Slack channels, and recipient emails. Pull Google Business Data Queries the Google Business API for impressions, actions, CTR, and reviews across: This week Last week 12-week average Summarize & Compare Code nodes calculate trends and append results back to Google Sheets. AI Summaries (OpenAI) Three separate LLMs generate insight lines: Impressions one-liner → Visibility & engagement shifts Reviews one-liner → Sentiment & reputation trends Overall one-liner → Combined marketing summary Format & Distribute All one-liners merge per company. Recipients are attached, and messages are formatted for output. Send Reports Slack: Beautifully formatted updates via Slack Blocks Email: Branded Weekly Google Business Pulse summaries 📈 Example Output Slack Message: Weekly Google Reviews & Impressions Brad’s Service Center Sept 22–28, 2025 • Impressions down 41% vs last week • CTR 3pp lower than 12-week avg • 2 new reviews, 100% positive sentiment Email Pulse: 🧾 Requirements Google Business API credential** (OAuth2) Google Sheets credential** (OAuth2) Slack credential** (OAuth2) — chat:write, users:read, channels:read Gmail or SMTP credential** (for email pulse delivery) OpenAI credential** (for summaries) ✏️ Customizing Add other KPIs (Google Ads, GA4, POS data) Adjust scheduling cadence (daily, bi-weekly, monthly) Send reports to Notion, Airtable, or HubSpot Update Slack + email branding for your agency 💡 Use Cases Agencies automating client reporting Multi-location businesses monitoring reputation Service centers tracking performance trends Anyone tired of paying for “Google Business Pulse” dashboards 🎯 Why It Matters Most marketing firms hide behind “AI dashboards” to sell visibility reports. Google Pulse Pro gives you that same power — automated, AI-enhanced, and free. Unchain your reporting. Impress clients. And keep your marketing dollars where they belong — in your business.
by Abdullah Alshiekh
📝 Description: This template is designed for healthcare providers, sales reps, and medical tourism companies who need to process diagnosis emails efficiently. It automates the full flow from email to report delivery. When a new diagnosis email arrives: The email content is captured and parsed by an AI agent (Gemini or any customizable LLM). Patient and medical data is extracted into structured fields (e.g., name, phone, diagnosis). Data is logged into a Google Sheet for records. A Google Docs medical report is generated using a predefined template. The report is exported as PDF and emailed to stakeholders (e.g., managers or sales team). This template supports custom AI models, customizable Google Docs templates, and flexible filtering based on sender email. 🛠️ Features Gmail email trigger (customizable sender filter) AI-powered diagnosis parsing using Gemini (easily switchable to OpenAI or others) Google Sheets log Google Docs templated report (auto-filled) PDF export and email sending Full flexibility & customization 🔧 Requirements Before using this template, you'll need: A connected Gmail account (to receive diagnosis emails) A valid Google Sheets integration (create your own sheet with the desired columns) A Google Docs template document that includes placeholder tags like {{patient_name}}, {{date}}, etc. A Gemini or OpenAI API connection for the AI agent (fully customizable) Note: You must replace all Google Drive, Docs, and Sheets references with your own documents. This template does not grant access to the original creator's files. ⚙️ Customization Tips In the Gmail Trigger node, change the sender filter to match the doctor’s email you want to process. Modify the AI prompt if your use case needs different extracted fields. Replace the Google Docs template link with your own file and customize its structure and variables. Change recipient email addresses in the final Gmail node to notify the correct team members. Optional: Add fallback flows or error branches for when AI fails or input is malformed. 🧠 Use Case Examples Medical tourism agencies auto-generating patient reports for incoming diagnosis summaries Clinics storing structured data from messy email inputs Sales teams instantly notified of new leads with completed medical summaries
by vinci-king-01
Daily Stock Regulatory News Aggregator with Compliance Alerts and Google Sheets Tracking 🎯 Target Audience Compliance officers and regulatory teams Financial services firms monitoring regulatory updates Investment advisors tracking regulatory changes Risk management professionals Corporate legal departments Stock traders and analysts monitoring regulatory news 🚀 Problem Statement Manually monitoring regulatory updates from multiple agencies (SEC, FINRA, ESMA) is time-consuming and error-prone. This template automates daily regulatory news monitoring, aggregates updates from major regulatory bodies, filters for recent announcements, and instantly alerts compliance teams to critical regulatory changes, enabling timely responses and maintaining regulatory compliance. 🔧 How it Works This workflow automatically monitors regulatory news daily, scrapes the latest updates from major regulatory agencies using AI-powered web scraping, filters for updates from the last 24 hours, and sends Slack alerts while logging all updates to Google Sheets for historical tracking. Key Components Daily Schedule Trigger - Automatically runs the workflow every 24 hours to check for regulatory updates Regulatory Sources Configuration - Defines the list of regulatory agencies and their URLs to monitor (SEC, FINRA, ESMA) Batch Processing - Iterates through regulatory sources one at a time for reliable processing AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract regulatory updates including title, summary, date, agency, and source URL Data Flattening - Transforms scraped data structure into individual update records Time Filtering - Filters updates to keep only those from the last 24 hours Historical Tracking - Logs all filtered updates to Google Sheets for compliance records Compliance Alerts - Sends Slack notifications to compliance teams when new regulatory updates are detected 💰 Key Features Automated Regulatory Monitoring Daily Execution**: Runs automatically every 24 hours without manual intervention Multi-Agency Support**: Monitors SEC, FINRA, and ESMA simultaneously Error Handling**: Gracefully handles scraping errors and continues processing other sources Smart Filtering Time-Based Filtering**: Automatically filters updates to show only those from the last 24 hours Date Validation**: Discards updates with unreadable or invalid dates Recent Updates Focus**: Ensures compliance teams only receive actionable, timely information Alert System Compliance Alerts**: Instant Slack notifications for new regulatory updates Structured Data**: Alerts include title, summary, date, agency, and source URL Dedicated Channel**: Posts to designated compliance alerts channel for team visibility 📊 Output Specifications The workflow generates and stores structured data including: | Output Type | Format | Description | Example | |-------------|--------|-------------|---------| | Regulatory Updates | JSON Object | Extracted regulatory update information | {"title": "SEC Announces New Rule", "date": "2024-01-15", "agency": "SEC"} | | Update History | Google Sheets | Historical regulatory update records with timestamps | Columns: Title, Summary, Date, Agency, Source URL, Scraped At | | Slack Alerts | Messages | Compliance notifications for new updates | "📢 New SEC update: [Title] - [Summary]" | | Error Logs | System Logs | Scraping error notifications | "❌ Error scraping FINRA updates" | 🛠️ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets API access (OAuth2) Slack workspace with API access Google Sheets spreadsheet for regulatory update tracking Step-by-Step Configuration 1. Install Community Nodes Install ScrapeGraphAI community node npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Authorize access to your Google account Create or identify the spreadsheet for regulatory update tracking Note the spreadsheet ID and sheet name (default: "RegUpdates") 4. Configure Slack Integration Add Slack API credentials to your n8n instance Create or identify Slack channel: #compliance-alerts Test Slack connection with a sample message Ensure the bot has permission to post messages 5. Customize Regulatory Sources Open the "Regulatory Sources" Code node Update the urls array with additional regulatory sources if needed: const urls = [ 'https://www.sec.gov/news/pressreleases', 'https://www.finra.org/rules-guidance/notices', 'https://www.esma.europa.eu/press-news', // Add more URLs as needed ]; 6. Configure Google Sheets Update documentId in "Log to Google Sheets" node with your spreadsheet ID Update sheetName to match your sheet name (default: "RegUpdates") Ensure the sheet has columns: Title, Summary, Date, Agency, Source URL, Scraped At Create the sheet with proper column headers if starting fresh 7. Customize Slack Channel Open "Send Compliance Alert" Slack node Update the channel name (default: "#compliance-alerts") Customize the message format if needed Test with a sample message 8. Adjust Schedule Open "Daily Regulatory Poll" Schedule Trigger Modify hoursInterval to change frequency (default: 24 hours) Set specific times if needed for daily execution 9. Customize Scraping Prompt Open "Scrape Regulatory Updates" ScrapeGraphAI node Adjust the userPrompt to extract different or additional fields Modify the JSON schema in the prompt if needed Change the number of updates extracted (default: 5 most recent) 10. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for data structure and format Verify Slack alerts are working correctly Test error handling with invalid URLs Validate date filtering is working properly 🔄 Workflow Customization Options Modify Monitoring Frequency Change hoursInterval in Schedule Trigger for different frequencies Switch to multiple times per day for critical monitoring Add multiple schedule triggers for different agency checks Extend Data Collection Modify ScrapeGraphAI prompt to extract additional fields (documents, categories, impact level) Add data enrichment nodes for risk assessment Integrate with regulatory databases for more comprehensive tracking Add sentiment analysis for regulatory updates Enhance Alert System Add email notifications alongside Slack alerts Create different alert channels for different agencies Add priority-based alerting based on update keywords Integrate with SMS or push notification services Add webhook integrations for other compliance tools Advanced Analytics Add data visualization nodes for regulatory trend analysis Create automated compliance reports with summaries Integrate with business intelligence tools Add machine learning for update categorization Track regulatory themes and topics over time Multi-Source Support Add support for additional regulatory agencies Implement agency-specific scraping strategies Add regional regulatory sources (FCA, BaFin, etc.) Include state-level regulatory updates 📈 Use Cases Compliance Monitoring**: Automatically track regulatory updates to ensure timely compliance responses Risk Management**: Monitor regulatory changes that may impact business operations or investments Regulatory Intelligence**: Build historical databases of regulatory announcements for trend analysis Client Communication**: Stay informed to provide timely updates to clients about regulatory changes Legal Research**: Track regulatory developments for legal research and case preparation Investment Strategy**: Monitor regulatory changes that may affect investment decisions 🚨 Important Notes Respect website terms of service and rate limits when scraping regulatory sites Monitor ScrapeGraphAI API usage to manage costs Ensure Google Sheets has proper column structure before first run Set up Slack channel before running the workflow Consider implementing rate limiting for multiple regulatory sources Keep credentials secure and rotate them regularly Test with one regulatory source first before adding multiple sources Verify date formats are consistent across different regulatory agencies Be aware that some regulatory sites may have anti-scraping measures 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets logging failures: Check spreadsheet ID, sheet name, and column structure Slack notification failures: Verify channel name exists and bot has permissions Date filtering issues: Ensure dates from scraped content are in a parseable format Validation errors: Check that scraped data matches expected schema Empty results: Verify regulatory sites are accessible and haven't changed structure Optimization Tips: Start with one regulatory source to test the workflow Monitor API usage and costs regularly Use batch processing to avoid overwhelming scraping services Implement retry logic for failed scraping attempts Consider caching mechanisms for frequently checked sources Adjust the number of updates extracted based on typical volume Support Resources: ScrapeGraphAI documentation and API reference Google Sheets API documentation Slack API documentation for webhooks n8n community forums for workflow assistance n8n documentation for node configuration SEC, FINRA, and ESMA official websites for source verification
by Olivier
This template syncs prospects from ProspectPro into HubSpot. It checks if a company already exists in HubSpot (by ProspectPro ID or domain), then updates the record or creates a new one. Sync results are logged back in ProspectPro with tags to prevent duplicates and mark errors, ensuring reliable and repeatable integrations. ✨ Features Automatically sync ProspectPro prospects to HubSpot companies Smart search logic: match by ProspectPro ID first, then by domain Creates new HubSpot companies when no match is found Updates existing HubSpot companies with latest ProspectPro data Logs sync results back into ProspectPro with tags (HubspotSynced, HubspotSyncFailed) Extendable and modular: use as a trigger workflow or callable sub-flow ⚙ Requirements n8n instance or cloud workspace Install the ProspectPro Verified Community Node ProspectPro account & API credentials (14-day free trial) HubSpot account with OAuth2 app and API credentials 🔧 Setup Instructions Import the template and set your credentials (ProspectPro, HubSpot). Connect to a trigger (e.g., ProspectPro "New website visitor") or call as a sub-workflow. Add a propery to Hubspot for the ProspectPro ID if you don't already have one Adjust sync logic in the "Continue?"-node and HubSpot fields to match your setup. Optional: extend error handling, add Slack/CRM notifications, or sync back HubSpot data into ProspectPro. 🔐 Security Notes Prevents re-processing of failed syncs using the HubspotSyncFailed tag Error branches included for failed updates/creates Manual resolution required if sync errors persist 🧪 Testing Run with a ProspectPro ID of a company with a known domain Check HubSpot for creation or update of the company record Verify updated tags (HubspotSynced / HubspotSyncFailed) in ProspectPro 📌 About ProspectPro ProspectPro is a B2B Prospecting Platform for Dutch SMEs. It helps sales teams identify prospects, track website visitors, and streamline sales without a full CRM. Website: https://www.prospectpro.nl Platform: https://mijn.prospectpro.nl API docs: https://www.docs.bedrijfsdata.nl Support: https://www.prospectpro.nl/klantenservice Support hours: Monday–Friday, 09:00–17:00 CET 📌 About HubSpot HubSpot is a leading CRM platform offering marketing, sales, and customer service tools. It helps companies manage contacts, automate workflows, and grow their customer base. Website: https://www.hubspot.com Developer Docs: https://developers.hubspot.com
by n8n Automation Expert | Template Creator | 2+ Years Experience
🌤️ Automated Indonesian Weather Monitoring with Smart Notifications Stay ahead of weather changes with this comprehensive monitoring system that fetches real-time data from Indonesia's official meteorological agency (BMKG) and delivers beautiful, actionable weather reports directly to your Telegram. ⚡ What This Workflow Does This intelligent weather monitoring system automatically: Fetches Official Data**: Connects to BMKG's public weather API for accurate Indonesian forecasts Smart Processing**: Analyzes temperature, humidity, precipitation, and wind conditions Risk Assessment**: Generates contextual warnings for extreme weather conditions Automated Alerts**: Sends formatted weather reports to Telegram every 6 hours Error Handling**: Includes robust error detection and notification system 🎯 Perfect For Local Communities**: Keep neighborhoods informed about weather changes Business Operations**: Plan outdoor activities and logistics based on weather Emergency Preparedness**: Receive early warnings for extreme weather conditions Personal Planning**: Never get caught unprepared by sudden weather changes Agricultural Monitoring**: Track conditions affecting farming and outdoor work 🛠️ Key Features 🔄 Automated Scheduling**: Runs every 6 hours with manual trigger option 📊 Comprehensive Reports**: Current conditions + 6-hour detailed forecasts ⚠️ Smart Warnings**: Contextual alerts for temperature extremes and rain probability 🎨 Beautiful Formatting**: Rich Telegram messages with emojis and structured data 🔧 Error Recovery**: Automatic error handling with notification system 📍 Location-Aware**: Supports any Indonesian location via BMKG regional codes 📋 What You'll Get Each weather report includes: Current temperature, humidity, and weather conditions 6-hour detailed forecast with timestamps Wind speed and direction information Rain probability and visibility data Personalized warnings and recommendations Average daily statistics and trends 🚀 Setup Requirements Telegram Bot Token**: Create a bot via @BotFather Chat ID**: Your personal or group chat identifier BMKG Location Code**: Regional administrative code for your area 💡 Pro Tips Customize the location by changing the adm4 parameter in the HTTP request Adjust scheduling interval based on your monitoring needs Modify warning thresholds in the processing code Add multiple chat IDs for broader distribution Integrate with other n8n workflows for advanced automation 🌟 Why Choose This Template Production Ready**: Includes comprehensive error handling and logging Highly Customizable**: Easy to modify for different locations and preferences Official Data Source**: Uses Indonesia's trusted meteorological service User-Friendly Output**: Clean, readable reports perfect for daily use Scalable Design**: Easily extend for multiple locations or notification channels Transform your weather awareness with this professional-grade monitoring system that brings Indonesia's official weather data right to your fingertips! Keywords: weather monitoring, BMKG API, Telegram notifications, Indonesian weather, automated alerts, meteorological data, weather forecasting, n8n automation, weather API integration
by Alejandro Scuncia
An extendable RAG template to build powerful, explainable AI assistants — with query understanding, semantic metadata, and support for free-tier tools like Gemini, Gemma and Supabase. Description This workflow helps you build smart, production-ready RAG agents that go far beyond basic document Q&A. It includes: ✅ File ingestion and chunking ✅ Asynchronous LLM-powered enrichment ✅ Filterable metadata-based search ✅ Gemma-based query understanding and generation ✅ Cohere re-ranking ✅ Memory persistence via Postgres Everything is modular, low-cost, and designed to run even with free-tier LLMs and vector databases. Whether you want to build a chatbot, internal knowledge assistant, documentation search engine, or a filtered content explorer — this is your foundation. ⚙️ How It Works This workflow is divided into 3 pipelines: 📥 Ingestion Upload a PDF via form Extract text and chunk it for embedding Store in Supabase vector store using Google Gemini embeddings 🧠 Enrichment (Async) Scheduled task fetches new chunks Each chunk is enriched with LLM metadata (topics, use_case, risks, audience level, summary, etc.) Metadata is added to the vector DB for improved retrieval and filtering 🤖 Agent Chat A user question triggers the RAG agent Query Builder transforms it into keywords and filters Vector DB is queried and reranked The final answer is generated using only retrieved evidence, with references Chat memory is managed via Postgres 🌟 Key Features Asynchronous enrichment** → Save tokens, batch process with free-tier LLMs like Gemma Metadata-aware** → Improved filtering and reranking Explainable answers** → Agent cites sources and sections Chat memory** → Persistent context with Postgres Modular design** → Swap LLMs, rerankers, vector DBs, and even enrichment schema Free to run** → Built with Gemini, Gemma, Cohere, Supabase (free tier-compatible) 🔐 Required Credentials |Tool|Use| |-|-|-| |Supabase w/ PostreSQL|Vector DB + storage| |Google Gemini/Gemma|Embeddings & LLM| |Cohere API|Re-ranking| |PostgreSQL|Chat memory| 🧰 Customization Tips Swap extractFromFile with Notion/Google Drive integrations Extend Metadata Obtention prompt to fit your domain (e.g., financial, legal) Replace LLMs with OpenAI, Mistral, or Ollama Replace Postgre Chat Memory with Simple Memory or any other Use a webhook instead of a form to automate ingestion Connect to Telegram/Slack UI with a few extra nodes 💡 Use Cases Company knowledge base bot (internal docs, SOPs) Educational assistant with smart filtering (by topic or level) Legal or policy assistant that cites source sections Product documentation Q&A with multi-language support Training material assistant that highlights risks/examples Content Generation 🧠 Who It’s For Indie developers building smart chatbots AI consultants prototyping Q&A assistants Teams looking for an internal knowledge agent Anyone building affordable, explainable AI tools 🚀 Try It Out! Deploy a modular RAG assistant using n8n, Supabase, and Gemini — fully customizable and almost free to run. 1. 📁 Prepare Your PDFs Use any internal documents, manuals, or reports in *PDF *format. Optional: Add Google Drive integration to automate ingestion. 2. 🧩 Set Up Supabase Create a free Supabase project Use the table creation queries included in the workflow to set up your schema. Add your *supabaseUrl *and *supabaseKey *in your n8n credentials. > 💡 Pro Tip: Make sure you match the embedding dimensions to your model. This workflow uses Gemini text-embedding-04 (768-dim) — if switching to OpenAI, change your table vector size to 1536. 3. 🧠 Connect Gemini & Gemma Use Gemini/Gemma for embeddings and optional metadata enrichment. Or deploy locally for lightweight async LLM processing (via Ollama/HuggingFace). 4. ⚙️ Import the Workflow in n8n Open n8n (self-hosted or cloud). Import the workflow file and paste your credentials. You’re ready to ingest, enrich, and query your document base. 💬 Have Feedback or Ideas? I’d Love to Hear This project is open, modular, and evolving — just like great workflows should be :). If you’ve tried it, built on top of it, or have suggestions for improvement, I’d genuinely love to hear from you. Let’s share ideas, collaborate, or just connect as part of the n8n builder community. 📧 ascuncia.es@gmail.com 🔗 Linkedin