by Growth AI
Ultimate n8n Agentic RAG Template Author: Cole Medin What is this? This template provides a complete implementation of an Agentic RAG (Retrieval Augmented Generation) system in n8n that can be extended easily for your specific use case and knowledge base. Unlike standard RAG which only performs simple lookups, this agent can reason about your knowledge base, self-improve retrieval, and dynamically switch between different tools based on the specific question. Why Agentic RAG? Standard RAG has significant limitations: Poor analysis of numerical/tabular data Missing context due to document chunking Inability to connect information across documents No dynamic tool selection based on question type What makes this template powerful: Intelligent tool selection**: Switches between RAG lookups, SQL queries, or full document retrieval based on the question Complete document context**: Accesses entire documents when needed instead of just chunks Accurate numerical analysis**: Uses SQL for precise calculations on spreadsheet/tabular data Cross-document insights**: Connects information across your entire knowledge base Multi-file processing**: Handles multiple documents in a single workflow loop Efficient storage**: Uses JSONB in Supabase to store tabular data without creating new tables for each CSV Getting Started Run the table creation nodes first to set up your database tables in Supabase Upload your documents through Google Drive (or swap out for a different file storage solution) The agent will process them automatically (chunking text, storing tabular data in Supabase) Start asking questions that leverage the agent's multiple reasoning approaches Customization This template provides a solid foundation that you can extend by: Tuning the system prompt for your specific use case Adding document metadata like summaries Implementing more advanced RAG techniques Optimizing for larger knowledge bases I do intend on making a local version of this agent very soon!
by Abdul Mir
Overview Impress your leads with ultra-personalized “thank you” emails that look hand-written — sent automatically seconds after they submit your intake form. This workflow instantly scrapes the prospect's website, extracts meaningful copy, and uses AI to write a custom thank-you message referencing something specific from their site. It gives the impression you immediately reviewed their business and crafted a thoughtful reply — without lifting a finger. Who’s it for Agencies and consultants using intake forms Freelancers booking discovery calls B2B businesses that want high-touch first impressions Sales teams automating initial follow-ups How it works Triggered when a form (e.g. Tally, Typeform) is submitted Scrapes the website URL provided in the form Converts HTML to Markdown and extracts plain copy Uses AI to write a personalized thank-you message referencing the site Waits briefly to simulate real typing delay Sends the message via Gmail (or any email provider) Example use case > Prospect submits a form with their website: coolstartup.ai > > 30 seconds later, they receive: > > “Thanks for reaching out! I just checked out Cool Startup’s homepage — love the clean UX and mission around AI for teams. Looking forward to diving into how we might collaborate!” How to set up Connect your form tool (e.g. Tally or Typeform) Connect Gmail or another email provider Customize the AI prompt to match your tone Set the wait time (e.g. 30 seconds) for a realistic delay Update your website scraping logic if needed Requirements Form tool with webhook support OpenAI (or other LLM) credentials Email sending integration (Gmail, Mailgun, Postmark, etc.) How to customize Edit the email tone (casual, formal, funny, etc.) Add CRM integration to log form submission and response Trigger additional workflows like lead scoring or Slack alerts Add fallback logic if the website doesn’t scrape cleanly
by Lakindu Siriwardana
This visual workflow represents an AI-powered automated CV filtering system created using tools like n8n, Google Drive, Google Sheets, and Ollama (LLM) ⚙️ Key Features 📂 Google Drive Integration – Automatically searches and downloads CVs (PDF/DOCX/PPTX) from a shared folder. 📋 Criteria Matching – Reads and applies filtering rules from a Google Sheet. 🧠 LLM-Based Analysis – Uses a Large Language Model (Ollama) to assess and interpret CV content. 🧪 Smart Parsing – Includes structured and auto-fixing output parsers to ensure data accuracy. 📊 Automated Results Output – Writes matching candidates and analysis to a Google Sheet. 🔁 Loop and Aggregate Logic – Handles multiple CVs with iterative processing and aggregation. 🚀 No-Code Automation with n8n – Fully visual, modifiable without programming. 🛠️ How It Works Trigger: Workflow is initiated via a Webhook (from a UI “Start Workflow” button). CV Search: Searches for CV files in a designated Google Drive folder. Loop Over Files: Each file is downloaded. Text is extracted (from PDFs or other formats). Criteria Input: Matching rules are fetched from a predefined Google Sheet. Merge & Aggregate: Combines file text and criteria for unified processing. LLM Processing: Text + criteria are sent to the Basic LLM Chain. Utilizes Ollama Model for advanced language understanding. Structured or auto-fixing output parsers enhance reliability. Custom Code Execution: Optionally enriches or reformats the data. Output: Results are appended to a shared Google Sheet (the output sheet).
by Mabura Ze Guru
Try It Out! This n8n templates assists with keeping track of mobile payments within a fundraising WhatsApp group. Use cases: We fundraise alot using whatsapp groups in East Africa, especially in Kenya ! Keeping track of each payment and the tallying requires alot of manual effort and brings unnecessary tension in cases of Errors of Commision or Ommision. Works with MPESA and AIRTEL MONEY, for now. How it works Connect you twillio account / mobile number to the webhook. Send whatsapp message or web chat. We use simple regex to classify the text of the message. A switch node routes payment messages based on the payment service provider. The message may be a request for the current total or an instruction to end the campaign and clear the payment logs. A Gemini node will handle deviations from payments topic Clearing may be necessary in case of mistakes since we provide no edit function. It may also be necessary to avoid mixing payments of a previous fundraising campaign with the current one. Payment information is extracted from the message accoding to the SMS format of the service provider. This is then saved to a data table. After each payment, or a request for summary, all payments related to the sender/groupid are fetched and taken to the next node for summarization. A merge node is used to bring in the message metadata (from, to, ) to assist in whatsapp reply via twillio How to use As the treasurer / payee keep SMS receipts of all incoming mobile payments. Each SMS receipt will contain the amount and the senders details, among other info. Send each one by one via whatsapp to the phone number above or web chat. Requirements Twillio account - for whatsapp Accessible webhook url This example uses a data table payment_table but an SQL node is recommended for production use Need Help? Join the Discord or ask in the Forum! Happy Hacking!
by oka hironobu
Who is this for Team leads, project managers, and operations staff who want to automate meeting documentation. Useful for any team that records meetings and needs structured notes with clear action items. What this workflow does This workflow accepts a meeting recording upload via a web form. The recording is uploaded to Gemini Files API for audio analysis. Gemini AI generates a structured summary including key decisions, action items with assignees, and follow-up topics. A Notion page is created with the complete notes, and the team is notified on Slack with a summary and the action item list. Setup Add a Google Gemini API credential for file upload and audio analysis. Add a Notion API credential and create a database with columns: Title, Date, Summary, Action Items, Status. Add a Slack OAuth2 credential and set your meetings channel. Requirements Google Gemini API key (supports audio file analysis) Notion workspace with API integration enabled Slack workspace with OAuth2 app How to customize Edit the analysis prompt in "Analyze recording with Gemini" to focus on specific meeting types (standup, retrospective, planning). Change the Gemini model to a larger variant for longer recordings. Add a Google Calendar integration to automatically match recordings to calendar events. Important disclaimer AI-generated summaries may not capture every detail or nuance from the recording. Always review the notes before sharing externally or making decisions based on them.
by RamK
Phishing Lookout (Typosquatting) and Brand Domain Monitor This workflow monitors SSL certificate logs to find and scan new domains that might be impersonating your brand. Background In modern cybersecurity, Brand Impersonation (or "Typosquatting") is quite common in phishing attacks. Attackers register domains that look nearly identical to a trusted brand—such as .input-n8n.io, n8n.i0, etc. instead of the legitimate— to deceive users into revealing sensitive credentials or downloading malware. How it works Monitor: Checks crt.sh every hour for new SSL certificates matching your brand keywords. Process: Uses a Split Out node to handle multi-domain certificates and a Filter node to ignore your own legitimate domains bringing only most recent certificates. Scan: Automatically sends suspicious domains to Urlscan.io for a headless browser scan and screenshot. Loop & Triage: Implements a 30-second Wait to allow the scan in loop to finish before fetching results. Alert: Sends a Slack message with the domain name, report link, and an image of the supposedly suspicious site trying to mimic your site login page, etc. alerting potentially a phishing case. Setup Steps Credentials: Connect your Urlscan.io API key and Slack bot token. Configuration: Update the "Poll crt.sh" node. In URL https://crt.sh/?q=%.testdomain.com&output=json, use your specific brand name (e.g., %.yourbrand.com or .yourdomain.com instead of .testdomain.com). Whitelist: Add your real domains to the myDomains list in the Filter & Deduplicate code node to prevent false alerts. Alternatively, you may also NOT opt to include your own domain for testing purposes to check how the Workflow behaves and outputs. In such case, obviously, your domain and sub-domains also are highlighted as Suspicious (as received in Slack Alerts) Looping: Ensure the Alert Slack node output is connected back to the Split In Batches input to process all found domains.
by Parth Pansuriya
Fetch Property Listings from 99Acres & MagicBricks with Apify and Google Sheets Who’s it for Users who want to automatically fetch and organize property listings from 99Acres and MagicBricks into Google Sheets without manual copying. How it works / What it does Users submit search URLs via a form. The workflow uses Apify scrapers to fetch listings from 99Acres & MagicBricks. Data is cleaned, standardized (ID, Title, Price, Price per Sqft, URL), and deduplicated. Listings are automatically appended to their respective Google Sheets tabs. How to set up Connect your Google Sheets account in all Google Sheets nodes. Open the form trigger and submit valid search URLs. Run the workflow or submit the form live. A new spreadsheet is created and populated automatically. Requirements Google Sheets account Apify API key for 99Acres & MagicBricks scrapers Valid property search URLs How to customize the workflow Change sheet names or spreadsheet title in the “Create Master Spreadsheet” node. Adjust API parameters in the HTTP Request nodes (like max retries or proxy settings). Modify the Code nodes to include additional fields or filters.
by Robert Breen
This workflow pulls deals from Pipedrive, categorizes them by stage, and logs them into a Google Sheet for reporting and tracking. ⚙️ Setup Instructions 1️⃣ Connect Pipedrive In Pipedrive → Personal preferences → API → copy your API token URL shortcut: https://{your-company}.pipedrive.com/settings/personal/api In n8n → Credentials → New → Pipedrive API Company domain: {your-company} (the subdomain in your Pipedrive URL) API Token: paste the token from step 1 → Save In the Pipedrive Tool node, select your Pipedrive credential and (optionally) set filters (e.g., owner, label, created time). 2️⃣ Prepare Your Google Sheet Connect your Data in Google Sheets Use this format: Sample Sheet Row 1 = column names In n8n, create credentials: Google Sheets (OAuth2) Log in with your Google account and select your Spreadsheet + Worksheet 🧠 How it works Get many deals (Pipedrive)**: Fetches all deals with stage IDs. Categorize Stages**: Maps stage IDs → friendly stage names (Prospecting, Qualified, Proposal, Negotiation, Closed Won). Today's Date**: Adds a date stamp to each run. Set Fields**: Combines stage, deal name, and date into clean columns. Google Sheets (Append)**: Writes all rows to your reporting sheet. 📬 Contact Need help customizing this (e.g., pulling only active deals, calculating win-rates, or sending dashboards)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Mabura Ze Guru
Try It This n8n template provides a self hosted RAG implementation. How it works Provides one workflow to maintain the knowledge base and another one to query the knowledge base. Uploaded documents are saved into the Qdrant vector store. When a query is made, the most relevant documents are retrieved from the vector store and sent to the LLM as context for generating a response. How to use Start the workflow by clicking Execute workflow Use the file upload form to upload a document into the knowledge base (Qdrant db). Click Open chat to start asking questions related to the uploaded documents. Setup steps Below steps show how to setup on Amazon Linux. Consult your OS for respective steps Install Ollama on prem mkdir ollama cd ollama curl -fsSL https://ollama.com/install.sh | sh ollama --version Install required models ( in Amazon Linux) ollama pull llama3:8b ollama pull mistral:7b ollama pull nomic-embed-text:latest Access ollama via http://localhost:11434 Fire up Qdrant (e.g. via docker) docker run -p 6333:6333 qdrant/qdrant. Access Qdrant via http://localhost:6333/dashboard Create a Qdrant collection named knowledge-base configured with vector length of 768. NB: Do not forget a persistent docker volume for Qdrant if you want to keep the data when using docker. Point the nodes to the respective on premise Qdrant and Ollama runtimes. Need Help? Join the Discord or ask in the Forum! Happy RAGing!
by furuidoreandoro
Automated TikTok Real Estate Research for Couples This workflow automates the process of finding real estate (rental) videos on TikTok, filtering them for a specific target audience (couples in their 20s), generating an explanation of why they are recommended, and saving the results to Google Sheets and Slack. Who’s it for Real Estate Agents & Marketers:** To research trending rental properties and video styles popular on social media. Content Curators:** To automatically gather and summarize niche content from TikTok. House Hunters:** To automate the search for "rental" videos tailored to couples. How it works / What it does Trigger: The workflow starts manually (on click). Scrape TikTok: It connects to Apify to run a "TikTok Scraper". It searches for videos with the hashtag 賃貸 (Rental) and retrieves metadata. Filter & Extract (AI Agent 1): An AI Agent (using OpenRouter) analyzes the retrieved video data to select properties suitable for "couples in their 20s" and outputs the video URL. Generate Insights (AI Agent 2): A second AI Agent reviews the URL/content and generates a specific reason why this property is recommended for the target audience, formatting the output with the URL and explanation. Save to Database: The final text (URL + Reason) is appended to a Google Sheet. Notify Team: The same recommendation text is sent to a specific Slack channel to alert the user. Requirements n8n:** Version 1.0 or later. Apify Account:** You need an API token and access to the clockworks/tiktok-scraper actor. OpenRouter Account:** An API Key to use Large Language Models (LLMs) for the AI Agents. Google Cloud Platform:** A project with the Google Sheets API enabled and OAuth credentials. Slack Workspace:** Permission to add apps/bots to a channel. How to set up Import the Workflow: Copy the JSON code and paste it into your n8n editor. Configure Credentials: Apify: Create a new credential in n8n using your Apify API Token. OpenRouter: Create a new credential using your OpenRouter API Key. Google Sheets: Connect your Google account via OAuth2. Slack: Connect your Slack account via OAuth2. Configure Nodes: Google Sheets Node: Select your specific Spreadsheet and Sheet from the dropdown lists (replace the placeholders YOUR_SPREADSHEET_ID etc. if they don't update automatically). Slack Node: Select the Channel where you want to receive notifications (replace YOUR_CHANNEL_ID). Test: Click "Execute Workflow" to run a test. How to customize the workflow Change the Search Topic:* Open the *Apify** node and change the hashtags value in the "Custom Body" JSON (e.g., change "賃貸" to "DIY" or "Travel"). Adjust the Persona:* Open the *AI Agent** nodes and modify the text prompt. You can change the target audience from "20s couples" to "students" or "families." Increase Volume:* In the *Apify** node, increase the resultsPerPage or maxProfilesPerQuery to process more videos at once (note: this will consume more API credits). Change Output Format:* Modify the *Google Sheets** node to map specific fields (like Video Title, Author, Likes) into separate columns instead of just one raw output string.
by Angel Menendez
Who it’s for This workflow is for content creators and marketers who write short scripts in Google Sheets and want to automatically turn each line into an AI-generated avatar video stored in Google Drive, with links written back to the sheet. How it works A Manual Trigger starts the workflow. It first uses Get Avatar Description (Google Sheets) to read avatar details from a dedicated “Gaia” sheet. The Global Variables node sets the working script page (for example, “Draft 5”) and exposes the avatar description. Next, Get Script reads all rows from the selected sheet. Loop Over Items iterates through each row, while Set Loop Inputs prepares the variables: avatar description, speech, and framing. For every row, Generate a video with Veo (Google Gemini video model) creates an 8-second 16:9 clip. Upload video file saves it to a chosen Google Drive folder, and Update row in sheet with link to video writes the Drive link back into the same row, then loops to the next snippet. Yellow sticky notes explain each phase, with the large one summarizing the end-to-end snippet generation loop. How to set up Connect your Google Sheets and Google Drive credentials. Update the spreadsheet IDs, sheet names, and Drive folder to match your own. Configure the Gemini/Veo model credentials. Adjust the default script page name in Global Variables. Requirements n8n instance Google Sheets and Google Drive accounts Google Gemini / Veo API access No API keys or personal identifiers are hardcoded; always store credentials securely in n8n and avoid real PII in test data. How to customize Change the page value in Global Variables to target different script tabs. Edit the Veo prompt to alter background, camera framing, or speaking style. Modify video duration, aspect ratio, or output folder in the Gemini and Drive nodes. Extend the loop to add more post-processing steps (e.g., thumbnail generation, analytics tracking).
by Poghos Adamyan
Overview Automatically scrapes Google My Business listings using Apify's Google Maps Scraper, filters results to businesses with 1-star reviews, and exports structured lead data into a dedicated Google Sheet tab per run. Ideal for agencies and freelancers identifying local businesses with poor online reputations — potential leads for reputation management or review response services. How it works Fill in the built-in form with a business type (e.g. "plumber") and location (e.g. "Miami, FL") The workflow launches the Apify Google Maps Scraper actor and polls every 10 seconds until the run completes Results are filtered — only businesses with at least one 1-star review continue A new tab is created in your Google Sheet named {query}-{timestamp} and all matching leads are appended with full contact and review data What you get per lead Business Name and GMB profile URL City / Address Primary and alternative phone number Business email (if available) Negative review URL (lowest-ranked review link) Negative review URL with image attached Total 1-star review count Set up steps Apify credential — Create a free account at apify.com, generate an API token, and add it as an HTTP Header Auth credential in n8n named Apify Token (header name: Authorization, value: Bearer YOUR_TOKEN) Google Sheets credential — Connect a Google Sheets OAuth2 account in n8n Sheet ID — Open the Build Search Query node and replace YOUR_GOOGLE_SHEET_ID_HERE with your Google Sheet ID (found in the sheet URL: docs.google.com/spreadsheets/d/{SHEET_ID}/edit) Activate the workflow — your unique form URL will appear in the Form Trigger node Requirements Apify account (free tier includes ~$5 monthly usage credits — enough for dozens of searches) Google account with Google Sheets access