by Oriol Seguí
Web Consultation & Crawling Chatbot with Google Sheets Memory Who is this workflow for? This workflow is designed for SEO analysts, content creators, marketing agencies, and developers who need to index a website and then interact with its content as if it were a chatbot. ⚠ Note: if the site contains many pages, AI token consumption can generate high costs, especially during the initial crawling and analysis phase. 1. Initial Mode (first use with a URL) When the user enters a URL for the first time: URL validation using AI (gpt-5-nano). Automatic sitemap discovery via robots.txt. Relevant sitemap selection (pages, posts, categories, or tags) using GPT-4o according to configured options. (Includes “OPTIONS” node to precisely choose which types of URLs to process) Crawling of all selected pages: Downloads HTML of each page. Converts HTML to Markdown. AI analysis to extract: Detected language. Heading hierarchy (H1, H2, etc.). Internal and external links. Content summary. Structured storage in Google Sheets: Lang H1 and hierarchy External URLs Internal URLs Summary Content Data schema (flag to enable agent mode) When finished, the sheet is marked with Data schema = true, signaling that the site is indexed. 2. Agent Mode (subsequent queries) If the URL has already been indexed (Data schema = true): The chat becomes a LangChain Agent that: Reads the database in Google Sheets. Can perform real-time HTTP requests if it needs updated information. Responds as if it were the website, using stored and live data. This allows the user to ask questions such as: "What’s on the contact page?" "How many external links are there on the homepage?" "Give me all the H1 headings from the services pages" "What CTA would you suggest for my page?" "How would you expand X content?" Use cases Build a chatbot that answers questions about a website’s content. Index and analyze full websites for future queries. SEO tool to list headings, links, and content summaries. Assistant for quick exploration of a site’s structure. Generate improvement recommendations and content strategies from site data.
by Abdullah Alshiekh
🧩 Problem Solved Eliminates the slow, inconsistent, and error-prone process of manually replying to Facebook comments by automating it with AI, ensuring fast, accurate, and on-brand customer engagement 24/7. 📝 Description This workflow automatically monitors your latest Facebook post for new comments. For each new comment, an AI agent instantly generates a friendly, personalized reply in Egyptian Arabic, using your Notion knowledge base to ensure all product info is accurate. It prevents spam by never replying to the same comment twice. Simple Flow: New Comment → Duplicate Check → AI Analysis → Post Reply → Log Action. 🎯 Key Benefits 24/7 Instant Replies: Engage customers immediately, anytime. Perfect Brand Voice: Consistent, empathetic, natural Egyptian Arabic tone in every reply. Always Accurate: Pulls facts directly from your knowledge base; never invents details. No Duplicate Replies: Robust checks ensure each comment gets only one response. Huge Time Saver: Frees your team from constantly monitoring comments. 🛠️ Core Features Facebook API Integration (Read & Reply) Notion Database Integration (Knowledge Base & Logging) AI Response Generation (Google Gemini) Duplicate Comment Prevention Automated Workflow Logic 🔧 Requirements Facebook Access Token with pages_read_engagement and pages_manage_posts permissions. Two Notion Databases: One for your product knowledge, one to log processed comments. Google Gemini API Key for AI. n8n Credentials for Facebook, Notion, and Gemini. ⚙️ Quick Customization Tone & Style: Edit the prompt in the Generate Customer Reply node. Product Info: Add fields to your Notion Knowledge Base database. Escalation: Add a step to flag angry comments for a human agent. 🧠 Perfect For E-commerce: Answering product questions on promo posts. Healthcare: Providing accurate drug info with compassion. Local Businesses: Replying to queries about hours/menu items. Any Business that wants fast, professional customer engagement on social media. Need help? Connect on LinkedIn
by Alex Gurinovich
This n8n workflow automates support ticket handling with AI-driven classification, response generation, and safety checks. Responses are based solely on your Mintlify documentation, ensuring accuracy, consistency, and reduced manual effort in customer support. ✅ Trigger: New Ticket Received The workflow is triggered whenever a new support ticket is created. 🔍 Check for Assignee If the ticket is already assigned to a human agent, the bot does nothing and exits. If the ticket is unassigned, the bot continues processing. 🔢 Bot Response Count Check The workflow checks how many times the bot has already responded to this ticket. If the bot has replied more than 3 times, it stops and waits for a human to take over. This prevents endless loops and flags potentially complex cases for review. 🧠 AI-Based Ticket Categorization An AI model analyzes the ticket content and classifies it into one of the following categories: 🧾 Billing → Sends a predefined billing-related message. 📢 Advertising → Automatically deletes the ticket. 🚨 Fraud → Sends a predefined fraud-related message. ❓ Other → Proceeds to generate a dynamic response. 🤖 Mintlify Integration For tickets categorized as "Other", the customer’s question is sent to the Mintlify API, which returns a documentation-based answer. ✍️ AI Response Formatter The raw response from Mintlify is passed to an AI model that: Summarizes and rewrites the answer in a clear, friendly tone Limits the response to 120 words Adds conversational elements like “Hi,” “Thanks,” and a proper closing 🛡️ AI Confidence Filter A second AI model reviews the formatted response to ensure it sounds confident and accurate. It looks for uncertainty phrases like: “I’m not sure” “I don’t have enough information” “It depends…” If the response is flagged as uncertain, the workflow stops and waits for a human agent to respond. 📤 Send Response & Update Ticket If the response passes the confidence check: The reply is sent to the customer The ticket status is updated to “Pending”
by Hybroht
Using Mistral API, you can use this n8n workflow to automate the process of: collecting, filtering, analyzing, and summarizing news articles from multiple sources. The sources come from pre-built RSS feeds and a custom DuckDuckGo node, which you can change if you need. It will deliver the most relevant news of the day in a concise manner. ++How It Works++** The workflow begins each weekday at noon. The news are gathered from RSS feeds and a custom DuckDuckGo node, using HTTPS GET when needed. News not from today or containing unwanted keywords are filtered out. The first AI Agent will select the top news from their titles alone and generate a general title & summary. The next AI Agent will summarize the full content of the selected top news articles. The general summary and title will be combined with the top 10 news summaries into a final output. ++Requirements++ An active n8n instance (self-hosted or cloud). Install the custom DuckDuckGo node: n8n-nodes-duckduckgo-search A Mistral API key Configure the Sub-Workflow for the content which requires HTTP GET requests. It is provided in the template itself. ++Fair Notice++ This is an older version of the template. There is a superior updated version which isn't restricted to tech news, with enhanced capabilities such as communication through different channels (email, social media) and advanced keyword filtering. It was recently published in n8n. You can find it here. If you are interested or would like to discuss specific needs, then feel free to contact us.
by Jan Oberhauser
Receives data from an incoming HTTP Request (set up to use respond to webhook node) Create dummy data Convert JSON to XML which gets returned Respond to Webhook which returns the data and the content type of the data
by tanaypant
This workflow is the second of three. You can find the other workflkows here: Incident Response Workflow - Part 1 Incident Response Workflow - Part 2 Incident Response Workflow - Part 3 We have the following nodes in the workflow: Webhook node: This trigger node listens to the event when the Acknowledge button is clicked. PagerDuty node: This node changes the status of the incident report from 'Triggered' to 'Acknowledged' in PagerDuty. Mattermost node: This node publishes a message in the auxiliary channel saying that the status of the incident report has been changed to Acknowledged.
by Harshil Agrawal
This workflow generates sensor data, which is used in another workflow for managing factory incident reports. Read more about this use case and how to build both workflows with step-by-step instructions in the blog post How to automate your factory’s incident reporting. Prerequisites AMQP, an ActiveMQ connection, and credentials Nodes Interval node triggers the workflow every second. Set node set the necessary values for the items that are addeed to the queue. AMQP Sender node sends a raw message to add to the queue.
by Miquel Colomer
Do you want to control the DNS domain entries of your customers or servers? This workflow gets DNS information of any domain using the uProc Get Domain DNS records tool. You can use this workflow to check existing DNS records in real-time to ensure that any domain setup is correct. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. You can replace "Create Domain Item" with any integration containing a domain, like Google Sheets, MySQL, or Zabbix server. Every "uProc" node returns multiple items with the next fields per every item: type: Contains the DNS record type (A, ALIAS, AAAA, CERT, CNAME, MX, NAPTR, NS, PTR, SOA, SRV, TXT, URL). values: Contains the DNS record values.
by Ludwig
How it Works As n8n instances scale, teams often lose track of sub-workflows—who uses them, where they are referenced, and whether they can be safely updated. This leads to inefficiencies like unnecessary copies of workflows or reluctance to modify existing ones. This workflow solves that problem by: Fetching all workflows and identifying which ones execute others. Verifying that referenced subworkflows exist. Building a caller-subworkflow dependency graph for visibility. Automatically tagging sub-workflows based on their parent workflows. Providing a chart visualization to highlight the most-used sub-workflows. Set Up Steps Estimated time: ~10–15 minutes Set up n8n API credentials to allow access to workflows and tags. Replace instance_url with your n8n instance URL. Run the workflow to analyze dependencies and generate the graph. Review and validate assigned tags for sub-workflows. (Optional) Enable pie chart visualization to see the most-used sub-workflows. This workflow is essential for enterprise teams managing large n8n instances, preventing workflow duplication, reducing uncertainty around dependencies, and allowing safe, informed updates to sub-workflows.
by Holger
++How it Works:++ This RSS reader retrieves links from a Google Sheets file and goes through each link to retrieve the messages that are younger than 3 days, then saves them in a second Google Sheets file and then deletes all older entries in the second Google Sheets file! The retrieval can take a while due to the Google API block prevention, depending on the number of news feeds that were retrieved!==* Detailed Description is in the sticky Notes from the Workflow!*
by Giulio
This n8n workflow template allows you to create a CRUD endpoint that performs the following actions: Create a new record Get a record Get many records Update a record Delete a record This template is connected with Airtable but you can replace the Airtable nodes with anything you need to interact with (e.g. Postgres, MySQL, Notion, Coda...). The template uses the n8n Webhook node setting 'Allow Multiple HTTP Methods' to enable multiple HTTP methods on the same node. Features Just two nodes to create 5 endpoints Use it with Airtable or replace the Airtable nodes for your own customization Add your custom logic exploiting all n8n's possibilities Workflow Steps Webhook node**: exposes the endpoints to get many records and create a new record Webhook (with ID) node**: exposes the endpoints to get, update, and delete a record. Due to a n8n limitation, this endpoint will have an additional code in the path (e.g. https://my.app.n8n.cloud/webhook/580ccc56-f308-4b64-961d-38323501a170/customers/:id). Keep this in mind when using these endpoints in your application Various Airtable nodes**: execute various specific operations to interact with Airtable records Getting Started To deploy and use this template: Import the workflow into your n8n workspace Customize the endpoint paths by tweaking the 'Path' parameters in the 'Webhook' and 'Webhook (with ID)' nodes (currently customers) Set up your Airtable credentials by following this guide and customize the Airtable nodes by selecting your base, table, and the correct fields to update. ...or... replace the Airtable nodes and connect the endpoint to any other service (e.g. Postgres, MySQL, Notion, Coda) How to use the workflow Activate the workflow Connect your app to the endpoints (production URLs) to perform the various operations allowed by the workflow Note that the Webhook nodes have two URLs, one for testing and one for production. The testing URL is activated when you click on 'Test workflow' button and can't be used for production. The production URL is available after you activate the workflow. More info here. Feel free to get in touch with me if you have questions about this workflow.
by Shahrukh
AI-Powered Local Lead Generation Workflow with n8n This workflow solves one of the biggest pain points for freelancers, agencies, and SaaS founders—finding accurate local business leads at scale without manual copy-pasting or unreliable scraping tools. Traditional lead generation is time-consuming and prone to errors. This template automates the entire process so you can focus on outreach, not data gathering. ✅ What the Workflow Does Accepts a business type (e.g., plumbers) and city (e.g., Los Angeles) as input Uses AI to generate hyperlocal search terms for full neighborhood coverage Scrapes Google Maps results to extract business details and websites Filters out junk, Google-owned links, and duplicates Scrapes homepage HTML for each business and extracts valid emails using Regex Outputs a clean, deduplicated lead list with business names, websites, and emails 🛠 Everything Runs Inside n8n With: OpenAI** for AI-driven query generation Zyte API** for reliable scraping JavaScript functions** for email extraction Built-in filtering and batching** for clean results 👥 Who is This For? Marketing agencies** doing local outreach Freelancers** offering SEO, design, or lead gen services SaaS founders** targeting SMBs Sales teams** scaling outbound campaigns ✅ Requirements n8n account** (Cloud or self-hosted) OpenAI API key** (stored in n8n credentials) Zyte API key** (stored securely) Basic familiarity with Google Sheets if you want to export results ⚙️ How to Set Up Import the workflow JSON into n8n Go to Credentials in n8n and add OpenAI and Zyte API keys Replace placeholder credential references in the HTTP Request nodes Set your search parameters (business type and city) in the designated Set node Test the workflow with a single search to confirm scraping and email extraction steps Configure batching if you plan to scrape multiple neighborhoods Add an output step (Google Sheets, Airtable, or CRM) to store your leads 🔧 How to Customize Update the OpenAI prompt for different search formats (e.g., service + zip code) Adjust the Regex pattern in the JavaScript node for additional email validation rules Add extra filtering logic for niche-specific keywords Integrate with Instantly, HubSpot, or any email-sending tool for full automation