by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📁 Google Drive MCP Workflow – AI-Powered File Management Automation 🚀 🧠 Overview A secure and intelligent n8n workflow that connects with Google Drive via MCP (Model Control Protocol). Ideal for AI agent tasks, compliance-driven storage, and document automation. 🌟 Key Features 🔒 Built-In Safety Backs up files before edits (timestamped) Supports rollback using file history Validates file size, type, and permissions 📁 Smart Organization Automatically converts file types (PDF, DOCX, etc.) Moves files to structured folders Auto-archives old files based on age or rules 🔄 MCP Integration Accepts standardized JSON via webhook Real-time execution for AI agents Fully customizable input (action, fileId, format, etc.) ✅ AI Callable MCP Actions These are the commands AI agents can perform via MCP: Download a file (with optional format conversion) Upload a new file to Google Drive Copy a file for backup Move a file to a specific folder Archive old or inactive files Organize documents into folders Convert files to a new format (PDF, DOCX, etc.) Retrieve and review file history for rollback 📝 Example Input { "action": "download", "fileId": "abc123", "folderPath": "/projects/clientA", "convertFormat": "pdf" } 🔐 Security & Performance OAuth2 secured access to Google Drive API No sensitive data stored in transit Real-time audit logs and alerts Batch-friendly with built-in rate limiting 📌 Ideal For Businesses automating file management AI Agents retrieving, sorting, converting, or archiving files Compliance teams needing file versioning and backups ⚙️ Requirements n8n + Google Drive API v3 MCP server + Webhook integration Google OAuth2 Credentials
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically analyzes competitor backlink profiles to understand their link building strategies and identify opportunities for your own SEO efforts. It saves you time by eliminating the need to manually research competitor links and provides detailed insights into their most valuable linking relationships. Overview This workflow automatically scrapes backlink analysis tools and competitor websites to extract comprehensive backlink data including referring domains, anchor text, link quality metrics, and link acquisition patterns. It uses Bright Data to access backlink databases and AI to intelligently analyze competitor link strategies. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping backlink analysis platforms without being blocked OpenAI**: AI agent for intelligent backlink strategy analysis Google Sheets**: For storing competitor backlink data and insights How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your backlink analysis spreadsheet Customize: Define competitor domains and backlink analysis parameters Use Cases SEO Strategy**: Learn from competitor link building success and replicate strategies Link Prospecting**: Identify websites that link to competitors but not to you Competitive Intelligence**: Understand competitor SEO strategies and authority sources Link Building**: Find high-quality link opportunities in your industry Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #competitorbacklinks #backlinkanalysis #seo #linkbuilding #brightdata #webscraping #competitoranalysis #n8nworkflow #workflow #nocode #linkanalysis #backlinkresearch #seoanalysis #competitiveintelligence #linkresearch #seostrategy #backlinkmonitoring #linkprospecting #domainanalysis #seotools #backlinkaudit #linkbuilding #organicseo #searchmarketing #competitorresearch #linkstrategy #seocompetitor #backlinkinsights
by Yaron Been
This workflow automatically monitors competitor pricing across multiple products and services to track market positioning and pricing strategies. It saves you time by eliminating the need to manually check competitor prices and provides real-time insights into pricing changes and market trends. Overview This workflow automatically scrapes competitor websites and pricing pages to extract current pricing information, product details, and promotional offers. It uses Bright Data to access pricing data without restrictions and AI to intelligently parse pricing information and detect changes over time. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping competitor pricing pages without being blocked OpenAI**: AI agent for intelligent pricing data extraction and analysis Google Sheets**: For storing pricing data and tracking changes over time How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your pricing tracking spreadsheet Customize: Define competitor URLs and pricing monitoring parameters Use Cases Pricing Strategy**: Stay competitive by monitoring market pricing trends Product Management**: Track competitor feature and pricing changes Sales Teams**: Provide up-to-date competitive pricing information Market Research**: Analyze pricing patterns and market positioning Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #pricemonitoring #competitorpricing #brightdata #webscraping #pricinganalysis #n8nworkflow #workflow #nocode #competitoranalysis #pricingdata #marketresearch #pricingtrends #competitiveintelligence #pricingtracking #marketanalysis #pricecomparison #competitormonitoring #businessintelligence #pricingstrategy #marketpositioning #pricinginsights #competitorresearch #pricingautomation #markettrends #pricealerts #dynamicpricing #pricingoptimization #competitivepricing
by Itamar
🕵️ Company Research Agent (n8n + Explorium + LLM) This n8n workflow automates company research by combining Explorium’s MCP server, web scraping tools, and an AI agent. Results are written to a Google Sheet for easy use in GTM, product analysis, or competitive research. 🚀 What It Does Given a list of company domains or names, this workflow will: Look up company information using: 🧠 LLM Agent to guide the research 🔎 Explorium MCP Server for firmographic & tech signals 🌐 Website content and SerpAPI scraping (optional) Extract key commercial details (see below) Format the output in a consistent JSON structure Update a connected Google Sheet with the enriched results 🧩 Extracted Fields Each company is enriched with: domain linkedinUrl has_free_trial cheapest_plan has_enterprise_plan last_case_study_link market (e.g., B2B or B2C) integrations (e.g., Slack, Hubspot, MySQL) enrichment_status 📥 Input Sheet Format | input | |-------------| | Explorium | | n8n | | Apple | | ... | 📤 Output Sheet Format | domain | linkedinUrl | has_free_trial | cheapest_plan | has_enterprise_plan | last_case_study_link | market | integrations | enrichment_status | |--------------|----------------------------------|----------------|----------------|----------------------|-----------------------------|--------|---------------------------------------------------|-------------------| | Explorium.ai | https://linkedin.com/company/... | TRUE | 69 | TRUE | https://www.explorium.com | B2B | ["HubSpot", "Zapier", "Salesforce", ...] | done | | n8n.io | https://linkedin.com/company/... | TRUE | 20 | TRUE | https://n8n.io/case-studies | B2B | ["Slack", "Gmail", "MySQL", "Google Sheets", ...] | done | 🛠️ Tools Used n8n** (Automation platform) Explorium MCP Server** – rich company enrichment via API Anthropic Claude or OpenAI** – used by the AI researcher Google Sheets** – stores output data Structured Output Parser** – ensures clean, predictable JSON formatting 📦 How to Set It Up Add your company domains or names to the input sheet Configure your MCP and SerpAPI credentials in n8n Run the workflow using the Test Workflow trigger Watch the sheet populate with results You can adapt the system to output different formats or fields depending on your team's research goals. 📌 Use Cases Competitive landscape analysis Lead intelligence for outbound campaigns Feature benchmarking (e.g., who offers enterprise or free trial) VC/investment research 🧠 Notes This agent is easily customizable. Adjust the LLM prompt or Output Parser to extract different properties. Explorium MCP is leveraged as the core enrichment engine, ensuring signal accuracy and freshness.
by n8n Team
This workflow integrates both web scraping and NLP functionalities. It uses HTML parsing to extract links, HTTP requests to fetch essay content, and AI-based summarization using GPT-4o. It's an excellent example of an end-to-end automated task that is not only efficient but also provides real value by summarizing valuable content. Note that to use this template, you need to be on n8n version 1.50.0 or later.
by Sarfaraz Muhammad Sajib
This n8n workflow sends SMS messages through the Textbelt API by accepting phone numbers, messages, and API keys as inputs. It uses a manual trigger to start the process, sets the necessary data, and executes an HTTP POST request to deliver the SMS. Step-by-Step Explanation: Manual Trigger: Starts the workflow manually by clicking ‘Execute workflow’. Set Data Node: Defines the required input parameters (phone, message, and key) that will be sent to the SMS API. You can populate these fields with your target phone number, the text message, and your Textbelt API key. HTTP Request Node: Sends a POST request to https://textbelt.com/tex with the phone number, message, and API key in the request body to send the SMS. The response from the API confirms whether the message was successfully sent.
by vinci-king-01
Daily Report Generator with Mattermost and HubSpot This workflow automatically compiles key metrics from HubSpot (and optional internal data sources) into a concise daily summary and posts it to a designated Mattermost channel. It helps sales and marketing teams stay informed without manually pulling reports or navigating multiple dashboards. Pre-conditions/Requirements Prerequisites An n8n instance (self-hosted or n8n Cloud) HubSpot account with a Private App token Mattermost workspace with an incoming webhook enabled (Optional) Internal REST API or database endpoint for additional data Required Credentials HubSpot Private App Token** – Grants API access to Deals, Contacts, Activities, etc. Mattermost Personal Access Token** (or Incoming Webhook URL) – Permits message posting to channels n8n User Management** – Ensure the workflow has network access to HubSpot and Mattermost Specific Setup Requirements | Component | Requirement | Example/Notes | |-------------|---------------------------------------------------------|-------------------------------------------------------| | Mattermost | Create an Incoming Webhook or generate a PAT | System Console → Integrations | | HubSpot | Create a Private App → Scopes: crm.objects.deals.read | Settings → Integrations → Private Apps | | Scheduler | External cron job or n8n’s internal trigger* | curl https://n8n.yourdomain.com/webhook/daily_report | *The provided template uses a Webhook node so you can trigger it via any scheduler (e.g., crontab, Zapier, GitHub Actions). Replace with the Cron node if preferred. How it works This workflow automatically compiles key metrics from HubSpot (and optional internal data sources) into a concise daily summary and posts it to a designated Mattermost channel. It helps sales and marketing teams stay informed without manually pulling reports or navigating multiple dashboards. Key Steps: Webhook Trigger**: Waits for a daily call from an external scheduler. HubSpot Node**: Retrieves deal statistics, new contacts, and other CRM metrics. HTTP Request Node**: (Optional) Pulls supplementary data from an internal API. Merge Node**: Consolidates HubSpot and optional data sources. Code / Set Nodes**: Formats numbers, calculates KPIs, and builds a markdown message. If Node**: Guards against empty datasets or API failures. Mattermost Node**: Posts the formatted report to the chosen channel. Respond to Webhook**: Returns a JSON confirmation to the scheduler. Set up steps Setup Time: 15-20 minutes Import Template: In n8n, go to “Workflows → Import from File” and select the JSON template. Add Credentials: a. HubSpot → New credential → Paste Private App token b. Mattermost → New credential → Paste PAT or Webhook URL Configure Webhook URL: Copy the production URL of the Webhook node and add it to your external scheduler (e.g., crontab, Zapier). Adjust Query Parameters (HubSpot node): Modify filters (e.g., deal stage, create date) as needed. Edit Message Template (Code node): Update markdown formatting, include/exclude sections. Test Run: Manually execute the workflow. Verify the JSON response and Mattermost post. Activate: Toggle workflow to “Active”. Confirm your scheduler triggers it at the desired time. Node Descriptions Core Workflow Nodes: stickyNote** – Contains inline documentation and instructions. Webhook** – Primary trigger, receives the daily HTTP call. HubSpot** – Pulls deals, contacts, and engagement data. HTTP Request** – Fetches optional internal statistics (e.g., support tickets). Merge** – Combines HubSpot and HTTP results into one object. Set** – Selects and renames fields for clarity. Code** – Calculates KPIs (e.g., conversion rate) and assembles a markdown summary. If** – Checks for empty data arrays or API errors. Mattermost** – Sends the final message to a channel. Respond to Webhook** – Returns a success/failure payload to the caller. Data Flow: Webhook → HubSpot Webhook → HTTP Request HubSpot + HTTP Request → Merge → Set → Code → If → Mattermost → Respond to Webhook Customization Examples Change Report Time Range // HubSpot Node → Additional Fields { "filterGroups": [{ "filters": [{ "propertyName": "createdate", "operator": "BETWEEN", "highValue": Date.now(), "value": Date.now() - 24 * 60 * 60 * 1000 // last 24h }] }] } Format Mattermost Message with Emojis // Code Node (return statement) return [{ json: { text: :bar_chart: Daily CRM Report\n\n• New Deals: ${newDeals}\n• New Contacts: ${newContacts}\n• Win Rate: ${winRate}% } }]; Data Output Format The workflow outputs structured JSON data: { "status": "success", "date": "2024-05-23", "hubspot": { "new_deals": 12, "new_contacts": 34, "win_rate": 27.1 }, "internal": { "tickets_opened": 8, "tickets_closed": 6 }, "mattermostPostId": "abc123xyz" } Troubleshooting Common Issues 401 Unauthorized (HubSpot) – Verify Private App token and scopes. Regenerate if necessary. Message Not Posting – Ensure the Mattermost token has post:write or the webhook URL is valid. Performance Tips Cache HubSpot responses during testing to avoid hitting API limits. Reduce payload size by selecting only the fields you need in the Set node. Pro Tips: Replace the Webhook node with the Cron node for an all-in-n8n schedule. Use environment variables for tokens ({{$env.HUBSPOT_TOKEN}}) to avoid hard-coding secrets. Add a second Mattermost node to DM managers for critical alerts (e.g., low win rate). This is a community workflow template provided “as-is.” It is not officially supported by n8n GmbH. Always review and test in a development environment before deploying to production.
by scrapeless official
Brief Overview This workflow integrates Linear, Scrapeless, and Claude AI to create an AI research assistant that can respond to natural language commands and automatically perform market research, trend analysis, data extraction, and intelligent analysis. Simply enter commands such as /search, /trends, /crawl in the Linear task, and the system will automatically perform search, crawling, or trend analysis operations, and return Claude AI's analysis results to Linear in the form of comments. How It Works Trigger: A user creates or updates an issue in Linear and enters a specific command (e.g. /search competitor analysis). n8n Webhook: Listens to Linear events and triggers automated processes. Command identification: Determines the type of command entered by the user through the Switch node (search/trends/unlock/scrape/crawl). Data extraction: Calls the Scrapeless API to perform the corresponding data crawling task. Data cleaning and aggregation: Use Code Node to unify the structure of the data returned by Scrapeless. Claude AI analysis: Claude receives structured data and generates summaries, insights, and recommendations. Result writing: Writes the analysis results to the original issue as comments through the Linear API. Features Multiple commands supported /search: Google SERP data query /trends: Google Trends trend analysis /unlock: Unlock protected web content (JS rendering) /scrape: Single page crawling /crawl: Whole site multi-page crawling Claude AI intelligent analysis Automatically structure Scrapeless data Generate executable suggestions and trend insights Format optimization to adapt to Linear comment format Complete automation process Codeless process management based on n8n Multi-channel parallel logic distribution + data standardization processing Support custom API Key, regional language settings and other parameters Requirements Scrapeless API Key**: Scrapeless Service request credentials. Log in to the Scrapeless Dashboard Then click "Setting" on the left -> select "API Key Management" -> click "Create API Key". Finally, click the API Key you created to copy it. n8n Instance**: Self-hosted or n8n.cloud account. Claude AI**: Anthropic API Key (Claude Sonnet 3.7 model recommended) Installation Log in to Linear and get a Personal API Token Log in to n8n Cloud or a local instance Import the n8n workflow JSON file provided by Scrapeless Configure the following environment variables and credentials: Linear API Token Scrapeless API Token Claude API Key Configure the Webhook URL and bind to the Linear Webhook settings page Usage This automated job finder agent is ideal for: | Industry / Role | Use Case | | --------------------------------- | -------------------------------------------------------------------------------------------------- | | SaaS / B2B Software | | | Market Research Teams | Analyze competitor pricing pages using /unlock, and feature pages via /scrape. | | Content & SEO | Discover trending keywords and SERP data via /search and /trends to guide content topics. | | Product Managers | Use /crawl to explore product documentation across competitor sites for feature benchmarking. | | AI & Data-Driven Teams | | | AI Application Developers | Automate info extraction + LLM summarization for building intelligent research agents. | | Data Analysts | Aggregate structured insights at scale using /crawl + Claude summarization. | | Automation Engineers | Integrate command workflows (e.g., /scrape, /search) into tools like Linear to boost productivity. | | E-commerce / DTC Brands | | | Market & Competitive Analysts | Monitor competitor sites, pricing, and discounts with /unlock and /scrape. | | SEO & Content Teams | Track keyword trends and popular queries via /search and /trends. | | Investment / Consulting / VC | | | Investment Analysts | Crawl startup product docs, guides, and support pages via /crawl for due diligence. | | Consulting Teams | Combine SERP and trend data (/search, /trends) for fast market snapshots. | | Media / Intelligence Research | | | Journalists & Editors | Extract forum/news content from platforms like HN or Reddit using /scrape. | | Public Opinion Analysts | Monitor multi-source keyword trends and sentiment signals to support real-time insights. | Output
by Dataki
This workflow helps you generate an llms.txt file (if you're unfamiliar with it, check out this article) using a Screaming Frog export. Screaming Frog is a well-known website crawler. You can easily crawl a website. Then, export the "internal_html" section in CSV format. How It Works: A form allows you to enter: The name of the website A short description The internal_html.csv file from your Screaming Frog export Once the form is submitted, the workflow is triggered automatically, and you can download the llms.txt file directly from n8n. Downloading the File Since the last node in this workflow is "Convert to File", you will need to download the file directly from the n8n UI. However, you can easily add a node (e.g., Google Drive, OneDrive) to automatically upload the file wherever you want. AI-Powered Filtering (Optional): This workflow includes a text classifier node, which is deactivated by default. You can activate it to apply a more intelligent filter to select URLs for the llms.txt file. Consider modifying the description in the classifier node to specify the type of URLs you want to include. How to Use This Workflow Crawl the website you want to generate an llms.txt file for using Screaming Frog. Export the "internal_html" section in CSV format. In n8n, click "Test Workflow", fill in the form, and upload the internal_html.csv file. Once the workflow is complete, go to the "Export to File" node and download the output. That's it! You now have your llms.txt file! Recommended Usage: Use this workflow directly in the n8n UI by clicking 'Test Workflow' and uploading the file in the form.
by Nabin Bhandari
This template uses VAPI and Cal.com to book appointments through a voice conversation. It detects whether the user wants to check availability or book an appointment, then responds naturally with real-time scheduling options. Who is this for? This workflow is perfect for: Voice assistant developers AI receptionists and smart concierge tools Service providers (salons, clinics, coaches) needing hands-free scheduling Anyone building voice-based customer experiences What does it do? This workflow turns a natural voice conversation into a working appointment system. It starts with a Webhook connected to your VAPI voice agent. The Set node extracts user intent (like “check availability” or “book now”). A Switch node branches logic based on the intent. If the user wants to check availability, the workflow fetches available times from Cal.com. If the user wants to book, it creates a new event using Cal.com's API. The final result is sent back to VAPI as a conversational voice response. How to use it Import this workflow into your n8n instance. Set up a Webhook node and connect it to your VAPI voice agent. Add your Cal.com API token as a credential (use HTTP Header Auth). Deploy and test using VAPI’s simulator or real phone input. (Optional) Customize the OpenAI prompt if you're using it to process or moderate inputs. Requirements A working VAPI agent A Cal.com account with API access n8n (cloud or self-hosted) An understanding of how to configure webhook and API credentials in n8n Customization Ideas Swap out Cal.com with another booking API (like Calendly) Add a Google Sheets or Supabase node to log appointments Use OpenAI to summarize or sanitize voice inputs before proceeding Build multi-turn conversations in VAPI for more complex bookings
by Abbas Ali
This n8n workflow automatically finds apartments for rent in Germany, filters them by your city, rent budget, and number of rooms, and applies to them via email. Each application includes: A personalized German cover letter Schufa report (fetched dynamically from Google Drive) Recent salary slips (also fetched from Google Drive) The workflow runs daily at a scheduled time, emails landlords or agencies automatically, and logs every application into a Google Sheet for tracking. How It Works Scheduled Trigger – Runs every day at 9 AM (adjustable). Fetch Listings – Uses immobilienscout24 API (or similar) to pull rental listings for your selected city. Filter Listings – Keeps only listings matching your CITY, MAX_RENT, and ROOMS settings. Fetch Documents – Retrieves your Schufa report and salary slips from Google Drive (no need for local hosting). Generate Cover Letter – Creates a personalized German-language letter per apartment. Send Email Application – Sends the email to the landlord or agent with cover letter + documents attached. Log Applications – Saves each application (title, address, rent, date) in a Google Sheet. How to Use Import the workflow JSON into n8n. Set environment variables in n8n (for security): immobilienscout24_TOKEN: Your immobilienscout24 API token immobilienscout24_LISTING_ACTOR: Actor ID for your preferred rental listing scraper (or custom) MY_EMAIL: Your sender email address (SMTP configured in n8n) SCHUFA_FILE_ID: Google Drive File ID for your Schufa PDF SALARY_FILE_ID: Google Drive File ID for your Salary Slips PDF APPLICATION_SHEET_ID: Google Sheet ID to log applications Authenticate Google Drive and Google Sheets (OAuth2 in n8n). Customize search filters in the Set Config node: CITY (e.g., Berlin) MAX_RENT (e.g., 1200) ROOMS (e.g., 2) Activate the workflow – It will run daily at the configured time and send applications automatically. Check your Google Sheet – Every application will be logged for tracking. Requirements An immobilienscout24 account (or another apartment listing API, can be substituted). A Google account (for Drive and Sheets integration). A Schufa report (PDF) uploaded to Google Drive. Recent salary slips (PDF) uploaded to Google Drive. An SMTP-configured email account for sending applications. n8n instance (self-hosted or cloud) with: Google Drive and Google Sheets credentials configured Environment variables set for tokens and file IDs A working email SMTP setup
by ConvertAPI
Who is this for? For developers and organizations that need to convert DOCX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the DOCX file from the web. Converts the DOCX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.