by David Ashby
Complete MCP server exposing 15 BulkSMS JSON REST API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add BulkSMS JSON REST API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the BulkSMS JSON REST API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.bulksms.com/v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (15 total) 🔧 Blocked-Numbers (2 endpoints) • GET /blocked-numbers: Block Phone Number • POST /blocked-numbers: Create a blocked number 🔧 Credit (1 endpoints) • POST /credit/transfer: Transfer Account Credits 🔧 Messages (5 endpoints) • GET /messages: List Related Messages • POST /messages: Send Messages • GET /messages/send: Send message by simple GET or POST • GET /messages/{id}: Show Message • GET /messages/{id}/relatedReceivedMessages: List Related Messages 🔧 Profile (1 endpoints) • GET /profile: Retrieve User Profile 🔧 Rmm (1 endpoints) • POST /rmm/pre-sign-attachment: Generate Attachment Upload URL 🔧 Webhooks (5 endpoints) • GET /webhooks: Update Webhook Settings • POST /webhooks: Create a webhook • DELETE /webhooks/{id}: Delete a webhook • GET /webhooks/{id}: Read a webhook • POST /webhooks/{id}: Update a webhook 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native BulkSMS JSON REST API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 9 NPR Listening Service API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add NPR Listening Service credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the NPR Listening Service API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://listening.api.npr.org • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (9 total) 🔧 V2 (9 endpoints) • GET /v2/aggregation/{aggId}/recommendations: Get a set of recommendations for an aggregation independent of the user's lis... • GET /v2/channels: List Available Channels • GET /v2/history: Get User Ratings History • GET /v2/organizations/{orgId}/categories/{category}/recommendations: Get a list of recommendations from a category of content from an organization • GET /v2/organizations/{orgId}/recommendations: Get a variety of details about an organization including various lists of rec... • GET /v2/promo/recommendations: Get Recent Promo Audio • POST /v2/ratings: Submit Media Ratings • GET /v2/recommendations: Get User Recommendations • GET /v2/search/recommendations: Get Search Recommendations 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native NPR Listening Service API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Bao Duy Nguyen
Who is this for? If you hate SPAM emails or don't want to clean them up frequently. What problem is this workflow solving? It automatically deletes SPAM emails for you, so you don't have to. Save a bit of your time. What this workflow does You can specify when to execute the workflow: daily, weekly, or monthly. Setup Select the preferred execution time Configure credentials for Gmail OAuth2 How to customize this workflow to your needs There's no need. It just works!
by Lucas Peyrin
How it works This workflow is an interactive, hands-on tutorial designed to teach you the absolute basics of JSON (JavaScript Object Notation) and, more importantly, how to use it within n8n. It's perfect for beginners who are new to automation and data structures. The tutorial is structured as a series of simple steps. Each node introduces a new, fundamental concept of JSON: Key/Value Pairs: The basic building block of all JSON. Data Types: It then walks you through the most common data types one by one: String (text) Number (integers and decimals) Boolean (true or false) Null (representing "nothing") Array (an ordered list of items) Object (a collection of key/value pairs) Using JSON with Expressions: The most important step! It shows you how to dynamically pull data from a previous node into a new one using n8n's expressions ({{ }}). Final Exam: A final node puts everything together, building a complete JSON object by referencing data from all the previous steps. Each node has a detailed sticky note explaining the concept in simple terms. Set up steps Setup time: 0 minutes! This is a tutorial workflow, so there is no setup required. Simply click the "Execute Workflow" button to run it. Follow the instructions in the main sticky note: click on each node in order, from top to bottom. For each node, observe the output in the right-hand panel and read the sticky note next to it to understand what you're seeing. By the end, you'll have a solid understanding of what JSON is and how to work with it in your own n8n workflows.
by Daniel Shashko
How it Works This workflow automates competitive price intelligence using Bright Data's enterprise web scraping API. On a scheduled basis (default: daily at 9 AM), the system loops through configured competitor product URLs, triggers Bright Data's web scraper to extract real-time pricing data from each site, and intelligently compares competitor prices against your current pricing. The workflow handles the full scraping lifecycle: it sends scraping requests to Bright Data, waits for completion, fetches the scraped product data, and parses prices from various formats and website structures. All pricing data is automatically logged to Google Sheets for historical tracking and trend analysis. When a competitor's price drops below yours by more than the configured threshold (e.g., 10% cheaper), the system immediately sends detailed alerts via Slack and email to your pricing team with actionable intelligence. At the end of each monitoring run, the workflow generates a comprehensive daily summary report that aggregates all competitor data, calculates average price differences, identifies the lowest and highest competitors, and provides a complete competitive landscape view. This eliminates hours of manual competitor research and enables data-driven pricing decisions in real-time. Who is this for? E-commerce businesses and online retailers needing automated competitive price monitoring Product managers and pricing strategists requiring real-time competitive intelligence Revenue operations teams managing dynamic pricing strategies across multiple products Marketplaces competing in price-sensitive categories where margins matter Any business that needs to track competitor pricing without manual daily checks Setup Steps Setup time: Approx. 30-40 minutes (Bright Data configuration, credential setup, competitor URL configuration) Requirements: Bright Data account with Web Scraper API access Bright Data API token (from dashboard) Google account with a spreadsheet for price tracking Slack workspace with pricing channels SMTP email provider for alerts Sign up for Bright Data and create a web scraping dataset (use e-commerce template for product data) Obtain your Bright Data API token and dataset ID from the dashboard Configure these nodes: Schedule Daily Check: Set monitoring frequency using cron expression (default: 9 AM daily) Load Competitor URLs: Add competitor product URLs array, configure your current price, set alert threshold percentage Loop Through Competitors: Automatically handles multiple URLs (no configuration needed) Scrape with Bright Data: Add Bright Data
by Juan Carlos Cavero Gracia
This automation template turns any RSS feed into ready-to-publish social content using AI. It continuously ingests articles, scores their quality and relevance, crafts platform-native posts (Twitter/X threads and LinkedIn posts), routes items for review or archiving, logs everything to Google Sheets, and can publish automatically to X, Threads, and LinkedIn. Note: This workflow uses OpenAI models for analysis and content generation and integrates with Upload-Post for multi-platform publishing and Google Sheets for tracking. Costs depend on token usage and posting volume.* Who Is This For? Content Teams & Solo Creators:** Ship consistent, high-signal posts without manual rewriting. Newsletters & Media Brands:** Turn breaking stories into shareable, platform-native content. Agencies:** Scale curation across clients with review and auto-publish paths. Founders & PMMs:** Maintain a steady public presence with minimal effort. What Problem Does This Workflow Solve? Manual curation and rewriting of news is slow and inconsistent. This workflow: Scores Articles:** Filters noise with AI quality/relevance scoring. Auto-Writes Posts:** Generates concise X threads and business-ready LinkedIn copy. Routes Intelligently:** Sends good items to publish/review and archives the rest. Logs Everything:** Keeps a structured history in Google Sheets for analytics. How It Works RSS Polling: Monitors your chosen feed(s) on a schedule. Scoring AI: Rates quality and relevance; extracts summary, key topics, and angle. Parse & Enrich: Normalizes AI output and merges with article metadata. Quality Gates: Directs items to “publish/review” or “archive.” Content Generation: Produces an X thread and a LinkedIn post with clear hooks and insights. Publishing: Uploads to X, Threads, and LinkedIn via Upload-Post (optional). Sheets Logging: Writes summaries, scores, and outputs to Google Sheets. Setup OpenAI API: Add your OpenAI credentials (models like gpt-4.1/gpt-4o). Upload-Post Credentials: Connect the Upload-Post integration and target pages (e.g., LinkedIn org ID). Google Sheets: Add OAuth credentials and point “Store Content”/“New for Review”/“Archive” to your sheets. RSS Feed URL: Replace the sample feed with your preferred sources. Thresholds & Routing: Adjust quality/relevance filters to your standards. Publishing Mode: Toggle platforms (X, Threads, LinkedIn) and decide auto vs. review-first. Requirements Accounts:** n8n, OpenAI, Upload-Post, Google account (Sheets). API Keys:** OpenAI token, Upload-Post credentials, Google Sheets OAuth. Feeds:** One or more RSS URLs for your niche. Features AI Triage:** Quality/relevance scoring to prioritize high-value stories. Platform-Native Output:** Hooked X threads and professional LinkedIn posts. Review or Auto-Publish:** Safe gating before posting live. Analytics-Ready Logs:** Structured entries in Google Sheets. Modular & Extensible:** Swap feeds, add Slack/Discord alerts, or plug into CMS/Notion. Stay top-of-mind: convert fresh news into compelling, on-brand social content—automatically.
by Mohamed Salama
Let AI agents fetch communicate with your Bubble app automatically. It connects direcly with your Bubble data API. This workflow is designed for teams building AI tools or copilots that need seamless access to Bubble backend data via natural language queries. How it works Triggered via a webhook from an AI agent using the MCP (Model-Chain Prompt) protocol. The agent selects the appropriate data tool (e.g., projects, user, bookings) based on user intent. The workflow queries your Bubble database and returns the result. Ideal for integrating with ChatGPT, n8n AI-Agents, assistants, or autonomous workflows that need real-time access to app data. Set up steps Enable access to your Bubble data or backend APIs (as needed). Create a Bubble admin token. Add your Bubble node/s to your n8n workflow. Add your Bubble admin token. Configer your Bubble node/s. Copy the generated webhook URL from the MCP Server Trigger node and register it with your AI tool (e.g., LangChain tool loader). (Optional) Adjust filters in the “Get an Object Details” node to match your dataset needs. Once connected, your AI agents can automatically retrieve context-aware data from your Bubble app, no manual lookups required.
by Cheng Siong Chin
Introduction Generates complete scientific papers from title and abstract using AI. Designed for researchers, automating literature search, content generation, and citation formatting. How It Works Extracts input, searches academic databases (CrossRef, Semantic Scholar, OpenAlex), merges sources, processes citations, generates AI sections (Introduction, Literature Review, Methodology, Results, Discussion, Conclusion), compiles document. Workflow Template Webhook → Extract Data → Search (CrossRef + Semantic Scholar + OpenAlex) → Merge Sources → Process References → Prepare Context → AI Generate (Introduction + Literature Review + Methodology + Results + Discussion + Conclusion via OpenAI) → Merge Sections → Compile Document Workflow Steps Input & Search: Webhook receives title/abstract; searches CrossRef, Semantic Scholar, OpenAlex; merges and processes references AI Generation: OpenAI generates six sections with in-text citations using retrieved references Assembly: Merges sections; compiles formatted document with reference list Setup Instructions Trigger & APIs: Configure webhook URL; add OpenAI API key; customize prompts Databases: Set up CrossRef, Semantic Scholar, OpenAlex API access; configure search parameters Prerequisites OpenAI API, CrossRef API, Semantic Scholar API, OpenAlex API, webhook platform, n8n instance Customization Adjust reference limits, modify prompts for research fields, add citation styles (APA/IEEE), integrate databases (PubMed, arXiv), customize outputs (DOCX/LaTeX/PDF) Benefits Automates paper drafting, comprehensive literature integration, proper citations
by David Ashby
Complete MCP server exposing 14 doqs.dev | PDF filling API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add doqs.dev | PDF filling API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the doqs.dev | PDF filling API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.doqs.dev/v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (14 total) 🔧 Designer (7 endpoints) • GET /designer/templates/: List Templates • POST /designer/templates/: Create Template • POST /designer/templates/preview: Preview • DELETE /designer/templates/{id}: Delete • GET /designer/templates/{id}: List Templates • PUT /designer/templates/{id}: Update Template • POST /designer/templates/{id}/generate: Generate Pdf 🔧 Templates (7 endpoints) • GET /templates: List • POST /templates: Create • DELETE /templates/{id}: Delete • GET /templates/{id}: Get Template • PUT /templates/{id}: Update • GET /templates/{id}/file: Get File • POST /templates/{id}/fill: Fill 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native doqs.dev | PDF filling API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by n8n Team
This workflow creates or updates a Mautic contact when a new event is scheduled in Calendly. The first name and email address are the only two fields that get updated. Prerequisites Calendly account and Calendly credentials. Mautic account and Mautic credentials. How it works Triggers on a new event in Calendly. Creates a new contact in Mautic if the email address does not exist in Mautic. Updates the contact's first name in Mautic if the email address exists in Mautic.
by Avkash Kakdiya
How it works This workflow captures new subscribers from a Mailchimp list and extracts their key details. It then enriches the subscriber data using AI, predicting professional attributes and assigning a lead score. Based on the score, high-value leads are identified, and all leads are synced into HubSpot and Pipedrive. For top-priority leads, the workflow automatically creates new deals in Pipedrive for sales follow-up. Step-by-step Step-by-step 1. Capture subscriber data Mailchimp Subscriber Trigger** – Detects new signups in a Mailchimp list. Extract Subscriber Data** – Normalizes payload into clean fields like name, email, and timestamp. 2. Enrich with AI Lead Enrichment AI** – Uses AI to infer company, role, industry, intent, LinkedIn, and lead score. Parse & Merge Enrichment** – Merges AI output with subscriber info and sets defaults if parsing fails. 3. Qualify leads High-Value Lead Check** – Filters leads with a score ≥70 to flag them as priority. Create High-Value Deal** – Opens a new deal in Pipedrive for high-value leads. 4. Sync to CRMs HubSpot Contact Sync** – Updates enriched lead data in HubSpot CRM. Pipedrive Person Create** – Adds lead as a person in Pipedrive. Why use this? Automatically enrich raw Mailchimp subscribers with valuable professional insights. Score and qualify leads instantly for prioritization. Keep HubSpot and Pipedrive updated with enriched lead records. Automate deal creation for high-value opportunities, saving sales team effort. Build a seamless pipeline from marketing signups to CRM-ready opportunities.
by Yves Tkaczyk
Use cases Monitor Google Drive folder, parsing PDF, DOCX and image file into a destination folder, ready for further processing (e.g. RAG ingestion, translation, etc.) Keep processing log in Google Sheet and send Slack notifications. How it works Trigger: Watch Google Drive folder for new and updated files. Create a uniquely named destination folder, copying the input file. Parse the file using Mistral Document, extracting content and handling non-OCRable images separately. Save the data returned by Mistral Document into the destination Google Drive folder (raw JSON file, Markdown files, and images) for further processing. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (14 nodes total) to use the credentials Mistral node: Create Mistral Cloud API credentials. Read more about Mistral Cloud Credentials. Update the OCR Document node to use the Mistral Cloud credentials. Slack nodes: Create Slack OAuth2 credentials. Read more about Slack OAuth2 credentials Update the two Slack nodes: Send Success Message and Send Error Message: Set the credentials Select the channel where you want to send the notifications (channels can be different for success and errors). Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration. Ensure the spreadsheet can be accessed as Editor by the account used by the Google Credentials above. Create a directory for input files and a directory for output folders/files. Ensure the directories can be accessed by the account used by the Google Credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes. Requirements Google account with Google API access Mistral Cloud account access to Mistral API key. Slack account with access to Slack client ID and secret ID. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone building a data pipeline ingesting files to be OCRed for further processing. 🔒 Security All credentials are stored as n8n credentials. The only information stored in this workflow that could be considered sensitive are the Google Drive Directory and Sheet IDs. These directories and the spreadsheet should be secured according to your needs. Need Help? Reach out on LinkedIn or Ask in the Forum!