by Abrar Sami
Turn Reddit Questions into SEO Articles Automatically This workflow takes real user questions from Reddit and transforms them into fully structured blog posts — title, intro, steps, and conclusion — using AI. How it works Manually triggered when you want to run it Scrapes the latest posts from a specific subreddit (e.g. r/n8n) Filters only posts that are real questions (based on keywords like “how,” “what,” “why”) Logs relevant questions into a Google Sheet as raw input Enhances each question using AI (rephrases, creates a clean title and slug) Generates full-length blog content: ✏️ Intro paragraph ✅ Step-by-step guide 🧠 Clear conclusion Saves the final blog content to a second Google Sheet for publishing Set up steps You’ll need access to: Reddit API (OAuth) OpenAI API Google Sheets Takes around 15–20 minutes to connect all the credentials and tweak prompts Customize the subreddit or topic focus by changing the Reddit node config Perfect for content teams who want to scale content output using real community pain points — without ever starting from a blank page.
by David Ashby
Complete MCP server exposing 1 Listing API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Listing API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Listing API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com{basePath} • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Item_Draft (1 endpoints) • POST /item_draft/: Create eBay Listing Draft 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Listing API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 1 IP2Proxy Proxy Detection API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add IP2Proxy Proxy Detection credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the IP2Proxy Proxy Detection API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ip2proxy.com • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 General (1 endpoints) • GET /: Check Proxy IP 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native IP2Proxy Proxy Detection API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Mailcheck Tool operations to AI agents. Zero configuration needed - 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Mailcheck Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Mailcheck Tool tool with full error handling 📋 Available Operations (1 total) Every possible Mailcheck Tool operation is included: 🔧 Email (1 operations) • Check an email 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Mailcheck Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Mailcheck Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Dr. Firas
Generate AI videos with Seedance & Blotato, upload to TikTok, YouTube & Instagram Who is this for? This template is ideal for creators, content marketers, social media managers, and AI enthusiasts who want to automate the production of short-form, visually captivating videos for platforms like TikTok, YouTube Shorts, and Instagram Reels — all without manual editing or publishing. What problem is this workflow solving? Creating engaging videos requires: Generating creative ideas Writing detailed scene prompts Producing realistic video clips and sound effects Editing and stitching the final video Publishing across multiple platforms This workflow automates the entire process, saving hours of manual work and ensuring consistent, AI-driven content output ready for social distribution. What this workflow does This end-to-end AI video automation workflow: Generates a creative idea using OpenAI and LangChain Creates detailed video prompts with Seedance AI Generates video clips via Wavespeed AI Generates sound effects with Fal AI Stitches the final video using Fal AI’s ffmpeg API Logs metadata and video links to Google Sheets Uploads the video to Blotato Auto-publishes to TikTok, YouTube, Instagram, and other platforms Setup Add your OpenAI API key in the LLM nodes Set up Seedance and Wavespeed AI credentials for video prompt and clip generation Add your Fal AI API key for sound and stitching steps Connect your Google Sheets account for tracking ideas and outputs Set your Blotato API key and fill in the platform account IDs in the Assign Social Media IDs node Adjust the Schedule Trigger to control when the automation runs How to customize this workflow to your needs Change the AI prompts** to target your niche (e.g., ASMR, product videos, humor) Add a Telegram or Slack step** for video preview before publishing Tweak scene structure** or video duration to match your style Disable platforms** you don’t want by turning off specific HTTP Request nodes Edit the sound generation prompts** for different moods or effects 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Angel Menendez
Who’s it for This template is perfect for OMI pendant users or anyone with AI-generated memory transcripts who want to: Automatically create daily journals in Markdown Extract actionable tasks from conversations Store memories in Google Drive Sync action items to Google Tasks Great for creators, ADHD professionals, techies, or productivity hackers who want to build a second brain workflow with no manual data entry. What it does / How it works This workflow: Accepts POST data from the OMI AI pendant (via webhook) Extracts structured summaries, tasks, events, and raw transcript data Converts the transcript into Markdown using metadata like emoji, category, and overview Uses Google Gemini or an AI Agent to generate a high-quality journal entry Splits out action items and creates tasks in Google Tasks Uploads both the transcription and the final journal file into separate Google Drive folders for archival Deletes processed files if needed (cleanup path is included) How to set up Connect your OMI device to send daily summaries to the webhook endpoint Authenticate your Google Drive and Google Tasks accounts Replace any hardcoded values (like folder IDs or task list IDs) with your own Review the system prompt in the AI Agent node if you'd like to personalize your journal style ## Requirements OMI pendant or device that generates .md summaries via API or webhook Google Drive & Google Tasks credentials set up in n8n Optional: Google Gemini or OpenAI for natural language journal generation ## How to customize Change the output folder IDs for GDrive in the Upload Transcription and Upload Journal nodes. One folder is for long term storage and the other is short term, the contents of which are deleted every night to generate the journal entries. Ensure your workflow timezone is set correctly in the settings. Replace Google Tasks with another todo app (e.g. Notion, Todoist) using HTTP or native nodes Customize the AI prompt in the AI Agent or Gemini Chat node to reflect your tone (e.g., poetic, minimalist, spiritual) Modify the Markdown format in the Build Markdown Transcription node for your preferred structure
by David Ashby
Complete MCP server exposing 1 Article Search API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Article Search API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Article Search API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to http://api.nytimes.com/svc/search/v2 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Articlesearch.Json (1 endpoints) • GET /articlesearch.json: Search Articles 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Article Search API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all DeepL Tool operations to AI agents. Zero configuration needed - all 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every DeepL Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n DeepL Tool tool with full error handling 📋 Available Operations (1 total) Every possible DeepL Tool operation is included: 🔧 Language (1 operations) • Translate a language 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native DeepL Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every DeepL Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Paul
AI-Powered Lead Generation with Apollo, GPT-4, and Telegram to Database Overview This intelligent lead generation workflow transforms voice commands or text input into verified prospect lists through automated Apollo.io scraping. The system processes natural language requests, extracts search parameters using AI, and delivers clean, verified contact data directly to your database. Key Features 🎤 Voice & Text Input Processing Voice Recognition**: Converts audio messages to text using OpenAI's transcription API Natural Language Processing**: AI agent interprets requests and extracts search criteria Flexible Input**: Supports both voice commands and text messages 🔍 Smart Lead Scraping Apollo.io Integration**: Automated scraping using official Apollo.io API Dynamic URL Generation**: Builds search URLs based on extracted parameters Intelligent Parsing**: Processes location, industry, and job title criteria ✅ Email Verification & Filtering Verified Emails Only**: Filters results to include only verified email addresses Duplicate Prevention**: Compares against existing database to avoid duplicates Data Quality Control**: Ensures high-quality prospect data 📊 Automated Data Management Database Integration**: Automatic storage in PostgreSQL/Supabase Structured Data**: Organizes contacts with complete profile information Real-time Updates**: Instant database updates with new prospects How It Works Input Processing: Receive voice message or text command AI Analysis: Extract search parameters (location, industry, job titles) URL Construction: Build Apollo.io search URL with extracted criteria Data Scraping: Retrieve prospect data via Apollo.io API Email Verification: Filter for verified email addresses only Duplicate Check: Compare against existing database records Data Storage: Save new prospects to database Confirmation: Send success notification with count of new leads Supported Search Parameters Location**: City, state, country combinations Industry**: Business sectors and verticals Job Titles**: Executive roles, departments, seniority levels Company Size**: Organization scale and employee count Data Fields Extracted Contact Information First Name & Last Name Email Address (verified only) LinkedIn Profile URL Phone Number (when available) Professional Details Current Job Title Company Name Industry Seniority Level Employment History Location Data City & State Country Full Location String Company Information Website URL Business Industry Organization Details Technical Architecture Core Components n8n Workflow Engine**: Orchestrates the entire process OpenAI Integration**: Powers voice transcription and AI analysis Apollo.io API**: Source for prospect data PostgreSQL/Supabase**: Database storage and management API Integrations OpenAI Whisper API for voice transcription OpenAI GPT for natural language processing Apollo.io API for lead data retrieval Supabase API for database operations Use Cases Sales Teams Quickly build prospect lists for outreach campaigns Target specific industries or job roles Maintain clean, verified contact databases Marketing Professionals Generate targeted lead lists for campaigns Research prospects in specific markets Build comprehensive contact databases Business Development Identify potential partners or clients Research competitive landscapes Generate contact lists for networking Recruitment Find candidates in specific locations Target particular job roles or industries Build talent pipeline databases Benefits ⚡ Speed & Efficiency Voice commands for instant lead generation Automated processing eliminates manual work Batch processing for large prospect lists 🎯 Precision Targeting AI-powered parameter extraction Flexible search criteria combinations Industry and role-specific filtering 📈 Data Quality Verified email addresses only Duplicate prevention Structured, consistent data format 🔄 Automation End-to-end automated workflow Real-time database updates Instant confirmation notifications Setup Requirements Prerequisites n8n workflow platform OpenAI API access Apollo.io API credentials PostgreSQL or Supabase database Messaging platform integration Configuration Steps Import workflow into n8n Configure API credentials Set up database connections Customize search parameters Test with sample voice/text input Customization Options Search Parameters Modify location formats Add custom industry categories Adjust job title variations Set result limits Data Processing Customize field mappings Add data validation rules Implement additional filters Configure output formats Integration Options Connect to CRM systems Add email marketing tools Integrate with sales platforms Export to various formats Success Metrics Processing Speed**: Voice-to-database in under 30 seconds Data Accuracy**: 95%+ verified email addresses Automation Level**: 100% hands-free operation Scalability**: Process 500+ leads per request Transform your lead generation process with intelligent automation that understands natural language and delivers verified prospects directly to your database.
by David Ashby
Complete MCP server exposing 4 BikeWise API v2 API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add BikeWise API v2 credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the BikeWise API v2 API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://bikewise.org/api • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (4 total) 🔧 V2 (4 endpoints) • GET /v2/incidents: Paginated incidents matching parameters • GET /v2/incidents/{id}: GET /v2/incidents/{id} • GET /v2/locations: Unpaginated geojson response • GET /v2/locations/markers: Unpaginated geojson response with simplestyled markers 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native BikeWise API v2 API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Femi Ad
Description AI-Powered Business Idea Generation & Social Media Content Strategy Workflow This intelligent content discovery and strategy system features 15 nodes that automatically monitor Reddit communities, analyze business opportunities, and generate targeted social media content for AI automation agencies and entrepreneurs. It leverages AI classification, structured analysis, and automated content creation to transform community discussions into actionable business insights and marketing materials. Core Components Reddit Intelligence: Multi-subreddit monitoring across AI automation, n8n, and entrepreneur communities with keyword-based filtering. AI Classification Engine: Intelligent categorization of posts into "Questions" vs "Requests" using LangChain text classification. Dual Analysis System: Specialized AI agents for educational content (questions) and sales-focused content (service requests). Content Strategy Generator: Automated creation of LinkedIn and Twitter content tailored to different audience engagement strategies. Telegram Integration: Real-time delivery of formatted content strategies and business insights. Structured Output Processing: JSON-formatted analysis with relevancy scores, feasibility assessments, and actionable content recommendations. Target Users • AI Automation Agency Owners seeking consistent lead generation and thought leadership content • Entrepreneurs wanting to identify market opportunities and position themselves as industry experts • Content Creators in the automation/AI space needing data-driven content strategies • Business Development Professionals looking for systematic opportunity identification • Digital Marketing Agencies serving tech and automation clients Setup Requirements To get started, you'll need: Reddit API Access: OAuth2 credentials for accessing Reddit's API and monitoring multiple subreddits. Required APIs: • OpenRouter (for AI model access - supports GPT-4, Claude, and other models) • Reddit OAuth2 API (for community monitoring and data extraction) n8n Prerequisites: • Version 1.7+ with LangChain nodes enabled • Webhook configuration for Telegram integration • Proper credential storage and management setup Telegram Bot: Create via @BotFather for receiving formatted content strategies and business insights. Disclaimer: This template uses LangChain nodes and Reddit API integration. Ensure your n8n instance supports these features and verify API rate limits for production use. Step-by-Step Setup Guide Install n8n: Ensure you're running n8n version 1.7 or higher with LangChain node support enabled. Set Up API Credentials: • Create Reddit OAuth2 application at reddit.com/prefs/apps • Set up OpenRouter account and obtain API key • Store credentials securely in n8n credential manager Create Telegram Bot: • Go to Telegram, search for @BotFather • Create new bot and note the token • Configure webhook pointing to your n8n instance Import the Workflow: • Copy the workflow JSON from the template submission • Import into your n8n dashboard • Verify all nodes are properly connected Configure Monitoring Settings: • Adjust subreddit targets (currently: ArtificialIntelligence, n8n, entrepreneur) • Set keyword filters for relevant topics • Configure post limits and sorting preferences Customize AI Analysis: • Update system prompts to match your business expertise • Adjust relevancy and feasibility scoring criteria • Modify content generation templates for your brand voice Test the Workflow: • Run manual execution to verify Reddit data collection • Check AI classification and analysis outputs • Confirm Telegram delivery of formatted content Schedule Automation: • Set up daily trigger (currently configured for 12 PM) • Monitor execution logs for any API rate limit issues • Adjust frequency based on content volume needs Usage Instructions Automated Discovery: The workflow runs daily at 12 PM, scanning three key subreddits for relevant posts about AI automation, business opportunities, and n8n workflows. Intelligent Classification: Posts are automatically categorized as either "Questions" (educational opportunities) or "Requests" (potential service leads) using AI text classification. Dual Analysis Approach: • Questions → Educational content strategy with relevancy and detail scoring • Requests → Sales-focused content with relevancy and feasibility scoring Content Strategy Generation: Each analyzed post generates: • 3 LinkedIn posts (thought leadership, case studies, educational frameworks) • 3 Twitter posts (quick insights, engagement questions, thread starters) Telegram Delivery: Receive formatted content strategies with: • Post summaries and business context • Relevancy/feasibility scores • Ready-to-use social media content • Strategic recommendations Content Customization: Adapt generated content for different tones (business, educational, technical) and posting schedules. Workflow Features Multi-Platform Monitoring: Simultaneous tracking of 3 key Reddit communities with customizable keyword filters. AI-Powered Classification: Automatic categorization of posts into actionable content types. Dual Scoring System: • Relevancy scores (0.05-0.95) for business alignment • Detail/Feasibility scores (0.05-0.95) for content quality assessment Content Variety: Generates both educational and sales-focused social media strategies. Structured Output: JSON-formatted analysis for easy integration with other systems. Real-time Delivery: Instant Telegram notifications with formatted content strategies. Scalable Monitoring: Easy addition of new subreddits and keyword filters. Error Handling: Comprehensive validation with graceful failure management. Performance Specifications • Monitoring Frequency: Daily automated execution with manual trigger capability • Post Analysis: 5 posts per subreddit (15 total daily) • Content Generation: 6 social media posts per analyzed opportunity • Classification Accuracy: AI-powered with structured output validation • Delivery Method: Real-time Telegram integration • Scoring Range: 0.05-0.95 scale for relevancy and feasibility assessment Why This Workflow? Systematic Opportunity Identification: Never miss potential business opportunities or content ideas from key communities. AI-Enhanced Analysis: Leverage advanced language models for intelligent content categorization and strategy generation. Time-Efficient Content Creation: Transform community discussions into ready-to-use social media content. Data-Driven Insights: Quantified scoring helps prioritize opportunities and content strategies. Automated Lead Intelligence: Identify potential service requests and educational content opportunities automatically. Workflow Image Need help customizing this workflow for your specific use case? As a fellow entrepreneur passionate about automation and business development, I'd be happy to consult. Connect with me on LinkedIn: https://www.linkedin.com/in/femi-adedayo-h44/ or email for support. Let's make your AI automation agency even more efficient!
by David Ashby
🛠️ DHL Tool MCP Server Complete MCP server exposing all DHL Tool operations to AI agents. Zero configuration needed - all 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every DHL Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n DHL Tool tool with full error handling 📋 Available Operations (1 total) Every possible DHL Tool operation is included: 🔧 Shipment (1 operations) • Get tracking details for a shipment 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native DHL Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every DHL Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.