by David Ashby
Complete MCP server exposing 4 AWS Cost and Usage Report Service API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add AWS Cost and Usage Report Service credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the AWS Cost and Usage Report Service API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to http://cur.{region}.amazonaws.com • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (4 total) 🔧 #X-Amz-Target=Awsorigamiservicegatewayservice.Deletereportdefinition (1 endpoints) • POST /#X-Amz-Target=AWSOrigamiServiceGatewayService.DeleteReportDefinition: Deletes the specified report. 🔧 #X-Amz-Target=Awsorigamiservicegatewayservice.Describereportdefinitions (1 endpoints) • POST /#X-Amz-Target=AWSOrigamiServiceGatewayService.DescribeReportDefinitions: Lists the AWS Cost and Usage reports available to this account. 🔧 #X-Amz-Target=Awsorigamiservicegatewayservice.Modifyreportdefinition (1 endpoints) • POST /#X-Amz-Target=AWSOrigamiServiceGatewayService.ModifyReportDefinition: Allows you to programatically update your report preferences. 🔧 #X-Amz-Target=Awsorigamiservicegatewayservice.Putreportdefinition (1 endpoints) • POST /#X-Amz-Target=AWSOrigamiServiceGatewayService.PutReportDefinition: Creates a new report using the description that you provide. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native AWS Cost and Usage Report Service API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 1 Image Moderation API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Image Moderation credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Image Moderation API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.moderatecontent.com/moderate/ • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 endpoints) General (1 operation) Detect Nudity in Images 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Image Moderation API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 8 Bulk WHOIS API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Bulk WHOIS API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Bulk WHOIS API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to http://localhost:5000 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (8 total) 🔧 Batch (4 endpoints) • GET /batch: Get your batches • POST /batch: Create batch. Batch is then being processed until all provided items have been completed. At any time it can be get to provide current status with results optionally. • DELETE /batch/{id}: Delete batch • GET /batch/{id}: Get batch 🔧 Db (1 endpoints) • GET /db: Query domain database 🔧 Domains (3 endpoints) • GET /domains/{domain}/check: Check domain availability • GET /domains/{domain}/rank: Check domain rank (authority). • GET /domains/{domain}/whois: WHOIS query for a domain 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Bulk WHOIS API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by n8n Team
This workflow sends a new Clockify invoice to a Notion database of your choosing when a new invoice is created in Clockify. Prerequisites Notion account and Notion credentials. Clockify account. How it works On new invoice in Clockify webhook node will trigger when a new invoice is created in Clockify. Setup is involved. Create database page Notion node will create a database page with the information specified from the Clockify trigger. You can add additional fields if required by following the setup. Setup This workflow requires that you set up a webhook in Clockify. Follow the steps below to set up the webhook: Create a Clockify webhook by going to the webhooks section in Clockify. Create the webhook specifying the "Invoice created" event and paste in the URL provided from On new invoice in Clockify webhook step. You will also have to set up a Notion database: In Notion, create a new database. Add the following columns to the database: Invoice number (renamed from "Name") Issue date (with type "Date") Due date (with type "Date") Amount (with type "Number") Add any other fields you require to the database. Share the database to n8n. By default, the workflow will fill all the fields provided above, except for any other additional fields you add.
by David Ashby
Complete MCP server exposing 11 hashlookup CIRCL API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add hashlookup CIRCL API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the hashlookup CIRCL API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://hashlookup.circl.lu • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (11 total) 🔧 Bulk (2 endpoints) • POST /bulk/md5: Bulk Search MD5 Hashes • POST /bulk/sha1: Bulk Search SHA1 Hashes 🔧 Children (1 endpoints) • GET /children/{sha1}/{count}/{cursor}: Return children from a given SHA1. A number of element to return and an offset must be given. If not set it will be the 100 first elements. A cursor must be given to paginate over. The starting cursor is 0. 🔧 Info (1 endpoints) • GET /info: Get Database Info 🔧 Lookup (3 endpoints) • GET /lookup/md5/{md5}: Lookup MD5. • GET /lookup/sha1/{sha1}: Lookup SHA-1. • GET /lookup/sha256/{sha256}: Lookup SHA-256. 🔧 Parents (1 endpoints) • GET /parents/{sha1}/{count}/{cursor}: Return parents from a given SHA1. A number of element to return and an offset must be given. If not set it will be the 100 first elements. A cursor must be given to paginate over. The starting cursor is 0. 🔧 Session (2 endpoints) • GET /session/create/{name}: Create a session key to keep search context. The session is attached to a name. After the session is created, the header hashlookup_session can be set to the session name. • GET /session/get/{name}: Return set of matching and non-matching hashes from a session. 🔧 Stats (1 endpoints) • GET /stats/top: Get Top Queries 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native hashlookup CIRCL API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Cyril Nicko Gaspar
📌 HubSpot Lead Enrichment with Bright Data MCP This template enables natural-language-driven automation using Bright Data's MCP tools, triggered directly by new leads in HubSpot. It dynamically extracts and executes the right tool based on lead context—powered by AI and configurable in N8N. ❓ What Problem Does This Solve? Manual lead enrichment is slow, inconsistent, and drains valuable time. This solution automates the process using a no-code workflow that connects HubSpot, Bright Data MCP, and an AI agent—without requiring scripts or technical skills. Perfect for marketing, sales, and RevOps teams. 🧰 Prerequisites To use this template, you’ll need: A self-hosted or cloud instance of N8N A Bright Data MCP API token A valid OpenAI API key (or compatible AI model) A HubSpot account Either a Private App token or OAuth credentials for HubSpot Basic familiarity with N8N workflows ⚙️ Setup Instructions 1. Set Up Authentication in HubSpot 🔐 Option 1: Use a Private App Token (Simple Setup) Log in to your HubSpot account. Navigate to Settings → Integrations → Private Apps. Create a new Private App with the following scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read crm.objects.companies.read (optional) Copy the Access Token. In N8N, create a credential for HubSpot App Token and paste the app token in the field. Go back to Hubspot Private App settings to setup a webhook. Copy the url in your workflow's Webhook node and paste it here. 🔁 Option 2: Use OAuth (Advanced + Secure) In HubSpot, go to Settings → Integrations → Apps → Create App. Set your Redirect URL to match your N8N OAuth2 redirect path. Choose scopes like: crm.objects.companies.read crm.objects.contacts.read crm.objects.deals.read crm.schemas.companies.read crm.schemas.contacts.read crm.schemas.deals.read crm.objects.contacts.write (conditionally required) Note the Client ID and Client Secret. Copy the App ID and the developer API key In N8N, create a credential for HubSpot Developer API and paste those info from previous step. Attach these credentials to the HubSpot node in N8N. 2. Setup and obtain API token and other necessary information from Bright Data In your Bright Data account, obtain the following information: API token Web Unlocker zone name (optional) Browser API username and password string separated by colon (optional) 3. Host SSE server from STDIO command The methods below will allow you to receive SSE (Server-Sent Events) from Bright Data MCP via a local Supergateway or Smithery ** Method 1: Run Supergateway in a separate web service (Recommended) This method will work for both cloud version and self-hosted N8N. Signup to any cloud services of your choice (DigitalOcean, Heroku, Hetzner, Render, etc.). For NPM based installation: Create a new web service. Choose Node.js as runtime environment and setup a custom server without repository. In your server’s settings to define environment variables or .env file, add: `API_TOKEN=your_brightdata_api_token WEB_UNLOCKER_ZONE=optional_zone_name BROWSER_AUTH=optional_browser_auth` Paste the following text as a start command: npx -y supergateway --stdio "npx -y @brightdata/mcp" --port 8000 --baseUrl http://localhost:8000 --ssePath /sse --messagePath /message Deploy it and copy the web server URL, then append /sse into it. Your SSE server should now be accessible at: https://your_server_url/sse For Docker based installation: Create a new web service. Choose Docker as the runtime environment. Set up your Docker environment by pulling the necessary images or creating a custom Dockerfile. In your server’s settings to define environment variables or .env file, add: `API_TOKEN=your_brightdata_api_token WEB_UNLOCKER_ZONE=optional_zone_name BROWSER_ZONE=optional_browser_zone_name` Use the following Docker command to run Supergateway: `docker run -it --rm -p 8000:8000 supercorp/supergateway \ --stdio "npx -y @brightdata/mcp /" \ --port 8000` Deploy it and copy the web server URL, then append /sse into it. Your SSE server should now be accessible at: https://your_server_url/sse For more installation guides, please refer to https://github.com/supercorp-ai/supergateway.git. ** Method 2: Run Supergateway in the same web service as the N8N instance This method will only work for self-hosted N8N. a. Set Required Environment Variables In your server's settings to define environment variables or .env file, add: API_TOKEN=your_brightdata_api_token WEB_UNLOCKER_ZONE=optional_zone_name BROWSER_ZONE=optional_browser_zone_name b. Run Supergateway in Background npx -y supergateway --stdio "npx -y @brightdata/mcp" --port 8000 --baseUrl http://localhost:8000 --ssePath /sse --messagePath /message Use the command above to execute it through the cloud shell or set it as a pre-deploy command. Your SSE server should now be accessible at: http://localhost:8000/sse For more installation guides, please refer to https://github.com/supercorp-ai/supergateway.git. * *Method 3: Configure via Smithery.ai* (Easiest) If you don't want additional setup and want to test it right away, follow these instructions: Visit https://smithery.ai/server/@luminati-io/brightdata-mcp/tools to: Signup (if you are new to Smithery) Create an API key Define environment variables via a profile Retrieve your SSE server HTTP URL 4. Connect Google Sheets to N8N Ensure your Google Sheet: Contains columns like row_id, first_name, last_name, email, and status. Is shared with your N8N service account (or connected via OAuth) In N8N: Add a Google Sheets Trigger node Set it to watch for new rows in your lead sheet 5. Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Update nodes with your credentials: Hubspot: Add your API key or connect it via OAuth. Google Sheets Trigger: Link to your actual sheet OpenAI Node: Add your API key Bright Data Tool Execution: Add Bright Data token and SSE URL 🔄 How It Works New contact in Hubspot or a new row is added to the Google Sheet N8N triggers the workflow AI agent classifies the task (e.g., “Find LinkedIn”, “Get company info”) The relevant MCP tool is called Results are appended back to the sheet or routed to another destination Rerun the specific record by specifying status "needs more enrichment", or leaving it blank. 🧩 Use Cases B2B Lead Enrichment** – Add missing fields (title, domain, social profiles) Email Intelligence** – Validate and enrich based on email Market Research** – Pull company or contact data on demand CRM Auto-fill** – Push enriched leads to tools like HubSpot or Salesforce 🛠️ Customization Prompt Tuning** – Adjust how the AI interprets input data Column Mapping** – Customize which fields to pull from the sheet Tool Logic** – Add retries, fallback tools, or confidence-based routing Destination Output** – Integrate with CRMs, Slack, or webhook endpoints ✅ Summary This template turns a Google Sheet into an AI-powered lead enrichment engine. By combining Bright Data’s tools with a natural language AI agent, your team can automate repetitive tasks and scale lead ops—without writing code. Just add a row, and let the workflow do the rest.
by David Ashby
Complete MCP server exposing 1 Buy Marketing API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Buy Marketing API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Buy Marketing API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com/buy/marketing/v1_beta • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Merchandised_Product (1 endpoints) • GET /merchandised_product: Fetch Merchandised Products 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Buy Marketing API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 1 Recommendation API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Recommendation API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Recommendation API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com{basePath} • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Find (1 endpoints) • POST /find: Get Promoted Listings Recommendations 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Recommendation API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 1 IP2WHOIS Domain Lookup API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add IP2WHOIS Domain Lookup credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the IP2WHOIS Domain Lookup API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ip2whois.com/v2 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 General (1 endpoints) • GET /: Lookup WHOIS Data 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native IP2WHOIS Domain Lookup API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Trung Tran
Multi-Agent Book Creation Workflow with AI Tool Node and GPT-4, DALL-E Who’s it for This workflow is designed for: Content creators** who want to generate books or structured documents automatically. Educators and trainers** who need quick course materials, eBooks, or study guides. Automation enthusiasts* exploring *multi-agent systems* using the newly released *AI Tool Node** in n8n. Developers* looking for a reference template to understand *orchestration of multiple AI agents** with structured output. How it works / What it does This template demonstrates a multi-agent orchestration system powered by AI Tool Nodes: Trigger: Workflow starts when a chat message is received. Book Brief Agent: Generates the initial book concept (title, subtitle, and outline). Book Writer Agent: Expands the outline into full content by collaborating with two sub-agents: Designer Agent → Provides layout/design suggestions. Content Writer Agent → Drafts and refines chapters. Generate Cover Image: AI generates a custom book cover image. Upload to AWS S3: Stores the cover image securely. Configure Metadata: Adds metadata for title, author, and description. Build Book HTML: Converts markdown-based content into HTML format. Upload to Google Drive: Saves the HTML content for processing. Convert to PDF: Transforms the book into a professional PDF. Archive to Google Drive: Final version is archived for safe storage. This workflow showcases multi-agent coordination, structured parsing, and seamless integration with cloud storage services. How to set up Import the workflow into n8n. Configure the following connections: OpenAI (for Book Brief, Book Writer, Designer, and Content Writer Agents). AWS S3 (for image storage). Google Drive (for document storage & archiving). Add your API keys and credentials in n8n credentials manager. Test the workflow by sending a sample chat message (e.g., “Write a book about AI in education”). Verify outputs in Google Drive (HTML + PDF) and AWS S3 (cover image). Requirements n8n* (latest version with *AI Tool Node** support). OpenAI API key** (to power multi-agent models). AWS account** (with S3 bucket for storing images). Google Drive integration** (for document storage and archiving). Basic familiarity with workflow setup in n8n. How to customize the workflow Switch Models**: Replace gpt-4.1-mini with other models (faster, cheaper, or more powerful). Add More Agents: Introduce agents for **editing, fact-checking, or translation. Change Output Format: Export to **EPUB, DOCX, or Markdown instead of PDF. Branding Options: Modify the **cover generation prompt to include company logos or specific style. Extend Storage: Add **Dropbox, OneDrive, or Notion integration for additional archiving. Trigger Alternatives: Replace chat trigger with **form submission, webhook, or schedule-based runs. ✅ This workflow acts as a free, plug-and-play template to showcase how multi-agents + AI Tool Node can work together to automate complex content creation pipelines.
by David Ashby
Complete MCP server exposing 4 Transportation Laws and Incentives API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Transportation Laws and Incentives credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Transportation Laws and Incentives API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to http://developer.nrel.gov/api/transportation-incentives-laws • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (4 total) 🔧 V1.{Output_Format} (1 endpoints) • GET /v1.{output_format}: Return a full list of laws and incentives that match your query. 🔧 V1 (3 endpoints) • GET /v1/category-list.{output_format}: Return the law categories for a given category type. • GET /v1/pocs.{output_format}: Get the points of contact for a given jurisdiction. • GET /v1/{id}.{output_format}: Fetch the details of a specific law given the law's ID. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Transportation Laws and Incentives API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Angel Menendez
Who is this for? This workflow is designed for IT teams, service desk personnel, and incident management professionals who need a streamlined way to monitor and report on recent ServiceNow incidents directly within Slack. What problem is this workflow solving? / Use Case Manually monitoring incidents in ServiceNow can be time-consuming, and keeping teams updated about new or specific incidents often involves additional manual effort. This workflow automates the process of querying recent incidents from ServiceNow based on user-defined parameters and delivering formatted results directly to Slack. It ensures faster response times and improved incident visibility. What this workflow does This workflow integrates Slack and ServiceNow to provide an automated system for retrieving and presenting incident details. Slack User Interaction: Users initiate the workflow via a Slack modal form, selecting incident parameters like priority and state. ServiceNow Query: The workflow queries ServiceNow for incidents matching the selected criteria. Results Delivery: Incident results are sent back to Slack as a message formatted using Block Kit. If no results are found, the workflow notifies the user with a detailed message, either in a Slack channel or via direct message. Error Handling: If no channel is selected or any issues arise, the workflow ensures graceful fallback with appropriate notifications. Setup Instructions Slack Setup: Integrate Slack with n8n using a Slack app. Configure the modal form to accept parameters like priority and state. Check out this video for setting up a modal slack app on YouTube. ServiceNow Integration: Use ServiceNow credentials to connect with n8n. Ensure appropriate permissions for querying incidents. n8n Workflow Configuration: Import this workflow into n8n. Verify all node configurations, particularly those for ServiceNow API queries and Slack outputs. Set up webhook URLs for Slack event handling. Testing: Trigger the workflow from Slack to test modal inputs and incident queries. Confirm the output is correctly formatted and delivered to the intended Slack channel or user. How to Customize this Workflow to Your Needs Modify the ServiceNow query logic to include additional filters or fields. Adjust the Slack Block Kit formatting to match your organization’s preferred notification style. Use conditional logic to add more advanced handling for specific priorities or states. Expand the workflow to include escalation steps, such as notifying a specific team or creating follow-up tasks. Workflow Highlights Slack Modal Form**: Allows users to specify search criteria for incidents interactively. Dynamic Results Delivery**: Automatically sends results to a Slack channel or direct message based on user input. Error Handling**: Provides fallback notifications when no incidents are found or user inputs are incomplete. Customizable Integration**: Easily adaptable to fit different organizational needs, including advanced filtering and formatting options.