by Guido X Jansen
Introduction **Manual LinkedIn data collection is time-consuming, error-prone, and results in inconsistent data quality across CRM/database records.** This workflow is great for organizations that struggle with: Incomplete contact records with only LinkedIn URLs but missing profile details Hours spent manually copying LinkedIn information into databases Inconsistent data formats due to copy-paste from LinkedIn (emojis, styled text, special characters) Outdated profile information that doesn't reflect current roles/companies No systematic way to enrich contacts at scale Primary Users Sales & Marketing Teams Event Organizers & Conference Managers for event materials Recruitment & HR Professionals CRM Administrators Specific Problems Addressed Data Completeness: Automatically fills missing profile fields (headline, bio, skills, experience) Data Quality: Sanitizes problematic characters that break databases/exports Time Efficiency: Reduces hours of manual data entry to automated monthly updates Error Handling: Gracefully manages invalid/deleted LinkedIn profiles Scalability: Processes multiple profiles in batch without manual intervention Standardization: Ensures consistent data format across all records Cost Each URL scraped by Apify costs $0.01 to get all the data above. Apify charges per scrape, regardless of how much dta or fields you extract/use. Setup Instructions Prerequisites n8n Instance: Access to a running n8n instance (self-hosted or cloud) NocoDB Account: Database with a table containing LinkedIn URLs Apify Account: Free or paid account for LinkedIn scraping Required fields in NocoDB table Input: single LinkedIn URL NocoDB Field name LinkedIn Output: first/last/full name e-mail bio headline profile pic URL current role country skills current employer employer URL experiences (all previous jobs) personal website publications (articles) NocoDB Field names linkedin_full_name linkedin_first_name: linkedin_headline: linkedin_email: linkedin_bio: linkedin_profile_pic linkedin_current_role linkedin_current_company linkedin_country linkedin_skills linkedin_company_website linkedin_experiences linkedin_personal_website linkedin_publications linkedin_scrape_error_reason linkedin_scrape_last_attempt linkedin_scrape_status linkedin_last_modified Technically you also need an Id field, but that is always there so no need to add it :) n8n Setup 1. Import the Workflow Copy the workflow JSON from the template In n8n, click "Add workflow" → "Import from JSON" Paste the workflow and click "Import" 2. Configure NocoDB Connection Click on any NocoDB node in the workflow Add new credentials → "NocoDB Token account" Enter your NocoDB API token (found in NocoDB → User Settings → API Tokens) Update the projectId and table parameters in all NocoDB nodes 3. Set Up Apify Integration Create an Apify account at apify.com Generate an API token (Settings → Integrations → API) In the workflow, update the Apify token in the "Get Scraper Results" node Configure HTTP Query Auth credentials with your token 4. Map Your Database Fields Review the "Transform & Sanitize Data" node Update field mappings to match your NocoDB table structure Ensure these fields exist in your table: LinkedIn (URL field) linkedin_headline, linkedin_full_name, linkedin_bio, etc. linkedin_scrape_status, linkedin_last_modified 5. Configure the Filter In "Get Guests with LinkedIn" node Adjust the filter to match your requirements Default: (LinkedIn,isnot,null)~and(linkedin_headline,is,null) 6. Test the Workflow Click "Execute Workflow" with Manual Trigger Monitor execution for any errors Verify data is properly updated in NocoDB 7. Activate Automated Schedule Configure the Schedule Trigger node (default: monthly) Toggle the workflow to "Active" Monitor executions in n8n dashboard Customization Options 1. Data Source Modifications Different Database: Replace NocoDB nodes with Airtable, Google Sheets, or PostgreSQL Multiple Tables: Add parallel branches to process different contact tables Custom Filters: Modify the WHERE clause to target specific record subsets 2. Enrichment Fields Add Fields: Include additional LinkedIn data like education, certifications, or recommendations Remove Fields: Simplify by removing unnecessary fields (publications, skills) Custom Transformations: Add business logic for field calculations or formatting 3. Scheduling Options Frequency: Change from monthly to daily, weekly, or hourly Time-based: Set specific times for different timezones Event-triggered: Replace with webhook trigger for on-demand processing 4. Error Handling Enhancement Notifications: Add email/Slack nodes to alert on failures Retry Logic: Implement wait and retry for temporary failures Logging: Add database logging for audit trails 5. Data Quality Rules Validation: Add IF nodes to validate data before updates Duplicate Detection: Check for existing records before creating new ones Data Standardization: Add custom sanitization rules for industry-specific needs 6. Integration Extensions CRM Sync: Add nodes to push data to Salesforce, HubSpot, or Pipedrive AI Enhancement: Use OpenAI to summarize bios or extract key skills Image Processing: Download and store profile pictures locally 7. Performance Optimization Batch Size: Adjust the number of profiles processed per run Rate Limiting: Add delays between API calls to avoid limits Parallel Processing: Split large datasets across multiple workflow executions 8. Compliance Additions GDPR Compliance: Add consent checking before processing Data Retention: Implement automatic cleanup of old records Audit Logging: Track who accessed what data and when These customizations allow the workflow to adapt from simple contact enrichment to complex data pipeline scenarios across various industries and use cases.
by David Ashby
Complete MCP server exposing all Cloudflare Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Cloudflare Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Cloudflare Tool tool with full error handling 📋 Available Operations (4 total) Every possible Cloudflare Tool operation is included: 🔧 Zonecertificate (4 operations) • Delete a certificate • Get a certificate • Get many certificates • Upload a certificate 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Cloudflare Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Cloudflare Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Beeminder Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Beeminder Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Beeminder Tool tool with full error handling 📋 Available Operations (4 total) Every possible Beeminder Tool operation is included: 🔧 Datapoint (4 operations) • Create datapoint for goal • Delete a datapoint • Get many datapoints for a goal • Update a datapoint 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Beeminder Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Beeminder Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Nick Saraev
Google Maps Email Scraper System Categories: Lead Generation, Web Scraping, Business Automation This workflow creates a completely free Google Maps email scraping system that extracts unlimited business emails without requiring expensive third-party APIs. Built entirely in N8N using simple HTTP requests and JavaScript, this system can generate thousands of targeted leads for any industry or location while operating at 99% free cost structure. Benefits Zero API Costs** - Operates entirely through free Google Maps scraping without expensive third-party services Unlimited Lead Generation** - Extract emails from thousands of Google Maps listings across any industry Geographic Targeting** - Search by specific cities, regions, or business types for precise lead targeting Complete Automation** - From search query to organized email list with minimal manual intervention Built-in Data Cleaning** - Automatic duplicate removal, filtering, and data validation Scalable Processing** - Handle hundreds of businesses per search with intelligent rate limiting How It Works Google Maps Search Integration: Uses strategic HTTP requests to Google Maps search URLs Processes search queries like "Calgary + dentist" to extract business listings Bypasses API restrictions through direct HTML scraping techniques Intelligent URL Extraction: Custom JavaScript regex patterns extract website URLs from Google Maps data Filters out irrelevant domains (Google, schema, static files) Returns clean list of actual business websites for processing Smart Website Processing: Loop-based architecture prevents IP blocking through intelligent batching Built-in delays and redirect handling for reliable scraping Processes each website individually with error handling Email Pattern Recognition: Advanced regex patterns identify email addresses within website HTML Extracts contact emails, info emails, and administrative addresses Handles multiple email formats and validation patterns Data Aggregation & Cleaning: Automatically removes duplicate emails across all processed websites Filters null entries and invalid email formats Exports clean, organized email lists to Google Sheets Required Google Sheets Setup Create a Google Sheet with these exact column headers: Search Tracking Sheet: searches - Contains your search queries (e.g., "Calgary dentist", "Miami lawyers") Email Results Sheet: emails - Contains extracted email addresses from all processed websites Setup Instructions: Create Google Sheet with two tabs: "searches" and "emails" Add your target search queries to the searches tab (one per row) Connect Google Sheets OAuth credentials in n8n Update the Google Sheets document ID in all sheet nodes The workflow reads search queries from the first sheet and exports results to the second sheet automatically. Business Use Cases Local Service Providers** - Find competitors and potential partners in specific geographic areas B2B Sales Teams** - Generate targeted prospect lists for cold outreach campaigns Marketing Agencies** - Build industry-specific lead databases for client campaigns Real Estate Professionals** - Identify businesses in target neighborhoods for commercial opportunities Franchise Development** - Research potential markets and existing competition Market Research** - Analyze business density and contact information across regions Revenue Potential This system transforms lead generation economics: $0 per lead vs. $2-5 per lead from paid databases Process 1,000+ leads daily without hitting API limits Sell as a service for $500-2,000 per industry/location Perfect for agencies offering lead generation to local businesses Difficulty Level: Intermediate Estimated Build Time: 1-2 hours Monthly Operating Cost: $0 (completely free) Watch My Complete Build Process Want to watch me build this entire system live from scratch? I walk through every single step - including the JavaScript code, regex patterns, error handling, and all the debugging that goes into creating a bulletproof scraping system. 🎥 Watch My Live Build: "Scrape Unlimited Leads WITHOUT Paying for APIs (99% FREE)" This comprehensive tutorial shows the real development process - including writing custom JavaScript, handling rate limits, and building systems that actually work at scale without getting blocked. Set Up Steps Basic Workflow Architecture: Set up manual trigger for testing and Google Sheets integration Configure initial HTTP request node for Google Maps searches Enable SSL ignore and response headers for reliable scraping URL Extraction Code Setup: Configure JavaScript code node with custom regex patterns Set up input data processing from Google Maps HTML responses Implement URL filtering logic to remove irrelevant domains Website Processing Pipeline: Add "Split in Batches" node for intelligent loop processing Configure HTTP request nodes with proper delays and redirect handling Set up error handling for websites that can't be scraped Email Extraction System: Implement JavaScript code node with email-specific regex patterns Configure email validation and format checking Set up data aggregation for multiple emails per website Data Cleaning & Export: Configure filtering nodes to remove null entries and duplicates Set up "Split Out" node to aggregate emails into single list Connect Google Sheets integration for organized data export Testing & Optimization: Use limit nodes during testing to prevent IP blocking Test with small batches before scaling to full searches Implement proxy integration for high-volume usage Advanced Optimizations Scale the system with: Multi-Page Scraping:** Extract URLs from homepages, then scrape contact pages for more emails Proxy Integration:** Add residential proxies for unlimited scraping without rate limits Industry Templates:** Create pre-configured searches for different business types Contact Information Expansion:** Extract phone numbers, addresses, and social media profiles CRM Integration:** Automatically add leads to sales pipelines and marketing sequences Important Considerations Rate Limiting:** Built-in delays prevent IP blocking during normal usage Scalability:** For high-volume usage, consider proxy services for unlimited requests Compliance:** Ensure proper usage rights for extracted contact information Data Quality:** System includes filtering but manual verification recommended for critical campaigns Check Out My Channel For more advanced automation systems and business-building strategies that generate real revenue, explore my YouTube channel where I share proven automation techniques used by successful agencies and entrepreneurs.
by Mario
Purpose This workflow snippet allows for advanced error catching during retry attempts. There are cases, where you want to check if an item exists first, so you can determine the following actions. Some API’s do not support an endpoint (e.g. Todoist: completed tasks) to do so, which is why you would work with the error branch, only that this does not work well in combination with the retry functionality. How it works Instead of the builtin retry function of a Node a custom loop is used, to get more granular control in between the iterations If the main executed node fails, the error can be filtered for an expected error, which can trigger a separate action The retries only happen, if an unexpected error happened The workflow only stops, if the defined amount of retries exceeded Setup Copy the nodes into your existing workflow Replace the “Replace me” placeholder with the Node you want to apply the retry logic on Follow the sticky notes for more instructions and optional settings
by Tiartyos
Voice Cloning Workflow - Zyphra Zonos API Who is this for? This workflow is designed for developers, content creators, and businesses looking to automate high-quality voice synthesis using AI voice cloning technology. What problem does this solve? It automates the process of generating natural-sounding speech from text using a sample voice file, eliminating the need for manual voice recording and providing consistent voice output for applications like audiobooks, virtual assistants, or content localization. What this workflow does The workflow receives text and voice cloning parameters via webhook, reads a sample voice file from your storage, sends the data to Zyphra's Zonos API for voice synthesis, and saves the generated audio file to your specified output location. Prerequisites You'll need: API key from Zyphra (obtain from https://playground.zyphra.com/settings/api-keys) Account registration at https://playground.zyphra.com Sample voice file stored on accessible disk/cloud storage n8n instance running with webhook capabilities Setup Configure your Zyphra API key in the "Call Zyphra Clone API" node under Header Parameters (Name: X-API-Key, Value: your-api-key) Ensure your sample voice files are accessible at the paths you'll specify Test the webhook endpoint is accessible Supported Audio Formats The API supports multiple output formats through the mime_type parameter: WebM** (default) - audio/webm Ogg** - audio/ogg WAV** - audio/wav MP3** - audio/mp3 or audio/mpeg MP4/AAC** - audio/mp4 or audio/aac Usage Example Endpoint: POST http://localhost:5678/webhook-test/voice-clone Headers: Content-Type: application/json Request Body: { "text": "Hello there! This voice sounds just like the sample!", "speaking_rate": 18, "sample_voice_path": "/data/output/sampleVoice.wav", "output_path": "/data/output/", "language_iso_code": "en-us", "mime_type": "audio/wav", "model": "zonos-v0.1-transformer", "emotion": { "happiness": 0.8, "neutral": 0.3, "sadness": 0.05, "disgust": 0.05, "fear": 0.05, "surprise": 0.05, "anger": 0.05, "other": 0.5 } } Parameters Required Parameters text**: Text to synthesize into speech sample_voice_path**: Path to your voice sample file output_path**: Directory where generated audio will be saved Optional Parameters (with defaults) speaking_rate**: 15 - Speech speed language_iso_code**: "en-us" - Language code mime_type**: "audio/wav" - Output audio format model**: "zonos-v0.1-transformer" - AI model to use emotion**: Object with emotion levels (0-1 scale)
by Jakkrapat Ampring
Description This workflow automatically generates personalized certificates in Google Slides and emails them to respondents only if they meet a minimum score threshold, using data submitted via Google Forms (stored in Google Sheets). Ideal for: Online courses Quizzes and workshops Event participation certificates Sheet Requirements Your connected Google Sheet (from the Google Form) must contain: Full Name – The name to appear on the certificate. Email – Recipient’s email address. Score – The test/quiz score used for threshold logic. Setup Instructions Connect Google Sheets – Make sure your Form responses are linked to a Sheet with the columns mentioned above. Set Score Threshold – Modify the If node to your desired minimum score (e.g., >= 80). Customize Certificate Template – Use a Google Slides file with text placeholders like {{Full Name}}. Connect Gmail & Google Drive – For sending emails and saving generated certificates. Update File IDs – Replace any placeholder Slide and Drive file IDs with your own. Services Used Google Sheets (Form responses) Google Slides (Certificate template) Google Drive (Storage) Gmail (Email delivery) Troubleshooting Issue: "Cannot read property 'Score'" → Ensure your column names match exactly (Score, Full Name, etc.). Slides not replacing placeholders → Double-check placeholder format ({{Full Name}}) and capitalization. Emails not sending → Verify Gmail authentication and make sure the If node is correctly filtering results.
by Adam Janes
How it works: Whenever a new event is scheduled on your Google Calendar, this workflow generates a Meeting Briefing email, giving an overview of each person on the call and the company they work for. It makes use of the web search tool on the OpenAI Responses API to make lookups. The workflow triggers when a new event is added to the calendar, loops over each attendee, generating reports on each person and their company, collates the results, and sends the briefing as an email. Set up steps: Add your credentials for Google Calendar (for viewing events) and Gmail (to send the email) Add your OpenAI credentials as a Header Auth on the Company Search and Person Search nodes. Name: Authorization Value: Bearer {{ YOUR_API_KEY }} Edit the "Edit Fields" node with the email that you want to send the briefing to, and a short bit of context about yourself.
by David Olusola
AI-Powered Airtable Contact Manager Overview The AI-Powered Airtable Contact Manager is an intelligent n8n workflow that enables AI assistants to seamlessly manage contact data in Airtable through natural language interactions. Using the Model Context Protocol (MCP), this workflow bridges the gap between conversational AI and structured data management. How It Works This workflow creates a powerful AI-to-database interface that allows users to manage their Airtable contacts through natural language commands. Here's the complete flow: 1. AI Interaction Layer Users interact with an AI assistant using natural language Examples: "Add John Doe to contacts", "Find all contacts assigned to Sarah", "Show me contact details for ID xyz" 2. MCP Server Trigger The AI assistant processes the user's request and identifies the needed operation Sends structured commands to the n8n workflow via the MCP (Model Context Protocol) Acts as the intelligent routing system for all contact operations 3. Airtable Operations The workflow provides four core contact management functions: 🔍 Get Record: Retrieves specific contact details using a Record ID Input: Record ID from AI Output: Complete contact information (Name, Email, Assignee, Status) ➕ Create Record: Adds new contacts to the database Input: Contact details (Name, Email, Assignee) Output: New record with auto-generated ID and default status 🗑️ Delete Record: Removes contacts permanently Input: Record ID to delete Output: Confirmation of successful deletion 🔎 Search Records: Finds contacts using flexible criteria Input: Airtable formula for filtering Output: All matching contact records 4. Smart Data Handling The workflow uses AI-powered field mapping with $fromAI() functions Automatically extracts relevant information from natural language requests Maintains data integrity with proper field validation Setup Steps Prerequisites n8n instance (cloud or self-hosted) Airtable account with API access MCP-compatible AI system Step 1: Airtable Preparation Create Airtable Base: Set up a new base or use existing one Note your Base ID (starts with app) Set Up Contact Table: Create a table with these fields: Name (Single line text) email (Email) Assignee (Single line text) Status (Single select: Todo, In progress, Done) Note your Table ID (starts with tbl) Generate API Token: Go to https://airtable.com/developers/web/api/introduction Create a personal access token with full permissions Save this token securely Step 2: n8n Configuration Import Workflow: Copy the enhanced JSON workflow Import into your n8n instance Configure Airtable Credentials: Go to Credentials in n8n Create new "Airtable Personal Access Token" credential Enter your Airtable API token Name it "full access" (or update credential references in workflow) Update Base and Table IDs: Replace YOUR_AIRTABLE_BASE_ID with your actual Base ID (starts with app) Replace YOUR_AIRTABLE_TABLE_ID with your actual Table ID (starts with tbl) Update in all four Airtable nodes Update Credential References: Replace your-airtable-credential-id with your actual credential ID Or rename your credential to match "Airtable API Token" Step 3: MCP Integration Configure MCP Server: Set up your MCP server to communicate with n8n Replace your-webhook-path-here and your-webhook-id-here with your actual webhook details Configure your AI system to use this workflow Update Node IDs (Optional): The workflow uses placeholder node IDs for privacy n8n will auto-generate new IDs when you import No action needed unless you're referencing specific nodes Test the Integration: Activate the workflow in n8n Test each operation through your AI interface Verify data flows correctly between AI and Airtable Step 4: Customization (Optional) Add More Fields: Extend the Airtable schema as needed Update the Create Record node field mappings Modify the Search Record filters Enhanced Error Handling: Add error handling nodes Set up notifications for failed operations Implement retry logic for reliability Usage Examples Once set up, users can interact with the system naturally: Creating Contacts: "Add Sarah Johnson with email sarah@company.com, assign to Mike" "Create a new contact for David Wilson, email david@startup.io" Finding Contacts: "Show me all contacts assigned to Jennifer" "Find contacts with status 'In progress'" "Search for contacts with gmail addresses" Managing Records: "Get details for contact rec123ABC" "Delete the contact with ID rec456DEF" "Update John's status to Done" Benefits Natural Language Interface**: No technical knowledge required Automated Data Entry**: Reduces manual work and errors Flexible Searching**: Find contacts using any criteria AI-Powered**: Leverages advanced language understanding Scalable**: Easily extend with more operations Integrated**: Works seamlessly with existing Airtable workflows Technical Notes Uses n8n's $fromAI() function for intelligent data extraction Implements MCP for standardized AI-to-automation communication Supports Airtable's formula syntax for complex searches Maintains security through proper credential management Designed for high reliability with error handling capabilities This workflow transforms contact management from a manual, time-consuming task into an effortless, conversational experience powered by AI.
by Aitor | 1Node
Stay ahead of the curve and keep your followers informed—automatically. This n8n workflow uses Perplexity AI to generate insightful answers to scheduled queries, then auto-posts the responses directly to X (Twitter). ⚙️ What this workflow does Scheduled Trigger – Runs at set times (daily, hourly, etc.). searchQuery – Define what kind of trending or relevant insight you want (e.g. “latest AI trends”). set API Key – Securely insert your Perplexity API key. Perplexity API Call – Fetches a short, insightful response to your query. Post to X – Automatically publishes the result as a tweet. 🧩 Requirements An n8n account (self-hosted or cloud) A Perplexity API key A connected X (Twitter) account via n8n’s credentials ✅ Setup Steps Add this workflow into your n8n account. Edit the searchQuery node with a topic (e.g. “What’s new in ecommerce automation?”). Paste your Perplexity API key into the set API key node. Connect your X (Twitter) account in the final node. Adjust the schedule timing to suit your content frequency. 💡 Ideas to Improve 💬 Add a formatting step to shorten or hashtag the response. 📊 Pull multiple trending questions and auto-schedule posts. 🔁 Loop responses to queue a full week of content. 🌐 Translate content before posting to reach a global audience. 🆘 Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.
by David Ashby
Complete MCP server exposing 2 Wayback API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Wayback API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Wayback API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.archive.org • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (2 total) 🔧 Wayback (2 endpoints) • GET /wayback/v1/available: GET /wayback/v1/available • POST /wayback/v1/available: POST /wayback/v1/available 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Wayback API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 2 CarbonDoomsDay API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add CarbonDoomsDay credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the CarbonDoomsDay API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.carbondoomsday.com/api • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (2 total) 🔧 Co2 (2 endpoints) • GET /co2/: Get CO2 Measurement by Date • GET /co2/{date}/: CO2 measurements from the Mauna Loa observatory. This data is made available through the good work of the people at the Mauna Loa observatory. Their release notes say: These data are made freely available to the public and the scientific community in the belief that their wide dissemination will lead to greater understanding and new scientific insights. We currently scrape the following sources: [co2_mlo_weekly.csv] [co2_mlo_surface-insitu_1_ccgg_DailyData.txt] [weekly_mlo.csv] We have daily CO2 measurements as far back as 1958. Learn about using pagination via [the 3rd party documentation]. [co2_mlo_weekly.csv]: https://www.esrl.noaa.gov/gmd/webdata/ccgg/trends/co2_mlo_weekly.csv [co2_mlo_surface-insitu_1_ccgg_DailyData.txt]: ftp://aftp.cmdl.noaa.gov/data/trace_gases/co2/in-situ/surface/mlo/co2_mlo_surface-insitu_1_ccgg_DailyData.txt [weekly_mlo.csv]: http://scrippsco2.ucsd.edu/sites/default/files/data/in_situ_co2/weekly_mlo.csv [the 3rd party documentation]: http://www.django-rest-framework.org/api-guide/pagination/#pagenumberpagination 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native CarbonDoomsDay API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.