by Wikus Bergh
Who is this for? This template is ideal for n8n administrators, automation engineers, and DevOps teams who want to maintain bidirectional synchronization between their n8n workflows and GitHub repositories. It helps teams keep their workflow backups up-to-date and ensures consistency between their n8n instance and version control system. What problem is this workflow solving? Managing workflow versions across n8n and GitHub can become complex when changes happen in both places. This workflow solves that by automatically synchronizing workflows bidirectionally, ensuring that the most recent version is always available in both systems without manual intervention or version conflicts. What this workflow does: Runs on a weekly schedule (every Monday) to check for synchronization needs. Fetches all workflows from your n8n instance and compares them with GitHub repository files. Identifies workflows that exist only in n8n and uploads them to GitHub as JSON backups. Identifies workflows that exist only in GitHub and creates them in your n8n instance. For workflows that exist in both places, compares timestamps and syncs the most recent version: If n8n version is newer → Updates GitHub with the latest workflow If GitHub version is newer → Updates n8n with the latest workflow Automatically handles file naming, encoding/decoding, and commit messages with timestamps. Setup: Connect GitHub: Configure GitHub API credentials in the GitHub nodes. Note: Use a GitHub Personal Access Token (classic) with repo permissions to read and write workflow files. Connect n8n API: Provide your n8n API credentials in the n8n nodes. Check this doc Configure GitHub Details in the Set GitHub Details node: github_account_name: Your GitHub username or organization github_repo_name: The repository name where workflows should be stored repo_workflows_path: The folder path in your repo (e.g., workflows or n8n-workflows) Adjust Schedule: Modify the Schedule Trigger if you want a different sync frequency (currently set to weekly on Mondays). Test the workflow: Run it manually first to ensure all connections and permissions are working correctly. How to customize this workflow to your needs: Change sync frequency**: Modify the Schedule Trigger to run daily, hourly, or on-demand. Add filtering**: Extend the Filter node to exclude certain workflows (e.g., test workflows, templates). Add notifications**: Insert Slack, email, or webhook notifications to report sync results. Implement conflict resolution**: Add custom logic for handling workflows with the same timestamp. Add workflow validation**: Include checks to validate workflow JSON before syncing. Branch management**: Modify to sync to different branches or create pull requests instead of direct commits. Backup retention**: Add logic to maintain multiple versions or archive old workflows. Key Features: Bidirectional sync**: Handles changes from both n8n and GitHub Timestamp-based conflict resolution**: Always keeps the most recent version Automatic file naming**: Converts workflow names to valid filenames Base64 encoding/decoding**: Properly handles JSON workflow data Comprehensive comparison**: Uses dataset comparison to identify differences Automated commits**: Includes timestamps in commit messages for traceability This automated synchronization workflow provides a robust backup and version control solution for n8n workflows, ensuring your automation assets are always safely stored and consistently available across environments.
by Gareth B. Davies
Create engaging, structured threads on Bluesky with precise control over post timing and visibility. This workflow helps content creators and social media managers schedule and publish threaded posts that maintain proper connections and formatting, ensuring your content appears exactly as intended. How it works Creates an initial visible post that starts your thread Adds a series of hidden reply posts that form the body of your thread Maintains proper parent-child relationships between posts to ensure correct threading Enforces timing delays between posts to prevent rate limiting Concludes with two visible posts at the end of your thread The result is a clean, professional-looking thread where only the first and last two posts are immediately visible to your followers, encouraging engagement while maintaining a clean profile view. Set up steps (10-30 minutes) Create a Bluesky account Enter your Bluesky handle and app password in the "Set Bluesky Credentials" node Customize the post text in the Code nodes to match your content: Initial visible post Hidden reply posts Final visible posts Adjust the scheduling in the "Run Daily at 9 AM" node to match your preferred posting time Suggested enhancements Add error handling with retry logic for API failures Add input validation for post length and credential format Include error notifications via email or Slack Add data persistence to track successful posts and resume failed threads Make timing delays configurable with exponential backoff Add monitoring for rate limits and API quotas For Social Media Managers who want: Control over post visibility and timing Automated posting of long-form content Professional-looking content presentation
by Sira Ekabut
Facebook access tokens expire quickly, requiring regular updates for continued API access. This workflow simplifies the process of exchanging short-lived tokens for long-lived ones, saving time and reducing manual effort. What this workflow does Exchanges a short-lived Facebook User Access Token for a long-lived token using the Facebook Graph API. Optionally retrieves a long-lived Page Access Token associated with the user. Outputs both the user and page tokens for further use in automation or integrations. Setup Prerequisites: A valid Facebook App ID and App Secret. A short-lived User Access Token from the Facebook platform. (Optional) The App-Scoped User ID for fetching associated page tokens. Workflow Configuration: Replace placeholder values in the "Set Parameter" node with your Facebook credentials and token. Run the workflow manually to generate long-lived tokens. Documentation Reference: Follow the official Facebook guide for more details: https://developers.facebook.com/docs/facebook-login/guides/access-tokens/get-long-lived/
by David Ashby
Complete MCP server exposing all Oura Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Oura Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Oura Tool tool with full error handling 📋 Available Operations (4 total) Every possible Oura Tool operation is included: 🔧 Profile (1 operations) • Get a profile 🔧 Summary (3 operations) • Get activity summary • Get readiness summary • Get sleep summary 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Oura Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Oura Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Shiv Gupta
Trustpilot Insights Scraper: Auto Reviews via Bright Data + Google Sheets Sync Overview A comprehensive n8n automation that scrapes Trustpilot business reviews using Bright Data and automatically stores structured data in Google Sheets. Workflow Architecture 1. 📝 Form Trigger Node Purpose: Manual input interface for users Type**: n8n-nodes-base.formTrigger Configuration**: Form Title: "Website URL" Field: "Trustpilot Website URL" Function**: Accepts Trustpilot URL input from users to initiate the scraping process 2. 🌐 HTTP Request (Trigger Scraping) Purpose: Initiates scraping on Bright Data platform Type**: n8n-nodes-base.httpRequest Method**: POST Endpoint**: https://api.brightdata.com/datasets/v3/trigger Configuration**: Query Parameters: dataset_id: gd_lm5zmhwd2sni130p include_errors: true limit_multiple_results: 2 Headers: Authorization: Bearer BRIGHT_DATA_API_KEY Body: JSON with input URL and 35+ custom output fields Custom Output Fields The workflow extracts the following data points: Company Information**: company_name, company_logo, company_overall_rating, company_total_reviews, company_about, company_email, company_phone, company_location, company_country, company_category, company_id, company_website Review Data**: review_id, review_date, review_rating, review_title, review_content, review_date_of_experience, review_url, date_posted Reviewer Information**: reviewer_name, reviewer_location, reviews_posted_overall Review Metadata**: is_verified_review, review_replies, review_useful_count Rating Distribution**: 5_star, 4_star, 3_star, 2_star, 1_star Additional Fields**: url, company_rating_name, is_verified_company, breadcrumbs, company_other_categories 3. ⌛ Snapshot Progress Check Purpose: Monitors scraping job status Type**: n8n-nodes-base.httpRequest Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Configuration**: Query Parameters: format=json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY Function**: Receives snapshot_id from previous step and checks if data is ready 4. ✅ IF Node (Status Check) Purpose: Determines next action based on scraping status Type**: n8n-nodes-base.if Condition**: $json.status === "ready" Logic**: If True: Proceeds to data download If False: Triggers wait cycle 5. 🕒 Wait Node Purpose: Implements polling delay for incomplete jobs Type**: n8n-nodes-base.wait Duration**: 1 minute Function**: Pauses execution before re-checking snapshot status 6. 🔄 Loop Logic Purpose: Continuous monitoring until completion Flow**: Wait → Check Status → Evaluate → (Loop or Proceed) Prevents**: API rate limiting and unnecessary requests 7. 📥 Snapshot Download Purpose: Retrieves completed scraped data Type**: n8n-nodes-base.httpRequest Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Configuration**: Query Parameters: format=json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY 8. 📊 Google Sheets Integration Purpose: Stores extracted data in spreadsheet Type**: n8n-nodes-base.googleSheets Operation**: Append Configuration**: Document ID: 1yQ10Q2qSjm-hhafHF2sXu-hohurW5_KD8fIv4IXEA3I Sheet Name: "Trustpilot" Mapping: Auto-map all 35+ fields Credentials: Google OAuth2 integration Data Flow User Input (URL) ↓ Bright Data API Call ↓ Snapshot ID Generated ↓ Status Check Loop ↓ Data Ready Check ↓ Download Complete Dataset ↓ Append to Google Sheets Technical Specifications Authentication Bright Data**: Bearer token authentication Google Sheets**: OAuth2 integration Error Handling Includes error tracking in Bright Data requests Conditional logic prevents infinite loops Wait periods prevent API rate limiting Data Processing Mapping Mode**: Auto-map input data Schema**: 35+ predefined fields with string types Conversion**: No type conversion (preserves raw data) Setup Requirements Prerequisites Bright Data Account: Active account with API access Google Account: With Sheets API enabled n8n Instance: Self-hosted or cloud version Configuration Steps API Keys: Configure Bright Data bearer token OAuth Setup: Connect Google Sheets credentials Dataset ID: Verify correct Bright Data dataset ID Sheet Access: Ensure proper permissions for target spreadsheet Environment Variables BRIGHT_DATA_API_KEY: Your Bright Data API authentication token Use Cases Business Intelligence Competitor analysis and market research Customer sentiment monitoring Brand reputation tracking Data Analytics Review trend analysis Rating distribution studies Customer feedback aggregation Automation Benefits Scalability**: Handle multiple URLs sequentially Reliability**: Built-in error handling and retry logic Efficiency**: Automated data collection and storage Consistency**: Standardized data format across all scrapes Limitations and Considerations Rate Limits Bright Data API has usage limitations 1-minute wait periods help manage request frequency Data Volume Limited to 2 results per request (configurable) Large datasets may require multiple workflow runs Compliance Ensure compliance with Trustpilot's terms of service Respect robots.txt and rate limiting guidelines Monitoring and Maintenance Status Tracking Monitor workflow execution logs Check Google Sheets for data accuracy Review Bright Data usage statistics Regular Updates Update API keys as needed Verify dataset ID remains valid Test workflow functionality periodically Workflow Metadata Version ID**: dd3afc3c-91fc-474e-99e0-1b25e62ab392 Instance ID**: bc8ca75c203589705ae2e446cad7181d6f2a7cc1766f958ef9f34810e53b8cb2 Execution Order**: v1 Active Status**: Currently inactive (requires manual activation) Template Status**: Credentials setup completed For any questions or support, please contact: Email or fill out this form
by Octoleo
Overview This workflow automates the backup of all workflows from your system to a Git repository hosted on Gitea. It runs on a scheduled trigger, fetching, encoding, and committing workflow data, ensuring seamless version control and disaster recovery. 📌 Quick Setup: Just update three global variables and configure authentication—no manual exports needed! How It Works (Quick Glance) 1️⃣ Scheduled Execution → Runs automatically at defined intervals. 2️⃣ Fetch Workflows → Uses the API to retrieve all workflows. 3️⃣ Process Workflows → Converts workflow data into a Git-friendly format. 4️⃣ Commit & Push to Git → Saves workflows in a Gitea repository. Setup Steps (⚡ Takes ~5 min) 1️⃣ Set Global Variables Go to the Globals section in the workflow and update: repo.url* → https://your-gitea-instance.com *(Replace with your actual Gitea URL) repo.name* → workflows *(Repository name where backups will be stored) repo.owner* → octoleo *(Gitea account that owns the repository) 📌 These three variables define where the workflows are stored. 2️⃣ Configure Gitea Authentication Go to your Gitea account* → Generate a *Personal Access Token** In the credential manager, create a new Gitea Token with: Name:** Authorization Value:** Bearer YOUR_PERSONAL_ACCESS_TOKEN 📌 Ensure there is a space after Bearer before the token! 3️⃣ Link Credentials to Git Nodes Attach the Gitea credentials to these three Git nodes: GetGitea** → Retrieves existing repository data PutGitea** → Updates workflows PostGitea** → Adds new workflows 4️⃣ Link Credentials for API Requests Add API authentication** in the node that fetches all workflows. 5️⃣ Test & Activate Run the workflow manually** to confirm backups work. Enable the schedule trigger for automation. 📌 The workflow automatically checks for changes before committing updates. Why Use This Workflow? ✅ Automated Backups → No manual exports needed. ✅ Version Control → Easily track workflow changes. ✅ Simple Setup → Just configure globals & credentials. ✅ Secure → Uses token-based authentication. Next Steps 💬 Have questions? Reach out on the forum! 🚀
by Jimleuk
This n8n template showcases a cool feature of n8n Forms where the form itself can be defined dynamically using the form fields schema. It may be debateable how useful this template actually is since both Airtable and Baserow provide form interfaces already but still a great exercise and demonstration if ever the use-case comes around. How it works A form trigger is used to dynamically select a database/table from which to build the n8n form from. the table's schema is imported into the workflow and using the code node, is converted into the n8n form fields schema. This let's us dynamically build the fields in our n8n form when we choose to define the form using the JSON option. Once the n8n form submits, we convert the values back into our table's API schema so that we can create a new row. Note any files/attachments fields are removed as they need to be handled separately. Files are processed separately as they may first need to be stored. Once complete, the reference is saved into the newly created row. Check out the example Airtable here - https://airtable.com/appfP15Xd0aVZR9xV/shrGFgXLyQ4Jg58SU How to use The n8n form is autogenerated which means you only need provide access to the table. Using this approach, this template can be reused for any number of Airtable and/or Baserow tables. Requirements You'll need either an Airtable account or a Baserow account to use this template. Accessible n8n instance to your users Customising this workflow Not using either Airtable or Baserow? Theoretically any datastore which provides a fields schema can be used with this template. If you're feeling creative, split the table into multiple forms for a better user experience.
by Davide
This low-code automation enables all eCommerce store visitors to upload a photo of themselves and virtually “try on” a garment in just a few clicks. With this workflow, WooCommerce, Prestashop, Shopify and more merchants can offer a cutting-edge “virtual try-on” feature with minimal development effort, enhancing customer engagement and reducing product returns. Key Advantages Zero-Coding, Visual Setup** Build end-to-end e-commerce features with drag-and-drop nodes instead of custom backend code. Asynchronous, Scalable Processing** Non-blocking “Wait” + “If” loop handles multi-second AI jobs gracefully, freeing up the workflow for other tasks. Dynamic Inputs & URLs** Query strings (e.g. ?Product=IMAGE_URL) allow you to embed the form on any product page and pass the garment image on the fly. Seamless User Experience** Instant pop-up within your storefront and automatic redirect to the generated mock-up keeps shoppers engaged without page reloads. Easy Credential Management** API keys, FTP credentials, and webhook IDs are all stored securely in n8n’s credential manager. How It Works Form Submission: A user submits a form with their name, an image of themselves ("Me"), and a hidden product image URL ("Product"). The form is triggered via the On form submission node, which collects the input data. Image Upload: The uploaded image ("Me") is sent to an FTP server for temporary storage using the FTP node. The filename includes a timestamp to ensure uniqueness. Virtual Try-on Request: The Create Image node sends a POST request to the Fal.run API, providing: The uploaded human image URL (from FTP). The product image URL (from the hidden form field). This generates a virtual try-on result. Result Processing: The workflow checks the status of the image generation (Get status node) in a loop (with a 10-second wait between checks) until it is marked as "COMPLETED." Once ready, the final image URL is fetched (Get Url image node) and displayed to the user via a redirect (Form node). User Experience: The user is redirected to the generated try-on image, completing the process. Set Up Steps API Key Setup: Create an account and obtain an API key. Configure the Create Image node with HTTP Header Authentication: Name: Authorization Value: Key YOURAPIKEY FTP/S3 Configuration: Set up an FTP server or S3 bucket to temporarily store uploaded user images. Configure the FTP node with your FTP credentials and storage path. Ecommerce Integration: On your WooCommerce site, add a "Try On" button that opens the form in a pop-up. Dynamically pass the product image URL as a query parameter: Example: https://URL_N8N/form/ca1c314d-46c6-4eeb-b6a5-359XXXXXX?Product=IMAGE_URL Testing: Verify the workflow by submitting a test form and ensuring the virtual try-on image is generated and displayed correctly. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by David Ashby
🛠️ MailerLite Tool MCP Server Complete MCP server exposing all MailerLite Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every MailerLite Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n MailerLite Tool tool with full error handling 📋 Available Operations (4 total) Every possible MailerLite Tool operation is included: 🔧 Subscriber (4 operations) • Create a subscriber • Get a subscriber • Get many subscribers • Update a subscriber 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native MailerLite Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every MailerLite Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Solomon
Based on Jonathan's work. Check out his templates. How it works This workflow will backup your workflows to GitHub. It uses the n8n API node to export all workflows. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. Who is this for? People wanting to backup their workflows outside the server for safety purposes or to migrate to another server. Check out my other templates 👉 https://n8n.io/creators/solomon/
by Guido X Jansen
Introduction **Manual LinkedIn data collection is time-consuming, error-prone, and results in inconsistent data quality across CRM/database records.** This workflow is great for organizations that struggle with: Incomplete contact records with only LinkedIn URLs but missing profile details Hours spent manually copying LinkedIn information into databases Inconsistent data formats due to copy-paste from LinkedIn (emojis, styled text, special characters) Outdated profile information that doesn't reflect current roles/companies No systematic way to enrich contacts at scale Primary Users Sales & Marketing Teams Event Organizers & Conference Managers for event materials Recruitment & HR Professionals CRM Administrators Specific Problems Addressed Data Completeness: Automatically fills missing profile fields (headline, bio, skills, experience) Data Quality: Sanitizes problematic characters that break databases/exports Time Efficiency: Reduces hours of manual data entry to automated monthly updates Error Handling: Gracefully manages invalid/deleted LinkedIn profiles Scalability: Processes multiple profiles in batch without manual intervention Standardization: Ensures consistent data format across all records Cost Each URL scraped by Apify costs $0.01 to get all the data above. Apify charges per scrape, regardless of how much dta or fields you extract/use. Setup Instructions Prerequisites n8n Instance: Access to a running n8n instance (self-hosted or cloud) NocoDB Account: Database with a table containing LinkedIn URLs Apify Account: Free or paid account for LinkedIn scraping Required fields in NocoDB table Input: single LinkedIn URL NocoDB Field name LinkedIn Output: first/last/full name e-mail bio headline profile pic URL current role country skills current employer employer URL experiences (all previous jobs) personal website publications (articles) NocoDB Field names linkedin_full_name linkedin_first_name: linkedin_headline: linkedin_email: linkedin_bio: linkedin_profile_pic linkedin_current_role linkedin_current_company linkedin_country linkedin_skills linkedin_company_website linkedin_experiences linkedin_personal_website linkedin_publications linkedin_scrape_error_reason linkedin_scrape_last_attempt linkedin_scrape_status linkedin_last_modified Technically you also need an Id field, but that is always there so no need to add it :) n8n Setup 1. Import the Workflow Copy the workflow JSON from the template In n8n, click "Add workflow" → "Import from JSON" Paste the workflow and click "Import" 2. Configure NocoDB Connection Click on any NocoDB node in the workflow Add new credentials → "NocoDB Token account" Enter your NocoDB API token (found in NocoDB → User Settings → API Tokens) Update the projectId and table parameters in all NocoDB nodes 3. Set Up Apify Integration Create an Apify account at apify.com Generate an API token (Settings → Integrations → API) In the workflow, update the Apify token in the "Get Scraper Results" node Configure HTTP Query Auth credentials with your token 4. Map Your Database Fields Review the "Transform & Sanitize Data" node Update field mappings to match your NocoDB table structure Ensure these fields exist in your table: LinkedIn (URL field) linkedin_headline, linkedin_full_name, linkedin_bio, etc. linkedin_scrape_status, linkedin_last_modified 5. Configure the Filter In "Get Guests with LinkedIn" node Adjust the filter to match your requirements Default: (LinkedIn,isnot,null)~and(linkedin_headline,is,null) 6. Test the Workflow Click "Execute Workflow" with Manual Trigger Monitor execution for any errors Verify data is properly updated in NocoDB 7. Activate Automated Schedule Configure the Schedule Trigger node (default: monthly) Toggle the workflow to "Active" Monitor executions in n8n dashboard Customization Options 1. Data Source Modifications Different Database: Replace NocoDB nodes with Airtable, Google Sheets, or PostgreSQL Multiple Tables: Add parallel branches to process different contact tables Custom Filters: Modify the WHERE clause to target specific record subsets 2. Enrichment Fields Add Fields: Include additional LinkedIn data like education, certifications, or recommendations Remove Fields: Simplify by removing unnecessary fields (publications, skills) Custom Transformations: Add business logic for field calculations or formatting 3. Scheduling Options Frequency: Change from monthly to daily, weekly, or hourly Time-based: Set specific times for different timezones Event-triggered: Replace with webhook trigger for on-demand processing 4. Error Handling Enhancement Notifications: Add email/Slack nodes to alert on failures Retry Logic: Implement wait and retry for temporary failures Logging: Add database logging for audit trails 5. Data Quality Rules Validation: Add IF nodes to validate data before updates Duplicate Detection: Check for existing records before creating new ones Data Standardization: Add custom sanitization rules for industry-specific needs 6. Integration Extensions CRM Sync: Add nodes to push data to Salesforce, HubSpot, or Pipedrive AI Enhancement: Use OpenAI to summarize bios or extract key skills Image Processing: Download and store profile pictures locally 7. Performance Optimization Batch Size: Adjust the number of profiles processed per run Rate Limiting: Add delays between API calls to avoid limits Parallel Processing: Split large datasets across multiple workflow executions 8. Compliance Additions GDPR Compliance: Add consent checking before processing Data Retention: Implement automatic cleanup of old records Audit Logging: Track who accessed what data and when These customizations allow the workflow to adapt from simple contact enrichment to complex data pipeline scenarios across various industries and use cases.
by David Ashby
Complete MCP server exposing all Cloudflare Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Cloudflare Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Cloudflare Tool tool with full error handling 📋 Available Operations (4 total) Every possible Cloudflare Tool operation is included: 🔧 Zonecertificate (4 operations) • Delete a certificate • Get a certificate • Get many certificates • Upload a certificate 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Cloudflare Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Cloudflare Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.