by Grace Gbadamosi
How it works This workflow creates a complete MCPserver that provides comprehensive API integration monitoring and testing capabilities. The server exposes five specialized tools through a single MCP endpoint: API health analysis, webhook reliability testing, rate limit monitoring, authentication verification, and client report generation. External applications can connect to this MCP server to access all monitoring tools. Who is this for This template is designed for DevOps engineers, API developers, integration specialists, and technical teams responsible for maintaining API reliability and performance. It's particularly valuable for organizations managing multiple API integrations, SaaS providers monitoring client integrations, and development teams implementing API monitoring strategies. Requirements MCP Client**: Any MCP-compatible application (Claude Desktop, custom MCP client, or other AI tools) Network Access**: Outbound HTTP/HTTPS access to test API endpoints and webhooks Authentication**: Bearer token authentication for securing the MCP server endpoint Target APIs**: The APIs and webhooks you want to monitor (no special configuration required on target systems) How to set up Configure MCP Server Authentication - Update the MCP Server - API Monitor Entry node with your desired authentication method and generate a secure bearer token for accessing your MCP server Deploy the Workflow - Save and activate the workflow in your n8n instance, noting the MCP server endpoint URL that will be generated for external client connections Connect MCP Client - Configure your MCP client (such as Claude Desktop) to connect to the MCP server endpoint using the authentication token you configured Test Monitoring Tools - Use your MCP client to call the available tools: Analyze Api Health, Validate Webhook Reliability, Monitor API Limits, Verify Authentication, and Generate Client Report with your API endpoints and credentials
by Rahul Joshi
Description: Streamline your cloud storage with this powerful Google Drive File Renamer automation built with n8n. The workflow watches a specific Google Drive folder and automatically renames new files using a standardized format based on their creation date and time—ideal for organizing images, backups, and uploads with consistent timestamp-based names. Whether you're managing daily uploads, sorting Instagram-ready content, or organizing client submissions, this timestamp-based file naming system ensures consistent and searchable file structures—without manual intervention. What This Template Does (Step-by-Step) 🔔 Google Drive Trigger – "Watch Folder" Setup Monitors a specific folder (e.g., “Instagram”) Detects new file creations every minute Captures file metadata like ID, createdTime, and extension 🧠 Set Formatted Name Extracts file creation time (e.g., 2025-07-22T14:45:10Z) Converts it into a structured name like IMG_20250722_1445.jpg Keeps original file extension (JPG, PNG, PDF, etc.) ✏️ Rename File (Google Drive) Renames the original file using Google Drive API Applies the new timestamped name Keeps file content, permissions, and location unchanged Required Integrations: Google Drive API (OAuth2 credentials) Best For: 📸 Content creators organizing uploads from mobile 🏷️ Branding teams enforcing uniform naming 🗄️ Admins managing scanned documents or backups 📂 Automated archives for media, reports, or daily snapshots Key Benefits: ✅ Timestamped naming ensures chronological file tracking ✅ Reduces human error and messy file names ✅ Works in real-time (polls every minute) ✅ No-code: Deploy with drag-and-drop setup in n8n ✅ Fully customizable name patterns (e.g., change IMG_ prefix)
by Joseph
This n8n workflow converts various file formats (.pdf, .doc, .png, .jpg, .webp) to clean markdown text using the datalab.to API. Perfect for AI agents, LLM processing, and RAG (Retrieval Augmented Generation) data preparation for vector databases. Workflow Description Input Trigger Node**: Form trigger or webhook to accept file uploads Supported Formats**: PDF documents, Word documents (.doc/.docx), and images (PNG, JPG, WEBP) Processing Steps File Validation: Check file type and size constraints HTTP Request Node: Method: POST to https://api.datalab.to/v1/marker Headers: X-API-Key with your datalab.to API key Body: Multipart form data with the file Response Processing: Extract the converted markdown text Output Formatting: Clean and structure the markdown for downstream use Output Clean, structured markdown text ready for: LLM prompt injection Vector database ingestion AI agent knowledge base processing Document analysis workflows Setup Instructions Get API Access: Sign up at datalab.to to obtain your API key Configure Credentials: Create a new credential in n8n Add Generic Header: X-API-Key with your API key as the value Import Workflow: Ready to process files immediately Use Cases AI Workflows**: Convert documents for LLM analysis and processing RAG Systems**: Prepare clean text for vector database ingestion Content Management**: Batch convert files to searchable markdown format Document Processing**: Extract text from mixed file types in automated pipelines The workflow handles the complexity of different file formats while delivering consistent, AI-ready markdown output for your automation needs.
by Sarfaraz Muhammad Sajib
Overview This n8n workflow automates the generation of short news videos using the HeyGen video API and RSS feeds from a Bangla news source, Prothom Alo. It is ideal for content creators, media publishers, or developers who want to create daily video summaries from text-based news feeds using AI avatars. The workflow reads the latest news summaries from an RSS feed and sends each item to the HeyGen API to create a video with a realistic avatar and voice narration. The resulting video is suitable for publishing on platforms like YouTube, Instagram, or TikTok. Requirements A HeyGen account with access to the API. HeyGen API key (kept securely in your environment). n8n (self-hosted or cloud instance). Basic understanding of using HTTP request nodes in n8n. Setup Instructions Clone this Workflow into your n8n instance. Replace the placeholder value in the X-Api-Key header with your HeyGen API key. Confirm the RSS feed URL is correct and live: https://prod-qt-images.s3.amazonaws.com/production/prothomalo-bangla/feed.xml The HTTP Request body references {{$json.summary}} from each RSS item. Make sure this field exists. Run the workflow manually or configure a CRON trigger if you want to automate it. Customization Avatar ID* and *Voice ID** can be changed in the HTTP Request body. Use your HeyGen dashboard to get available IDs. Change the video dimensions (1280x720) to suit your platform’s requirements. You can replace the RSS feed with any other news source that supports XML format. Add nodes to upload the video to YouTube, Dropbox, etc., after generation. What It Does Triggers manually. Reads an RSS feed. Extracts summary from each news item. Sends a request to HeyGen to generate a video. Returns the video generation response.
by Avkash Kakdiya
How it works This workflow synchronizes support tickets in Freshdesk with issues in Linear, enabling smooth collaboration between support and development teams. It triggers on new or updated Freshdesk tickets, maps fields to Linear’s format, and creates linked issues through Linear’s API. Reverse synchronization is also supported, so changes in Linear update the corresponding Freshdesk tickets. Comprehensive logging ensures success and error events are always tracked. Step-by-step 1. Trigger the workflow New Ticket Webhook** – Captures new Freshdesk tickets for issue creation. Update Ticket Webhook** – Detects changes in existing tickets. Linear Issue Updated Webhook** – Listens for updates from Linear. 2. Transform and map data Map Freshdesk Fields to Linear** – Converts priority, status, title, and description for Linear. Map Linear to Freshdesk Fields** – Converts Linear state, priority, and extracts ticket ID for Freshdesk updates. 3. Perform API operations Create Linear Issue** – Sends GraphQL mutation to Linear API. Check Linear Creation Success** – Validates issue creation before linking. Link Freshdesk with Linear ID** – Updates Freshdesk with Linear reference. Update Freshdesk Ticket** – Pushes Linear updates back to Freshdesk. 4. Manage logging and errors Log Linear Creation Success** – Records successful ticket-to-issue sync. Log Linear Creation Error** – Captures and logs issue creation failures. Log Freshdesk Update Success** – Confirms successful reverse sync. Log Missing Ticket ID Error** – Handles missing ticket reference errors. Why use this? Keep support and development teams aligned with real-time updates. Eliminate manual ticket-to-issue handoffs, saving time and reducing errors. Maintain full visibility with detailed success and error logs. Enable bidirectional sync between Freshdesk and Linear for true collaboration. Improve response times by ensuring both teams always work on the latest data.
by Tony Ciencia
Overview This template provides an automatic backup solution for all your n8n workflows, saving them directly to Google Drive. It’s designed for freelancers, agencies, and businesses that want to keep their automations safe, versioned, and always recoverable. Why Backups Matter Disaster recovery – Restore workflows quickly if your instance fails. Version control – Track workflow changes over time. Collaboration – Share workflow JSON files easily with teammates. How it Works Fetches the complete list of workflows from your n8n instance via API. Downloads each workflow in JSON format. Converts the data into a file with a unique name (workflow name + ID). Uploads all files to a chosen Google Drive folder. Can be run manually or via an automatic schedule (daily, weekly, etc.). Requirements An active n8n instance with API access enabled API credentials for n8n (API key or basic auth) A Google account with access to Google Drive Google Drive credentials connected in n8n Setup Instructions Connect your n8n API (authenticate your instance). Connect your Google Drive account. Select or create the Drive folder where backups will be stored. Customize the Schedule Trigger to define backup frequency. Run once to confirm files are stored correctly. Customization Options Frequency → Set daily, weekly, or monthly backups. File Naming → Adjust filename expression (e.g., {{workflowName}}-{{workflowId}}-{{date}}.json). Folder Location → Store backups in separate Google Drive folders per project or client. Target Audience This template is ideal for: Freelancers managing multiple client automations. Agencies delivering automation services. Teams that rely on n8n for mission-critical workflows. It reduces risk, saves time, and ensures you never lose your work. ⏱ Estimated setup time: 5–10 minutes.
by Gegenfeld
AI Image Generator Workflow This workflow lets you automatically generate AI images with the APImage API 🡥, download the generated image, and upload it to any serivce you want (e.g., Google Drive, Notion, Social Media, etc.). 🧩 Nodes Overview 1. Generate Image (Trigger) This node contains the following fields: Image Prompt*: *(text input) Dimensions**: Square, Landscape, Portrait AI Model**: Basic, Premium This acts as the entry point to your workflow. It collects input and sends it to the APImage API node. Note: You can swap this node with any other node that lets you define the parameters shown above._** 2. APImage API (HTTP Request) This node sends a POST request to: https://apimage.org/api/ai-image-generate The request body is dynamically filled with values from the first node: { "prompt": "{{ $json['Describe the image you want'] }}", "dimensions": "{{ $json['Dimensions'] }}", "model": "{{ $json['AI Model'] }}" } ✅ Make sure to set your API Key in the Authorization header like this: Bearer YOUR_API_KEY 🔐 You can find your API Key in your APImage Dashboard 🡥 3. Download Image (HTTP Request) Once the image is generated, this node downloads the image file using the URL returned by the API: {{ $json.images[0] }} The image is stored in the output field: generated_image 4. Upload to Google Drive This node takes the image from the generated_image field and uploads it to your connected Google Drive. 📁 You can configure a different target folder or replace this node with: Dropbox WordPress Notion Shopify Any other destination Make sure to pass the correct filename and file field, as defined in the "Download Image" node. Set up Google Drive credentials 🡥 ✨ How To Get Started Double-click the APImage API node. Replace YOUR_API_KEY with your actual key (keep Bearer prefix). Open the Generate Image node and test the form. 🔗 Open the Dashboard 🡥 🔧 How to Customize Replace the Form Trigger with another node if you're collecting data elsewhere (e.g., via Airtable, Notion, Webhook, Database, etc.) Modify the Upload node if you'd like to send the image to other tools like Slack, Notion, Email, or an S3 bucket. 📚 API Docs & Resources APImage API Docs 🡥 n8n Documentation 🡥 🖇️ Node Connections Generate Image → APImage API → Download Image → Upload to Google Drive ✅ This template is ideal for: Content creators automating media generation SaaS integrations for AI tools Text-to-image pipelines
by MUHAMMAD SHAHEER
Who's It For AI developers, automation engineers, and teams building chatbots, AI agents, or workflows that process user input. Perfect for those concerned about security, compliance, and content safety. What It Does This workflow demonstrates all 9 guardrail types available in n8n's Guardrails node through real-world test cases. It provides a comprehensive safety testing suite that validates: Keyword blocking for profanity and banned terms Jailbreak detection to prevent prompt injection attacks NSFW content filtering for inappropriate material PII detection and sanitization for emails, phone numbers, and credit cards Secret key detection to catch leaked API keys and tokens Topical alignment to keep conversations on-topic URL whitelisting to block malicious domains Credential URL blocking to prevent URLs with embedded passwords Custom regex patterns for organization-specific rules (employee IDs, order numbers) Each test case flows through its corresponding guardrail node, with results formatted into clear pass/fail reports showing violations and sanitized text. How to Set Up Add your Groq API credentials (free tier works fine) Import the workflow Click "Test workflow" to run all 9 cases Review the formatted results to understand each guardrail's behavior Requirements n8n version 1.119.1 or later (for Guardrails node) Groq API account (free tier sufficient) Self-hosted instance (some guardrails use LLM-based detection) How to Customize Modify test cases in the "Test Cases Data" node to match your specific scenarios Adjust threshold values (0.0-1.0) for AI-based guardrails to fine-tune sensitivity Add or remove guardrails based on your security requirements Integrate individual guardrail nodes into your production workflows Use the sticky notes as reference documentation for implementation This is a plug-and-play educational template that serves as both a testing suite and implementation reference for building production-ready AI safety layers.
by Simone
Overview This workflow automates the process of merging multiple .xlsx files from a designated folder into a single, well-organized Excel workbook. Each input file is converted into its own sheet within the output file. Additionally, a summary sheet is generated at the beginning, providing a convenient overview of all merged files, including their original names and the number of records in each. This is particularly useful for consolidating reports, combining data from different sources, or archiving multiple spreadsheets into one manageable file. How It Works The workflow follows these key steps: Trigger: The process begins when you manually execute the workflow. Read Files: It reads all files ending with the .xlsx extension from the /n8n_files/ directory (ensure your volume is mapped correctly). Process Each File: The workflow iterates through each file one by one. For each file, it extracts the data from the first sheet. Collect and Clean Data: A custom code node gathers the data from all files. It cleans the data by removing any completely empty rows and prepares it for the final Excel generation. The original filename is used to name the new sheet. Generate Multi-Sheet Excel: The core logic resides in a code node that uses the xlsx library. It creates a new Excel workbook in memory, adds a sheet for each processed file, and populates it with the corresponding data. It also creates a "Summary" sheet that lists all the source files and their record counts. Save the Result: The final workbook is saved as a new .xlsx file in the /n8n_files/output/ directory with a timestamped filename (e.g., 合并文件_20250908T123000.xlsx). Setup & Prerequisites To use this workflow, you need to configure your n8n instance to allow and use the external xlsx npm module. Place Your Files: Put all the Excel files you want to merge into the folder that is mapped to /n8n_files/ in your n8n container. Enable External Module: Set the following environment variable for your n8n service in your docker-compose.yml file: environment: NODE_FUNCTION_ALLOW_EXTERNAL=xlsx Install the Module: You must build a custom Docker image for n8n that includes the xlsx library. In the same directory as your docker-compose.yml, create a file named Dockerfile. Add the following content to the Dockerfile: FROM n8nio/n8n:latest USER root RUN npm install xlsx USER node In your docker-compose.yml, replace the image: n8nio/n8n... line with build: . for the n8n service. Rebuild and restart your n8n container using docker-compose up --build -d. Nodes Used Manual Trigger: To start the workflow. Read Write File: To read source files and save the final output file. Split In Batches: To process files one by one. Extract From File: To read the data from each .xlsx file. Code: For custom JavaScript logic to process data and generate the final multi-sheet Excel file using the xlsx library.
by Robert Breen
Capture new Jotform submissions and instantly create items on a Monday.com board with mapped columns (email, date, dropdowns, instructions, etc.). 🛠️ Setup — Jotform (simple) Add your Jotform API key (Jotform Account → Settings → API → Create Key). Create your form template in Jotform (use fields like Name, Email, Start Date, Engagement Type, Campaign Type, Instructions). In n8n, open the Jotform Trigger node and choose your Jotform template/form from the dropdown. That’s it. 🛠️ Setup — Monday.com In Monday.com, generate an API token (Admin/Developers → API). In n8n → Credentials → New → Monday.com, paste your API token. Identify and set: Board ID (from your board URL or via node “List” operations) Group ID (e.g., topics) Column IDs that match your board (examples used by this workflow): text_mkvdj8v3 → Email (Text) date_mkvdg4aa → Start Date (Date) dropdown_mkvdjwra → Engagement Type (Dropdown) dropdown_mkvdd9v3 → Campaign Type (Dropdown) text_mkvd2md9 → Campaign Type (as Text label) text_mkvd1bj2 → Instructions (Text) text_mkvd5w3y → Domain (Text) Update the label → ID mappings inside the Monday.com node if your dropdown IDs differ (e.g., Engagement A → 1, Engagement B → 2). ✅ Notes (best practices) No secrets in nodes: store tokens in n8n Credentials. Use the included Sticky Notes for quick reference inside the workflow. Test once in Jotform to see the payload flow into Monday. 📬 Contact Need help customizing this (e.g., extra fields, file uploads, or routing by campaign)? 📧 rbreen@ynteractive.com 🔗 Robert Breen — https://www.linkedin.com/in/robert-breen-29429625/ 🌐 ynteractive.com — https://ynteractive.com
by AI/ML API | D1m7asis
This n8n workflow turns Telegram into a personal language tutor. Users can choose between different learning modes — vocabulary, grammar, quiz, or mixed lessons — simply by adding a hashtag to their message. The bot processes requests with AI/ML API and sends back structured, interactive lessons in Telegram. 🚀 Features 📩 Telegram-based input with hashtag commands 🧠 Adaptive AI responses (vocabulary, grammar, quiz) 🔤 Pronunciation support in Latin transcription 📒 Grammar explanations with examples ❓ Custom quizzes with auto-feedback 💬 Supportive learning experience with motivational messages ⏳ Typing indicator for smoother UX 🛠 Setup Guide 📲 Create Telegram Bot Go to @BotFather Use /newbot → choose a name and username Save the bot token 🔐 Set Up Credentials in n8n Telegram API: Use your bot token AI/ML API: Add your API key under AI/ML account ⚙️ Configure the Workflow Import the JSON into n8n Update credentials (Telegram + AI/ML API) Activate the workflow 💬 Start Learning In Telegram, send a message with one of the supported hashtags: #vocabulary — learn new words #grammar — study rules with examples #quiz — get exercises Or just send plain text for a free-form AI response 🔍 Node Overview Telegram Trigger** → Listens for incoming messages Show Typing Indicator** → Displays “typing…” while processing Switch Node** → Routes message by hashtag (#vocabulary, #grammar, #quiz) Prompt Builder Nodes** → Create JSON payload for AI/ML API AI/ML API Node** → Generates the structured lesson Telegram Send** → Sends the formatted response back to the user 💡 Example Flow User Input: #vocabulary кукуруза Bot Output: Кукуруза (Kukurúza) — Corn Pronunciation: koo-koo-ROO-zah Sentence: Я люблю есть кукурузу на гриле. I love eating grilled corn. 👉 Try to write your own sentence with “кукуруза”!
by System Admin
Tagged with: , , , ,