by Audun
Who is this for? Security professionals Developers Individuals interested in data breach awareness Use Case Automated monitoring for new breaches Proactive identity protection Demonstration of simple cache mechanism What this workflow does Checks the Have I Been Pwned API every 15 minutes for the latest breaches. Compares new breach data against previously notified breaches. Demonstrates a simple cache mechanism to track previously seen breaches. How the Cache Functionality Works Read from Cache**: Retrieves the last known breach from cache.json to avoid redundant alerts for the same breach. Compare Against Current Breach**: The workflow checks if the latest fetched breach differs from the cached one. Update the Cache**: If a new breach is detected, it updates cache.json with the latest breach data. Setup instructions The endpoint used in this workflow does not require an API key. Add your desired alert mechanism in the red box attached to the New breach node. How to customize this workflow to your needs Modify Notification Settings**: Tailor where alerts are sent (email, Slack, etc.). Add the desired node after the New breach node. This node contains all the data from the breach so it is eaisily available. You can choose from a variety of n8n nodes to send alerts when a new breach is detected. Below are a few common options you might consider adding after the New breach node: Email Node What it does: Sends an email notification to one or more recipients. Use case: Great for simple alerts to your inbox or a team distribution list. Customization: You can include breach details in the subject or body of the email, using data from the New breach node. Slack Node What it does: Sends a message to a Slack channel or user. Use case: Perfect for real-time alerts to your team in Slack. Customization: You can post breach details directly in a channel or DM. You can also format the message (bold, code blocks, etc.). Microsoft Teams Node What it does: Sends a message to a Teams channel. Use case: For organizations that use Microsoft Teams for communication. Customization: Similar to Slack, you can customize the message content and include all relevant breach information. Discord Node What it does: Sends an alert message to a Discord channel. Use case: Useful for teams or communities that coordinate via Discord. Customization: Add formatted messages with breach details for easy viewing. Telegram Node What it does: Sends messages to a Telegram chat or group. Use case: Good for mobile notifications and fast alerts. Customization: You can include breach summaries or detailed information, and even use bots to automate this. Webhook Node (as a sender) What it does: Sends breach data to another service via a webhook. Use case: If you have an external system or app that handles alerts, you can push the data directly to it. Customization: Send JSON payloads with detailed breach information to trigger actions in other systems. SMS Nodes (like Twilio) What it does: Sends an SMS notification to one or more phone numbers. Use case: For urgent alerts that need to be seen immediately. Customization: Keep messages concise, including key breach details like the time, type of breach, and affected system. Adjust Check Frequency**: Change the interval in the Schedule Trigger node (e.g., hourly or daily).
by Sarfaraz Muhammad Sajib
📧 Email Validation Workflow Using APILayer API This n8n workflow enables users to validate email addresses in real time using the APILayer Email Verification API. It's particularly useful for preventing invalid email submissions during lead generation, user registration, or newsletter sign-ups, ultimately improving data quality and reducing bounce rates. ⚙️ Step-by-Step Setup Instructions Trigger the Workflow Manually: The workflow starts with the Manual Trigger node, allowing you to test it on demand from the n8n editor. Set Required Fields: The Set Email & Access Key node allows you to enter: email: The target email address to validate. access_key: Your personal API key from apilayer.net. Make the API Call: The HTTP Request node dynamically constructs the URL: https://apilayer.net/api/check?access_key={{ $json.access_key }}&email={{ $json.email }} It sends a GET request to the APILayer endpoint and returns a detailed response about the email's validity. (Optional): You can add additional nodes to filter, store, or react to the results depending on your needs. 🔧 How to Customize Replace the manual trigger with a webhook or schedule trigger to automate validations. Dynamically map the email and access_key values from previous nodes or external data sources. Add conditional logic to filter out invalid emails, log them into a database, or send alerts via Slack or Email. 💡 Use Case & Benefits Email validation is crucial in maintaining a clean and functional mailing list. This workflow is especially valuable in: Sign-up forms where real-time email checks prevent fake or disposable emails. CRM systems to ensure user-entered emails are valid before saving them. Marketing pipelines to minimize email bounce rates and increase campaign deliverability. Using APILayer’s trusted validation service, you can verify whether an email exists, check if it’s a role-based address (like info@ or support@), and identify disposable email services—all with a simple workflow. Keywords: email validation, n8n workflow, APILayer API, verify email, real-time email check, clean email list, reduce bounce rate, data accuracy, API integration, no-code automation
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Brave Search Structured Data Extractor workflow is designed for professionals and teams that need high-quality, structured insights from Brave search results in real time. Whether you're performing market research, tracking competitors, training AI models, or powering content engines, this workflow offers a robust and automated solution. This workflow is tailored for: Market Researchers - Who analyze trends across multimedia channels AI Developers - Who require clean, structured datasets for model fine-tuning SEO & Content - Analysts looking to monitor visibility across news, images, and videos Media Researchers - Curating timely and relevant information across formats Automation Engineers - Integrating search insights into downstream workflows What problem is this workflow solving? Traditional web scraping and search result parsing is fragmented, inconsistent, and prone to errors, especially when dealing with multimedia (images, videos, news) data from search engines. This workflow provides: Centralized Brave search data extraction across all content types. Switches the search execution based upon the type of search that is being set. ex: news, images, videos, all Automated structured data transformation using Google Gemini Unified output persistence and notification across disk, webhook, and Google Sheets What this workflow does Input Configuration Define your Brave search query Set the search type: videos, images, news, or all Configure your Bright Data MCP zone Bright Data MCP Search Execution Initiates a Brave search via Bright Data MCP using the correct URL pattern for each search type Returns raw HTML of search results Google Gemini LLM Structured Data Extraction Transforms raw results into structured data (e.g., title, URL, source, snippet) Output Handling Save to disk (e.g., JSON or CSV file) Send Webhook notification with structured data (e.g., Slack, internal dashboards) Store in Google Sheets for team-wide access or dashboarding Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Enhance Output Analysis Add additional LLM prompts for topic classification, sentiment scoring, or trend forecasting. Output Format Options Choose to output CSV, Markdown, or HTML reports based on your integration target. Schedule Automation Trigger the workflow on a schedule (daily/weekly) to keep monitoring topical content.
by Angel Menendez
CallForge - AI Gong Transcript PreProcessor Transform your Gong.io call transcripts into structured, enriched, and AI-ready data for better sales insights and analytics. Who is This For? This workflow is designed for: ✅ Sales teams looking to automate call transcript formatting. ✅ Revenue operations (RevOps) professionals optimizing AI-driven insights. ✅ Businesses using Gong.io that need structured, enriched call transcripts for better decision-making. What Problem Does This Workflow Solve? Manually processing raw Gong call transcripts is inefficient and often lacks essential context for AI-driven insights. With CallForge, you can: ✔ Extract and format Gong call transcripts for structured AI processing. ✔ Enhance metadata using sales data from Salesforce. ✔ Classify speakers as internal (sales team) or external (customers). ✔ Identify external companies by filtering out free email domains (e.g., Gmail, Yahoo). ✔ Enrich customer profiles using PeopleDataLabs to identify company details and locations. ✔ Prepare transcripts for AI models by structuring conversations and removing unnecessary noise. What This Workflow Does 1. Retrieves Gong Call Data Calls the Gong API to extract call metadata, speaker interactions, and collaboration details. Fetches call transcripts for AI processing. 2. Processes and Cleans Transcripts Converts call transcripts into structured, speaker-based dialogues. Assigns each speaker as either Internal (Sales Team) or External (Customer). 3. Extracts Company Information Retrieves Salesforce data** to match customers with existing sales opportunities. Filters out free email domains* to determine the *customer’s actual company domain**. Calls the PeopleDataLabs API to retrieve additional company data and location details. 4. Merges and Enriches Data Combines Gong metadata, Salesforce customer details and insights**. Ensures all necessary data is available for AI-driven sales insights. 5. Final Formatting for AI Processing Merges all call transcript data into a single structured format for AI analysis. Extracts the final cleaned, enriched dataset for further AI-powered insights. How to Set Up This Workflow 1. Connect Your APIs 🔹 Gong API Access – Set up your Gong API credentials in n8n. 🔹 Salesforce Setup – Ensure API access if you want customer enrichment. 🔹 PeopleDataLabs API – Required to retrieve company and location details based on email domains. 🔹 Webhook Integration – Modify the webhook call to push enriched call data to an internal system. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion How to Customize This Workflow 💡 Modify Data Sources – Connect different CRMs (e.g., HubSpot, Zoho) instead of Salesforce. 💡 Expand AI Analysis – Add another AI model (e.g., OpenAI GPT, Claude) for advanced conversation insights. 💡 Change Speaker Classification Rules – Adjust internal vs. external speaker logic to match your team’s structure. 💡 Filter Specific Customers – Modify the free email filtering logic to better fit your company’s needs. Why Use CallForge? 🚀 Automate Gong call transcript processing to save time. 📊 Improve AI accuracy with enriched, structured data. 🛠 Enhance sales strategy by extracting actionable insights from calls. Start optimizing your Gong transcript analysis today!
by Lucas Peyrin
How it works This template provides a complete, ready-to-use web application for generating high-quality AI prompts. It features a user-friendly web form where you can describe your goal, and it leverages an AI model (Google Gemini) to create a structured, reusable prompt for you. The workflow is a full-stack application built entirely within n8n: Frontend (The Form): A Form Trigger node creates a beautiful, public-facing web form. Here, a user describes the prompt they need and selects which structural components to include (like system instructions, examples, or input variables). Backend (The AI Logic): A LangChain Chain node takes the user's request and constructs a "meta-prompt"—a set of instructions for the AI on how to generate the final prompt. The Google Gemini node executes this meta-prompt, creating a well-structured output with clear sections and tags. The Result (The Webpage): After generation, the user is automatically redirected to a new URL. This URL is handled by another Webhook node, which serves a custom-coded HTML page. This beautiful, dark-themed webpage displays the generated prompt and includes a one-click "Copy" button, making it easy to use the result immediately. This template is a perfect example of how to build interactive web tools with n8n, combining a user interface, backend logic, and a dynamic web response in a single workflow. Set up steps Setup time: ~1-3 minutes This workflow requires a Google AI credential to function. Configure Google AI Credentials: This workflow uses a Google Gemini model. You will need a Google AI API key. In n8n, go to Credentials and click Add credential. Search for Google Gemini and enter your API key. Go back to the workflow, open the Gemini 2.5 Flash node, and select your newly created credential from the dropdown. Activate the Workflow: Click the Active toggle in the top-right corner to turn the workflow on. Access Your Prompt Maker: Open the Prompt Request (Form Trigger) node. Copy the Public URL provided. This is the link to your new web application! Open the link in your browser, fill out the form, and see the magic happen. Note: This workflow uses environment variables like {{ $env.WEBHOOK_URL }} to build the redirect URL. These are typically set automatically by n8n and should work out-of-the-box on most standard n8n setups.
by David Ashby
Complete MCP server exposing all Humantic AI Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Humantic AI Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Humantic AI Tool tool with full error handling 📋 Available Operations (3 total) Every possible Humantic AI Tool operation is included: 🔧 Profile (3 operations) • Create a profile • Get a profile • Update a profile 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Humantic AI Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Humantic AI Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Jimleuk
This n8n template introduces the Dynamic Prompts Ai workflow pattern which are incredible for certain types of data extraction tasks where attributes are unknown or need to remain flexible. The general idea behind this pattern is that the prompts for requested attributes to be extracted live outside the template and so can be changed at any time - without needing to edit the template. This seriously cuts down on maintainance requirements and is reusable for any number of tables at little cost. Check out the video demo I did for n8n Studio here: https://www.youtube.com/watch?v=_fNAD1u8BZw Check out the example Airtable here: https://airtable.com/appAyH3GCBJ56cfXl/shrXzR1Tj99kuQbyL Looking for the Baserow Version? https://n8n.io/workflows/2780-ai-data-extraction-with-dynamic-prompts-and-baserow/ How it works Given we have an "input" field for context and a number of fields for the data we want to extract, this template will run in the background to react to any changes to either the "input" or fields and automatically update the rows accordingly. The key is that Airtable fields have a special property called the "field description". In this pattern, we use this property to allow the user to store a simple prompt describing the data that should exist in the column. Our n8n template reads these column descriptions aka "prompts" to use as instructions to perform tasks on the "input". In this template, the "input" is a PDF of a resume/CV and the columns are attributes a HR person would want to extract from it - such as full name, address, last position, years of experience etc. How to use First publish this template and ensure it's accessible via webhook URL. You then have to run the "create airtable webhooks" mini-flow to configure your Airtable to send change events to the n8n template. This mini-flow exists in the template but you'll have to update the IDs. Check the template for more instructions. Requirements Airtable for Tables/Database OpenAI for LLM and extraction. Feel free to choose another LLM if preferred. Customising this workflow If you're not using files, you can replace the "input" field with anything you like. For example, the "input" could be single line text.
by Ranjan Dailata
Who this is for? The Structured Data Extract & Data Mining workflow is crafted for researchers, content analysts, SEO strategists, and AI developers who need to transform semi-structured web data (like markdown content or scraped HTML) into actionable structured datasets. It is ideal for: Content Analysts** - Organizing and mining large volumes of markdown or HTML content. SEO & Trend Researchers** - Exploring topics by location and category. AI Engineers & NLP Developers** - Looking to automate insight extraction from unstructured inputs. Growth Marketers** - Tracking topic-level trends for strategic campaigns. Automation Specialists** - Streamlining workflows from scrape to storage. What problem is this workflow solving? Extracting insights from markdown or HTML documents typically requires manual review, formatting, and parsing. This becomes unscalable when dealing with large datasets or when real-time response is needed. Additionally, trend and topic extraction usually involves external tools, custom scripts, and inconsistent formatting. This workflow solves: Automatic text extraction from markdown or structured content. Location and category-based trend mining with semantic grouping. AI-driven topic extraction and summarization Real-time notification via webhook with rich structured payloads. Persistent storage of mined data to disk for audits or further processing. What this workflow does Receives input: Sets the URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. A Markdown/Text Extractor node parses the content into clean plaintext The cleaned data is passed to Google Gemini to: Identify trends by location and category Extract key topics and themes Format the response into structured JSON The structured insights are sent via Webhook Notification to external systems (e.g., Slack, Web apps, Zapier) The final output is saved to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. Gemini Prompt Customization** : Extract trends within a custom category (e.g., E-commerce design patterns in the US) Output topics with popularity metrics Structure the output as per your database schema (e.g., [{ topic, trend_score, location }]) Webhook Output** : Send notifications to - Slack – with AI summaries in rich blocks Internal APIs – for use in dashboards Zapier/Make – for multi-step automation Persistence** Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.
by Ferenc Erb
Overview An automation workflow that creates a complete REST API for digitally signing PDF documents using n8n webhooks. This service demonstrates how to implement secure document signing functionality through standardized API endpoints with file upload and download capabilities. Use Case This workflow is designed for developers and automation specialists who need to implement digital document signing. It's particularly useful for: Integrating PDF signing capabilities into existing document workflows API-based automation of signature processes Creating proof-of-concept implementations for document verification systems Learning n8n's webhook capabilities and file handling techniques Testing PDF signing in development environments before production implementation What This Workflow Does API-Based Document Management Exposes RESTful webhook endpoints for all document operations Handles multipart/form-data uploads for PDF documents Processes JSON payloads for signing configuration Provides download functionality for completed documents Digital Certificate Handling Uploads existing PFX/PKCS#12 digital certificates Generates new certificates with customizable attributes Securely manages certificate storage and access Associates certificates with signing operations Cryptographic PDF Signing Applies digital signatures using industry-standard cryptographic methods Embeds signature information within PDF document structure Validates document integrity through cryptographic verification Preserves original document while adding signature elements Webhook Integration System Routes different API methods to appropriate handlers Validates request payloads and file content Manages authentication through webhook paths Returns structured responses for integration with other systems Technical Architecture Components API Gateway: n8n webhook nodes that receive external requests Request Router: Switch nodes that direct operations based on method parameters Document Processor: Function nodes for PDF manipulation and verification Certificate Manager: Specialized nodes for cryptographic key operations Storage Interface: File operation nodes for document persistence Response Formatter: Nodes that structure API responses Integration Flow Client Request → Webhook Endpoint → Method Router → Processing Engine → Digital Signing → Storage → Response Generation → Client Response Setup Instructions Prerequisites n8n installation (minimum version 0.214.0) Node.js 14 or higher Required environment variable: NODE_FUNCTION_ALLOW_EXTERNAL: "node-forge,@signpdf/signpdf,@signpdf/signer-p12,@signpdf/placeholder-plain" Configuration Steps Import Workflow Import the workflow JSON into your n8n instance Activate the workflow to enable the webhooks Configure Storage Set the storage path variables in the workflow Ensure proper permissions on the storage directories Test API Endpoints Use the included test scripts to verify functionality Test PDF upload, certificate generation, and signing Integration Document the webhook URLs for integration with other systems Configure error handling according to your requirements Testing Methods Test the workflow functionality using various HTTP requests and JSON data: Upload PDF documents to the document processing endpoint Upload or generate digital certificates Execute PDF signing operations Download signed documents from the download endpoint Webhook Endpoints The workflow exposes two primary webhook endpoints that form a complete API for PDF digital signing operations: 1. Document Processing Endpoint (/webhook/docu-digi-sign) This endpoint handles all document and certificate operations: Method: Upload PDF HTTP: POST Content-Type: multipart/form-data Parameters: method, uploadType, fileName, fileData Method: Upload Certificate HTTP: POST Content-Type: multipart/form-data Parameters: method, uploadType, fileName, fileData Method: Generate Certificate HTTP: POST Content-Type: application/json Parameters: method, subjectCN, issuerCN, serialNumber, validFrom, validTo, password Method: Sign PDF HTTP: POST Content-Type: application/json Parameters: method, inputPdf, pfxFile, pfxPassword 2. Document Download Endpoint (/webhook/docu-download) This endpoint handles the retrieval of processed documents: Method: Download Signed PDF HTTP: GET Content-Type: application/json Parameters: method, fileType, fileName Key Workflow Sections The workflow is organized into logical sections with clear responsibilities: Request Processing**: Parses incoming webhook data Method Routing**: Directs requests to appropriate handlers Document Management**: Handles file operations and storage Cryptographic Operations**: Manages signing and certificate functions Response Formatting**: Structures and returns results
by Oneclick AI Squad
Transform your meetings into actionable insights automatically! This workflow captures meeting audio, transcribes conversations, generates AI summaries, and emails the results to participants—all without manual intervention. What's the Goal? Auto-record meetings** when they start and stop when they end Transcribe audio** to text using Vexa Bot integration Generate intelligent summaries** with AI-powered analysis Email summaries** to meeting participants automatically Eliminate manual note-taking** and post-meeting admin work Never miss important discussions** or action items again Why Does It Matter? Save 90% of Post-Meeting Time**: No more manual transcription or summary writing Never Lose Key Information**: Automatic capture ensures nothing falls through cracks Improve Team Productivity**: Focus on discussions, not note-taking Perfect Meeting Records**: Searchable transcripts and summaries for future reference Instant Distribution**: Summaries reach all participants immediately after meetings How It Works Step 1: Meeting Detection & Recording Start Meeting Trigger**: Detects when meeting begins via Google Meet webhook Launch Vexa Bot**: Automatically joins meeting and starts recording End Meeting Trigger**: Detects meeting end and stops recording Step 2: Audio Processing & Transcription Stop Vexa Bot**: Ends recording and retrieves audio file Fetch Meeting Audio**: Downloads recorded audio from Vexa Bot Transcribe Audio**: Converts speech to text using AI transcription Step 3: AI Summary Generation Prepare Transcript**: Formats transcribed text for AI processing Generate Summary**: AI model creates concise meeting summary with: Key discussion points Decisions made Action items assigned Next steps identified Step 4: Distribution Send Email**: Automatically emails summary to all meeting participants Setup Requirements Google Meet Integration: Configure Google Meet webhook and API credentials Set up meeting detection triggers Test with sample meeting Vexa Bot Configuration: Add Vexa Bot API credentials for recording Configure audio file retrieval settings Set recording quality and format preferences AI Model Setup: Configure AI transcription service (e.g., OpenAI Whisper, Google Speech-to-Text) Set up AI summary generation with custom prompts Define summary format and length preferences Email Configuration: Set up SMTP credentials for email distribution Create email templates for meeting summaries Configure participant list extraction from meeting metadata Import Instructions Get Workflow JSON: Copy the workflow JSON code Open n8n Editor: Navigate to your n8n dashboard Import Workflow: Click menu (⋯) → "Import from Clipboard" → Paste JSON → Import Configure Credentials: Add API keys for Google Meet, Vexa Bot, AI services, and SMTP Test Workflow: Run a test meeting to verify end-to-end functionality Your meetings will now automatically transform into actionable summaries delivered to your inbox!
by Jay Emp0
🤖 MCP Personal Assistant Workflow Description This workflow integrates multiple productivity tools into a single AI-powered assistant using n8n, acting as a centralized control hub to receive and execute tasks across Google Calendar, Gmail, Google Drive, LinkedIn, Twitter, and more. ✅ Key Capabilities AI Agent + Tool Use**: Built using n8n's AI Agent and MCP system, enabling intelligent multi-step reasoning. Tool Integration**: Google Calendar: schedule, update, delete events Gmail: search, draft, send emails Google Drive: manage files and folders LinkedIn & Twitter: post updates, send DMs Utility tools: fetch date/time, search URLs Discord Input**: Accepts prompts via n8n_discord_trigger_bot repo link 🛠 Setup Instructions Timezone Configuration: Go to Settings > Default Timezone in n8n. Set to your local timezone (e.g., Asia/Jakarta). Ensure all Date & Time nodes explicitly use the same zone to avoid UTC-related bugs. Tool Authentication: Replace all OAuth credentials for: Gmail Google Drive Google Calendar Twitter LinkedIn Use your own accounts when copying this workflow. Platform Adaptability: While designed for Discord, you can replace the Discord trigger with any other chat or webhook service. Example: Telegram, Slack, WhatsApp Webhook, n8n Form Trigger, etc. 📦 Strengths Great for document retrieval, email summarization, calendar scheduling, and social posting. Reduces the need for tab-switching across multiple platforms. Tested with a comprehensive checklist across categories like: Calendar Gmail Google Drive Twitter LinkedIn Utility tools Cross-tool actions (Refer to discordGPT prompt checklist for prompt coverage.) ⚠️ Limitations ❌ Binary Uploads: AI agents & MCP server currently struggle with binary payloads. Uploading files to Gmail, Google Drive, or LinkedIn may fail due to format serialization issues. Binary operations (upload/post) are under development and will be fixed in future iterations. ❌ Date Bugs: If timezone settings are incorrect, event times may default to UTC, leading to misaligned calendar events. 🔬 Testing Use the provided prompt checklist for full coverage of: ✅ Core feature flows ✅ Edge cases (e.g., invalid dates, nonexistent users) ✅ Cross-tool chains (e.g., Google Drive → Gmail → LinkedIn) ✅ MCP Assistant Test Prompt Checklist 📅 Google Calendar [X] "Schedule a meeting with Alice tomorrow at 10am. and send an invite to alice@wonderland.com" [X] "Create an event called 'Project Sync' on Friday at 3pm with Bob and Charlie." [X] "Update the time of my call with James to next Monday at 2pm." [X] "Delete my meeting with Marketing next Wednesday." [x] "What is my schedule tommorow ? " 📧 Gmail [x] "Show me unread emails from this week." [x] "Search for emails with subject: invoice" [X] "Reply to the latest email from john@company.com saying 'Thanks, noted!'" [X] "Draft an email to info@a16z.com with subject 'Emp0 Fundraising' and draft the body of the email with an investment opportunity in Emp0, scrape this site https://Emp0.com to get to know more about emp0.com" [X] "Send an email to hi@cursor.com with subject 'Feature request' and cc sales@cursor.com" [ ] "Send an email to recruiting@openai.com , write about how you like their product and want to apply for a job there and attach my latest CV from Google Drivce" 🗂 Google Drive [ ] "Upload the PDF you just sent me to my Google Drive." [X] "Create a folder called 'July Reports' inside Emp0 shared drive." [X] "Move the file named 'Q2_Review.pdf' to 'Reports/2024/Q2'." [X] "Share the folder 'Investor Decks' with info@a16z.com as viewer." [ ] "Download the file 'Wayne_Li_CV.pdf' and attach it in Discord." [X] "Search for a file named 'Invoice May' in my Google Drive." 🖼 LinkedIn [X] "Think of a random and inspiring quote. Post a text update on LinkedIn with the quote and end with a question so people will answer and increase engagement" [ ] "Post this Google Drive image to LinkedIn with the caption: 'Team offsite snapshots!'" [X] "Summarize the contents of this workflow and post it on linkedin with the original url https://n8n.io/workflows/5230-content-farming-ai-powered-blog-automation-for-wordpress/" 🐦 Twitter [X] "Tweet: 'AI is eating operations. Fast.'" [X] "Send a DM to @founderguy: 'Would love to connect on what you’re building.'" [X] "Search Twitter for keyword: 'founder advice'" 🌐 Utilities [X] "What time is it now?" [ ] "Download this PDF: https://ontheline.trincoll.edu/images/bookdown/sample-local-pdf.pdf" [X] "Search this URL and summarize important tech updates today: https://techcrunch.com/feed/" 📎 Discord Attachments [ ] "Take the image I just uploaded and post it to LinkedIn." [ ] "Get the file from my last message and upload it to Google Drive." 🧪 Edge Cases [X] "Schedule a meeting on Feb 30." [X] "Send a DM to @user_that_does_not_exist" [ ] "Download a 50MB PDF and post it to LinkedIn" [X] "Get the latest tweet from my timeline and email it to myself." 🔗 Cross-tool Flows [ ] "Get the latest image from my Google Drive and post it on LinkedIn with the caption 'Another milestone hit!'" [ ] "Find the latest PDF report in Google Drive and email it to investor@vc.com." [ ] "Download an image from this link and upload it to my Google Drive: https://example.com/image.png" [ ] "Get the most recent attachment from my inbox and upload it to Google Drive." Run each of these in isolated test cases. For cross-tool flows, verify binary serialization integrity. 🧠 Why Use This Workflow? This is an always-on personal assistant that can: Process natural language input Handle multi-step logic Execute commands across 6+ platforms Be extended with more tools and memory If you want to interact with all your work tools from a single prompt—this is your base to start. 📎 Repo & Credits Discord bot trigger: n8n_discord_trigger_bot Creator: Jay (Emp₀)
by David Ashby
🛠️ E-goi Tool MCP Server Complete MCP server exposing all E-goi Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every E-goi Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n E-goi Tool tool with full error handling 📋 Available Operations (4 total) Every possible E-goi Tool operation is included: 📇 Contact (4 operations) • Create a member • Get a member • Get many members • Update a member 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native E-goi Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every E-goi Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.