by Rahi
🛠️ Workflow: Jotform → HubSpot Company + Task Automation Automatically create or update HubSpot companies and generate follow-up tasks whenever a Jotform is submitted. All logs are stored to Google Sheets for traceability, transparency, and debugging. ✅ Use Cases Capture marketing queries from your website’s Jotform form and immediately create tasks for your sales or SDR team. Enrich HubSpot companies with submitted domains, company names, and contact data. Automatically assign tasks to owners and keep all form submissions logged and auditable. Avoid manual handoffs — full automation from form submission → CRM. 🔍 How It Works (Step-by-Step) 1. Jotform Trigger The workflow starts when a new submission is received via the Jotform webhook. Captured fields include: name, email, LinkedIn profile, company name, marketing budget, domain, and any specific query. 2. Create or Update Company in HubSpot + Format Data The “Create Company” node ensures the submitted company is either created in HubSpot or updated if it already exists. A Formatter (Function) node standardizes the data — names, email, LinkedIn URL, domain, marketing budget, and query text. It composes a task title, generates a follow-up timestamp, and dynamically assigns an owner. 3. Loop & HTTP Request – Create HubSpot Task The workflow loops through each formatted item. A Wait node prevents rate limit issues. It then sends an HTTP POST request to HubSpot’s Tasks API, creating a task with: Subject and body including the submission details Task status, priority, and type Assigned owner and associated company 4. Loop & HTTP Request – Set Company Domain After tasks are created, another loop updates each HubSpot company record with the submitted domain. This ensures all HubSpot companies have proper website data for future enrichment. 5. Storing Logs (Google Sheets) All processed submissions, responses, errors, and metadata are appended or updated in a Google Sheets document. This provides a complete audit trail — ideal for debugging, reporting, and performance monitoring. 🧩 Node Structure Overview | Step | Node | Description | |------|------|--------------| | 1️⃣ | Jotform Trigger | Receives form submission data | | 2️⃣ | HubSpot Create Company | Ensures company record exists | | 3️⃣ | Formatter / Function Node | Cleans & structures data, assigns owner, generates task fields | | 4️⃣ | Wait / Delay Node | Controls API call frequency | | 5️⃣ | HTTP Request (Create Task) | Pushes task to HubSpot | | 6️⃣ | HTTP Request (Update Domain) | Updates company domain in HubSpot | | 7️⃣ | Google Sheets Node | Logs inputs, outputs, and status | 📋 Requirements & Setup 🔑 HubSpot Private App Token with permissions to create companies, tasks, and update records 🌐 Jotform Webhook URL pointing to this workflow 📗 Google Sheets Credentials (OAuth or service account) with write access ✅ HubSpot app must have crm.objects.companies.write and crm.objects.tasks.write scopes ⚠️ Add retry or error-handling branches for failed API calls ⚙️ Customization Tips & Variations Add contact association:** Modify the payload to also link the task with a HubSpot Contact (via email) so it appears in both company and contact timelines. Use fallback values:** In the Formatter node, provide defaults like “Unknown Company” or “No query provided.” Dynamic owner assignment:** Replace hash-based assignment with round-robin or territory logic. Conditional task creation:** Add logic to only create tasks when certain conditions are met (e.g., budget > 0). Error branches:** Capture failed HTTP responses and send Slack/Email alerts. Extended logs:** Add response codes, errors, and retry counts to your Google Sheet for more transparency. 🎯 Benefits & Why You’d Use This ⚡ Speed & Automation — eliminate manual data entry into HubSpot 📊 Data Consistency — submissions are clean, enriched, and traceable 👀 Transparency — every action logged for full visibility 🌍 Scalability — handle hundreds of submissions effortlessly 🔄 Flexibility — adaptable for other use cases (support tickets, surveys, partnerships, etc.) ✨ Example Use Case A marketing form on your website captures partnership or franchise inquiries. This workflow instantly creates a HubSpot company, logs the inquiry as a task, assigns it to a regional manager, and saves a record in Google Sheets — all within seconds. Tags: HubSpot Jotform CRM GoogleSheets Automation LeadManagement
by Jaruphat J.
This workflow automates the entire process of creating and publishing social media ads — directly from Telegram. By simply sending a product photo to your Telegram bot, the system analyzes the image, generates an AI-based advertising prompt, creates a marketing image via Fal.AI, writes an engaging Facebook/Instagram caption, and posts it automatically. This template saves hours of manual work for marketers and small business owners who constantly need to design, write, and publish product campaigns. It eliminates repetitive steps like prompt writing, AI model switching, and post scheduling — letting you focus on strategy, not execution. The workflow integrates seamlessly with Fal.AI for image generation, OpenAI Vision for image analysis, and the Facebook Graph API for automated publishing. Whether you’re launching a 10.10 campaign or promoting a new product line, this template transforms your product photo into a ready-to-publish ad in just minutes. Who’s it for This workflow is designed for: Marketers and e-commerce owners** who need to create social content quickly. Agencies** managing multiple clients’ campaigns. Small business owners** who want to automate Facebook/Instagram posts. n8n creators** who want to learn AI-assisted content automation. What problem does this solve Manually creating ad images and captions is time-consuming and repetitive. You need to: Edit the product photo. Write a creative brief or prompt. Generate an image in Fal.AI or Midjourney. Write a caption. Log into Facebook and post. This workflow combines all five steps into one automation — triggered directly by sending a Telegram message. It handles AI analysis, image creation, caption writing, and posting, removing human friction while maintaining quality and creative control. What this workflow does The workflow is divided into four main zones, color-coded inside the canvas: 🟩 Zone 1 – Product Image Analysis Trigger: User sends a product image to a Telegram bot. n8n retrieves the file path using Telegram API. OpenAI Vision analyzes the product photo and describes color, material, and shape. An AI agent converts this into structured data for generating ad prompts. 🟥 Zone 2 – Generate Ad Image Prompt The AI agent creates a professional advertising prompt based on the product description and campaign (e.g., “10.10 Sale”). The prompt is sent to the user for confirmation via Telegram before proceeding. 🟨 Zone 3 – Create Ad Image via Fal.AI The confirmed prompt and image are sent to Fal.AI’s image generation API. The system polls the generation status until completion. The generated image is sent back to Telegram for user review and approval. 🟦 Zone 4 – Write Caption & Publish The approved image is re-analyzed by AI to write a Facebook/Instagram caption. The user confirms the text on Telegram. Once approved, the workflow uploads the final post (image + caption) to Facebook automatically using the Graph API. Setup Prerequisites n8n self-hosted or Cloud account** Telegram Bot Token** (via @BotFather) Fal.AI API key** Facebook Page Access Token** with publishing permissions OpenAI API Key** for image analysis and text generation Steps Create a Telegram Bot and paste its token into n8n Credentials. Set up Fal.AI Credentials under HTTP Request → Authentication. Connect your Facebook Page through Facebook Graph API credentials. In the HTTP Request node, set: URL: https://fal.run/fal-ai/nano-banana Auth: Bearer {{ $credentials.FalAI.apiKey }} Configure all LLM and Vision nodes using your OpenAI credentials. Node settings 🟩 Analyze Image (OpenAI Vision) { "model": "gpt-4o-mini", "input": [ { "role": "user", "content": [ { "type": "image_url", "image_url": "{{$json.image_url}}" }, { "type": "text", "text": "Describe this product in detail for advertising context." } ] } ] } 🟥 Set Node – Prepare Fal.AI Body { "prompt": {{ JSON.stringify(($json.ad_prompt || '').replace(/\r?\n/g, ' ')) }}, "image_urls": [{{ JSON.stringify($json.image_url || '') }}], "num_images": 1, "output_format": "jpeg" } 🟦 HTTP Request (Facebook Graph API) { "method": "POST", "url": "https://graph.facebook.com/v19.0/me/photos", "body": { "caption": "{{ $json.caption_text }}", "url": "{{ $json.final_image_url }}", "access_token": "{{ $credentials.facebook.accessToken }}" } } How to customize the workflow Change AI Models:** Swap Fal.AI for Flux, Veo3, or SDXL by adjusting API endpoints. Add Channels:** Extend the workflow to post on LINE OA or Instagram. Add Approval Logic:** Keep Telegram confirmation steps before every publish. Brand Rules:** Adjust AI prompt templates to enforce tone, logo, or color palette consistency. Multi-language Posts:** Add translation nodes for global campaigns. Troubleshooting | Problem | Cause | Solution | |----------|--------|-----------| | Telegram message not triggering | Webhook misconfigured | Reconnect Telegram Trigger | | Fal.AI API error | Invalid JSON or token | Use JSON.stringify() in Set node and check credentials | | Facebook upload fails | Missing permissions | Ensure Page Access Token has pages_manage_posts | | LLM parser error | Output not valid JSON | Add Structured Output Parser and enforce schema | ⚠️ Security Notes Do NOT hardcode API keys** in Set or HTTP Request nodes. Always store credentials securely under n8n Credentials Manager. For self-hosted setups, use .env variables for sensitive keys (OpenAI, Fal.AI, Facebook). 🏷️ Hashtags #n8n #Automation #AIworkflow #FalAI #FacebookAPI #TelegramBot #nanobanana #NoCode #MarketingAutomation #SocialMediaAI #JaruphatJ #WorkflowTemplate #OpenAI #LLM #ProductAds #CreativeAutomation Product Image Process Step
by Elay Guez
Enrich Monday.com leads with AI-powered company research and personalized email drafts using Explorium MCP and GPT-4.1 AI-Powered Lead Enrichment & Email Writer for Monday.com 🚀 Overview Stop losing deals to slow response times! Transform your inbound leads into qualified opportunities with this intelligent workflow that automates lead enrichment and personalized outreach. When a new lead drops into your Monday.com board, magic happens: 🔍 Deep-dives into company data using Explorium MCP's advanced intelligence engine 🧠 Analyzes business priorities, pain points, and growth opportunities 💡 Identifies specific AI automation use cases tailored to each company ✉️ Crafts hyper-personalized email drafts with GPT-4.1 (under 120 words!) 📊 Enriches your CRM with actionable insights and AI solution recommendations 📧 Saves draft emails directly to Gmail for your review 🔄 Updates Monday.com automatically with all the juicy insights Perfect for sales teams, growth marketers, and BizDev pros who want to turn every lead into a conversation starter that actually converts! 👥 Who's it for? B2B sales teams drowning in inbound leads Growth teams needing lightning-fast lead qualification BizDev professionals seeking that personal touch at scale Companies rocking Monday.com as their CRM ⚡ How it works Webhook triggers when fresh lead hits Monday.com Company Researcher agent unleashes Explorium MCP for company intel Email Writer agent crafts personalized outreach that doesn't sound like a robot CRM Enrichment agent adds golden nuggets of AI recommendations Gmail integration parks the draft in your inbox Monday.com updates with all the enriched goodness 🛠️ Setup Instructions Time to magic: 20 minutes You'll need: OpenAI API Key (for GPT-4.1) Explorium MCP API Key Monday.com API Token Gmail OAuth credentials Monday.com webhook setup Step-by-step: Import this template into your n8n instance Hook up Monday.com webhook via "Respond to Webhook" node Deactivate that "Respond to Webhook" node (important!) Plug in your API credentials Customize agent prompts with YOUR company's secret sauce Match your Monday.com columns to the workflow Test drive with a dummy lead Hit activate and watch the magic! ✨ 📋 Requirements Monday.com board with these columns: Company Name, Contact Name, Email, Comments Explorium MCP access (for that company intelligence gold) OpenAI API (GPT-4.1 model ready) Gmail account (where drafts go to shine) 🎨 Make it yours Tweak email tone - formal, casual, or somewhere in between Adjust research depth in Company Researcher Add your unique value props to agent prompts Connect more data sources for extra enrichment Hook up other CRMs (HubSpot, Salesforce, Pipedrive) Add Slack alerts for hot leads 🔥 💪 Why this rocks Real talk: Manual lead research is SO 2023. While your competitors are still googling companies, you're already in their inbox with an email that mentions their latest funding round, understands their tech stack, and offers solutions to problems they didn't even know you could solve. This isn't just another "Hi {{first_name}}" template. It's AI that actually gets context, writes like a human, and makes your prospects think "How did they know exactly what we need?" Results you can expect: Faster lead response time Higher email open rates Actually useful CRM data (not just "interested in our product") Your sales team thanking you (seriously) Built with ❤️ by: Elay Guez Pro tip: The more context you feed the AI agents about your business, the scarier-good the personalization gets. Don't hold back on those System Message customizations!
by Jorge Martínez
Generate social posts from GitHub pushes to Twitter and LinkedIn On each GitHub push, this workflow checks if the commit set includes README.md and CHANGELOG.md, fetches both files, lets an LLM generate a Twitter and LinkedIn post, then publishes to Twitter and LinkedIn (Person). Apps & Nodes Trigger:** Webhook Logic:** IF, Merge, Aggregate GitHub:** Get Repository File (×2) Files:** Extract from File (text) (×2) AI:** OpenAI Chat Model → LLM Chain (+ Structured Output Parser) Publish:** Twitter, LinkedIn (Person) Prerequisites GitHub:** OAuth2 or PAT with repo read. OpenAI:** API key. Twitter:* OAuth2 app with *Read and Write; scopes tweet.read tweet.write users.read offline.access. LinkedIn (Person):* OAuth2 credentials; *required scope:** w_member_social, openid. Setup GitHub Webhook: Repo → Settings → Webhooks Payload URL: https://<your-n8n-domain>/webhook/github/push Content type: application/json • Event: Push • Secret (optional) • Branches as needed. Credentials: Connect GitHub, OpenAI, Twitter, and LinkedIn (Person). How it Works Webhook receives GitHub push payload. IF checks that README and CHANGELOG appear in added/modified. GitHub (Get Repository File) pulls README.md and CHANGELOG.md. Extract from File (text) converts both binaries to text. Merge & Aggregate combines into one item with both contents. LLM (OpenAI + Parser) returns a JSON with twitter and linkedin. Twitter posts the tweet. LinkedIn (Person) posts the LinkedIn text.
by Elay Guez
🔍 AI-Powered Web Research in Google Sheets with Bright Data 📋 Overview Transform any Google Sheets cell into an intelligent web scraper! Type =BRIGHTDATA("cell", "search prompt") and get AI-filtered result from every website in ~20 seconds. What happens automatically: AI optimizes your search query Bright Data scrapes the web (bypasses bot detection) AI analyzes and filters result Returns clean data directly to your cell Completes in <25 seconds Cost: ~$0.02-0.05 per search | Time saved: 3-5 minutes per search 👥 Who's it for Market researchers needing competitive intelligence E-commerce teams tracking prices Sales teams doing lead prospecting SEO specialists gathering content research Real estate agents monitoring listings Anyone tired of manual copy-paste ⚙️ How it works Webhook Call - Google Sheets function sends POST request Data Preparation - Organizes input structure AI Query Optimization - GPT-4.1 Mini refines search query Web Scraping - Bright Data fetches data while bypassing blocks AI Analysis - GPT-4o Mini filters and summarizes result Response - Returns plain text to your cell Logging - Updates logs for monitoring 🛠️ Setup Instructions Time to deploy: 20 minutes Requirements n8n instance with public URL Bright Data account + API key OpenAI API key Google account for Apps Script Part 1: n8n Workflow Setup Import this template into your n8n instance Configure Webhook node: Copy your webhook URL: https://n8n.yourdomain.com/webhook/brightdata-search Set authentication: Header Auth Set API key: 12312346 (or create your own) Add OpenAI credentials to AI nodes. Configure Bright Data: Add API credentials Configure Output Language: Manually edit the "Set Variables" Node. Test workflow with manual execution Activate the workflow Part 2: Google Sheets Function Open Google Sheet → Extensions → Apps Script Paste this code: function BRIGHTDATA(prompt, source) { if (!prompt || prompt === "") { return "❌ Must enter prompt"; } source = source || "google"; // Update with YOUR webhook URL const N8N_WEBHOOK_URL = "https://your-n8n-domain.com/webhook/brightdata-search"; // Update with YOUR password const API_KEY = "12312346"; let spreadsheetId, sheetName, cellAddress; try { const sheet = SpreadsheetApp.getActiveSheet(); const activeCell = sheet.getActiveCell(); spreadsheetId = SpreadsheetApp.getActiveSpreadsheet().getId(); sheetName = sheet.getName(); cellAddress = activeCell.getA1Notation(); } catch (e) { return "❌ Cannot identify cell"; } const payload = { prompt: prompt, source: source.toLowerCase(), context: { spreadsheetId: spreadsheetId, sheetName: sheetName, cellAddress: cellAddress, timestamp: new Date().toISOString() } }; const options = { method: "post", contentType: "application/json", payload: JSON.stringify(payload), muteHttpExceptions: true, headers: { "Accept": "text/plain", "key": API_KEY } }; try { const response = UrlFetchApp.fetch(N8N_WEBHOOK_URL, options); const responseCode = response.getResponseCode(); if (responseCode !== 200) { Logger.log("Error response: " + response.getContentText()); return "❌ Error " + responseCode; } return response.getContentText(); } catch (error) { Logger.log("Exception: " + error.toString()); return "❌ Connection error: " + error.toString(); } } function doGet(e) { return ContentService.createTextOutput(JSON.stringify({ status: "alive", message: "Apps Script is running", timestamp: new Date().toISOString() })).setMimeType(ContentService.MimeType.JSON); } Update N8N_WEBHOOK_URL with your webhook Update API_KEY with your password Save (Ctrl+S / Cmd+S) - Important! Close Apps Script editor 💡 Usage Examples =BRIGHTDATA("C3", "What is the current price of the product?") =BRIGHTDATA("D30", "What is the size of this company?") =BRIGHTDATA("A4", "Do this comapny is hiring Developers?") 🎨 Customization Easy Tweaks AI Models** - Switch to GPT-4o for better optimization Response Format** - Modify prompt for specific outputs Speed** - Optimize AI prompts to reduce time Language** - Change prompts for any language Advanced Options Implement rate limiting Add data validation Create async mode for long queries Add Slack notifications 🚀 Pro Tips Be Specific** - "What is iPhone 15 Pro 256GB US price?" beats "What is iPhone price?" Speed Matters** - Keep prompts concise (30s timeout limit) Monitor Costs** - Track Bright Data usage Debug** - Check workflow logs for errors ⚠️ Important Notes Timeout:** 30-second Google Sheets limit (aim for <20s) Plain Text Only:** No JSON responses Costs:** Monitor Bright Data at console.brightdata.com Security:** Keep API keys secret No Browser Storage:** Don't use localStorage/sessionStorage 🔧 Troubleshooting | Error | Solution | |-------|----------| | "Exceeded maximum execution time" | Optimize AI prompts or use async mode | | "Could not fetch data" | Verify Bright Data credentials | | Empty cell | Check n8n logs for AI parsing issues | | Broken characters | Verify UTF-8 encoding in webhook node | 📚 Resources Bright Data API Docs n8n Webhook Documentation Google Apps Script Reference Built with ❤️ by Elay Guez
by David Ashby
🛠️ NASA Tool MCP Server Complete MCP server exposing all NASA Tool operations to AI agents. Zero configuration needed - all 15 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every NASA Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n NASA Tool tool with full error handling 📋 Available Operations (15 total) Every possible NASA Tool operation is included: 🔧 Asteroidneobrowse (1 operations) • Get many asteroid neos 🔧 Asteroidneofeed (1 operations) • Get an asteroid neo feed 🔧 Asteroidneolookup (1 operations) • Get an asteroid neo lookup 🔧 Astronomypictureoftheday (1 operations) • Get the astronomy picture of the day 🔧 Donkicoronalmassejection (1 operations) • Get a DONKI coronal mass ejection 🔧 Donkihighspeedstream (1 operations) • Get a DONKI high speed stream 🔧 Donkiinterplanetaryshock (1 operations) • Get a DONKI interplanetary shock 🔧 Donkimagnetopausecrossing (1 operations) • Get a DONKI magnetopause crossing 🔧 Donkinotifications (1 operations) • Get a DONKI notifications 🔧 Donkiradiationbeltenhancement (1 operations) • Get a DONKI radiation belt enhancement 🔧 Donkisolarenergeticparticle (1 operations) • Get a DONKI solar energetic particle 🔧 Donkisolarflare (1 operations) • Get a DONKI solar flare 🔧 Donkiwsaenlilsimulation (1 operations) • Get a DONKI wsa enlil simulation 🔧 Earthassets (1 operations) • Get Earth assets 🔧 Earthimagery (1 operations) • Get Earth imagery 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native NASA Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every NASA Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Grace Gbadamosi
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow automatically synchronizes contact data from multiple CRM systems (HubSpot, Pipedrive, and Salesforce) into a unified Google Sheets database. The system runs on a daily schedule or can be triggered manually via webhook. It uses AI-powered data processing to intelligently deduplicate records, calculate data quality scores, and merge similar contacts. The workflow generates comprehensive quality reports and sends notifications via Slack, ensuring your master database stays clean and up-to-date. Who is this for This template is designed for revenue operations teams, data managers, and business analysts who need to consolidate customer data from multiple CRM platforms. It's particularly valuable for organizations using multiple sales tools simultaneously or during CRM migration projects. The workflow helps maintain data integrity while providing insights into data quality across different systems. Requirements Google Sheets Account**: For storing master database and quality reports CRM Platform Access**: API credentials for HubSpot, Pipedrive, and/or Salesforce OpenAI API Key**: For AI-powered data processing and deduplication MCP Server**: Github:marlonluo2018/pandas-mcp-server (optional - can be replaced with Code node) Slack Webhook**: For receiving sync completion and error notifications How to set up Configure Environment Variables - Set up secure credential storage for all API keys: HUBSPOT_API_KEY, PIPEDRIVE_API_KEY, SALESFORCE_ACCESS_TOKEN, OPENAI_API_KEY, and SLACK_WEBHOOK_URL Create Google Sheets Structure - Create a master Google Sheet with two tabs: "Master_CRM_Data" for the unified contact database and "Quality_Reports" for tracking sync statistics and data quality metrics Set Up MCP Server - Install and configure "marlonluo2018/pandas-mcp-server" Update Configuration Center - Modify all placeholder values in the Configuration Center node with your actual Google Sheet IDs, quality thresholds, deduplication keys, and batch processing settings
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works This workflow connects an MCP Server with the KlickTipp contact management platform and integrates it with an LLM (e.g. Claude etc.) to enable intelligent querying and segmentation of contact data. It covers all major KlickTipp API endpoints, providing a comprehensive toolkit for automated contact handling and campaign targeting. Key Features MCP Server Trigger: Initiates the workflow via the MCP server, listening for incoming requests related to contact queries or segmentation actions. LLM Interaction Setup: Interacts with an OpenAI or Claude model to handle natural language queries such as contact lookups, tagging, and segmentation tasks. KlickTipp Integration: Complete set of KlickTipp API endpoints included: Contact Management: Add, update, get, list, delete, and unsubscribe contacts. Contact Tagging: Tag, untag, list tagged contacts. Tag Operations: Create, get, update, delete, list tags. Opt-In Processes: List and retrieve opt-in process details. Data Fields: List and get custom data fields. Redirects: Retrieve redirect URLs. Use Cases Supported: Query contact information via email or name. Identify and segment contacts by city, region, or behavior. Create or update contacts from the provided data. Dynamically apply or remove tags to initiate campaigns. Automate targeted outreach based on contact attributes. Setup Instructions Install and Configure Nodes: Set up MCP Server. Configure the LLM connection (e.g., Claude Desktop configuration). Add and authenticate all KlickTipp nodes using valid API credentials. Define Tagging and Field Mapping: Identify which fields and tags are relevant to your use cases. Ensure necessary tags and custom fields are already created in KlickTipp. Workflow Logic: Trigger via MCP Server: A prompt or webhook call activates the server listener. Query Handling via LLM Agent: AI interprets the natural language input and determines the action. Contact Search & Segmentation: Searches contacts using identifiers (email, address) or criteria. Data Operations: Retrieves, updates, or manages contact and tag data based on interpreted command. Campaign Preparation: Applies tags or sends campaign triggers depending on query results. Benefits: AI-Powered Automation:** Reduces manual contact search and tagging efforts through intelligent processing. Scalable Integration:** Built-in support for full range of KlickTipp operations allows diverse use-case handling. Data Consistency:** Ensures structured data flows between MCP, AI, and KlickTipp, minimizing errors. Testing and Deployment: Use defined prompts such as: “Tell me something about the contact with email address X” “Tag all contacts from region Y” “Send campaign Z to customers in area A” Validate expected actions in KlickTipp after prompt execution. Notes: Customization:** Adjust tag logic, AI prompts, and contact field mappings based on project needs. Extensibility:** The template can be expanded with further logic for Google Sheets input or campaign feedback loops Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Pawan
Who's it for? This template is perfect for educational institutions, coaching centers (like UPSC, GMAT, or specialized technical training), internal corporate knowledge bases, and SaaS companies that need to provide instant, accurate, and source-grounded answers based on proprietary documents. It's designed for users who want to leverage Google Gemini's powerful reasoning but ensure its answers are strictly factual and based only on their verified knowledge repository. How it works / What it does This workflow establishes a Retrieval-Augmented Generation (RAG) pipeline to build a secure, fact-based AI Agent. It operates in two main phases: 1. Knowledge Ingestion: When a new document (e.g., a PDF, lecture notes, or policy manual) is uploaded via a form or Google Drive, the Embeddings Google Gemini node converts the content into numerical vectors. These vectors are then stored in a secure MongoDB Atlas Vector Store, creating a private knowledge base. 2. AI Query & Response: A user asks a question via Telegram. The AI Agent uses the question to perform a semantic search on the MongoDB Vector Store, retrieving the most relevant, source-specific passages. It then feeds this retrieved context to the Google Gemini Chat Model to generate a precise, factual answer, which is sent back to the user on Telegram. This process ensures the agent never "hallucinates" or uses general internet knowledge, making the responses accurate and trustworthy. Requirements To use this template, you will need the following accounts and credentials: n8n Account Google Gemini API Key: For generating vector embeddings and powering the AI Agent. MongoDB Atlas Cluster: A free-tier cluster is sufficient, configured with a Vector Search index. Telegram Bot: A bot created via BotFather and a Chat ID where the bot will listen for and send messages. Google Drive Credentials (if using the Google Drive ingestion path). How to set up Set up MongoDB Atlas:** Create a free cluster and a database. Create a Vector Search Index on your collection to enable efficient searching. Configure Ingestion Path:** Set up the Webhook trigger for your "On form submission" or connect your Google Drive credentials. Configure the Embeddings Google Gemini node with your API Key. Connect the MongoDB Atlas Vector Store node with your database credentials, collection name, and index name. Configure Chat Path:** Set up the Telegram Trigger with your Bot Token to listen for incoming messages. Configure the Google Gemini Chat Model with your API Key. Connect the MongoDB Atlas Vector Store 1 node as a Tool within the AI Agent. Ensure it points to the same vector store as the ingestion path. Final Step:* Configure the Send a text message node with your *Telegram Bot Token and the Chat ID**. How to customize the workflow Change Knowledge Source:** Replace the Google Drive nodes with nodes for Notion, SharePoint, Zendesk, or another document source. Change Chat Platform:** Replace the Telegram nodes with a Slack, Discord, or WhatsApp Cloud trigger and response node. Refine the Agent's Persona:** Open the AI Agent node and edit the System Instruction to give the bot a specific role (e.g., "You are a senior UPSC coach. Answer questions politely and cite sources."). 💡 Example Use Case An UPSC/JEE/NEET coaching uploads NCERT summaries and previous year notes to Google Drive. Students ask questions in the Telegram group — the bot instantly replies with contextually accurate answers from the uploaded materials. The same agent can generate daily quizzes or concise notes from this curated content automatically.
by Aryan Shinde
Instagram Reel Downloader & Logger Automate Instagram Reel Downloads, Storage, and Activity Logging What does this workflow do? Handles incoming webhook requests (ideal for Instagram/Facebook API triggers). Validates the webhook via challenge-response and custom verify token. Checks for messages from yourself (filtering automated/self-triggered runs). Downloads Instagram Reels from URLs posted to the webhook. Uploads the reel to Google Drive and retrieves the download URL. Logs reel details (status, URL, and timestamp) to a Google Sheet for record-keeping. Notifies you on Telegram with the download details and Google Drive link. How does it work? Webhook: Listens for new messages/events (custom webhook endpoint for Meta). Validation: Confirms webhook subscribe/challenge and verify token from Meta API. Sender Check: Ignores messages unless they match your configured sender/recipient. Download Reel: Fetches the reel/attachment from Instagram using received URLs. Timestamp Gen: Adds a precise timestamp and ISO-based unique ID to the activity log. Upload to Drive: Saves the downloaded reel in a preset Google Drive folder. Log to Sheet: Updates a Google Sheet with the reel’s status, URL, and timestamp. Telegram Alert: Instantly notifies you when a new reel is downloaded and logged. What do I need to make this work? A registered webhook endpoint (from your Meta/Instagram app configuration). A Google Drive and Google Sheets account (OAuth2 connected to n8n). A Telegram Bot and Chat ID setup to receive download completion messages. The correct verify_token in your webhook event source matches your template (‘youtube-automation-n8n-token’ by default). Update your Drive/Sheet/Bot credentials as per your n8n instance’s environment. Why use this? Fully automates the collection and archival of Instagram Reels. Centralizes content download, backup, and activity records for your automation flows. Provides instant monitoring and archival of each event. Setup Tips: Make sure your webhook path and Meta app configuration match (/n8n-template-insta-webhook). Double-check the Google credentials and the sheet’s tab IDs/names. Replace the Telegram and Google connection credentials with your own securely. Use this as a foundation for any Instagram/Facebook-based automations in n8n, and customize as your automation stack evolves! Publish confidently, and let users know this template: Saves time, automates digital content management, and notifies users in real-time.
by David Ashby
Complete MCP server exposing 14 Domains-Index API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Domains-Index API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Domains-Index API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to /v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (14 total) 🔧 Domains (9 endpoints) • GET /domains/search: Domains Database Search • GET /domains/tld/{zone_id}: Get TLD records • GET /domains/tld/{zone_id}/download: Download Whole Dataset for TLD • GET /domains/tld/{zone_id}/search: Domains Search for TLD • GET /domains/updates/added: Get added domains, latest if date not specified • GET /domains/updates/added/download: Download added domains, latest if date not specified • GET /domains/updates/deleted: Get deleted domains, latest if date not specified • GET /domains/updates/deleted/download: Download deleted domains, latest if date not specified • GET /domains/updates/list: List of updates 🔧 Info (5 endpoints) • GET /info/api: GET /info/api • GET /info/stat/: Returns overall stagtistics • GET /info/stat/{zone}: Returns statistics for specific zone • GET /info/tld/: Returns overall Tld info • GET /info/tld/{zone}: Returns statistics for specific zone 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Domains-Index API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 9 Api2Pdf - PDF Generation, Powered by AWS Lambda API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Api2Pdf - PDF Generation, Powered by AWS Lambda credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Api2Pdf - PDF Generation, Powered by AWS Lambda API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://v2018.api2pdf.com • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (9 total) 🔧 Chrome (3 endpoints) • POST /chrome/html: Convert raw HTML to PDF • GET /chrome/url: Convert URL to PDF • POST /chrome/url: Convert URL to PDF 🔧 Libreoffice (1 endpoints) • POST /libreoffice/convert: Convert office document or image to PDF 🔧 Merge (1 endpoints) • POST /merge: Merge multiple PDFs together 🔧 Wkhtmltopdf (3 endpoints) • POST /wkhtmltopdf/html: Convert raw HTML to PDF • GET /wkhtmltopdf/url: Convert URL to PDF • POST /wkhtmltopdf/url: Convert URL to PDF 🔧 Zebra (1 endpoints) • GET /zebra: Generate bar codes and QR codes with ZXING. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Api2Pdf - PDF Generation, Powered by AWS Lambda API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.