by Daniel Rosehill
Voice Note Context Extraction Pipeline with AI Agent & Vector Storage This n8n template demonstrates how to automatically extract and store contextual information from voice notes using AI agents and vector databases for future retrieval. How it works Webhook trigger** receives voice note data including title, transcript, and timestamp from external services (example here: voicenotes.com) Field extraction** isolates the key data fields (title, transcript, timestamp) for processing AI Context Agent** processes the transcript to extract meaningful context while: Correcting speech-to-text errors Converting first-person references to third-person facts Filtering out casual conversation and focusing on significant information Output formatting** structures the extracted context with timestamps for embedding File conversion** prepares the context data for vector storage Vector embedding** uses OpenAI embeddings to create searchable representations Milvus storage** stores the embedded context for future retrieval in RAG applications How to use Configure the webhook endpoint to receive data from your voice note service Set up credentials for OpenRouter (LLM), OpenAI (embeddings), and Milvus (vector storage) Customize the AI agent's system prompt to match your context extraction needs The workflow automatically processes incoming voice notes and stores extracted context Requirements OpenRouter account for LLM access OpenAI API key for embeddings Milvus vector database (cloud or self-hosted) Voice note service with webhook capabilities (e.g., Voicenotes.com) Customizing this workflow Modify the context extraction prompt** to focus on specific types of information (preferences, facts, relationships) Add filtering logic** to process only voice notes with specific tags or keywords Integrate with other storage** systems like Pinecone, Weaviate, or local vector databases Connect to RAG systems** to use the stored context for enhanced AI conversations Add notification nodes** to confirm successful context extraction and storage Use cases Personal AI assistant** that remembers your preferences and context from voice notes Knowledge management** system for capturing insights from recorded thoughts Content creation** pipeline that extracts key themes from voice recordings Research assistant** that builds context from interview transcripts or meeting notes
by Oneclick AI Squad
This automated n8n workflow monitors real-time cryptocurrency prices using CoinGecko API and sends smart alerts when price conditions are met. It supports multi-coin tracking, dynamic conditions, and instant notifications via Email, Telegram, and Discord. Good to Know Reads crypto watchlist data from Google Sheets. Monitors prices at defined intervals (24/7 monitoring). Handles upper and lower price limits with direction-based alerts (above, below, both). Implements cooldown logic to avoid duplicate alerts. Updates last alert price and timestamp in Google Sheets. Supports multiple alert channels: Email, Telegram, Discord. Uses CoinGecko API for price data (Free tier supported). How It Works 24/7 Crypto Trigger – Runs every minute (or custom interval) to check latest prices. Read Crypto Watchlist – Fetches symbols and conditions from Google Sheets. Parse Crypto Data – Converts Google Sheet data into structured JSON. Fetch Live Crypto Price – Uses CoinGecko API to get latest market price for each coin. Smart Crypto Alert Logic – Compares live price with upper/lower limits and evaluates conditions: Above – Trigger alert if price > upper\_limit. Below – Trigger alert if price < lower\_limit. Both – Trigger alert if either condition is met. Implements cooldown\_minutes to prevent repeated alerts. Check Crypto Alert Conditions – Validates alerts before sending notifications. Send Crypto Email Alert – Sends email alert if condition is true. Send Telegram Crypto Alert – Sends Telegram alert. Send Discord Crypto Alert – Sends Discord alert. Update Crypto Alert History – Updates last_alert_price and last_alert_time in Google Sheet. Crypto Alert Status Check – Ensures alert process completed successfully. Success Notification – Sends confirmation message on success. Error Notification – Sends an error alert if something fails. Google Sheet Columns (A-G) | Column | Description | | ------ | ---------------------------------- | | A | symbol (BTC, ETH, SOL, etc.) | | B | upper_limit (e.g., 45000) | | C | lower_limit (e.g., 40000) | | D | direction (both / above / below) | | E | cooldown_minutes (e.g., 10) | | F | last_alert_price (auto-updated) | | G | last_alert_time (auto-updated) | How to Use Import the workflow into n8n. Configure Google Sheets credentials and link your watchlist sheet. Add your CoinGecko API endpoint in the Fetch Price node (Free tier). Set up Email, Telegram, and Discord credentials for notifications. Test with sample data: Example: BTC, upper\_limit=45000, lower\_limit=40000, direction=both. Execute the workflow and monitor alerts. Requirements n8n environment with execution permissions. Google Sheets integration (with API credentials). CoinGecko API (Free tier supported). Notification channels: Email (SMTP settings in n8n). Telegram Bot Token. Discord Webhook URL. Customizing This Workflow Add more coins in Google Sheet. Modify alert conditions (e.g., percentage change, moving averages). Add SMS or WhatsApp notifications. Integrate with Slack or Microsoft Teams. Use AI-based price predictions for smarter alerts.
by vinci-king-01
How it works This workflow automatically processes bank statements from various formats and extracts structured transaction data with intelligent categorization using AI. Key Steps File Upload - Accepts bank statements via webhook upload (PDF, Excel, CSV formats). Smart Format Detection - Automatically routes files to appropriate processors (PDF text extraction or spreadsheet parsing). AI-Powered Extraction - Uses GPT-4 to extract account details, transactions, and balances from statement data. Data Processing & Categorization - Cleans, validates, and automatically categorizes transactions into expense categories. Database Storage - Saves processed data to PostgreSQL database for analysis and reporting. API Response - Returns structured summary with transaction counts, expense totals, and category breakdowns. Set up steps Setup time: 8-12 minutes Configure OpenAI credentials - Add your OpenAI API key for AI-powered data extraction. Set up PostgreSQL database - Connect your PostgreSQL database and create the required table structure. Configure webhook endpoint - The workflow provides a /upload-statement endpoint for file uploads. Customize transaction categories - Modify the AI prompt to include your preferred expense categories. Test the workflow - Upload a sample bank statement to verify the extraction and categorization process. Set up database table - Ensure your PostgreSQL database has a bank_statements table with appropriate columns. Features Multi-format support**: PDF, Excel, CSV bank statements AI-powered extraction**: GPT-4 extracts account details and transactions Automatic categorization**: Expenses categorized as groceries, dining, gas, shopping, utilities, healthcare, entertainment, income, fees, or other Data validation**: Cleans and validates transaction data with error handling Database storage**: PostgreSQL integration for data persistence API responses**: Clean JSON responses with transaction summaries and category breakdowns Smart routing**: Automatic format detection and appropriate processing paths
by Intuz
This n8n template from Intuz delivers a complete and automated solution to streamline your development workflow for a single repository. By embedding specific keywords and a JIRA issue ID within your git commit commands, this workflow automatically creates a Pull Request in GitHub and simultaneously updates the corresponding JIRA ticket. This provides a complete, seamless integration that eliminates manual steps and keeps your project management perfectly in sync with your codebase. How it works This workflow acts as a powerful bridge between your Git repository and your project management tools, driven entirely by the structure of your commit messages. GitHub Webhook Trigger: The workflow starts when a developer pushes a new commit to a specified repository in GitHub. Parse Commit Message: A Code node extracts key information from the commit message: The JIRA Issue Key (e.g., FF-1196). The base branch for the PR (e.g., development). Action commands like [auto-pr] and [taskcompleted]. Conditional PR Creation: An IF node checks if the [auto-pr] command is present. If yes, it uses the GitHub node to automatically create a pull request from the developer's branch to the specified base branch. If no, this step is skipped, allowing for multiple commits before a PR is made. Conditional JIRA Update: Another IF node checks for the [taskcompleted] command. If yes, it uses the JIRA node to transition the corresponding issue to your "Done" status (e.g., "Task Completed" or "In Review"). If no, the JIRA issue remains in its current state, perfect for work-in-progress commits. How to Use: Quick Start Guide Click the "Use Template" button to import this workflow into your n8n instance. Configure the GitHub Trigger: Open the "GitHub Push Trigger" node. It will display a unique Webhook URL. Copy this URL. In your GitHub repository, go to Settings > Webhooks > Add webhook. Paste the URL into the Payload URL field. Set the Content type to application/json. Under "Which events would you like to trigger this webhook?", select Just the push event. Click Add webhook. Connect Your Accounts: GitHub: Select your GitHub API credential in the "Create Pull Request" node. JIRA : Select your JIRA API credential in the "Update JIRA Issue Status" node. Customize the JIRA Transition (Important): Open the "Update JIRA Issue Status" node. In the Transition parameter, you need to set the specific status you want to move the issue to (e.g., 'Done', 'Completed', 'In Review'). You can use the ID or the exact name of the transition from your JIRA project's workflow. Activate the Workflow: Save your changes and activate the workflow. You're ready to automate! Example Commit Message: git commit -m "FF-1196 Implement OAuth login [auto-pr,development,taskcompleted]" Key Requirements to Use Template An active n8n instance. A GitHub account with repository admin permissions to create webhooks. A JIRA Cloud account with permissions to update issues. Developers who can follow the specified git commit message format. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Pawan
Who's it for? This template is perfect for educational institutions, coaching centers (like UPSC, GMAT, or specialized technical training), internal corporate knowledge bases, and SaaS companies that need to provide instant, accurate, and source-grounded answers based on proprietary documents. It's designed for users who want to leverage Google Gemini's powerful reasoning but ensure its answers are strictly factual and based only on their verified knowledge repository. How it works / What it does This workflow establishes a Retrieval-Augmented Generation (RAG) pipeline to build a secure, fact-based AI Agent. It operates in two main phases: 1. Knowledge Ingestion: When a new document (e.g., a PDF, lecture notes, or policy manual) is uploaded via a form or Google Drive, the Embeddings Google Gemini node converts the content into numerical vectors. These vectors are then stored in a secure MongoDB Atlas Vector Store, creating a private knowledge base. 2. AI Query & Response: A user asks a question via Telegram. The AI Agent uses the question to perform a semantic search on the MongoDB Vector Store, retrieving the most relevant, source-specific passages. It then feeds this retrieved context to the Google Gemini Chat Model to generate a precise, factual answer, which is sent back to the user on Telegram. This process ensures the agent never "hallucinates" or uses general internet knowledge, making the responses accurate and trustworthy. Requirements To use this template, you will need the following accounts and credentials: n8n Account Google Gemini API Key: For generating vector embeddings and powering the AI Agent. MongoDB Atlas Cluster: A free-tier cluster is sufficient, configured with a Vector Search index. Telegram Bot: A bot created via BotFather and a Chat ID where the bot will listen for and send messages. Google Drive Credentials (if using the Google Drive ingestion path). How to set up Set up MongoDB Atlas:** Create a free cluster and a database. Create a Vector Search Index on your collection to enable efficient searching. Configure Ingestion Path:** Set up the Webhook trigger for your "On form submission" or connect your Google Drive credentials. Configure the Embeddings Google Gemini node with your API Key. Connect the MongoDB Atlas Vector Store node with your database credentials, collection name, and index name. Configure Chat Path:** Set up the Telegram Trigger with your Bot Token to listen for incoming messages. Configure the Google Gemini Chat Model with your API Key. Connect the MongoDB Atlas Vector Store 1 node as a Tool within the AI Agent. Ensure it points to the same vector store as the ingestion path. Final Step:* Configure the Send a text message node with your *Telegram Bot Token and the Chat ID**. How to customize the workflow Change Knowledge Source:** Replace the Google Drive nodes with nodes for Notion, SharePoint, Zendesk, or another document source. Change Chat Platform:** Replace the Telegram nodes with a Slack, Discord, or WhatsApp Cloud trigger and response node. Refine the Agent's Persona:** Open the AI Agent node and edit the System Instruction to give the bot a specific role (e.g., "You are a senior UPSC coach. Answer questions politely and cite sources."). 💡 Example Use Case An UPSC/JEE/NEET coaching uploads NCERT summaries and previous year notes to Google Drive. Students ask questions in the Telegram group — the bot instantly replies with contextually accurate answers from the uploaded materials. The same agent can generate daily quizzes or concise notes from this curated content automatically.
by Incrementors
Financial Insight Automation: Market Cap to Telegram via Bright Data 📊 Description An automated n8n workflow that scrapes financial data from Yahoo Finance using Bright Data, processes market cap information, generates visual charts, and sends comprehensive financial insights directly to Telegram for instant notifications. 🚀 How It Works This workflow operates through a simple three-zone process: 1. Data Input & Trigger User submits a keyword (e.g., "AI", "Crypto", "MSFT") through a form trigger that initiates the financial data collection process. 2. Data Scraping & Processing Bright Data API discovers and scrapes comprehensive financial data from Yahoo Finance, including market cap, stock prices, company profiles, and financial metrics. 3. Visualization & Delivery The system generates interactive market cap charts, saves data to Google Sheets for record-keeping, and sends visual insights to Telegram as PNG images. ⚡ Setup Steps > ⏱️ Estimated Setup Time: 15-20 minutes Prerequisites Active n8n instance (self-hosted or cloud) Bright Data account with Yahoo Finance dataset access Google account for Sheets integration Telegram bot token and chat ID Step 1: Import the Workflow Copy the provided JSON workflow code In n8n: Go to Workflows → + Add workflow → Import from JSON Paste the JSON content and click Import Step 2: Configure Bright Data Integration Set up Bright Data Credentials: In n8n: Navigate to Credentials → + Add credential → HTTP Header Auth Add Authorization header with value: Bearer BRIGHT_DATA_API_KEY Replace BRIGHT_DATA_API_KEY with your actual API key Test the connection to ensure it works properly > Note: The workflow uses dataset ID gd_lmrpz3vxmz972ghd7 for Yahoo Finance data. Ensure you have access to this dataset in your Bright Data dashboard. Step 3: Set up Google Sheets Integration Create a Google Sheet: Go to Google Sheets and create a new spreadsheet Name it "Financial Data Tracker" or similar Copy the Sheet ID from the URL Configure Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update the workflow: Open the "📊 Filtered Output & Save to Sheet" node Replace YOUR_SHEET_ID with your actual Sheet ID Select your Google Sheets credential Step 4: Configure Telegram Bot Set up Telegram Integration: Create a Telegram bot using @BotFather Get your bot token and chat ID In n8n: Credentials → + Add credential → Telegram API Enter your bot token Update the "📤 Send Chart on Telegram" node with your chat ID Replace YOUR_TELEGRAM_CHAT_ID with your actual chat ID Step 5: Test and Activate Test the workflow: Use the form trigger with a test keyword (e.g., "AAPL") Monitor the execution in n8n Verify data appears in Google Sheets Check for chart delivery on Telegram Activate the workflow: Turn on the workflow using the toggle switch The form trigger will be accessible via the provided webhook URL 📋 Key Features 🔍 Keyword-Based Discovery: Search companies by keyword, ticker, or industry 💰 Comprehensive Financial Data: Market cap, stock prices, earnings, and company profiles 📊 Visual Charts: Automatic generation of market cap comparison charts 📱 Telegram Integration: Instant delivery of insights to your mobile device 💾 Data Storage: Automatic backup to Google Sheets for historical tracking ⚡ Real-time Processing: Fast data retrieval and processing with Bright Data 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | Company Name | Full company name | "Apple Inc." | | Stock Ticker | Trading symbol | "AAPL" | | Market Cap | Total market capitalization | "$2.89T" | | Current Price | Latest stock price | "$189.25" | | Exchange | Stock exchange | "NASDAQ" | | Sector | Business sector | "Technology" | | PE Ratio | Price to earnings ratio | "28.45" | | 52 Week Range | Annual high and low prices | "$164.08 - $199.62" | 🔧 Troubleshooting Common Issues Bright Data Connection Failed: Verify your API key is correct and active Check dataset permissions in Bright Data dashboard Ensure you have sufficient credits Google Sheets Permission Denied: Re-authenticate Google Sheets OAuth Verify sheet sharing settings Check if the Sheet ID is correct Telegram Not Receiving Messages: Verify bot token and chat ID Check if bot is added to the chat Test Telegram credentials manually Performance Tips Use specific keywords for better data accuracy Monitor Bright Data usage to control costs Set up error handling for failed requests Consider rate limiting for high-volume usage 🎯 Use Cases Investment Research:** Quick financial analysis of companies and sectors Market Monitoring:** Track market cap changes and stock performance Competitive Analysis:** Compare financial metrics across companies Portfolio Management:** Monitor holdings and potential investments Financial Reporting:** Generate automated financial insights for teams 🔗 Additional Resources n8n Documentation Bright Data Datasets Google Sheets API Telegram Bot API For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Rahi
🛠️ Workflow: Jotform → HubSpot Company + Task Automation Automatically create or update HubSpot companies and generate follow-up tasks whenever a Jotform is submitted. All logs are stored to Google Sheets for traceability, transparency, and debugging. ✅ Use Cases Capture marketing queries from your website’s Jotform form and immediately create tasks for your sales or SDR team. Enrich HubSpot companies with submitted domains, company names, and contact data. Automatically assign tasks to owners and keep all form submissions logged and auditable. Avoid manual handoffs — full automation from form submission → CRM. 🔍 How It Works (Step-by-Step) 1. Jotform Trigger The workflow starts when a new submission is received via the Jotform webhook. Captured fields include: name, email, LinkedIn profile, company name, marketing budget, domain, and any specific query. 2. Create or Update Company in HubSpot + Format Data The “Create Company” node ensures the submitted company is either created in HubSpot or updated if it already exists. A Formatter (Function) node standardizes the data — names, email, LinkedIn URL, domain, marketing budget, and query text. It composes a task title, generates a follow-up timestamp, and dynamically assigns an owner. 3. Loop & HTTP Request – Create HubSpot Task The workflow loops through each formatted item. A Wait node prevents rate limit issues. It then sends an HTTP POST request to HubSpot’s Tasks API, creating a task with: Subject and body including the submission details Task status, priority, and type Assigned owner and associated company 4. Loop & HTTP Request – Set Company Domain After tasks are created, another loop updates each HubSpot company record with the submitted domain. This ensures all HubSpot companies have proper website data for future enrichment. 5. Storing Logs (Google Sheets) All processed submissions, responses, errors, and metadata are appended or updated in a Google Sheets document. This provides a complete audit trail — ideal for debugging, reporting, and performance monitoring. 🧩 Node Structure Overview | Step | Node | Description | |------|------|--------------| | 1️⃣ | Jotform Trigger | Receives form submission data | | 2️⃣ | HubSpot Create Company | Ensures company record exists | | 3️⃣ | Formatter / Function Node | Cleans & structures data, assigns owner, generates task fields | | 4️⃣ | Wait / Delay Node | Controls API call frequency | | 5️⃣ | HTTP Request (Create Task) | Pushes task to HubSpot | | 6️⃣ | HTTP Request (Update Domain) | Updates company domain in HubSpot | | 7️⃣ | Google Sheets Node | Logs inputs, outputs, and status | 📋 Requirements & Setup 🔑 HubSpot Private App Token with permissions to create companies, tasks, and update records 🌐 Jotform Webhook URL pointing to this workflow 📗 Google Sheets Credentials (OAuth or service account) with write access ✅ HubSpot app must have crm.objects.companies.write and crm.objects.tasks.write scopes ⚠️ Add retry or error-handling branches for failed API calls ⚙️ Customization Tips & Variations Add contact association:** Modify the payload to also link the task with a HubSpot Contact (via email) so it appears in both company and contact timelines. Use fallback values:** In the Formatter node, provide defaults like “Unknown Company” or “No query provided.” Dynamic owner assignment:** Replace hash-based assignment with round-robin or territory logic. Conditional task creation:** Add logic to only create tasks when certain conditions are met (e.g., budget > 0). Error branches:** Capture failed HTTP responses and send Slack/Email alerts. Extended logs:** Add response codes, errors, and retry counts to your Google Sheet for more transparency. 🎯 Benefits & Why You’d Use This ⚡ Speed & Automation — eliminate manual data entry into HubSpot 📊 Data Consistency — submissions are clean, enriched, and traceable 👀 Transparency — every action logged for full visibility 🌍 Scalability — handle hundreds of submissions effortlessly 🔄 Flexibility — adaptable for other use cases (support tickets, surveys, partnerships, etc.) ✨ Example Use Case A marketing form on your website captures partnership or franchise inquiries. This workflow instantly creates a HubSpot company, logs the inquiry as a task, assigns it to a regional manager, and saves a record in Google Sheets — all within seconds. Tags: HubSpot Jotform CRM GoogleSheets Automation LeadManagement
by A Z
⚡ Quick Setup Import this workflow into your n8n instance. Add your Apify, Google Sheets, and Firecrawl credentials. Activate the workflow to start your automated lead enrichment system. Copy the webhook URL from the MCP trigger node. Connect AI agents using the MCP URL. 🔧 How it Works This solution combines two powerful workflows to deliver fully enriched, AI-ready business leads from Google Maps: Apify Google Maps Scraper Node: Collects business data and, if enabled, enriches each lead with contact details and social profiles. Leads Missing Enrichment: Any leads without contact or social info are automatically saved to a Google Sheet. Firecrawl & Code Node Workflow: A second workflow monitors the Google Sheet, crawls each business’s website using Firecrawl, and extracts additional social media profiles or contact info using a Code node. Personalization Logic: AI-powered nodes generate tailored outreach content for each enriched lead. Native Integration: The entire process is exposed as an MCP-compatible interface, returning enriched and personalized lead data directly to the AI agent. 📋 Available Operations Business Search: Find businesses on Google Maps by location, category, or keyword. Lead Enrichment: Automatically append contact details, social profiles, and other business info using Apify and Firecrawl. Personalized Outreach Generation: Create custom messages or emails for each lead. Batch Processing: Handle multiple leads in a single request. Status & Error Reporting: Get real-time feedback on processing, enrichment, and crawling. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: Search queries (location, keywords, categories) Enrichment options (contact, social, etc.) Personalization variables (name, business type, etc.) Response Format: Returns fully enriched lead data and personalized outreach content in a structured format.
by Mohamed Salama
Let AI agents fetch communicate with your Bubble app automatically. It connects direcly with your Bubble data API. This workflow is designed for teams building AI tools or copilots that need seamless access to Bubble backend data via natural language queries. How it works Triggered via a webhook from an AI agent using the MCP (Model-Chain Prompt) protocol. The agent selects the appropriate data tool (e.g., projects, user, bookings) based on user intent. The workflow queries your Bubble database and returns the result. Ideal for integrating with ChatGPT, n8n AI-Agents, assistants, or autonomous workflows that need real-time access to app data. Set up steps Enable access to your Bubble data or backend APIs (as needed). Create a Bubble admin token. Add your Bubble node/s to your n8n workflow. Add your Bubble admin token. Configer your Bubble node/s. Copy the generated webhook URL from the MCP Server Trigger node and register it with your AI tool (e.g., LangChain tool loader). (Optional) Adjust filters in the “Get an Object Details” node to match your dataset needs. Once connected, your AI agents can automatically retrieve context-aware data from your Bubble app, no manual lookups required.
by Elay Guez
Enrich Monday.com leads with AI-powered company research and personalized email drafts using Explorium MCP and GPT-4.1 AI-Powered Lead Enrichment & Email Writer for Monday.com 🚀 Overview Stop losing deals to slow response times! Transform your inbound leads into qualified opportunities with this intelligent workflow that automates lead enrichment and personalized outreach. When a new lead drops into your Monday.com board, magic happens: 🔍 Deep-dives into company data using Explorium MCP's advanced intelligence engine 🧠 Analyzes business priorities, pain points, and growth opportunities 💡 Identifies specific AI automation use cases tailored to each company ✉️ Crafts hyper-personalized email drafts with GPT-4.1 (under 120 words!) 📊 Enriches your CRM with actionable insights and AI solution recommendations 📧 Saves draft emails directly to Gmail for your review 🔄 Updates Monday.com automatically with all the juicy insights Perfect for sales teams, growth marketers, and BizDev pros who want to turn every lead into a conversation starter that actually converts! 👥 Who's it for? B2B sales teams drowning in inbound leads Growth teams needing lightning-fast lead qualification BizDev professionals seeking that personal touch at scale Companies rocking Monday.com as their CRM ⚡ How it works Webhook triggers when fresh lead hits Monday.com Company Researcher agent unleashes Explorium MCP for company intel Email Writer agent crafts personalized outreach that doesn't sound like a robot CRM Enrichment agent adds golden nuggets of AI recommendations Gmail integration parks the draft in your inbox Monday.com updates with all the enriched goodness 🛠️ Setup Instructions Time to magic: 20 minutes You'll need: OpenAI API Key (for GPT-4.1) Explorium MCP API Key Monday.com API Token Gmail OAuth credentials Monday.com webhook setup Step-by-step: Import this template into your n8n instance Hook up Monday.com webhook via "Respond to Webhook" node Deactivate that "Respond to Webhook" node (important!) Plug in your API credentials Customize agent prompts with YOUR company's secret sauce Match your Monday.com columns to the workflow Test drive with a dummy lead Hit activate and watch the magic! ✨ 📋 Requirements Monday.com board with these columns: Company Name, Contact Name, Email, Comments Explorium MCP access (for that company intelligence gold) OpenAI API (GPT-4.1 model ready) Gmail account (where drafts go to shine) 🎨 Make it yours Tweak email tone - formal, casual, or somewhere in between Adjust research depth in Company Researcher Add your unique value props to agent prompts Connect more data sources for extra enrichment Hook up other CRMs (HubSpot, Salesforce, Pipedrive) Add Slack alerts for hot leads 🔥 💪 Why this rocks Real talk: Manual lead research is SO 2023. While your competitors are still googling companies, you're already in their inbox with an email that mentions their latest funding round, understands their tech stack, and offers solutions to problems they didn't even know you could solve. This isn't just another "Hi {{first_name}}" template. It's AI that actually gets context, writes like a human, and makes your prospects think "How did they know exactly what we need?" Results you can expect: Faster lead response time Higher email open rates Actually useful CRM data (not just "interested in our product") Your sales team thanking you (seriously) Built with ❤️ by: Elay Guez Pro tip: The more context you feed the AI agents about your business, the scarier-good the personalization gets. Don't hold back on those System Message customizations!
by DuyTran
Description: Overview This workflow generates automated revenue and expense comparison reports from a structured Google Sheet. It enables users to compare financial data across the current period, last month, and last year, then uses an AI agent to analyze and summarize the results for business reporting. Prerequisites A connected Google Sheets OAuth2 credential. A valid DeepSeek AI API (or replaceable with another Chat Model). A sub-workflow (child workflow) that handles processing logic. Properly structured Google Sheets data (see below). Required Google Sheet Structure Column headers must include at least: Date, Amount, Type. Setup Steps Import the workflow into your n8n instance. Connect your Google Sheets and DeepSeek API credentials. Update: Sheet ID and Tab Name (already embedded in node: Get revenual from google sheet). Custom sub-workflow ID (in the Call n8n Workflow Tool node). Optionally configure chatbot webhook in the When chat message received node. What the Workflow Does Accepts date inputs via AI chat interface (ChatTrigger + AI Agent). Fetches raw transaction data from Google Sheets. Segments and pivots revenue by classification for: Current period Last month Last year Aggregates totals and applies custom titles for comparison. Merges all summaries into a final unified JSON report. Customization Options Replace DeepSeek with OpenAI or other LLMs. Change the date fields or cycle comparisons (e.g., quarterly, weekly). Add more AI analysis steps such as sentiment scoring or forecasting. Modify the pivot logic to suit specific KPI tags or labels. Troubleshooting Tips If Google Sheets fetch fails: ensure the document is shared with your n8n Google credential. If parsing errors: verify that all dates follow the expected format. Sub-workflow must be active and configured to accept the correct inputs (6 dates). SEO Keywords (ẩn hoặc mô tả ngầm): Google Sheets report, AI financial report, compare revenue by month, expense analysis automation, chatbot n8n report generator, n8n Google Sheet integration
by Le Nguyen
This template implements a recursive web crawler inside n8n. Starting from a given URL, it crawls linked pages up to a maximum depth (default: 3), extracts text and links, and returns the collected content via webhook. 🚀 How It Works 1) Webhook Trigger Accepts a JSON body with a url field. Example payload: { "url": "https://example.com" } 2) Initialization Sets crawl parameters: url, domain, maxDepth = 3, and depth = 0. Initializes global static data (pending, visited, queued, pages). 3) Recursive Crawling Fetches each page (HTTP Request). Extracts body text and links (HTML node). Cleans and deduplicates links. Filters out: External domains (only same-site is followed) Anchors (#), mailto/tel/javascript links Non-HTML files (.pdf, .docx, .xlsx, .pptx) 4) Depth Control & Queue Tracks visited URLs Stops at maxDepth to prevent infinite loops Uses SplitInBatches to loop the queue 5) Data Collection Saves each crawled page (url, depth, content) into pages[] When pending = 0, combines results 6) Output Responds via the Webhook node with: combinedContent (all pages concatenated) pages[] (array of individual results) Large results are chunked when exceeding ~12,000 characters 🛠️ Setup Instructions 1) Import Template Load from n8n Community Templates. 2) Configure Webhook Open the Webhook node Copy the Test URL (development) or Production URL (after deploy) You’ll POST crawl requests to this endpoint 3) Run a Test Send a POST with JSON: curl -X POST https://<your-n8n>/webhook/<id> \ -H "Content-Type: application/json" \ -d '{"url": "https://example.com"}' 4) View Response The crawler returns a JSON object containing combinedContent and pages[]. ⚙️ Configuration maxDepth** Default: 3. Adjust in the Init Crawl Params (Set) node. Timeouts** HTTP Request node timeout is 5 seconds per request; increase if needed. Filtering Rules** Only same-domain links are followed (apex and www treated as same-site) Skips anchors, mailto:, tel:, javascript: Skips document links (.pdf, .docx, .xlsx, .pptx) You can tweak the regex and logic in Queue & Dedup Links (Code) node 📌 Limitations No JavaScript rendering (static HTML only) No authentication/cookies/session handling Large sites can be slow or hit timeouts; chunking mitigates response size ✅ Example Use Cases Extract text across your site for AI ingestion / embeddings SEO/content audit and internal link checks Build a lightweight page corpus for downstream processing in n8n ⏱️ Estimated Setup Time ~10 minutes (import → set webhook → test request)