by Muhammad Zeeshan Ahmad
Platform: n8n (Telegram Bot Integration) Purpose: Let users fetch top meme coin prices in real-time using a simple /memecoin Telegram command How It Works (Logic Breakdown) This flow listens for a Telegram command and fetches data from the CoinGecko API to respond with live memecoin prices. ๐น 1. Telegram Trigger Node Listens for incoming Telegram messages from users. Activated when a message is sent in a Telegram chat connected to the bot. Passes the raw message (e.g., /memecoin) to the next node. ๐น 2. IF Node โ Check if Message is /memecoin Condition: {{$json"message"}} === "/memecoin" If true โ continue to fetch data from CoinGecko. If false โ nothing happens. ๐น 3. HTTP Request โ Fetch Meme Coins from CoinGecko API: https://api.coingecko.com/api/v3/coins/markets?...category=meme-token Fetches top 5 meme tokens by market cap. Data includes: Name Symbol Current price (USD) Coin ID (for URL linking) ๐น 4. Function Node โ Format the Message Parses the JSON response from CoinGecko. Builds a clean message like: ruby Copy Edit ๐ Dogecoin (DOGE) ๐ฐ Price: $0.123 ๐ More: https://www.coingecko.com/en/coins/dogecoin Loops through top 5 meme coins and adds line breaks. ๐น 5. Telegram Send Node โ Reply to User Sends the formatted message to the original chat. Uses chat_id from the trigger to ensure correct user receives it. ๐ผ Sample User Flow ๐ค User types /memecoin in Telegram bot ๐ค Bot fetches meme coin prices ๐ฌ Bot replies with live prices + links
by Charles
Modern AI systems are powerful but pose privacy risks when handling sensitive data. Organizations need AI capabilities while ensuring: โ Sensitive data never leaves secure environments โ Compliance with regulations (GDPR, HIPAA, PCI, SOX) โ Real-time decision making about data sensitivity โ Comprehensive audit trails for regulatory review The Concept: Intelligent Data Classification + Smart Routing The goal of this concept is to build the foundations of the safe and compliant use of LLMs in Agentic workflows by automatically detecting sensitive data, applying sanitization rules, and intelligently routing requests through secure processing channels. This workflow will analyze the user's chat or webhook input and attempt to detect PII using the Enhanced PII Pattern Detector. If detected, the workflow will process that input via a series of Compliance, Auditing, and Security steps which log and sanitizes the request prior to any LLM being pinged. Why Multi-Tier Routing? Traditional systems use binary decisions (sensitive/not sensitive). Our 3-tier approach provides: โ Granular Security: Critical PII gets maximum protection โ Performance Optimization: Clean data gets full cloud capabilities โ Cost Efficiency: Expensive local processing only when needed โ User Experience: Maintains conversational flow across security levels Why Context-Aware Detection? Regex patterns alone miss contextual sensitivity. Our approach: โ Catches Intent: "Bank account" discussion is sensitive even without account numbers โ Reduces False Negatives: Medical discussions stay secure even without explicit medical IDs โ Proactive Protection: Identifies sensitive contexts before PII is shared โ Compliance Alignment: Matches how regulations actually define sensitive data Why Risk Scoring vs Binary Classification? Binary PII detection creates artificial boundaries. Risk scoring provides: โ Nuanced Decisions: Multiple low-risk patterns might aggregate to high risk โ Adaptive Thresholds: Organizations can adjust sensitivity based on their needs โ Better UX: Users aren't unnecessarily restricted for low-risk scenarios โ Audit Transparency: Clear reasoning for every routing decision Why Comprehensive Monitoring? Privacy systems require trust and verification: โ Compliance Proof: Audit trails demonstrate regulatory compliance โ Performance Optimization: Identify bottlenecks and improve efficiency โ Security Validation: Ensure no sensitive data leakage occurs โ Operational Insights: Understand usage patterns and system health How to Install: All that you will need for this workflow are credentials for your LLM providers such as Ollama, OpenRouter, OpenAI, Anthropic, etc. This workflow is customizable and allows the user to define the best LLM and storage/memory solutions for their specific use case.
by Yulia
Free template for voice & text messages with short-term memory This n8n workflow template is a blueprint for an AI Telegram bot that processes both voice and text messages. Ready to use with minimal setup. The bot remembers the last several messages (10 by default), understands commands and provides responses in HTML. You can easily swap GPT-4 and Whisper for other language and speech-to-text models to suit your needs. Core Features Text: send or forward messages Voice: transcription via Whisper Extend this template by adding LangChain tools. Requirements Telegram Bot API OpenAI API (for GPT-4 and Whisper) ๐ก New to Telegram bots? Check our step-by-step guide on creating your first bot and setting up OpenAI access. Use Cases Personal AI assistant Customer support automation Knowledge base interface Integration hub for services that you use: Connect to any API via HTTP Request Tool Trigger other n8n workflows with Workflow Tool
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically analyzes purchase trends and consumer behavior patterns to identify market opportunities and optimize business strategies. It saves you time by eliminating the need to manually analyze sales data and provides insights into buying patterns, seasonal trends, and customer preferences. Overview This workflow automatically scrapes e-commerce platforms, marketplace data, and sales analytics to extract purchase trends, product popularity, and consumer behavior insights. It uses Bright Data to access sales data and AI to intelligently analyze purchasing patterns, seasonal trends, and market opportunities. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping e-commerce and marketplace platforms without being blocked OpenAI**: AI agent for intelligent purchase trend analysis and forecasting Google Sheets**: For storing purchase trend data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your trend analysis spreadsheet Customize: Define target marketplaces and trend analysis parameters Use Cases E-commerce Strategy**: Identify trending products and market opportunities Product Development**: Understand consumer preferences and demand patterns Marketing Planning**: Optimize campaigns based on seasonal purchase trends Business Intelligence**: Make data-driven decisions using market trend insights Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #purchasetrends #marketanalysis #brightdata #webscraping #ecommerce #n8nworkflow #workflow #nocode #trendanalysis #consumerinsights #marketresearch #salesanalytics #businessintelligence #markettrends #customerinsights #ecommerceanalysis #salesdata #marketforecasting #consumerdata #purchaseanalysis #retailanalytics #marketinsights #demandforecasting #salestrends #consumertrends #marketintelligence #buyingpatterns #marketdemand
by n8n Team
This workflow connects Telegram bots with LangChain nodes in n8n. The main AI Agent Node is configured as a Conversation Agent. It has a custom System Prompt which explains the reply formatting and provides some additional instructions. The AI Agent has several connections: OpenAI GPT-4 model is called to generate the replies Window Buffer Memory stores the history of conversation with each user separately There is an additional Custom n8n Workflow tool (Dall-E 3 Tool). AI Agent uses this tool when the user requests an image generation. In the lower part of the workflow, there is a series of nodes that call Dall-E 3 model with the user Telegram ID and a prompt for a new image. Once image is ready, it is sent back to the user. Finally, there is an extra Telegram node that masks HTML syntax for improved stability in case the AI Agent replies using the unsupported format.
by Oneclick AI Squad
This automated n8n workflow qualifies B2B leads via voice calls using the VAPI API and integrates the collected data into Google Sheets. It triggers when a new leadโs phone number is added, streamlining lead qualification and data capture. What is VAPI? VAPI is an API service that enables voice call automation, used here to qualify leads by capturing structured data through interactive calls. Good to Know VAPI API calls may incur costs based on usage; check VAPI pricing for details. Ensure Google Sheets access is properly authorized to avoid data issues. Use credential fields for the HTTP Request node 'Bearer token' instead of hardcoding. Use a placeholder Google Sheet document ID (e.g., "your-sheet-id-placeholder") to avoid leaking private data. How It Works Detect when a new phone number is added for a lead using the New Lead Captured node. Use the Receive Lead Details from VAPI node to capture structured data (name, company, challenges) via a POST request. Trigger an outbound VAPI call to qualify the lead with the Initiate Voice Call (VAPI) node. Store the collected data into a Google Sheet using the Save Qualified Lead to CRM Sheet node. Send a success response back to VAPI with the Send Call Data Acknowledgement node. How to Use Import the workflow into n8n. Configure VAPI API credentials in the HTTP Request node using credential fields. Set up Google Sheets API access and authorize the app. Create a Google Sheet with the following columns: Name (text), Company (text), Challenges (text), Date (date). Test with a sample lead phone number to verify call initiation and data storage. Adjust the workflow as needed and retest. Requirements VAPI API credentials Google Sheets API access Customizing This Workflow Modify the Receive Lead Details from VAPI node to capture additional lead fields or adjust call scripts for specific industries.
by Yaron Been
This workflow provides automated access to the Black Forest Labs Flux Krea Dev AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for image generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete image generation process using the Black Forest Labs Flux Krea Dev model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: An opinionated text-to-image model from Black Forest Labs in collaboration with Krea that excels in photorealism. Creates images that avoid the oversaturated "AI look". Key Capabilities High-quality image generation from text prompts** Advanced AI-powered visual content creation** Customizable image parameters and styles** Text-to-image transformation capabilities** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Black Forest Labs/flux-krea-dev AI model Black Forest Labs Flux Krea Dev**: The core AI model for image generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Image Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Content Creation**: Generate unique images for blogs, social media, and marketing materials Design Prototyping**: Create visual concepts and mockups for design projects Art & Creativity**: Produce artistic images for personal or commercial use Marketing Materials**: Generate eye-catching visuals for campaigns and advertisements Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #imagegeneration #aiart #texttoimage #visualcontent #aiimages #generativeart #flux #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Michael Gullo
Automate Drafts From Google Drive This workflow automates the end-to-end process of extracting and summarizing information from PDFs stored in a specific Google Drive folder. When a new PDF or any binary data is added, the workflow is triggered and begins by downloading and processing the PDF to extract all available text. If multiple PDFs are detected, their content is aggregated into a single, combined dataset. This automation eliminates the time consuming task of manually reading, taking notes, and drafting documents. By removing this burden, users can focus on more meaningful tasks while the workflow handles the repetitive, tedious work. The extracted content is then passed through an AI-powered information extractor that identifies key details such as names, dates, addresses, and any other structured data points the user wants to extract from the PDF. This step is highly customizable, allowing the user to define exactly what type of information should be extracted. While the workflow is designed to extract all available content from the PDF, specifying additional structured data points ensures that critical details are accurately captured. A second OpenAI Node uses the extracted information to draft a professional, formal summary suitable for documentation. This is the most important part of the workflow and can be fully customized to meet the user's specific needs. By editing the prompts, users can tailor the workflow to generate a wide variety of draft formats based on the extracted content. The workflow then generates a new Google Document containing the full draft and composes an email summarizing the key points in 3 to 5 bullet points. This email is automatically sent to the designated recipient along with a direct link to the Google Doc. This solution is ideal for insurance, legal, or administrative use cases where timely, accurate extraction and reporting from incoming PDFs is essential. How To Use The Workflow Step 1 - Place any binary data (e.g., PDF files) into the designated Google Drive folder. Step 2 - The workflow will automatically download each PDF, extract the text, and if multiple PDFs are present combine them into a single dataset for analysis. Step 3 - The OpenAI Draft Agent will analyze the extracted information, generate a formal draft, and create a Google Document. This document will be updated with the draft content and saved back into the same Google Drive folder. Step 4 - An email will be sent to the designated recipient(s), including a summary of the draft and key extracted information, along with a link to view the Google Document. Need Help? Have Questions? For consulting and support, or if you have questions, please feel free to connect with me on LinkedIn or email michael.gullo@outlook.com.
by Oneclick AI Squad
This AI-powered workflow reads emails, understands the request using an LLM, and creates structured Jira issues. Key Insights Poll for new emails every 5 minutes; ensure Gmail/IMAP is properly configured. AI analysis requires a reliable LLM model (e.g., Chat Model or AI Tool). Workflow Process Trigger the workflow with the Check for New Emails Gmail Trigger node. Fetch full email content using the Fetch Full Email Content get message node. Analyze email content with the Analyze Email & Extract Tasks node using AI. Parse the AI-generated JSON output into tasks with the Parse JSON Output from AI node. Create the main Jira issue with the Jira - Create Main Issue create: issue node. Split subtasks from JSON and create them with the Split Subtasks JSON Items and Create Subtasks create: issue nodes. Usage Guide Import the workflow into n8n and configure Gmail and Jira credentials. Test with a sample email to ensure ticket creation and subtask assignment. Prerequisites Gmail/IMAP credentials for email polling Jira API credentials with issue creation permissions Customization Options Adjust the Analyze Email & Extract Tasks node to refine AI task extraction or modify the polling frequency in the trigger node.
by Rahul Joshi
Description This powerful n8n automation template enables seamless synchronization between Zoho Inventory and Supabaseโkeeping your product database up to date with zero manual effort. Whether youโre running an eCommerce store, inventory dashboard, or product catalog app, this workflow ensures your data pipeline stays clean, consistent, and fully automated. What This Template Does: ๐ Runs on a schedule to fetch inventory data from Zoho ๐ Authenticates via OAuth using refresh token for secure API access ๐ฆ Fetches products & variants with complete metadata ๐ Splits each item and maps it into Supabase row-by-row ๐ Pushes rich product data, including name, SKU, unit, tags, stock levels, dimensions, and up to 3 custom attributes Fields Included in Sync: Product ID, Variant ID, Variant Name, Brand, SKU Returnability, Item Type, Unit, Attributes (1โ3) Tags, Stock on Hand, UPC/EAN/ISBN, Status Reorder Level, Dimensions, Created Time, and more Requirements: Zoho Inventory API access (with Refresh Token) Supabase account & API key Target table (e.g., Fairy Frills) set up in Supabase Optional: Custom field mapping for additional use cases Perfect For: Inventory managers syncing Zoho to custom dashboards D2C brands and eCommerce platforms powered by Supabase Internal tooling teams needing a real-time product database sync Startups replacing spreadsheets with a production-grade backend
by Angel Menendez
Who's it for This workflow is ideal for AI developers running multi-agent systems in n8n who need to quantitatively evaluate tool usage behavior. If you're building autonomous agents and want to verify their decisions against ground-truth expectations, this workflow gives you plug-and-play observability. What it does This template uses n8n's built-in Evaluation Trigger and Evaluation nodes to assess whether an AI agent correctly used all the expected tools. It supports: Dataset-driven testing of agent behavior Logging actual tools to compare them with the expected tools Assigning performance metrics (tool_called = true/false) Persisting output back to Google Sheets for further debugging The workflow can be triggered by either the chat input or the dataset row evaluation. It routes through a multi-tool agent node powered by the best LLMs. The agent has access to tools such as web search, calculator, vector search, and summarizer tools. The workflow then aims to validate tool use decisions by extracting the intermediate steps from the agent (i.e., action + observation) and comparing the tools that were called with the expected tools. If the tools that were called during the workflow execution match, then it's a pass; otherwise, it's documented as a fail. The evaluation nodes take care of that process.ย How to set it up Connect your Google Sheets OAuth2 credential. Replace the document with your own test dataset. Set your desired models and configure the different agent tools, such as the summarizer and vector store. The default vector store used is Qdrant, so the user must create this vector store with a few samples of queries + web search results. Run from either the chat trigger or the evaluation trigger to test. Requirements Google Sheets OAuth2 credential OpenRouter / OpenAI credentials for AI agents and embeddings Firecrawl and Qdrant credentials for web + vector search How to customize Edit the Search Agent system message to define tool selection behavior Add more metric columns in the Evaluation node for complex scoring Add new tool nodes and link them to the agent block Swap in your own summarizer
by Automate With Marc
๐ฅ Automated Daily Firecrawl Scraper with Telegram Alerts Get structured insights scraped daily from the web using Firecrawlโs AI extraction engine โ then send them directly to your Telegram chat. ๐งฐ What this workflow does: This workflow automatically scrapes specific structured data from any webpage every day at a scheduled time using the Firecrawl API, checks if results are returned, and then sends the formatted results to Telegram. For step-by-step video tutorials of n8n builds, check out my channel: https://www.youtube.com/@Automatewithmarc ๐งญ How It Works: ๐ Schedule Trigger (Daily at 6PM) Starts the workflow every day at a set time. ๐ Firecrawl POST Request Sends a custom extraction prompt and schema to Firecrawl, targeting any list of URLs you provide. โณ 30 Seconds Wait Waits to give Firecrawl enough time to complete processing. ๐ฅ GET Firecrawl Result Fetches the extraction results using the request ID. ๐ Loop with IF Node Checks whether data is returned. If not, waits another 15 seconds and retries. ๐งน Format & Clean (Set Node) Prepares and formats the extracted result into a readable message. ๐ฒ Telegram Message Node Delivers the structured data directly to your Telegram channel or group. ๐ง Requirements: โ Firecrawl API Key (Header Auth) โ Telegram Bot Token & Chat ID ๐ก Use Cases: Extract structured data (like product info or events) from niche websites Automate compliance monitoring or intelligence gathering Create market alert bots with real-time info delivery ๐ Customization Ideas: Swap Telegram with Gmail, Discord, or Slack Expand schema to include more complex nested fields Add a Google Sheet node to log daily scraped data Integrate with a summarizer or language model for intelligent summaries Ready to automate your web intelligence gathering? ๐ง Let Firecrawl do the scraping โ and let this workflow do the rest.