by Manuel
Effortlessly optimize your workflow by automatically save all files you are receiving on Telegram to a Google Drive Folder. How it works Retrieve a message sent to your Telegram Bot containing a file Upload the file to your Google Drive Folder Set up Steps Create a Telegram Account and a Telegram Bot and connect your Telegram Bot to n8n by following the official n8n instructions Create a Google Drive Folder Connect your Google Drive with n8n following the official n8n instructions Set the right folder in the Google Drive node Use case examples Backup and Recovery Cross-Platform Access File Organization and Management File Collaboration and Sharing Storage Space Management
by DataMinex
📊 Real-Time Flight Data Analytics Bot with Dynamic Chart Generation via Telegram 🚀 Template Overview This advanced n8n workflow creates an intelligent Telegram bot that transforms raw CSV flight data into stunning, interactive visualizations. Users can generate professional charts on-demand through a conversational interface, making data analytics accessible to anyone via messaging. Key Innovation: Combines real-time data processing, Chart.js visualization engine, and Telegram's messaging platform to deliver instant business intelligence insights. 🎯 What This Template Does Transform your flight booking data into actionable insights with four powerful visualization types: 📈 Bar Charts**: Top 10 busiest airlines by flight volume 🥧 Pie Charts**: Flight duration distribution (Short/Medium/Long-haul) 🍩 Doughnut Charts**: Price range segmentation with average pricing 📊 Line Charts**: Price trend analysis across flight durations Each chart includes auto-generated insights, percentages, and key business metrics delivered instantly to users' phones. 🏗️ Technical Architecture Core Components Telegram Webhook Trigger: Captures user interactions and button clicks Smart Routing Engine: Conditional logic for command detection and chart selection CSV Data Pipeline: File reading → parsing → JSON transformation Chart Generation Engine: JavaScript-powered data processing with Chart.js Image Rendering Service: QuickChart API for high-quality PNG generation Response Delivery: Binary image transmission back to Telegram Data Flow Architecture User Input → Command Detection → CSV Processing → Data Aggregation → Chart Configuration → Image Generation → Telegram Delivery 🛠️ Setup Requirements Prerequisites n8n instance** (self-hosted or cloud) Telegram Bot Token** from @BotFather CSV dataset** with flight information Internet connectivity** for QuickChart API Dataset Source This template uses the Airlines Flights Data dataset from GitHub: 🔗 Dataset: Airlines Flights Data by Rohit Grewal Required Data Schema Your CSV file should contain these columns: airline,flight,source_city,departure_time,arrival_time,duration,price,class,destination_city,stops File Structure /data/ └── flights.csv (download from GitHub dataset above) ⚙️ Configuration Steps 1. Telegram Bot Setup Create a new bot via @BotFather on Telegram Copy your bot token Configure the Telegram Trigger node with your token Set webhook URL in your n8n instance 2. Data Preparation Download the dataset from Airlines Flights Data Upload the CSV file to /data/flights.csv in your n8n instance Ensure UTF-8 encoding Verify column headers match the dataset schema Test file accessibility from n8n 3. Workflow Activation Import the workflow JSON Configure all Telegram nodes with your bot token Test the /start command Activate the workflow 🔧 Technical Implementation Details Chart Generation Process Bar Chart Logic: // Aggregate airline counts const airlineCounts = {}; flights.forEach(flight => { const airline = flight.airline || 'Unknown'; airlineCounts[airline] = (airlineCounts[airline] || 0) + 1; }); // Generate Chart.js configuration const chartConfig = { type: 'bar', data: { labels, datasets }, options: { responsive: true, plugins: {...} } }; Dynamic Color Schemes: Bar Charts: Professional blue gradient palette Pie Charts: Duration-based color coding (light→dark blue) Doughnut Charts: Price-tier specific colors (green→purple) Line Charts: Trend-focused red gradient with smooth curves Performance Optimizations Efficient Data Processing: Single-pass aggregations with O(n) complexity Smart Caching: QuickChart handles image caching automatically Minimal Memory Usage: Stream processing for large datasets Error Handling: Graceful fallbacks for missing data fields Advanced Features Auto-Generated Insights: Statistical calculations (percentages, averages, totals) Trend analysis and pattern detection Business intelligence summaries Contextual recommendations User Experience Enhancements: Reply keyboards for easy navigation Visual progress indicators Error recovery mechanisms Mobile-optimized chart dimensions (800x600px) 📈 Use Cases & Business Applications Airlines & Travel Companies Fleet Analysis**: Monitor airline performance and market share Pricing Strategy**: Analyze competitor pricing across routes Operational Insights**: Track duration patterns and efficiency Data Analytics Teams Self-Service BI**: Enable non-technical users to generate reports Mobile Dashboards**: Access insights anywhere via Telegram Rapid Prototyping**: Quick data exploration without complex tools Business Intelligence Executive Reporting**: Instant charts for presentations Market Research**: Compare industry trends and benchmarks Performance Monitoring**: Track KPIs in real-time 🎨 Customization Options Adding New Chart Types Create new Switch condition Add corresponding data processing node Configure Chart.js options Update user interface menu Data Source Extensions Replace CSV with database connections Add real-time API integrations Implement data refresh mechanisms Support multiple file formats Visual Customizations // Custom color palette backgroundColor: ['#your-colors'], // Advanced styling borderRadius: 8, borderSkipped: false, // Animation effects animation: { duration: 2000, easing: 'easeInOutQuart' } 🔒 Security & Best Practices Data Protection Validate CSV input format Sanitize user inputs Implement rate limiting Secure file access permissions Error Handling Graceful degradation for API failures User-friendly error messages Automatic retry mechanisms Comprehensive logging 📊 Expected Outputs Sample Generated Insights "✈️ Vistara leads with 350+ flights, capturing 23.4% market share" "📈 Long-haul flights dominate at 61.1% of total bookings" "💰 Budget category (₹0-10K) represents 47.5% of all bookings" "📊 Average prices peak at ₹14K for 6-8 hour duration flights" Performance Metrics Response Time**: <3 seconds for chart generation Image Quality**: 800x600px high-resolution PNG Data Capacity**: Handles 10K+ records efficiently Concurrent Users**: Scales with n8n instance capacity 🚀 Getting Started Download the workflow JSON Import into your n8n instance Configure Telegram bot credentials Upload your flight data CSV Test with /start command Deploy and share with your team 💡 Pro Tips Data Quality**: Clean data produces better insights Mobile First**: Charts are optimized for mobile viewing Batch Processing**: Handles large datasets efficiently Extensible Design**: Easy to add new visualization types Ready to transform your data into actionable insights? Import this template and start generating professional charts in minutes! 🚀
by Mihai Farcas
Chat with local LLMs using n8n and Ollama This n8n workflow allows you to seamlessly interact with your self-hosted Large Language Models (LLMs) through a user-friendly chat interface. By connecting to Ollama, a powerful tool for managing local LLMs, you can send prompts and receive AI-generated responses directly within n8n. Use cases Private AI Interactions Ideal for scenarios where data privacy and confidentiality are important. Cost-Effective LLM Usage Avoid ongoing cloud API costs by running models on your own hardware. Experimentation & Learning A great way to explore and experiment with different LLMs in a local, controlled environment. Prototyping & Development Build and test AI-powered applications without relying on external services. How it works When chat message received: Captures the user's input from the chat interface. Chat LLM Chain: Sends the input to the Ollama server and receives the AI-generated response. Delivers the LLM's response back to the chat interface. Set up steps Make sure Ollama is installed and running on your machine before executing this workflow. Edit the Ollama address if different from the default.
by Ricardo Espinozaas
Use Case When having a call with a new potential customer, one of the keys to getting the most out of the call is to find out as much information as you can about them before the call. Normally this involves a lot of manual research before every call. This workflow automates this tedious work for you. What this workflow does The workflow runs every time a new call is booked via your Calendly. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM. Setup Add Clearbit, Hubspot, and Calendly credentials. Click on Test workflow. Book a meeting on Calendly so the event starts the workflow. Be aware that you can adapt this workflow to work with your enrichment tool, CRM, and booking tool of choice.
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Content Publishing to TikTok, YouTube, Instagram, Facebook via Blotato 🎯 Who is this for? This workflow is perfect for: Content creators who post daily to multiple platforms Marketing teams managing brand presence across channels Solo entrepreneurs and social media managers looking to scale their output Anyone tired of uploading content manually across apps 💡 What problem is this solving? Managing content across platforms is time-consuming. You need to: Track posts per platform Upload videos manually Adapt captions and posting time Avoid repetitive mistakes This workflow solves all of that by centralizing everything in one place (Google Sheets) and automating it via Blotato. ⚙️ What this workflow does Every hour, this workflow will: Check your Google Sheet for any post marked as "TO GO" Select one item at a time (avoids spam and overposting) Extract media from a shared Google Drive link Upload the media to Blotato Publish it automatically to: TikTok YouTube Shorts Instagram Facebook Update the post status in your Sheet to "Posted" 🧰 Setup Before running this template, make sure you have: ✅ A Blotato account (Pro plan required for API key) 🔑 Generated your Blotato API key (Settings > API > Generate) 📦 Enabled Verified Community Nodes in n8n Admin Panel 🧩 Installed the Blotato node via the community nodes list 🛠 Created a Blotato credential in n8n using your API key ☁️ Made sure your media folder in Google Drive is set to Anyone with the link can view 📌 Followed the 3 setup steps in the brown sticky notes inside the workflow 🛠 How to customize this workflow Add new platform nodes (LinkedIn, Threads, Pinterest, etc.) using Blotato Adjust the scheduling frequency from hourly to daily or weekly Add an approval layer (Slack/Telegram) before publishing Customize your captions dynamically using GPT or formulas in Sheets Use tags, categories, or campaign tracking for analytics 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Legal Case Research Extractor is a powerful automated workflow designed for legal tech teams, researchers, law firms, and data scientists focused on transforming unstructured legal case data into actionable, structured insights. This workflow is tailored for: Legal Researchers automating case law data mining Litigation Support Teams handling large volumes of case records LawTech Startups building AI-powered legal research assistants Compliance Analysts extracting case-specific insights AI Developers working on legal NLP, summarization, and search engines What problem is this workflow solving? Legal case data is often locked in semi-structured or raw HTML formats, scattered across jurisdiction-specific websites. Manually extracting and processing this data is tedious and inefficient. This workflow automates: Extraction of legal case data via Bright Data's powerful MCP infrastructure Parsing of HTML into clean, readable text using Google Gemini LLM Structuring and delivering the output through webhook and file storage What this workflow does Input Set the Legal Case Research URL node is responsible for setting the legal case URL for the data extraction. Bright Data MCP Data Extractor Bright Data MCP Client For Legal Case Research node is responsible for the legal case extraction via the Bright Data MCP tool - scrape_as_html Case Extractor Google Gemini based Case Extractor is responsible for producing a paginated list of cases Loop through Legal Case URLs Receives a collection of legal case links to process Each URL represents a different case from a target legal website Bright Data MCP Scraping Utilizes Bright Data’s scrape_as_html MCP mode Retrieves raw HTML content of each legal case Google Gemini LLM Extraction Transforms raw HTML into clean, structured text Performs additional information extraction if required (e.g., case summary, court, jurisdiction etc.) Webhook Notification Sends extracted legal case content to a configurable webhook URL Enables downstream processing or storage in legal databases Binary Conversion & File Persistence Converts the structured text to binary format Saves the final response to disk for archival or further processing Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Target New Legal Portals Modify the legal case input URLs to scrape from different state or federal case databases Customize LLM Extraction Modify the prompt to extract specific fields: case number, plaintiff, case summary, outcome, legal precedents etc. Add a summarization step if needed Enhance Loop Handling Integrate with a Google Sheet or API to dynamically fetch case URLs Add error handling logic to skip failed cases and log them Improve Security & Compliance Redact sensitive information before sending via webhook Store processed case data in encrypted cloud storage Output Formats Save as PDF, JSON, or Markdown Enable output to cloud storage (S3, Google Drive) or legal document management systems
by Gaurav
Automate your entire guest communication journey from booking to post-stay with personalized welcome emails, review requests, and daily operational reports. Perfect for hotels, B&Bs, and short-term rental properties looking to enhance guest experience while reducing manual work and improving operational efficiency. How it works Pre-arrival welcome emails - Automatically sends personalized welcome emails 1-2 days before guest check-in with reservation details, hotel amenities, and contact information Post-stay review requests - Sends automated review request emails 24 hours after checkout with Google Reviews links and return guest discount codes Daily staff reports - Generates comprehensive arrival/departure reports every morning at 6 AM for front desk, housekeeping, and management teams Smart tracking - Prevents duplicate emails by automatically updating tracking status in your Google Sheets database Professional templates - Uses responsive HTML email templates that work across all devices and email clients Set up steps Connect Google Sheets - Link your hotel reservation spreadsheet (must include columns for guest details, check-in/out dates, and email tracking) Configure Gmail account - Set up Gmail credentials for sending automated emails Customize hotel information - Update hotel name, contact details, and branding in the "Edit Fields" nodes Set staff email addresses - Configure recipient addresses for daily operational reports Adjust timing - Modify schedule triggers if you want different timing for emails and reports (currently set to every 6 hours for guest emails and 6 AM daily for staff reports) Time investment: ~30 minutes for initial setup, then fully automated operation.
by Jay Emp0
🤖 MCP Personal Assistant Workflow Description This workflow integrates multiple productivity tools into a single AI-powered assistant using n8n, acting as a centralized control hub to receive and execute tasks across Google Calendar, Gmail, Google Drive, LinkedIn, Twitter, and more. ✅ Key Capabilities AI Agent + Tool Use**: Built using n8n's AI Agent and MCP system, enabling intelligent multi-step reasoning. Tool Integration**: Google Calendar: schedule, update, delete events Gmail: search, draft, send emails Google Drive: manage files and folders LinkedIn & Twitter: post updates, send DMs Utility tools: fetch date/time, search URLs Discord Input**: Accepts prompts via n8n_discord_trigger_bot repo link 🛠 Setup Instructions Timezone Configuration: Go to Settings > Default Timezone in n8n. Set to your local timezone (e.g., Asia/Jakarta). Ensure all Date & Time nodes explicitly use the same zone to avoid UTC-related bugs. Tool Authentication: Replace all OAuth credentials for: Gmail Google Drive Google Calendar Twitter LinkedIn Use your own accounts when copying this workflow. Platform Adaptability: While designed for Discord, you can replace the Discord trigger with any other chat or webhook service. Example: Telegram, Slack, WhatsApp Webhook, n8n Form Trigger, etc. 📦 Strengths Great for document retrieval, email summarization, calendar scheduling, and social posting. Reduces the need for tab-switching across multiple platforms. Tested with a comprehensive checklist across categories like: Calendar Gmail Google Drive Twitter LinkedIn Utility tools Cross-tool actions (Refer to discordGPT prompt checklist for prompt coverage.) ⚠️ Limitations ❌ Binary Uploads: AI agents & MCP server currently struggle with binary payloads. Uploading files to Gmail, Google Drive, or LinkedIn may fail due to format serialization issues. Binary operations (upload/post) are under development and will be fixed in future iterations. ❌ Date Bugs: If timezone settings are incorrect, event times may default to UTC, leading to misaligned calendar events. 🔬 Testing Use the provided prompt checklist for full coverage of: ✅ Core feature flows ✅ Edge cases (e.g., invalid dates, nonexistent users) ✅ Cross-tool chains (e.g., Google Drive → Gmail → LinkedIn) ✅ MCP Assistant Test Prompt Checklist 📅 Google Calendar [X] "Schedule a meeting with Alice tomorrow at 10am. and send an invite to alice@wonderland.com" [X] "Create an event called 'Project Sync' on Friday at 3pm with Bob and Charlie." [X] "Update the time of my call with James to next Monday at 2pm." [X] "Delete my meeting with Marketing next Wednesday." [x] "What is my schedule tommorow ? " 📧 Gmail [x] "Show me unread emails from this week." [x] "Search for emails with subject: invoice" [X] "Reply to the latest email from john@company.com saying 'Thanks, noted!'" [X] "Draft an email to info@a16z.com with subject 'Emp0 Fundraising' and draft the body of the email with an investment opportunity in Emp0, scrape this site https://Emp0.com to get to know more about emp0.com" [X] "Send an email to hi@cursor.com with subject 'Feature request' and cc sales@cursor.com" [ ] "Send an email to recruiting@openai.com , write about how you like their product and want to apply for a job there and attach my latest CV from Google Drivce" 🗂 Google Drive [ ] "Upload the PDF you just sent me to my Google Drive." [X] "Create a folder called 'July Reports' inside Emp0 shared drive." [X] "Move the file named 'Q2_Review.pdf' to 'Reports/2024/Q2'." [X] "Share the folder 'Investor Decks' with info@a16z.com as viewer." [ ] "Download the file 'Wayne_Li_CV.pdf' and attach it in Discord." [X] "Search for a file named 'Invoice May' in my Google Drive." 🖼 LinkedIn [X] "Think of a random and inspiring quote. Post a text update on LinkedIn with the quote and end with a question so people will answer and increase engagement" [ ] "Post this Google Drive image to LinkedIn with the caption: 'Team offsite snapshots!'" [X] "Summarize the contents of this workflow and post it on linkedin with the original url https://n8n.io/workflows/5230-content-farming-ai-powered-blog-automation-for-wordpress/" 🐦 Twitter [X] "Tweet: 'AI is eating operations. Fast.'" [X] "Send a DM to @founderguy: 'Would love to connect on what you’re building.'" [X] "Search Twitter for keyword: 'founder advice'" 🌐 Utilities [X] "What time is it now?" [ ] "Download this PDF: https://ontheline.trincoll.edu/images/bookdown/sample-local-pdf.pdf" [X] "Search this URL and summarize important tech updates today: https://techcrunch.com/feed/" 📎 Discord Attachments [ ] "Take the image I just uploaded and post it to LinkedIn." [ ] "Get the file from my last message and upload it to Google Drive." 🧪 Edge Cases [X] "Schedule a meeting on Feb 30." [X] "Send a DM to @user_that_does_not_exist" [ ] "Download a 50MB PDF and post it to LinkedIn" [X] "Get the latest tweet from my timeline and email it to myself." 🔗 Cross-tool Flows [ ] "Get the latest image from my Google Drive and post it on LinkedIn with the caption 'Another milestone hit!'" [ ] "Find the latest PDF report in Google Drive and email it to investor@vc.com." [ ] "Download an image from this link and upload it to my Google Drive: https://example.com/image.png" [ ] "Get the most recent attachment from my inbox and upload it to Google Drive." Run each of these in isolated test cases. For cross-tool flows, verify binary serialization integrity. 🧠 Why Use This Workflow? This is an always-on personal assistant that can: Process natural language input Handle multi-step logic Execute commands across 6+ platforms Be extended with more tools and memory If you want to interact with all your work tools from a single prompt—this is your base to start. 📎 Repo & Credits Discord bot trigger: n8n_discord_trigger_bot Creator: Jay (Emp₀)
by Lucas Perret
Who this is for This workflow is for sales people who want to quickly and efficiently follow up with their leads What this workflow does This workflow starts every time a new reply is received in lemlist. It then classifies the response using openAI and creates the correct follow up task. The follow-up tasks currently include: Slack alerts when a lead for each new replies Tag interested leads in lemlist Unsubscription of leads when they request it The Slack alerts include: Lead email address Sender email address Reply type (positive, not interested...etc) A preview of the reply Setup To set this template up, simply follow the stickies steps in it How to customize this workflow to your needs Adjust the follow up tasks to your needs Change the Slack notification to your needs ...
by Autonomous Work
This workflow exports every table in a base as its own CSV, saves the files in a time-stamped folder in Amazon S3, pings you on Slack, and optionally prunes older copies. You get an automated weekly backup that is easy to inspect or re-import as needed. You can easily swap the S3 node for the storage provider of your choice. ++How it works++ Weekly Backup Schedule trigger fires weekly Sets and formats the week ex. [2025-W12] Create a folder in S3 bucket with the week Loops through all tables in Airtable base creating CSVs and uploading to the new path Slack message is sent on completion Monthly Prune Schedule trigger fires weekly Sets a cut-off date 4 weeks in the past Lists folders in S3 Deletes all folders > 4 weeks old ++Setup Steps++ Clone workflow Swap credentials for Airtable, AWS, and Slack Ensure AWS credential has appropriate IAM policy to manage bucket & objects Set workflow to "Active"
by n8n Team
This workflow syncs Zendesk tickets to Pipedrive contact owners. This workflow is triggered every day at 09:00 with Zendesk collecting all the tickets updated after the last execution timestamp and updating them according to Pipedrive contacts. It also adds Zendesk comments to the tickets as notes in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials Zendesk account and Zendesk credentials Note: The Pipedrive and the Zendesk accounts need to be created by the same person / with the same email. How it works Cron node triggers the workflow every day at 09:00. Zendesk node collects all the tickets updated after the last execution timestamp. If node checks if the channel in the ticket is an email, and if so, it continues the workflow. The Item Lists node removes duplicates to make search efficient. Pipedrive node searches persons by email. Set node renames and keeps only needed fields (email & person id) Merge by key node adds the Pipedrive contact id to Zendesk tickets. The HTTP Request node gets Zendesk comments for tickets and the Merge node adds them to tickets. Split node adds nodes in batches with each iteration. Item list node splits comments into separate items. Pipedrive node adds comment as notes. If node checks if the data processing is done and if not, goes back to the Split node. The Function Item node sets the new last execution timestamp.
by Airtop
README Monitor Competitor Facebook Ads with Airtop Use Case Monitor a competitor’s active Facebook ads and get a weekly HTML intelligence brief by email — saving time on manual research and helping you spot messaging, offers, and creative trends quickly. What This Automation Does Runs weekly on a set schedule. Uses Airtop to visit the competitor’s Facebook Ad Library page and extract up to 30 active ads. Summarizes each ad with key points: message, topic, CTA, duration active, language, target audience. Sends the compiled HTML report via Gmail. How It Works Schedule Trigger – Fires once a week at the configured time. Airtop Extraction – Loads the Ad Library URL and runs a prompt to extract and format the ads into HTML. Email Delivery – Sends the HTML report to your specified recipient using Gmail. Setup Requirements Airtop API Key — Generate here. Airtop Credential in n8n — Add your API key under “Airtop” in n8n. Gmail OAuth2 Credential — Connect the Gmail account to send reports. Competitor’s Ad Library URL — Replace the default view_all_page_id in the workflow with your target. Next Steps Duplicate the Airtop step for multiple competitors. Enrich reports by visiting ad landing pages for deeper analysis. Send outputs to Slack or archive in a shared workspace. Read about ways to monitor your competitors ads here