by DataMinex
📊 Real-Time Flight Data Analytics Bot with Dynamic Chart Generation via Telegram 🚀 Template Overview This advanced n8n workflow creates an intelligent Telegram bot that transforms raw CSV flight data into stunning, interactive visualizations. Users can generate professional charts on-demand through a conversational interface, making data analytics accessible to anyone via messaging. Key Innovation: Combines real-time data processing, Chart.js visualization engine, and Telegram's messaging platform to deliver instant business intelligence insights. 🎯 What This Template Does Transform your flight booking data into actionable insights with four powerful visualization types: 📈 Bar Charts**: Top 10 busiest airlines by flight volume 🥧 Pie Charts**: Flight duration distribution (Short/Medium/Long-haul) 🍩 Doughnut Charts**: Price range segmentation with average pricing 📊 Line Charts**: Price trend analysis across flight durations Each chart includes auto-generated insights, percentages, and key business metrics delivered instantly to users' phones. 🏗️ Technical Architecture Core Components Telegram Webhook Trigger: Captures user interactions and button clicks Smart Routing Engine: Conditional logic for command detection and chart selection CSV Data Pipeline: File reading → parsing → JSON transformation Chart Generation Engine: JavaScript-powered data processing with Chart.js Image Rendering Service: QuickChart API for high-quality PNG generation Response Delivery: Binary image transmission back to Telegram Data Flow Architecture User Input → Command Detection → CSV Processing → Data Aggregation → Chart Configuration → Image Generation → Telegram Delivery 🛠️ Setup Requirements Prerequisites n8n instance** (self-hosted or cloud) Telegram Bot Token** from @BotFather CSV dataset** with flight information Internet connectivity** for QuickChart API Dataset Source This template uses the Airlines Flights Data dataset from GitHub: 🔗 Dataset: Airlines Flights Data by Rohit Grewal Required Data Schema Your CSV file should contain these columns: airline,flight,source_city,departure_time,arrival_time,duration,price,class,destination_city,stops File Structure /data/ └── flights.csv (download from GitHub dataset above) ⚙️ Configuration Steps 1. Telegram Bot Setup Create a new bot via @BotFather on Telegram Copy your bot token Configure the Telegram Trigger node with your token Set webhook URL in your n8n instance 2. Data Preparation Download the dataset from Airlines Flights Data Upload the CSV file to /data/flights.csv in your n8n instance Ensure UTF-8 encoding Verify column headers match the dataset schema Test file accessibility from n8n 3. Workflow Activation Import the workflow JSON Configure all Telegram nodes with your bot token Test the /start command Activate the workflow 🔧 Technical Implementation Details Chart Generation Process Bar Chart Logic: // Aggregate airline counts const airlineCounts = {}; flights.forEach(flight => { const airline = flight.airline || 'Unknown'; airlineCounts[airline] = (airlineCounts[airline] || 0) + 1; }); // Generate Chart.js configuration const chartConfig = { type: 'bar', data: { labels, datasets }, options: { responsive: true, plugins: {...} } }; Dynamic Color Schemes: Bar Charts: Professional blue gradient palette Pie Charts: Duration-based color coding (light→dark blue) Doughnut Charts: Price-tier specific colors (green→purple) Line Charts: Trend-focused red gradient with smooth curves Performance Optimizations Efficient Data Processing: Single-pass aggregations with O(n) complexity Smart Caching: QuickChart handles image caching automatically Minimal Memory Usage: Stream processing for large datasets Error Handling: Graceful fallbacks for missing data fields Advanced Features Auto-Generated Insights: Statistical calculations (percentages, averages, totals) Trend analysis and pattern detection Business intelligence summaries Contextual recommendations User Experience Enhancements: Reply keyboards for easy navigation Visual progress indicators Error recovery mechanisms Mobile-optimized chart dimensions (800x600px) 📈 Use Cases & Business Applications Airlines & Travel Companies Fleet Analysis**: Monitor airline performance and market share Pricing Strategy**: Analyze competitor pricing across routes Operational Insights**: Track duration patterns and efficiency Data Analytics Teams Self-Service BI**: Enable non-technical users to generate reports Mobile Dashboards**: Access insights anywhere via Telegram Rapid Prototyping**: Quick data exploration without complex tools Business Intelligence Executive Reporting**: Instant charts for presentations Market Research**: Compare industry trends and benchmarks Performance Monitoring**: Track KPIs in real-time 🎨 Customization Options Adding New Chart Types Create new Switch condition Add corresponding data processing node Configure Chart.js options Update user interface menu Data Source Extensions Replace CSV with database connections Add real-time API integrations Implement data refresh mechanisms Support multiple file formats Visual Customizations // Custom color palette backgroundColor: ['#your-colors'], // Advanced styling borderRadius: 8, borderSkipped: false, // Animation effects animation: { duration: 2000, easing: 'easeInOutQuart' } 🔒 Security & Best Practices Data Protection Validate CSV input format Sanitize user inputs Implement rate limiting Secure file access permissions Error Handling Graceful degradation for API failures User-friendly error messages Automatic retry mechanisms Comprehensive logging 📊 Expected Outputs Sample Generated Insights "✈️ Vistara leads with 350+ flights, capturing 23.4% market share" "📈 Long-haul flights dominate at 61.1% of total bookings" "💰 Budget category (₹0-10K) represents 47.5% of all bookings" "📊 Average prices peak at ₹14K for 6-8 hour duration flights" Performance Metrics Response Time**: <3 seconds for chart generation Image Quality**: 800x600px high-resolution PNG Data Capacity**: Handles 10K+ records efficiently Concurrent Users**: Scales with n8n instance capacity 🚀 Getting Started Download the workflow JSON Import into your n8n instance Configure Telegram bot credentials Upload your flight data CSV Test with /start command Deploy and share with your team 💡 Pro Tips Data Quality**: Clean data produces better insights Mobile First**: Charts are optimized for mobile viewing Batch Processing**: Handles large datasets efficiently Extensible Design**: Easy to add new visualization types Ready to transform your data into actionable insights? Import this template and start generating professional charts in minutes! 🚀
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Content Publishing to TikTok, YouTube, Instagram, Facebook via Blotato 🎯 Who is this for? This workflow is perfect for: Content creators who post daily to multiple platforms Marketing teams managing brand presence across channels Solo entrepreneurs and social media managers looking to scale their output Anyone tired of uploading content manually across apps 💡 What problem is this solving? Managing content across platforms is time-consuming. You need to: Track posts per platform Upload videos manually Adapt captions and posting time Avoid repetitive mistakes This workflow solves all of that by centralizing everything in one place (Google Sheets) and automating it via Blotato. ⚙️ What this workflow does Every hour, this workflow will: Check your Google Sheet for any post marked as "TO GO" Select one item at a time (avoids spam and overposting) Extract media from a shared Google Drive link Upload the media to Blotato Publish it automatically to: TikTok YouTube Shorts Instagram Facebook Update the post status in your Sheet to "Posted" 🧰 Setup Before running this template, make sure you have: ✅ A Blotato account (Pro plan required for API key) 🔑 Generated your Blotato API key (Settings > API > Generate) 📦 Enabled Verified Community Nodes in n8n Admin Panel 🧩 Installed the Blotato node via the community nodes list 🛠 Created a Blotato credential in n8n using your API key ☁️ Made sure your media folder in Google Drive is set to Anyone with the link can view 📌 Followed the 3 setup steps in the brown sticky notes inside the workflow 🛠 How to customize this workflow Add new platform nodes (LinkedIn, Threads, Pinterest, etc.) using Blotato Adjust the scheduling frequency from hourly to daily or weekly Add an approval layer (Slack/Telegram) before publishing Customize your captions dynamically using GPT or formulas in Sheets Use tags, categories, or campaign tracking for analytics 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Legal Case Research Extractor is a powerful automated workflow designed for legal tech teams, researchers, law firms, and data scientists focused on transforming unstructured legal case data into actionable, structured insights. This workflow is tailored for: Legal Researchers automating case law data mining Litigation Support Teams handling large volumes of case records LawTech Startups building AI-powered legal research assistants Compliance Analysts extracting case-specific insights AI Developers working on legal NLP, summarization, and search engines What problem is this workflow solving? Legal case data is often locked in semi-structured or raw HTML formats, scattered across jurisdiction-specific websites. Manually extracting and processing this data is tedious and inefficient. This workflow automates: Extraction of legal case data via Bright Data's powerful MCP infrastructure Parsing of HTML into clean, readable text using Google Gemini LLM Structuring and delivering the output through webhook and file storage What this workflow does Input Set the Legal Case Research URL node is responsible for setting the legal case URL for the data extraction. Bright Data MCP Data Extractor Bright Data MCP Client For Legal Case Research node is responsible for the legal case extraction via the Bright Data MCP tool - scrape_as_html Case Extractor Google Gemini based Case Extractor is responsible for producing a paginated list of cases Loop through Legal Case URLs Receives a collection of legal case links to process Each URL represents a different case from a target legal website Bright Data MCP Scraping Utilizes Bright Data’s scrape_as_html MCP mode Retrieves raw HTML content of each legal case Google Gemini LLM Extraction Transforms raw HTML into clean, structured text Performs additional information extraction if required (e.g., case summary, court, jurisdiction etc.) Webhook Notification Sends extracted legal case content to a configurable webhook URL Enables downstream processing or storage in legal databases Binary Conversion & File Persistence Converts the structured text to binary format Saves the final response to disk for archival or further processing Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Target New Legal Portals Modify the legal case input URLs to scrape from different state or federal case databases Customize LLM Extraction Modify the prompt to extract specific fields: case number, plaintiff, case summary, outcome, legal precedents etc. Add a summarization step if needed Enhance Loop Handling Integrate with a Google Sheet or API to dynamically fetch case URLs Add error handling logic to skip failed cases and log them Improve Security & Compliance Redact sensitive information before sending via webhook Store processed case data in encrypted cloud storage Output Formats Save as PDF, JSON, or Markdown Enable output to cloud storage (S3, Google Drive) or legal document management systems
by Jay Emp0
🤖 MCP Personal Assistant Workflow Description This workflow integrates multiple productivity tools into a single AI-powered assistant using n8n, acting as a centralized control hub to receive and execute tasks across Google Calendar, Gmail, Google Drive, LinkedIn, Twitter, and more. ✅ Key Capabilities AI Agent + Tool Use**: Built using n8n's AI Agent and MCP system, enabling intelligent multi-step reasoning. Tool Integration**: Google Calendar: schedule, update, delete events Gmail: search, draft, send emails Google Drive: manage files and folders LinkedIn & Twitter: post updates, send DMs Utility tools: fetch date/time, search URLs Discord Input**: Accepts prompts via n8n_discord_trigger_bot repo link 🛠 Setup Instructions Timezone Configuration: Go to Settings > Default Timezone in n8n. Set to your local timezone (e.g., Asia/Jakarta). Ensure all Date & Time nodes explicitly use the same zone to avoid UTC-related bugs. Tool Authentication: Replace all OAuth credentials for: Gmail Google Drive Google Calendar Twitter LinkedIn Use your own accounts when copying this workflow. Platform Adaptability: While designed for Discord, you can replace the Discord trigger with any other chat or webhook service. Example: Telegram, Slack, WhatsApp Webhook, n8n Form Trigger, etc. 📦 Strengths Great for document retrieval, email summarization, calendar scheduling, and social posting. Reduces the need for tab-switching across multiple platforms. Tested with a comprehensive checklist across categories like: Calendar Gmail Google Drive Twitter LinkedIn Utility tools Cross-tool actions (Refer to discordGPT prompt checklist for prompt coverage.) ⚠️ Limitations ❌ Binary Uploads: AI agents & MCP server currently struggle with binary payloads. Uploading files to Gmail, Google Drive, or LinkedIn may fail due to format serialization issues. Binary operations (upload/post) are under development and will be fixed in future iterations. ❌ Date Bugs: If timezone settings are incorrect, event times may default to UTC, leading to misaligned calendar events. 🔬 Testing Use the provided prompt checklist for full coverage of: ✅ Core feature flows ✅ Edge cases (e.g., invalid dates, nonexistent users) ✅ Cross-tool chains (e.g., Google Drive → Gmail → LinkedIn) ✅ MCP Assistant Test Prompt Checklist 📅 Google Calendar [X] "Schedule a meeting with Alice tomorrow at 10am. and send an invite to alice@wonderland.com" [X] "Create an event called 'Project Sync' on Friday at 3pm with Bob and Charlie." [X] "Update the time of my call with James to next Monday at 2pm." [X] "Delete my meeting with Marketing next Wednesday." [x] "What is my schedule tommorow ? " 📧 Gmail [x] "Show me unread emails from this week." [x] "Search for emails with subject: invoice" [X] "Reply to the latest email from john@company.com saying 'Thanks, noted!'" [X] "Draft an email to info@a16z.com with subject 'Emp0 Fundraising' and draft the body of the email with an investment opportunity in Emp0, scrape this site https://Emp0.com to get to know more about emp0.com" [X] "Send an email to hi@cursor.com with subject 'Feature request' and cc sales@cursor.com" [ ] "Send an email to recruiting@openai.com , write about how you like their product and want to apply for a job there and attach my latest CV from Google Drivce" 🗂 Google Drive [ ] "Upload the PDF you just sent me to my Google Drive." [X] "Create a folder called 'July Reports' inside Emp0 shared drive." [X] "Move the file named 'Q2_Review.pdf' to 'Reports/2024/Q2'." [X] "Share the folder 'Investor Decks' with info@a16z.com as viewer." [ ] "Download the file 'Wayne_Li_CV.pdf' and attach it in Discord." [X] "Search for a file named 'Invoice May' in my Google Drive." 🖼 LinkedIn [X] "Think of a random and inspiring quote. Post a text update on LinkedIn with the quote and end with a question so people will answer and increase engagement" [ ] "Post this Google Drive image to LinkedIn with the caption: 'Team offsite snapshots!'" [X] "Summarize the contents of this workflow and post it on linkedin with the original url https://n8n.io/workflows/5230-content-farming-ai-powered-blog-automation-for-wordpress/" 🐦 Twitter [X] "Tweet: 'AI is eating operations. Fast.'" [X] "Send a DM to @founderguy: 'Would love to connect on what you’re building.'" [X] "Search Twitter for keyword: 'founder advice'" 🌐 Utilities [X] "What time is it now?" [ ] "Download this PDF: https://ontheline.trincoll.edu/images/bookdown/sample-local-pdf.pdf" [X] "Search this URL and summarize important tech updates today: https://techcrunch.com/feed/" 📎 Discord Attachments [ ] "Take the image I just uploaded and post it to LinkedIn." [ ] "Get the file from my last message and upload it to Google Drive." 🧪 Edge Cases [X] "Schedule a meeting on Feb 30." [X] "Send a DM to @user_that_does_not_exist" [ ] "Download a 50MB PDF and post it to LinkedIn" [X] "Get the latest tweet from my timeline and email it to myself." 🔗 Cross-tool Flows [ ] "Get the latest image from my Google Drive and post it on LinkedIn with the caption 'Another milestone hit!'" [ ] "Find the latest PDF report in Google Drive and email it to investor@vc.com." [ ] "Download an image from this link and upload it to my Google Drive: https://example.com/image.png" [ ] "Get the most recent attachment from my inbox and upload it to Google Drive." Run each of these in isolated test cases. For cross-tool flows, verify binary serialization integrity. 🧠 Why Use This Workflow? This is an always-on personal assistant that can: Process natural language input Handle multi-step logic Execute commands across 6+ platforms Be extended with more tools and memory If you want to interact with all your work tools from a single prompt—this is your base to start. 📎 Repo & Credits Discord bot trigger: n8n_discord_trigger_bot Creator: Jay (Emp₀)
by Lucas Perret
Who this is for This workflow is for sales people who want to quickly and efficiently follow up with their leads What this workflow does This workflow starts every time a new reply is received in lemlist. It then classifies the response using openAI and creates the correct follow up task. The follow-up tasks currently include: Slack alerts when a lead for each new replies Tag interested leads in lemlist Unsubscription of leads when they request it The Slack alerts include: Lead email address Sender email address Reply type (positive, not interested...etc) A preview of the reply Setup To set this template up, simply follow the stickies steps in it How to customize this workflow to your needs Adjust the follow up tasks to your needs Change the Slack notification to your needs ...
by Joey D’Anna
This template is an error handler that will log n8n workflow errors to a Monday.com board for troubleshooting and tracking. Prerequisites Monday account and Monday credential Create a board on Monday for error logging, with the following columns and types: Timestamp (text) Error Message (text) Stack Trace (long text) Determine the column IDs using Monday's instructions Setup Edit the Monday nodes to use your credential Edit the node labeled CREATE ERROR ITEM to point to your error log board and group name Edit the column IDs in the "Column Values" field of the UPDATE node to match the IDs of the fields on your error log board To trigger error logging, select this automation as the error workflow on any automation For more detailed logging, add Stop and Error nodes in your workflow to send specific error messages to your board.
by Airtop
README Monitor Competitor Facebook Ads with Airtop Use Case Monitor a competitor’s active Facebook ads and get a weekly HTML intelligence brief by email — saving time on manual research and helping you spot messaging, offers, and creative trends quickly. What This Automation Does Runs weekly on a set schedule. Uses Airtop to visit the competitor’s Facebook Ad Library page and extract up to 30 active ads. Summarizes each ad with key points: message, topic, CTA, duration active, language, target audience. Sends the compiled HTML report via Gmail. How It Works Schedule Trigger – Fires once a week at the configured time. Airtop Extraction – Loads the Ad Library URL and runs a prompt to extract and format the ads into HTML. Email Delivery – Sends the HTML report to your specified recipient using Gmail. Setup Requirements Airtop API Key — Generate here. Airtop Credential in n8n — Add your API key under “Airtop” in n8n. Gmail OAuth2 Credential — Connect the Gmail account to send reports. Competitor’s Ad Library URL — Replace the default view_all_page_id in the workflow with your target. Next Steps Duplicate the Airtop step for multiple competitors. Enrich reports by visiting ad landing pages for deeper analysis. Send outputs to Slack or archive in a shared workspace. Read about ways to monitor your competitors ads here
by Kumar Shivam
Complete AI Product Description Generator Transforms product images into high-converting copy with GPT-4o Vision + Claude 3.5 The Shopify AI Product Description Factory is a production-grade n8n workflow that converts product images and metadata into refined, SEO-aware descriptions—fully automated and region-agnostic. It blends GPT-4o vision for visible attribute extraction, Claude 3.5 Sonnet for premium copy, Perplexity research for verified brand context, Google Sheets for orchestration and audit trails, plus automated daily sales analytics enrichment. Link-header pagination and structured output enforcement ensure reliable scale. To refine according to your usecase connect via my profile @connect Key Advantages Vision-first copywriting Uses gpt-4o to identify only visible physical attributes (closure, heel, materials, sole) from product images—no guesses. Premium copy generation anthropic/claude-3.5-sonnet crafts concise, benefit-led descriptions with consistent tone, length control, and clean formatting. Research-assisted accuracy perplexityTool verifies vendor/brand context from official sources to avoid speculation or fabricated claims. Pagination you can trust Automates Shopify REST pagination via Link headers and persists page_info for resumable runs. Google Sheets orchestration Centralized staging, status tracking, and QA in Products, with ProcessingState for batch/page markers, and Error_log for diagnostics. Bulletproof error feedback errorTrigger + AI diagnosis logs clear, non-technical and technical explanations to Error_log for fast recovery. Automated sales analytics Daily sales tracking automatically captures and enriches total sales data for comprehensive business intelligence and performance monitoring. How It Works Intake and filtering httpRequest fetches /admin/api/2024-04/products.json?limit=200&{page_info} code filters only items with: Image present Empty body_html The currSeas:SS2025 tag Extracts tag metadata such as x-styleCode, country_of_origin, and gender when available Pagination controller code parses Link headers for rel="next" and extracts page_info googleSheets updates ProcessingState with page_info_next and increments the batch number for resumable polling Generation pipeline googleSheets pulls rows with Status = Ready for AI Description; limit throttles batch size openAi Analyze image (model gpt-4o) returns strictly visible features lmChatOpenRouter (Claude 3.5) composes the SEO description, optionally blending verified vendor context from perplexityTool outputParserStructured guarantees strict JSON: product_id, product_title (normalized), generated_description, status googleSheets writes results back to Products for review/publish Sales analytics enrichment Schedule Trigger** runs daily at 2:01 PM to capture previous day's sales httpRequest fetches paid orders from Shopify REST API with date range filtering splitOut and summarize nodes calculate total daily sales Automatic Google Sheets logging with date stamps and totals Zero-sale days are properly recorded for complete analytics continuity Reliability and insight errorTrigger routes failures to an AI agent that explains the root cause and appends a concise note to Error_log. What's Inside (Node Map) Data + API httpRequest (Shopify REST 2024-04 for products and orders) googleSheets (multiple sheet operations) googleSheetsTool (error logging) AI models openAi (gpt-4o vision analysis) lmChatOpenRouter (anthropic/claude-3.5-sonnet for content generation) AI Agent** (intelligent error diagnosis) Analytics & Processing splitOut (order data processing) summarize (sales totals calculation) set nodes (data field mapping) Tools and guards perplexityTool (brand research) outputParserStructured (JSON validation) memoryBufferWindow (conversation context) Control & Scheduling scheduleTrigger (multiple time-based triggers) cron (periodic execution) limit (batch size control) if (conditional logic) code (custom filtering and pagination logic) Observability errorTrigger + AI diagnosis to Error_log Processing state tracking Sales analytics logging Content & Compliance Rules Locale-agnostic copy**; brand voice is configurable per store Only image-verifiable attributes** (no guesses); clean HTML suitable for Shopify themes Optional normalization rules (e.g., color/branding cleanup, title sanitization) Style code inclusion supported when x-styleCode is present Gender-aware content generation when gender tag is present Strict JSON output** and schema consistency for safe downstream publishing Setup Steps Core integrations Shopify Access Token** — Products read + Orders read (REST 2024-04) OpenAI API** — gpt-4o vision OpenRouter API** — Claude Sonnet (3.5) Perplexity API** — vendor/market verification via perplexityTool Google Sheets OAuth** — Products, ProcessingState, Error_log, Sales analytics Configure sheets ProcessingState** with fields: batch number page_info_next Products** with: Product ID Product Title Product Type Vendor Image url Status country of origin x_style_code gender Generated Description Error_log** with: timestamp Reason of Error Sales Analytics Sheet** with: Date Total Sales Workflow Capabilities Discovery and staging Auto-paginate Shopify; stage eligible products in Sheets with reasons and timestamps. Vision-grounded copywriting Descriptions reflect only visible attributes plus verified brand context; concise, mobile-friendly structure with gender-aware tone. Metadata awareness Auto-injects x-styleCode, country_of_origin, and gender when present; natural SEO for brand and product type. Sales intelligence Automated daily sales tracking with Melbourne timezone support, handles zero-sale days, and maintains complete historical records. Error analytics Layman + technical diagnosis logged to Error_log to shorten MTTR. Safe output Structured JSON via outputParserStructured for predictable row updates. Credentials Required Shopify Access Token** (Products + Orders read permissions) OpenAI API Key** (GPT-4o vision) OpenRouter API Key** (Claude Sonnet) Perplexity API Key** Google Sheets OAuth** Ideal For E-commerce teams** scaling compliant, on-brand product copy with comprehensive sales insights Agencies and SEO specialists** standardizing image-grounded descriptions with performance tracking and analytics Stores** needing resumable pagination, auditable content operations, and automated daily sales reporting in Sheets Advanced Features Dual-workflow architecture**: Content generation + Sales analytics in one system Link-header pagination with page_info persistence in ProcessingState Title/content normalization (e.g., color removal) configurable per brand Gender-aware copywriting** based on product tags Memory windows (memoryBufferWindow) to keep multi-step prompts consistent Melbourne timezone support** for accurate daily sales cutoffs Zero-sales handling** ensures complete analytics continuity Structured Output enforcement for downstream safety AI-powered error diagnosis** with technical and layman explanations Time & Scheduling (Universal) The workflow includes two independent schedules: Content Generation**: Every 5 minutes (configurable) for product processing Sales Analytics**: Daily at 2:01 PM Melbourne time for previous day's sales For globally distributed teams, schedule triggers and timestamps can be standardized on UTC to avoid regional drift. Pro Tip Start with small batches (limit set to 10 or fewer) to validate both copy generation and sales tracking flows. The workflow handles dual operations independently - content generation failures won't affect sales analytics and vice versa. Monitor the Error_log sheet for any issues and use the ProcessingState sheet to track pagination progress.
by Khairul Muhtadin
Tesseract - Money Mate Workflow Description Disclaimer: This template requires the n8n-nodes-tesseractjs community node, which is only available on self-hosted n8n instances. You’ll need a self-hosted n8n setup to use this workflow. Who is this for? This workflow is designed for individuals, freelancers, or small business owners who want an easy way to track expenses using Telegram. It’s ideal for anyone looking to digitize receipts—whether from photos or text messages—using free tools, without needing advanced technical skills. What problem does this workflow solve? Manually entering receipt details into a spreadsheet or app is time-consuming and prone to mistakes. This workflow automates the process by extracting information from receipt images or text messages sent via Telegram, categorizing expenses, and sending back a clear, formatted summary. It saves time, reduces errors, and makes expense tracking effortless. What this workflow does The workflow listens for messages sent to a Telegram bot, which can be either text descriptions of expenses or photos of receipts. If a photo is sent, Tesseract (an open-source text recognition tool) extracts the text. If text is sent, it’s processed directly. An AI model (LLaMA via OpenRouter) analyzes the input, categorizes it into expense types (e.g., Food & Beverages, Household, Transport), and creates a structured summary including store name, date, items, total, and category. The summary is then sent back to the user’s Telegram chat. Setup Instructions Follow these step-by-step instructions to set up the workflow. No advanced technical knowledge is required, but you’ll need a self-hosted n8n instance. Set Up a Self-Hosted n8n Instance: If you don’t have n8n installed, follow the n8n self-hosting guide to set it up. You can use platforms like Docker or a cloud provider (e.g., DigitalOcean, AWS). Ensure your n8n instance is running and accessible via a web browser. Install the Tesseract Community Node: In your n8n instance, go to Settings > Community Nodes in the sidebar. Click Install a Community Node, then enter n8n-nodes-tesseractjs in the search bar. Click Install and wait for confirmation. This node enables receipt image processing. If you encounter issues, check the n8n community nodes documentation for troubleshooting. Create a Telegram Bot: Open Telegram and search for @BotFather to start a new bot. Send /start to BotFather, then /newbot to create your bot. Follow the prompts to name your bot (e.g., “MoneyMateBot”). BotFather will provide a Bot Token (e.g., 23872837287:ExampleExampleExample). Copy this token. In n8n, go to Credentials > Add Credential, select Telegram API, and paste the token. Name the credential (e.g., “MoneyMateBot”) and save. Set Up OpenRouter for AI Processing: Sign up for a free account at OpenRouter. In your OpenRouter dashboard, generate an API Key under the API section. In n8n, go to Credentials > Add Credential, select OpenRouter API, and paste the API key. Name it (e.g., “OpenRouter Account”) and save. The free tier of OpenRouter’s LLaMA model is sufficient for this workflow. Import and Configure the Workflow: Download the workflow JSON file (provided separately or copy from the source). In n8n, go to Workflows > Import Workflow and upload the JSON file. Open the imported workflow (“Tesseract - Money Mate”). Ensure the Telegram Trigger and Send Expense Summary nodes use the Telegram credential you created. Ensure the AI Analyzer node uses the OpenRouter credential. Save the workflow. Test the Workflow: Activate the workflow by toggling the Active switch in n8n. In Telegram, find your bot (e.g., @MoneyMateBot) and send /start. Test with a sample input (see “Example Inputs” below). Check the n8n workflow execution panel to ensure data flows correctly. If errors occur, double-check credentials and node connections. Activate for Continuous Use: Once tested, keep the workflow active in n8n. Your bot will now process any text or image sent to it via Telegram. Example Inputs/Formats To help the workflow process your data accurately, use clear and structured inputs. Below are examples of valid inputs: Text Input Example: Send a message to your Telegram bot like this: Bought coffee at Starbucks, Jalan Sudirman, yesterday. Total Rp 50,000. 2 lattes, each Rp 25,000. Expected Output: hello [Your Name] Ini Rekap Belanjamu 📋 Store: Starbucks 📍 Location: Jalan Sudirman 📅 Date: 2025-05-26 🛒 Items: Latte: Rp 25,000 Latte: Rp 25,000 💸 Total: Rp 50,000 📌 Category: Food & Beverages Image Input Example: Upload a photo of a receipt to your Telegram bot. The receipt should contain: Store name (e.g., “Alfamart”) Address (e.g., “Jl. Gatot Subroto, Jakarta”) Date and time (e.g., “27/05/2025 14:00”) Items with prices (e.g., “Bread Rp 15,000”, “Milk Rp 20,000”) Total amount (e.g., “Total: Rp 35,000”) Expected Output: hello [Your Name] Ini Rekap Belanjamu 📋 Store: Alfamart 📍 Location: Jl. Gatot Subroto, Jakarta 📅 Date: 2025-05-27 14:00 🛒 Items: Bread: Rp 15,000 Milk: Rp 20,000 💸 Total: Rp 35,000 📌 Category: Household Tips for Images: Ensure the receipt is well-lit and text is readable. Avoid blurry or angled photos for better Tesseract accuracy. How to Customize This Workflow Change Expense Categories: In the **AI Categorizer node, edit the prompt to include custom categories (e.g., add “Entertainment” or “Utilities” to the list: Food & Beverages, Household, Transport). Modify Response Format: In the **Format Summary Message node, adjust the JavaScript code to change how the summary looks (e.g., add emojis, reorder fields). Save to a Database: Add a node (e.g., Google Sheets or PostgreSQL) after the **Format Summary Message node to store summaries. Support Other Languages: In the **AI Categorizer node, update the prompt to handle additional languages (e.g., Spanish, Mandarin) by specifying them in the instructions. Add Error Handling: Enhance the **Check Invalid Input node to catch more edge cases, like invalid dates. All Free, End-to-End This workflow is 100% free! It leverages: Telegram Bot API**: Free via BotFather. Tesseract**: Open-source text recognition. LLaMA via OpenRouter**: Free tier available for AI processing. Enjoy automating your expense tracking without any cost! Made by: khmuhtadin Need a custom? contact me on LinkedIn or Web
by Jimleuk
This n8n template demonstrates a simple approach to using AI to automate the generation of blog content which aligns to your organisation's brand voice and style by using examples of previously published articles. In a way, it's quick and dirty "training" which can get your automated content generation strategy up and running for very little effort and cost whilst you evaluate our AI content pipeline. How it works In this demonstration, the n8n.io blog is used as the source of existing published content and 5 of the latest articles are imported via the HTTP node. The HTML node is extract the article bodies which are then converted to markdown for our LLMs. We use LLM nodes to (1) understand the article structure and writing style and (2) identify the brand voice characteristics used in the posts. These are then used as guidelines in our final LLM node when generating new articles. Finally, a draft is saved to Wordpress for human editors to review or use as starting point for their own articles. How to use Update Step 1 to fetch data from your desired blog or change to fetch existing content in a different way. Update Step 5 to provide your new article instruction. For optimal output, theme topics relevant to your brand. Requirements A source of text-heavy content is required to accurately breakdown the brand voice and article style. Don't have your own? Maybe try your competitors? OpenAI for LLM - though I recommend exploring other models which may give subjectively better results. Wordpress for blog but feel free to use other preferred publishing platforms. Customising this workflow Ideally, you'd want to "train" your agent on material which is similar to your output ie. your social media post may not get the best results from your blog content due to differing formats. Typically, this brand voice extraction exercise should run once and then be cached somewhere for reuse later. This would save on generation time and overall cost of the workflow.
by Alexey from Mingles.ai
AI Image Generator & Editor with GPT-4 Vision - Complete Workflow Template Description Transform text prompts into stunning images or edit existing visuals using OpenAI's latest GPT-4 Vision model through an intuitive web form interface. This comprehensive n8n automation provides three powerful image generation modes: 🎨 Text-to-Image Generation Simply enter a descriptive prompt and generate high-quality images from scratch using OpenAI's gpt-image-1 model. Perfect for creating original artwork, concepts, or visual content. 🖼️ Image-to-Image Editing Upload an existing image file and transform it based on your text prompt. The AI analyzes your input image and applies modifications while maintaining the original structure and context. 🔗 URL-Based Image Editing Provide a direct URL to any online image and edit it with AI. Great for quick modifications of web images or collaborative workflows. Key Features Smart Input Processing Flexible Form Interface**: User-friendly web form with authentication Multiple Input Methods**: File upload, URL input, or text-only generation Quality Control**: Selectable quality levels (low, medium, high) Format Support**: Accepts PNG, JPG, and JPEG formats Advanced AI Integration Latest GPT-4 Vision Model**: Uses gpt-image-1 for superior results Intelligent Switching**: Automatically detects input type and routes accordingly Context-Aware Editing**: Maintains image coherence during modifications Customizable Parameters**: Control size (1024x1024), quality, and generation settings Dual Storage Options Google Drive Integration**: Automatic upload with public sharing permissions ImgBB Hosting**: Alternative cloud storage for instant public URLs File Management**: Organized storage with timestamp-based naming Instant Telegram Delivery Real-time Notifications**: Results sent directly to your Telegram chat Rich Media Messages**: Includes generated image with prompt details Quick Access Links**: Direct links to view and download results Markdown Formatting**: Clean, professional message presentation Technical Workflow Form Submission → User submits prompt and optional image Smart Routing → System detects input type (text/file/URL) AI Processing → OpenAI generates or edits image based on mode Binary Conversion → Converts base64 response to downloadable file Cloud Upload → Stores in Google Drive or ImgBB with public access Telegram Delivery → Sends result with viewing links and metadata Perfect For Content Creators**: Generate unique visuals for social media and marketing Designers**: Quick concept development and image variations Developers**: Automated image processing for applications Teams**: Collaborative image editing and sharing workflows Personal Use**: Transform ideas into visual content effortlessly Setup Requirements OpenAI API Key**: Access to GPT-4 Vision model Google Drive API** (optional): For Google Drive storage ImgBB API Key** (optional): For alternative image hosting Telegram Bot**: For result delivery Basic Auth Credentials**: For form security What You Get ✅ Complete image generation and editing pipeline ✅ Secure web form with authentication ✅ Dual cloud storage options ✅ Instant Telegram notifications ✅ Professional result formatting ✅ Flexible input methods ✅ Quality control settings ✅ Automated file management Start creating AI-powered images in minutes with this production-ready template! Tags: #AI #ImageGeneration #OpenAI #GPT4 #ImageEditing #Telegram #GoogleDrive #Automation #ComputerVision #ContentCreation
by Diego Alejandro Parrás
AI job application tracker and interview prep assistant Categories: AI, Productivity, Career Transform your job search from chaos to clarity. This workflow automatically tracks every application, researches companies, and prepares you for interviews with AI-generated materials — all saved to a visual Notion pipeline. Benefits Never lose track of applications** — Every job gets logged automatically with status tracking Walk into interviews prepared** — AI generates likely questions, talking points, and company insights Save 2-3 hours per application** — Research and prep that took hours now happens in seconds Automated follow-up reminders** — Get notified when it's time to send that follow-up email How It Works Submit application via form (paste job URL) or forward your confirmation email AI extracts job details — company, role, requirements, salary, location Interview prep generates — likely questions, suggested talking points, questions to ask Everything saves to Notion — visual pipeline with follow-up dates Daily reminders — Slack notification for applications needing follow-up Required Setup Notion Database Structure Create a Notion database with these properties: | Property Name | Type | Purpose | |---------------|------|---------| | Company | Title | Company name | | Role | Text | Job title | | Status | Select | Applied, Interviewing, Offer, Rejected, Ghosted | | Applied Date | Date | When you applied | | Salary Range | Text | Compensation info | | Job URL | URL | Link to posting | | Location | Text | City/Remote | | Interview Prep | Text | AI-generated prep materials | | Follow Up Date | Date | When to follow up | | Requirements | Text | Key job requirements | | Notes | Text | Your personal notes | Credentials Needed OpenAI API** — For job extraction and interview prep generation Notion** — Connected to your job tracker database Gmail** (optional) — For email forwarding and confirmations Slack** (optional) — For follow-up reminders Use Cases Active job seekers** — Track 20+ applications without spreadsheet chaos Career changers** — Get AI help understanding new industry requirements Recent graduates** — Build interview confidence with generated prep materials Passive searchers** — Keep a running list with minimal effort Set Up Steps Import the workflow into your n8n instance Create Notion database with the structure above (or duplicate template) Connect OpenAI credentials — API key with GPT-5 access recommended Connect Notion credentials — Select your job tracker database Configure Gmail trigger (optional) — Set filter for forwarded job emails Set up Slack webhook (optional) — Choose channel for reminders Test with a sample job posting — Paste a LinkedIn or company careers page URL Customization Tips Edit the interview prep prompt to mention your background/experience Adjust the follow-up reminder interval (default: 7 days) Add additional research sources (LinkedIn, Crunchbase) for richer company data Connect to calendar to block interview prep time automatically Technical Notes Uses Jina AI Reader for web scraping (free tier available) GPT-5-mini recommended for cost efficiency Notion text fields limited to 2000 characters (full prep saved) Daily check runs at 9 AM (configurable) Difficulty Level: Intermediate Estimated Setup Time: 30-45 minutes Monthly Operating Cost: ~$2-5 (based on 50 applications/month with GPT-5-mini)