by RamK
Description This workflow acts as an autonomous Tier 2 Customer Support Agent. It doesn't just answer questions; it manages the entire lifecycle of a support ticket—from triage to resolution with Guardrails to deal with prompt injections, PII information blocking, etc. enabling such threats are blocked and logged in Airtable. Unlike standard auto-responders, this system uses a "Master Orchestrator" architecture to coordinate specialized sub-agents. It creates a safe, human-like support experience by combining RAG (Knowledge Base retrieval) with a safety-first state machine. How it works The workflow operates on a strict "Hub and Spoke" model managed by a Master Orchestrator: Security Guardrails (The Gatekeeper) Before the AI even sees the message, a hard-coded security layer scans for Prompt Injection attacks, Profanity, and PII. If a threat is detected, the workflow locks down, logs the incident to Airtable, and stops execution immediately. Orchestration & Triage Once the message passes safety checks, the Master Orchestrator takes over. Its first action is to call the Ticket Analyser Agent. Analysis & Scoring The Ticket Analyser classifies the issue (e.g., "Technical," "Billing") and scores the customer's sentiment. It returns a priority_score to the Master Orchestrator. The Decision Logic (Circuit Breaker) The Master Orchestrator evaluates the score: Escalation: If the customer is "Furious" or the score is high, it bypasses AI drafting and immediately alerts a human manager via Slack. Resolution Path: If the request is standard, it proceeds to the next steps. Knowledge Retrieval (RAG) The Orchestrator calls the Knowledge Worker Agent. This agent searches your Supabase vector store to find specific, verified company policies or troubleshooting steps relevant to the user's issue. Resolution Drafting Armed with the analysis and the retrieved facts, the Orchestrator calls the Resolution Agent. This agent synthesizes a polite, professional email draft. Final Execution The Master Orchestrator reviews the final draft and sends the email via Gmail. Set up This is multi-agent system. Please follow these steps to configure the environment: ⚠️ IMPORTANT: This template contains the Main Orchestrator AND the Sub-Agents in a single view. You must separate them for the system to function: Separate the Agents: Copy the nodes for each sub-agent (Ticket Analyser, Knowledge Worker, Resolution Agent) into their own new workflows. Link the Tools: In the Main Orchestrator workflow, open the "Call [Agent Name]" tool nodes and update the Workflow ID to point to the new workflows you just created. Configure Credentials: You will need credentials for Gmail (or your preferred email provider), Slack, Airtable, Supabase (for the vector store), and Google Gemini (or OpenAI). Initialize the Knowledge Base: Open the "One time Document Loader" section in the workflow. Upload your policy document (PDF/Text) to the "Upload your file here" node. Run this branch once to vectorize your documents into Supabase. Setup Airtable: Create a simple table with columns for Sender Email, Incident Type, and Flagged Content to log security threats caught by the guardrails. Customize the Trigger: Update the Gmail Trigger node to watch for your specific support alias (e.g., support@yourdomain.com) and ensure it only picks up "Unread" emails. Adjust the Escalation Sensitivity: In the Orchestrator Agent node, you can tweak the "Phase 2" logic to change what triggers a human hand-off (currently set to priority_score >= 0.9). Good to go!
by Tsubasa Shukuwa
How it works This workflow automatically fetches the latest public grant information from the Ministry of Health, Labour and Welfare (MHLW) RSS feed. It uses AI to summarize and structure each grant post into a clear format, stores the results in Google Sheets, and sends a formatted HTML summary via Gmail. Workflow summary Schedule Trigger – Runs the flow daily or weekly. RSS Feed Reader – Fetches the latest MHLW news and updates. Text Classifier (AI) – Categorizes the item as “Grant/Subsidy”, “Labor-related”, or “Other”. AI Agent – Extracts structured data such as title, summary, deadline, amount, target, and URL. Google Sheets – Appends or updates the database using the grant title as the key. Code Node – Builds an HTML report summarizing new entries. Gmail – Sends a daily digest email to your inbox. Setup steps Add your OpenRouter API key as a credential (used in the AI Agent). Replace the Google Sheets ID and sheet name with your own. Update the recipient email address in the Gmail node. Adjust the schedule trigger to match your preferred frequency. (Optional) Add more RSS feeds if you want to monitor other sources. Ideal for Consultants or administrators tracking subsidy and grant programs Small business owners who want automatic updates Anyone who wants a daily AI-summarized government grant digest ⚙️ Note: Detailed explanations and setup hints are included as Sticky Notes above each node inside the workflow.
by Rahul Joshi
Description Automatically consolidate Zendesk and Freshdesk ticket data into a unified performance dashboard with KPI calculations, Google Sheets logging, real-time Slack alerts, and weekly Gmail email reports. Provides complete visibility into support operations, SLA compliance, and customer satisfaction across multiple platforms. 📊💬📧 What This Template Does Runs weekly on schedule to fetch tickets from both Zendesk and Freshdesk. ⏰ Merges ticket data into a standardized JSON structure with normalized priorities, statuses, and channels. 🔄 Logs all tickets and metadata into Google Sheets for audit-ready performance tracking. 📑 Calculates advanced KPIs including resolution rates, SLA breaches, CSAT score estimation, urgent ticket rates, and performance grading. 📊 Evaluates alert conditions (e.g., high SLA breach, low CSAT, backlog risk). 🚨 Sends formatted Slack alerts with performance grades, key metrics, and recommendations. 💬 Generates corporate-style HTML weekly reports and delivers them via Gmail. 📧 Key Benefits Unifies Zendesk and Freshdesk data into one consistent reporting flow. 🌐 Provides actionable KPIs for SLA monitoring, customer satisfaction, and backlog health. ⏱️ Ensures leadership visibility with Google Sheets logs and professional email reports. 🧾 Alerts the support team instantly on Slack when performance drops. 🚨 Reduces manual data analysis with automated grading and recommendations. 🤖 Features Multi-Platform Ticket Integration – Fetches tickets from Zendesk and Freshdesk. 🎫 Data Normalization – Cleans descriptions, maps priorities/statuses, and detects escalations. 🧼 Google Sheets Logging – Tracks tickets with IDs, URLs, tags, timestamps, and metadata. 📈 KPI Calculation Engine – Computes SLA breach rate, resolution rate, CSAT, escalation %, and more. 🧮 Performance Grading – Grades support performance (A–D) with detailed descriptions. 🏅 Slack Alerts – Notifies with active alerts, recommendations, and emoji-based health signals. 📢 Weekly Gmail Reports – Delivers branded HTML reports for management and audits. ✨ Requirements n8n instance (cloud or self-hosted). Zendesk API credentials with ticket read access. Freshdesk API credentials with ticket read access. Google Sheets OAuth2 credentials with spreadsheet write permissions. Slack Bot API credentials with posting permissions. Gmail OAuth2 credentials with send email permissions. Pre-configured Google Sheet for KPI logging. Target Audience Support managers overseeing multi-platform ticketing systems. 👩💻 Customer success teams monitoring SLA compliance and CSAT health. 🚀 SMBs running Zendesk + Freshdesk who need unified dashboards. 🏢 Remote/global support teams needing automated KPI visibility. 🌐 Executives requiring weekly performance reports and recommendations. 📈 Step-by-Step Setup Instructions Connect Zendesk, Freshdesk, Google Sheets, Slack, and Gmail credentials in n8n. 🔑 Update the Google Sheet ID in the “Log KPIs in Google Sheets” node. 📊 Configure Slack channel ID for alerts (default: zendesk-churn-alerts). 💬 Replace {Enter Your Email} in the Gmail node with your recipient email. 📧 Adjust thresholds in the KPI calculation node (default: 4h response, 24h resolution). ⏱️ Test with sample tickets to validate Sheets logging, Slack alerts, and Gmail reports. ✅ Deploy on schedule (default: weekly at 8 PM) for continuous tracking. 🗓️
by vinci-king-01
Lead Scoring Pipeline with Telegram and Box This workflow ingests incoming lead data from a form submission webhook, enriches each lead with external data sources, applies a custom scoring algorithm, and automatically stores the enriched record in Box while notifying your sales team in Telegram. It is designed to give you a real-time, end-to-end lead-qualification pipeline without writing any glue code. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed (not directly used in this template but required by marketplace listing rules) Telegram Bot created via BotFather Box account (Developer App or User OAuth2) Publicly accessible URL (for the Webhook trigger) Optional: Enrichment API account (e.g., Clearbit, PDL) for richer scoring data Required Credentials | Credential | Scope | Purpose | |------------|-------|---------| | Telegram Bot Token | Bot | Send scored-lead alerts | | Box OAuth2 Credentials | App Level | Upload enriched lead JSON/CSV | | (Optional) Enrichment API Key | REST | Append firmographic & technographic data | Environment Variables (Recommended) | Variable | Example | Description | |---------|----------|-------------| | LEAD_SCORE_THRESHOLD | 75 | Minimum score that triggers a Telegram alert | | BOX_FOLDER_ID | 123456789 | Destination folder for lead files | How it works This workflow listens for new form submissions, enriches each contact with external data, calculates a lead score based on configurable criteria, and routes the lead through one of two branches: high-value leads trigger an instant Telegram alert and are archived to Box, while low-value leads are archived only. Errors are captured by an Error Trigger for post-mortem analysis. Key Steps: Webhook Trigger**: Receives raw form data (name, email, company, etc.). Set node – Normalization**: Renames fields and initializes default values. HTTP Request – Enrichment**: Calls an external enrichment API to augment data. Merge node**: Combines original and enriched data into a single object. Code node – Scoring**: Runs JavaScript to calculate a numeric lead score. If node – Qualification Gate**: Checks if score ≥ LEAD_SCORE_THRESHOLD. Telegram node**: Sends alert message to your sales channel for high-scoring leads. Box node**: Uploads the enriched JSON (or CSV) file into a specified folder. Error Trigger**: Captures any unhandled errors and notifies ops (optional). Sticky Notes**: Explain scoring logic and credential placement (documentation aids). Set up steps Setup Time: 15-25 minutes Create Telegram Bot & Get Token Talk to BotFather → /newbot → copy the provided token. Create a Box Developer App Enable OAuth2 → add https://api.n8n.cloud/oauth2-credential/callback (or your own) as redirect URI. Install Required Community Nodes From n8n editor → “Install” → search “ScrapeGraphAI” → install. Import the Workflow JSON Click “Import” → paste the workflow file → save. Configure the Webhook URL in Your Form Tool Copy the production URL generated by the Webhook node → add it as form action. Set Environment Variables In n8n (Settings → Environment) add LEAD_SCORE_THRESHOLD and BOX_FOLDER_ID. Fill in All Credentials Telegram: paste bot token. Box: complete OAuth2 flow. Enrichment API: paste key in the HTTP Request node headers. Activate Workflow Toggle “Activate”. Submit a test form to verify Telegram/Box outputs. Node Descriptions Core Workflow Nodes: Webhook** – Entry point; captures incoming JSON payload from the form. Set (Normalize Fields)** – Maps raw keys to standardized ones (firstName, email, etc.). HTTP Request (Enrichment)** – Queries external service for firmographic data. Merge (Combine Data)** – Merges the two JSON objects (form + enrichment). Code (Scoring)** – Calculates lead score using weighted attributes. If (Score Check)** – Branches flow based on the score threshold. Telegram** – Sends high-score alerts to a specified chat ID. Box** – Saves a JSON/CSV file of the enriched lead to cloud storage. Error Trigger** – Executes if any preceding node fails. Sticky Notes** – Inline documentation for quick reference. Data Flow: Webhook → Set → HTTP Request → Merge → Code → If If (true) → Telegram If (always) → Box Error (from any node) → Error Trigger Customization Examples Change Scoring Logic // Inside the Code node const { jobTitle, companySize, technologies } = items[0].json; let score = 0; if (jobTitle.match(/(CTO|CEO|Founder)/i)) score += 50; if (companySize > 500) score += 20; if (technologies.includes('AWS')) score += 10; // Bonus: subtract points if free email domain if (items[0].json.email.endsWith('@gmail.com')) score -= 30; return [{ json: { ...items[0].json, score } }]; Use a Different Storage Provider (e.g., Google Drive) // Replace Box node with Google Drive node { "node": "Google Drive", "operation": "upload", "fileName": "lead_{{$json.email}}.json", "folderId": "1A2B3C..." } Data Output Format The workflow outputs structured JSON data: { "firstName": "Ada", "lastName": "Lovelace", "email": "ada@example.com", "company": "Analytical Engines Inc.", "companySize": 250, "jobTitle": "CTO", "technologies": ["AWS", "Docker", "Node.js"], "score": 82, "qualified": true, "timestamp": "2024-04-07T12:34:56.000Z" } Troubleshooting Common Issues Telegram messages not received – Ensure the bot is added to the group and chat_id/token are correct. Box upload fails with 403 – Check folder permissions; verify OAuth2 tokens have not expired. Webhook shows 404 – The workflow is not activated or the URL was copied in “Test” mode instead of “Production”. Performance Tips Batch multiple form submissions using the “SplitInBatches” node to reduce API-call overhead. Cache enrichment responses (Redis, n8n Memory) to avoid repeated lookups for the same domain. Pro Tips: Add an n8n “Wait” node between enrichment calls to respect rate limits. Use Static Data to store domain-level enrichment results for even faster runs. Tag Telegram alerts with emojis based on score (🔥 Hot Lead for >90). This is a community-contributed n8n workflow template provided “as-is.” Always test thoroughly in a non-production environment before deploying to live systems.
by Anatoly
Automated Solana News Tracker with AI-Powered Weekly Summaries Never miss important Solana ecosystem updates again. This production-ready workflow automatically scrapes crypto news daily, intelligently filters duplicates, stores everything in Google Sheets, and generates AI-powered weekly summaries every Monday—completely hands-free. 🎯 What It Does: This intelligent automation runs on autopilot to keep you informed about Solana developments without manual monitoring. Every day at 8 AM PT, it fetches the latest Solana news from CryptoPanic, checks for duplicates against your existing database, and stores only new articles in Google Sheets. On Mondays, it takes an extra step: reading all accumulated articles from the past week and using GPT-4.1-mini to generate a concise, factual summary of key developments and investor takeaways. Daily News Collection**: Automatically fetches latest Solana articles from CryptoPanic API Smart Duplicate Detection**: Compares incoming articles against existing database to prevent redundancy Data Validation**: Filters out incomplete articles to ensure data quality Organized Storage**: Maintains clean Google Sheets database with timestamps and descriptions Weekly AI Summaries**: Analyzes accumulated news every Monday and generates 2-3 sentence insights Historical Archive**: Builds searchable database of both raw articles and weekly summaries 💼 Perfect For: Crypto traders tracking market-moving news • SOL investors monitoring ecosystem growth • Blockchain researchers building historical datasets • Content creators sourcing newsletter material • Portfolio managers needing daily briefings • Anyone wanting Solana updates without information overload 🔧 How It Works: The workflow operates in two distinct modes based on the day of the week. During the daily collection phase (Tuesday-Sunday), it runs at 8 AM PT, fetches the latest Solana news from CryptoPanic, formats the data to extract titles, descriptions, and timestamps, checks each article against your Google Sheets database to identify duplicates, filters out any articles that already exist or have missing data, and appends only valid new articles to your "Raw Data" sheet. On Mondays, the workflow performs all daily tasks plus an additional summarization step. After storing new articles, it retrieves all accumulated news from the "Raw Data" sheet, aggregates all article descriptions into a single text block, sends this consolidated information to GPT-4.1-mini with instructions to create a factual, spartan-toned summary highlighting key investor takeaways, and saves the AI-generated summary with a timestamp to the "Weekly Summary" sheet for historical reference. ✨ Key Features: Schedule-based execution**: Runs automatically at 8 AM PT every day without manual intervention Intelligent deduplication**: Title-based matching prevents storing the same article multiple times Data quality control**: Validates required fields before storage to maintain clean dataset Dual-sheet architecture**: Separate sheets for raw articles and weekly summaries for easy access Cost-effective AI**: Uses GPT-4.1-mini (~$0.001 per summary) for extremely low operating costs Scalable storage**: Google Sheets handles thousands of articles with free tier Customizable cryptocurrency**: Easily adapt to track Bitcoin, Ethereum, or any supported coin Flexible scheduling**: Modify trigger time and summary frequency to match your needs 📋 Requirements: CryptoPanic account with free API key (register at cryptopanic.com) Google Sheets with two sheets: "Raw Data" (columns: date, title, descripton, summary) and "Weekly Summary" (columns: Date, Summary) OpenAI API key for GPT-4.1-mini access (~$0.05/month cost) n8n Cloud or self-hosted instance with schedule trigger enabled ⚡ Quick Setup: Register for a free CryptoPanic API key and replace [your token] in the "Get Solana News" HTTP Request node URL. Create a new Google Spreadsheet with two sheets: one named "Raw Data" with columns for date, title, descripton (note the typo in template), and summary; another named "Weekly Summary" with columns for Date and Summary. Connect your Google Sheets OAuth2 credential to all Google Sheets nodes in the workflow. Add your OpenAI API credential to the "Summarize News" node. Test the workflow manually to ensure it fetches news and stores it correctly. Activate the workflow to enable daily automatic execution. 🚨 Please note, that you're not able to get news in real-time with a FREE CryptoPanic API. Consider their pro plan or another platform for real-time news scraping You'll get new that's up to date as of yesterday. 🎁 What You Get: Complete end-to-end automation with concise sticky note documentation at each workflow stage, pre-configured duplicate detection logic, AI summarization with investor-focused prompts optimized for factual analysis without hype, dual-sheet Google Sheets structure for raw data and summaries, flexible schedule trigger you can adjust to any timezone, example data in pinned format showing expected API responses, customization guides for different cryptocurrencies and summary frequencies, and troubleshooting checklist for common setup issues. 💰 Expected Costs & Performance: CryptoPanic API is free with reasonable rate limits for personal use. OpenAI GPT-4.1-mini costs approximately $0.001 per summary, totaling about $0.05 per month for weekly summaries. The workflow typically processes 20-50 articles daily and generates one summary weekly from 140-350 accumulated articles. Daily executions complete in 5-10 seconds, while Monday runs with AI summarization take 15-20 seconds. Google Sheets provides free storage for up to 5 million cells, easily handling years of news data. 🔄 Customization Ideas: Track different cryptocurrencies by changing the currencies parameter (btc, eth, ada, doge, etc.). Adjust the schedule trigger to run at different times matching your timezone. Modify the Monday check condition to generate summaries on different days or multiple times per week. Connect Slack, Discord, or Email nodes to receive instant notifications when summaries are generated. Edit the AI prompt to change tone, detail level, or focus on specific aspects like price action, development updates, or partnerships. Add conditional logic to send alerts only when certain keywords appear in news (like "hack," "partnership," or "upgrade").
by Mira Melhem
👔 Recruitment Office WhatsApp Automation Automate WhatsApp communication for recruitment agencies with an interactive, structured customer experience. This workflow handles pricing inquiries, request submissions, tracking, complaints, and human escalation while maintaining full session tracking and media support. Good to know Uses WhatsApp Interactive List Messages for user selection and navigation. Includes session-state logic and memory across messages. Includes a 5-minute cooldown to avoid spam and repeated triggers. Supports logging for all interaction types including media files. Includes both a global bot shutdown switch and per-user override. How it works A customer sends a message to the official WhatsApp number. The workflow replies with an interactive menu containing 8 service options: 💰 Pricing by nationality (8 supported countries) 📝 New recruitment request submission 🔍 Tracking existing applications via Google Sheets lookup 🔁 Worker transfer link distribution 🌍 Translation service information 📄 Required documents and instructions ⚠️ Complaint submission and routing 👤 Request a human agent The workflow retrieves or stores data based on the selection using Google Sheets and Data Tables. If the customer requests human help or the logic detects uncertainty, the workflow: Pauses automation for that user Notifies a designated staff member All interactions are logged including files, text, timestamps, and selections. Features 📋 Structured WhatsApp service menu 📄 CRM-style recruitment request logging ✨ Pricing logic with nationality mapping 🔍 Lookup-based status tracking 📎 Support for media uploads (PDF, images, audio, documents) 🧠 Session tracking with persistent user state 🤝 Human escalation workflow with internal notifications 🛑 Anti-spam and cooldown control 🎚 Bot master switch (global + per-user) Technology stack | Component | Usage | |----------|-------| | n8n | Automation engine | | WhatsApp Business API | Messaging and interactive UX | | Google Sheets | CRM and logs | | Data Tables | State management | | JavaScript | Custom logic and routing | Requirements WhatsApp Business API account with active credentials n8n Cloud or self-hosted instance Google Sheets for CRM storage Data Tables enabled for persistent session tracking How to use The workflow uses a Webhook trigger compatible with common WhatsApp API providers. Modify menu content, pricing, optional steps, and escalation flows as needed. Link your Google Sheets and replace test sheet IDs with production values. Configure human escalation to notify team members or departments. Customising this workflow Replace Google Sheets with Airtable, HubSpot, or SQL storage. Add expiration and reminder messages for missing documents. Add AI-powered response logic for common questions. Enable multi-country support (Saudi/UAE/Jordan/Qatar/Kuwait/etc.) Connect to dashboards for reporting and staff performance analytics.
by Atharva
🧾An intelligent automation system that turns WhatsApp into your personal receipt manager — integrating Meta WhatsApp Cloud API, Google Drive, Google Sheets, and OpenAI GPT-4o-mini via n8n. 🎥 Demo: Watch the Loom walkthrough ⚙️ What It Does The AI-Powered WhatsApp Receipt Bot automates the complete invoice handling process through a conversational interface. Workflow Summary: User sends a receipt image via WhatsApp. The bot automatically downloads the media using the WhatsApp Cloud API. The image is uploaded to a Google Drive “Invoices” folder. The file is shared publicly, generating a shareable URL. The receipt is analyzed using OpenAI GPT-4o-mini to extract structured data: Store name Items purchased Payment method Total amount The extracted details are appended to a Google Sheet for record-keeping. The bot sends a human-readable summary back to WhatsApp with emojis and the invoice link. Output Example: 🏬 Store: Big Bazaar 📝 Items: Rice, Detergent, Snacks 💳 Payment: Card 💰 Total: ₹1520.75 🔗 Link: https://drive.google.com/file/d/1abcXYZ/view This system eliminates manual expense tracking, improves accuracy through OCR, and provides a seamless way to manage receipts in real time. 💡 Use Cases | Scenario | Description | | ------------------------------------- | --------------------------------------------------------------------------------------------------------------------- | | Personal Expense Management | Automatically store and categorize receipts from daily purchases. | | Business Accounting | Collect employee expense receipts through WhatsApp and centralize them in Google Sheets. | | Freelancer or Consultant Tracking | Keep a digital record of client reimbursements or software purchase receipts. | | Family Budgeting | Family members send receipts to one shared WhatsApp number, all data gets logged centrally. | | E-commerce / Delivery Teams | Drivers or delivery agents send invoices from the field to WhatsApp; data automatically goes to the accounting sheet. | 🔧 Setup 1. Accounts and Tools Needed | Tool | Purpose | Link | | -------------------------- | ------------------------------------------- | -------------------------------------------------------------------------------------------- | | Meta Developer Account | To access WhatsApp Business Cloud API | https://developers.facebook.com/apps | | Google Cloud Account | For enabling Drive and Sheets APIs | https://console.cloud.google.com | | n8n Instance | Workflow automation engine (local or cloud) | https://app.n8n.cloud | | OpenAI API Key | For GPT-4o-mini model OCR + reasoning | https://platform.openai.com/account/api-keys | 2. Meta Developer Setup (WhatsApp Cloud API) Go to Meta Developer Dashboard → My Apps → Create App → Business type. Add WhatsApp product under your app. Retrieve the following from WhatsApp > Configuration: Permanent Access Token Phone Number ID WhatsApp Business Account ID Add these credentials in n8n → Credentials → WhatsApp API. Use the same credentials for WhatsApp Trigger and Send Message nodes. Verify webhook in Meta with your n8n webhook URL. Important: In your HTTP Node, set the header as: Authorization: Bearer <access_token> Replace <access_token> with your WhatsApp Cloud API permanent token. Without this, the workflow will fail to send or receive WhatsApp messages properly. 3. Google Drive Setup Create a folder named Invoices on your Google Drive. Copy the Folder ID (found in the Drive URL). In Google Cloud Console → APIs & Services → Enable APIs: Enable Google Drive API Enable Google Sheets API Go to Credentials → Create Credentials → OAuth 2.0 Client ID. Download the credentials.json file. Upload this to n8n → Credentials → Google Drive OAuth2 API. Authorize the connection on first workflow run. 4. Google Sheets Setup Create a new Google Sheet titled Invoices. Add the following headers in Row 1: store name | discription | image_url | payment | total Copy the Sheet ID (from the URL). Add the ID under the Google Sheets Append node in n8n. Map each field to its corresponding value extracted from the OCR result. 5. OpenAI Setup Generate an API key from https://platform.openai.com/account/api-keys. Add it to n8n → Credentials → OpenAI API. Use model gpt-4o-mini in the “Analyze Image” node. Can upgrade to gpt-4o for better OCR accuracy if account supports it. 6. n8n Workflow Setup Import the provided n8n workflow JSON. Configure credentials for: WhatsApp API Google Drive OAuth2 Google Sheets OAuth2 OpenAI API Activate workflow and set webhook in Meta Developer console. Send a test receipt image to your WhatsApp Business number. The bot will automatically: Download → Upload → Extract → Log → Summarize → Reply 📊 Example Google Sheet Record | store name | discription | image_url | payment | total | | ---------- | ----------------------- | -------------------------------------------------------------------------------------------- | ------- | ------- | | Big Bazaar | Rice, Detergent, Snacks | https://drive.google.com/file/d/1abcXYZ/view | Card | 1520.75 | 🧠 Result A fully automated AI pipeline that transforms WhatsApp into a smart expense-tracking interface — integrating vision, automation, and natural language processing for zero-manual financial documentation. Support & Contact: If you face any issues during setup or execution, contact: 📧 Email: atharvapj5@gmail.com 🔗 LinkedIn: Atharva Jaiswal
by WeblineIndia
ETL Monitoring & Alert Automation: Jira & Slack Integration This workflow automatically processes ETL errors, extracts important details, generates a preview, creates a log URL, classifies the issue using AI and saves the processed data into Google Sheets. If the issue is important or needs attention, it also creates a Jira ticket automatically. The workflow reduces manual debugging effort, improves visibility and ensures high-severity issues are escalated instantly without human intervention. Quick Start – Implementation Steps Connect your webhook or ETL platform to trigger the workflow. Add your OpenAI, Google Sheets and Jira credentials. Enable the workflow. Send a sample error to verify Sheets logging and Jira ticket creation. Deploy and let the workflow monitor ETL pipelines automatically. What It Does This workflow handles ETL errors end-to-end by: Extracting key information from ETL error logs. Creating a short preview for quick understanding. Generating a URL to open the full context log. Asking AI to identify root cause and severity. Parsing the AI output into clean fields. Saving the processed error to Google Sheets. Creating a Jira ticket for medium/high-severity issues. This creates a complete automated system for error tracking, analysis and escalation. Who’s It For DevOps & engineering teams monitoring data pipelines. ETL developers who want automated error reporting. QA teams verifying daily pipeline jobs. Companies using Jira for issue tracking. Teams needing visibility into ETL failures without manual log inspection. Requirements to Use This Workflow n8n account or self-hosted instance. ETL platform capable of sending error payloads (via webhook). OpenAI API Key. Google Sheets credentials. Jira Cloud API credentials. Optional: log storage URL (S3, Supabase, server logs). How It Works & Setup Steps 1. Get ETL Error (Webhook Trigger) Receives ETL error payload and starts the workflow. 2. Prepare ETL Logs (Code Node) Extracts important fields and makes a clean version of the error.Generates a direct link to open the full ETL log. 3. AI Severity Classification (OpenAI / AI Agent) AI analyzes the issue, identifies cause and assigns severity. 4. Parse AI Output (Code Node) Formats AI results into clean fields: severity, cause, summary, recommended action. 5. Prepare Data for Logging (Set / Edit Fields) Combines all extracted info into one final structured record. 6. Save ETL Logs (Google Sheets Node) Logs each processed ETL error in a spreadsheet for tracking. 7. Create Jira Ticket (Jira Node) Automatically creates a Jira issue when severity is Medium, High or Critical. 8. ETL Failure Alert (Slack Node) Sends a Slack message to notify the team about the issue. 9. ETL Failure Notify (Gmail Node) Sends an email with full error details to the team. How to Customize Nodes ETL Log Extractor Add/remove fields based on your ETL log structure. AI Classification Modify the OpenAI prompt for custom severity levels or deep-dive analysis. Google Sheets Logging Adjust columns for environment, job name or log ID. Jira Fields Customize issue type, labels, priority and assignees. Add-Ons (Extend the Workflow) Send Slack or Teams alerts for high severity issues Store full logs in cloud storage (S3, Supabase, GCS) Add daily/weekly error summary reports Connect monitoring tools like Datadog or Grafana Trigger automated remediation workflows Use Case Examples Logging all ETL failures to Google Sheets Auto-creating Jira tickets with AI-driven severity Summarizing large logs with AI for quick analysis Centralized monitoring of multiple ETL pipelines Reducing manual debugging effort across teams Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Sheets not updating | Wrong Sheet ID or missing permission | Reconnect and reselect the sheet | | Jira ticket fails | Missing required fields or invalid project key | Update Jira mapping | | AI output empty | Invalid OpenAI key or exceeded usage | Check API key or usage limits | | Severity always “low” | Prompt too broad | Adjust AI prompt with stronger rules | | Log preview empty | Incorrect error field mapping | Verify the structure of the ETL error JSON | Need Help? For assistance setting up this workflow, customizing nodes or adding additional features, feel free to contact our n8n developers at WeblineIndia. We can help configure, scale or build similar automation workflows tailored to your ETL and business requirements.
by Abdul Mir
Overview Use your voice or text to command a Telegram-based AI agent that scrapes leads or generates detailed research reports—instantly. This workflow turns your Telegram bot into a full-blown outbound machine. Just tell it what type of leads you need, and it’ll use Apollo to find and save them into a spreadsheet. Or drop in a LinkedIn profile, and it’ll generate a personalized research dossier with info like job title, company summary, industry insights, and more. It handles voice messages too—just speak your request and get the results sent back like magic. Who’s it for Cold emailers and growth marketers Solo founders running outbound SDRs doing daily prospecting Agencies building high-quality lead lists or custom research for clients How it works Triggered by a message (text or voice) in Telegram If it’s voice, it transcribes using OpenAI Whisper Uses an AI agent to interpret intent: scrape leads or research a person For lead scraping: Gathers criteria (e.g., location, job title) via Telegram Calls the Apollo API to return fresh leads Saves the leads to Google Sheets For research reports: Takes a LinkedIn profile link Uses AI and lead data tools to create a 1-page professional research report Sends it back to the user via email Example outputs Lead scraping**: Populates a spreadsheet with names, roles, LinkedIn links, company info, emails, and more Research report**: A formatted PDF-style brief with summary of the person, company, and key facts How to set up Connect your Telegram bot to n8n Add your OpenAI credentials (for Whisper + Chat agent) Plug in your Apollo API key or scraping tool Replace the example spreadsheet with your own Customize the prompts for tone or data depth (Optional) Add PDF generation or CRM sync Requirements Telegram Bot Token OpenAI API Key Apollo (or other scraping API) credentials LinkedIn URLs for research functionality How to customize Replace Apollo with Clay, People Data Labs, or another scraping tool Add a CRM push step (e.g. Airtable, HubSpot, Notion) Add scheduling to auto-scrape daily Reformat the research report as a downloadable PDF Change the agent’s tone or role (e.g. “Outreach Assistant,” “Investor Scout,” etc.)
by Pratyush Kumar Jha
Trend2Content This n8n workflow (named Trend2Content) takes a short topic input from a small web form, scrapes recent/top social content for that topic (via an Apify act), aggregates the raw text, passes that aggregated content into a LangChain AI agent (Google Gemini in this flow) which returns a structured content output (topic summary, blog post title ideas, tweet hooks), formats that output, and appends the results into a Google Sheet. It’s a lightweight: Topic → Trending Content → AI Ideas → Sheet pipeline for fast content ideation. How It Works (Step-by-Step) On Form Submission The user fills a single field Topic (webhook/form trigger). X Scraper (HTTP Request) Calls an Apify act run-sync-get-dataset-items with: searchTerms: [{{ $json.Topic }}] maxItems: 20 to fetch social posts for that topic. Edit Fields (Set) Extracts fullText from each scraped item and stores it in a Content field. Aggregate Aggregates the Content field so the AI agent receives one combined input rather than many separate items. Google Gemini Chat Model (LM) + AI Agent (LangChain Agent Node) The agent uses a templated system prompt + the aggregated content to generate a structured response with: Topic summary Blog title ideas Tweet hooks The agent is connected to a Structured Output Parser node to force a predictable JSON schema. Code in JavaScript Transforms the structured JSON into sheet-friendly strings (joins arrays with bullets). Append Row in Sheet (Google Sheets) Appends the generated blog_post_titles and tweet_hooks to the target Google Sheet. (Optional) Sticky notes and internal meta nodes exist for documentation and board organization. Quick Setup Guide 👉 Demo & Setup Video 👉 Sheet Template 👉 Course Nodes of Interest You Can Edit 1. On Form Submission (formTrigger) Edit form fields (add author, language, region, or filters). Change webhook behaviour or require authentication. 2. X Scraper (HTTP Request) URL:** Change to another Apify act or another scraping API. jsonBody:** Change maxItems, sort (Top/Recent), or modify searchTerms (e.g., topic + hashtag). Headers:** Set the Authorization: Bearer token (Apify). Add pagination or query parameters if switching scraper APIs. 3. Edit Fields (Set) Map additional fields (author, date, source URL). Add filtering logic (remove short posts, retweets, duplicates). 4. Aggregate Customize aggregation strategy: Concatenate Sample top N Deduplicate before combining 5. Google Gemini Chat Model / AI Agent / Structured Output Parser Edit systemMessage and prompt template (tone, format, extra outputs). Tune LM parameters (temperature, max tokens). Update schema to request: Sentiment Key quotes Additional formats 6. Code in JavaScript Modify formatting (CSV-ready, add timestamp). Add metadata columns. Add deduplication or length checks before write. 7. Append Row in Sheet (Google Sheets) Change spreadsheet ID or sheet name. Add more columns. Switch from Append to Upsert. Configure batch appends. What You’ll Need (Credentials) 1. Apify API Token Used in the HTTP Request node. Set in header: Authorization: Bearer YOUR_APIFY_TOKEN 2. Google Sheets OAuth2 Credentials Must include spreadsheets scope. Required for appending rows. 3. Google / PaLM / Google Gemini API Credentials Used by the LangChain / Google Gemini node. Optional n8n webhook URL (for mounting the form). Monitoring credentials (Slack webhook, Sentry, etc.) for alerts. Recommended Settings & Best Practices Enable workflow only after testing (active: true). Limit maxItems (20–50 recommended). Sanitize & dedupe content before sending to the AI. Always use a Structured Output Parser for reliable JSON. Set low temperature (0.0–0.6) for consistent results. Add retries and exponential backoff for external APIs. Add logging or Slack alerts for failures. Keep execution log columns in the sheet (status, error_message, run_time). Store workflow JSON in version control. Monitor API rate limits (Apify + Google). Avoid writing scraped PII into public sheets. Customization Ideas Add output types: Instagram captions LinkedIn posts Video scripts Email subject lines Add sentiment / trend scoring. Add language detection & translation. Store aggregated content in a vector database (Pinecone / Chroma). Schedule runs using Cron trigger. Add multiple data sources (Reddit, RSS, HackerNews). Add approval workflow (Slack / Notion). Add metadata columns: source_urls top_authors most_shared Tags #content-ideation' #social #ai #google-gemini #apify #google-sheets #n8n
by Dinakar Selvakumar
How it works This workflow automatically publishes Instagram and Facebook posts using Google Sheets as a content calendar. Users add post details to a sheet, and the workflow handles scheduling, image processing, posting, and status updates without manual intervention. Step-by-step Scheduled Trigger The workflow runs automatically at a fixed interval (for example, every 15 minutes) to check for posts that are ready to be published. Configuration & Credentials A configuration step stores reusable values such as spreadsheet ID, sheet name, and platform settings, keeping the workflow easy to customize and secure. Data Retrieval & Filtering Posts are read from Google Sheets and filtered to include only rows marked as “Pending” and scheduled for the current time or earlier. Image Handling If an image link is provided, the workflow downloads the image from Google Drive. If no image is present, the post continues as text-only. Platform Routing Based on the selected platform (Instagram, Facebook, or both), the workflow routes the post to the appropriate publishing path. Social Media Publishing The post is published to Instagram and/or Facebook using the connected business account credentials. Status Update After publishing, the workflow updates the original Google Sheet with the post status (Success or Failed), published timestamp, and error message if applicable.
by Dinakar Selvakumar
Description This n8n workflow automatically publishes posts to Instagram Business accounts and Facebook Pages using Google Sheets as your content calendar. You schedule posts in the sheet, and n8n processes them, uploads media (if any), posts via Meta’s Graph API, and updates the sheet with success/failure. n8n How it Works Google Sheets rows marked “Pending” and due for publishing are picked up by a scheduled trigger. Posts are then routed to the proper social platforms and published via Meta’s Graph API. Finally, n8n writes back the publish status and timestamp to the sheet. n8n 🧠 Step-by-Step 1️⃣ Scheduled Trigger The workflow automatically checks Google Sheets at fixed intervals (e.g., every 15 min) for posts that are ready (status = Pending, publish time reached). 2️⃣ Config & Credentials Store reusable settings including: Google Sheets ID & Sheet name Meta App credentials (App ID, App Secret) Access token for Meta Graph API Instagram Business Account ID Facebook Page ID These configs make the workflow modular and secure. 3️⃣ Setup: Create a Meta (Facebook) App To post via the Graph API, you must first set up a Meta developer app: 🔗 Create App (Meta for Developers): Go here to start: https://developers.facebook.com/docs/development/create-an-app/ Steps: Log in at Meta for Developers. Click Create App and choose Business as the app type. Facebook Developers Add a name and contact email. In the app dashboard, Add Product → choose Instagram Graph API and Facebook Login. 📌 After creation, your app will have an App ID and App Secret in Settings → Basic — save both for n8n. 4️⃣ Link Accounts & Get IDs Before publishing you need: Instagram Business account (not a personal account) Facebook Page linked to that Instagram account Link them in Facebook Page settings → Linked Accounts. Then, generate an access token with permissions (instagram_basic, pages_show_list, etc.) using the Graph API Explorer and your new Meta app. From the token response or via Graph API calls, extract: Facebook Page ID Instagram Business Account ID These go into your n8n nodes for publishing. 5️⃣ Fetch & Filter Posts Read rows from Google Sheets and filter those ready to publish (status = Pending, scheduled time reached). 6️⃣ Image Handling If an image link is present, download or retrieve it (Google Drive or external URL). If not, continue with a text-only post. 7️⃣ Platform Routing Route the post to: Instagram publishing branch Facebook publishing branch (or both depending on the sheet’s platform column) 8️⃣ Posting via Meta Graph API Instagram Use Graph API endpoints to create and publish media containers and then make the publish call. Facebook Use Graph API to post to the Facebook Page feed (via /feed or /photos endpoint). 9️⃣ Update Sheet Status After each attempt, update Google Sheets with: Status (Success/Fail) Published timestamp