by Intuz
This n8n template from Intuz provides a complete and automated solution for transforming a static product image and a creative idea into a dynamic, AI-generated video ad. Using Google's state-of-the-art Veo 3 model, this workflow manages the entire creative process from concept to a final, downloadable video file. Who's this workflow for? E-commerce Brands & Marketers Advertising Agencies Social Media Content Creators Product Managers How it works 1. Submit a Creative Brief: The workflow starts when a user submits a creative idea via a simple web form (e.g., "A Pepsi can exploding into a vibrant disco party"). 2. Upload a Product Image: The user is then prompted to upload a corresponding image (e.g., a high-quality photo of the Pepsi can). 3. Log the Project in Airtable: The idea and the uploaded image are saved to an Airtable base, which acts as the central tracking system for all video generation projects. 4. AI Creative Analysis: Google Gemini analyzes both the user's text prompt and the uploaded image. It acts as an "AI Creative Director," generating a detailed video brief that reinterprets the static image according to the user's creative vision. 5. Generate Video with Veo 3: The detailed creative brief is sent to Google's Veo 3 AI video generation model. The workflow initiates a long-running task to create the video. 6. Retrieve the Final Video: After a brief waiting period, the workflow polls the Veo 3 API to retrieve the finished video, converts it into a binary file, and makes it available for download directly from the n8n execution log. Key Requirements to Use This Template n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow uses the official n8n LangChain integration (@n8n/n8n-nodes-langchain). If you are using a self-hosted version of n8n, please ensure this package is installed. Google Cloud Account: A Google Cloud Project with the Vertex AI API enabled. You must have access to both the Gemini and Veo 3 models within your project. You will need a Gemini API Key and a Google OAuth2 Credential configured for the Vertex AI scope. Airtable Account: An Airtable base with a table set up to track the video projects. It should have columns for Image Prompt, Image (Attachment), Video (Attachment/URL), and Status. Setup Instructions 1. Airtable Configuration (Crucial): In the Create a record, Get a record, and Update record nodes, connect your Airtable credentials and update the Base ID and Table ID to match your setup. In the Uploading Image in Airtable (HTTP Request) node, you must edit the URL and the "Authorization" header to include your Base ID, Table ID, and Personal Access Token. 2. Google AI Configuration (Gemini & Veo): In the Analyze image (Google Gemini) node, select your Gemini API credentials. In both the Generate Video Veo 3 and Get the the Video (HTTP Request) nodes: You must replace [Project ID] and [Location] in the URLs with your own Google Cloud Project ID and region (e.g., us-central1). Select your Google OAuth2 credentials for authentication. 3. Customize Video Parameters (Optional): In the Parse Request (Code) node, you can modify the JavaScript code to change video generation settings like aspectRatio, durationSeconds, and resolution. 4. Execute the Workflow: Activate the workflow. Open the Form URL from the Prompt your Idea node to start the process. Sample Videos Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by n8n Team
This workflow template shows how to load JSON data into a workflow and push that data into an App or convert it into a Spreadsheet file. Specifically, this workflow shows how to make a generic API request that returns JSON. It then shows how to load that into a Google Sheets spreadsheet, or convert it to .CSV file format. However, you can use the general pattern to load data into any app or convert to any spreadsheet file format (such as .xlsx).
by Baptiste Fort
Telegram Voice Message → Automatic Email Imagine: What if you could turn a simple Telegram voice message into a professional email—without typing, copying, pasting, or even opening Gmail? This workflow does it all for you: just record a voice note, and it will transcribe, format, and write a clean HTML email, then send it to the right person—all by itself. Prerequisites Create a Telegram bot** (via BotFather) and get the token. Have an OpenAI account** (API key for Whisper and GPT-4). Set up a Gmail account with OAuth2.** Import the JSON template** into your automation platform. 🧩 Detailed Flow Architecture 1. Telegram Trigger Node: Telegram Trigger This node listens to all Message events received by the specified bot (e.g., “BOT OFFICIEL BAPTISTE”). Whenever a user sends a voice message, the trigger fires automatically. > ⚠️ Only one Telegram trigger per bot is possible (API limitation). Key parameter: Trigger On: Message 2. Wait Node: Wait Used to buffer or smooth out calls to avoid collisions if you receive several voice messages in a row. 3. Retrieve the Audio File Node: Get a file Type:** Telegram (resource: file) Parameter:** fileId = {{$json"message"["file_id"]}} This node fetches the voice file from Telegram received in step 1 4. Automatic Transcription (Whisper) Node: Transcribe a recording Resource:** audio Operation:** transcribe API Key:** Your OpenAI account The audio file is sent to OpenAI Whisper: the output is clean, accurate text ready to be processed. 5. Optional Wait (Wait1) Node: Wait1 Same purpose as step 2: useful if you want to buffer or add a delay to absorb processing time. 6. Structured Email Generation (GPT-4 + Output Parser) Node: AI Agent This is the core of the flow: The transcribed text is sent to GPT-4 (or GPT-4.1-mini here, via OpenAI Chat Model) Prompt used:** You are an assistant specialized in writing professional emails. Based on the text below, you must: {{ $json.text }} Detect if there is a recipient's email address in the text (or something similar like "send to fort.baptiste.pro") If it’s not a complete address, complete it by assuming it ends with @gmail.com. Understand the user's intent (resignation, refusal, application, excuse, request, etc.) Generate a relevant and concise email subject, faithful to the content Write a professional message, structured in HTML: With a polite tone, adapted to the situation Formatted with HTML tags (`, `, etc.) No spelling mistakes in French My first name is jeremy and if the text says he is not happy, specify the wish to resign ⚠️ You must always return your answer in the following strict JSON format, with no extra text: { "email": "adresse@gmail.com", "subject": "Objet de l’email", "body": "Contenu HTML de l’email" } Everything is strictly validated and formatted with the Structured Output Parser node. 7. Automatic Email Sending (Gmail) Node: Send a message To:** {{$json.output.email}} Subject:** {{$json.output.subject}} HTML Body:** {{$json.output.body}} This node takes the JSON structure returned by the AI and sends the email via Gmail, to the right recipient, with the correct subject and full HTML formatting.
by jason
Allow your friends to get updates when you do not meet your workout goals so that they can help you get to where you need to be! Monitors your Strava account and sends an email to three accountability partners each day if you have not had enough activity so that they can reach out to you with some encouragement. Make sure that you configure the Accountability Settings to meet your needs along with the Strava and Send Email credentials.
by jason
Using Typeform to push task requests to an n8n webhook that then categorizes the request and assigns it in ClickUp accordingly. In order to get his workflow working for yourself, you will need: ClickUp account Typeform account Credentials for these services ClickUp configured with appropriate lists Typeform setup with options that correspond with ClickUp lists. You may modify this workflow to meet your specific needs and configuration. This is a very simple version of this workflow and you can make it as complicated as you wish to meet your requirements.
by n8n Team
This workflow compares 2 datasets from a single database. Two SQL nodes create a slightly different summary report based on the payments table. Both reports have the same structure, but different time periods. In addition to that, output from the "Orders from 2004 and 2005" node has an extra manipulation on the ordercount variable. This makes Compare Datasets node to produce four outputs: data in A Only Branch, in B Only Branch, Same Branch records and Different Branch records. Please refere to the MySQL Tutorial website and download the example database: https://www.mysqltutorial.org/mysql-sample-database.aspx
by John Pranay Kumar Reddy
🧾 Summary This workflow monitors Kubernetes pod CPU usage using Prometheus, and sends real-time Slack alerts when CPU consumption crosses a threshold (e.g., 0.8 cores). It groups pods by application name to reduce noise and improve clarity, making it ideal for observability across multi-pod deployments like Argo CD, Loki, Promtail, applications etc. 👥 Who’s it for Designed for DevOps and SRE teams and platform teams, this workflow is 100% no-code, plug-and-play, and can be easily extended to support memory, disk, or network spikes. It eliminates the need for Alertmanager by routing critical alerts directly into Slack using native n8n nodes. ⚙️ What it does This n8n workflow polls Prometheus every 5 minutes ⏱️, checks if any pod's CPU usage crosses a defined threshold (e.g., 0.8 cores) 🚨, groups them by app 🧩, and sends structured alerts to a Slack channel 💬. 🛠️ How to set up 🔗 Set your Prometheus URL with required metrics (container_cpu_usage_seconds_total, kube_pod_container_resource_limits) 🔐 Add your Slack bot token with chat:write scope 🧩 Import the workflow, customize: Threshold (e.g., 0.8 cores) Slack channel Cron schedule 📋 Requirements A working Prometheus stack with kube-state-metrics Slack bot credentials n8n instance (self-hosted or cloud) 🧑💻 How to customize 🧠 Adjust threshold values or query interval 📈 Add memory/disk/network usage metrics 💡 This is a plug-and-play Kubernetes alerting template for real-time observability. 🏷️ Tags: Prometheus, Slack, Kubernetes, Alert, n8n, DevOps, Observability, CPU Spike, Monitoring Prometheus Spike Alert to Slack
by Mohamed Abubakkar
How it Works. The workflow runs automatically every day and collects analytics data for both today and yesterday. It cleans and standardizes both datasets in the same way so they are easy to compare. After that, it measures how performance has changed from one day to the next and interprets those changes to understand trends and context. Once all calculations are finished, the AI creates a clear, easy-to-read summary of what happened. This summary is then formatted and sent through the required communication channels, while the final data is saved for tracking over time and for creating follow-up tasks if needed. Key Features: Trigger runs once per day recommended (11: 57 PM). Fetch seperate data for today and yesterday in node. Compare the two days data and highlight if traffic is less Add the trend lowTraffic for low traffic identification. Using GPT-4 Mini for human readable summary to suit different communication channels. Sending report to WhatsApp / Email Stored the final structure data to Google Sheet for future analytics and historical record Setup Steps 1. Connect Required Credentials You must connect the following credentials: Google Analytics API Google Sheets OpenAi API Email SMTP WhatsApp API ClickUp API 2. Replace Defalut Values Update the workflow with: Your Google Analytics Id's Your Google Sheet Tabs Replace SMTP Credentials, sender and recipients Change with your OpenAi API key Create your WhatApi API Credentials ClickUp API Key's 3. Customize Email Template Modify subject, message body, or formatting style based on your reporting standards. 4. Adjust Trigger You may choose: Manual Trigger Cron Trigger for daily/weekly report Webhook Trigger integrated with your system Detailed Process Flow Schedule Trigger Node Type: Trigger Node Purpose: Automates the start of the workflow. Details: Runs every day (or every hour if real-time monitoring is needed). Eliminates manual data collection and ensures consistent reporting. Analytics Reports Node Type :- Google Analytics Node Purpose: Fetch website performance data Metrics includes: users, sessions, page views Combine the Data Node Type: Merge Node (Append) Purpose: Combines today’s and yesterday’s datasets into a single item for comparison. Details: Prepares data for calculating percentage changes. Maintains proper structure for further nodes. Calculate Percent Changes Node Type: Function Node Purpose: Computes day-over-day percentage changes for users, sessions, and page views. Details: Formula: ((today - yesterday) / yesterday) * 100 Handles increases and decreases correctly. Outputs values used for trend indicators and alerts. Generate AI Summary Node Type: OpenAI Node Purpose: Produces human-readable, professional insights about the daily analytics. Details: Summarizes key changes, trends, and recommendations. Provides context such as low-traffic warnings. Text output is used in emails, WhatsApp, and ClickUp tasks. Send Email or Whatsapp to Dedicated Person / Marketing Team Node Type: Email Node / Whatspp Purpose: Sends daily alert or report emails or whatsapp. Details: Includes formatted metrics and AI summary. Email subject and body clearly indicate trends and recommendations. Workflow Benefits Fully automated daily GA reporting AI-generated summaries for clear insights Alerts only triggered when necessary Historical logging for trends and dashboards Actionable tasks automatically created in ClickUp Multi-channel delivery via Email and WhatsApp Handles low-traffic scenarios gracefully
by Oneclick AI Squad
A secure, scalable enterprise AI orchestration layer built on the Model Context Protocol (MCP). This workflow standardizes tool access across all business systems, enforces permission-based data handling, applies contextual reasoning via Claude AI, and provides a single governance plane for multi-agent AI deployments. How it works Receive AI Agent Request - Unified MCP webhook accepts tool or context requests from any agent Enterprise Auth & RBAC - Validates JWT, resolves role-based access controls, enforces tenant isolation Context Assembly - Builds full enterprise context: user profile, org policies, active sessions, prior tool calls Claude AI Orchestration - Reasons over context to select optimal tool chain, validate intent, plan execution Policy Enforcement Engine - Applies data classification, DLP rules, and geo/time-based restrictions Multi-System Tool Dispatch - Routes to CRM, ERP, HRMS, Data Warehouse, or custom APIs in parallel Response Aggregation - Merges multi-tool results, applies post-processing and redaction rules Compliance Logging - SOC2/ISO27001-ready audit trail with data lineage tracking Return Enriched Context - Delivers MCP-compliant response with reasoning trace back to agent Setup Steps Import workflow into n8n Configure credentials: Anthropic API - Claude AI for orchestration and contextual reasoning Google Sheets - RBAC policy store, session registry, audit log SMTP / Slack - Security and compliance notifications JWT Secret - For enterprise token validation Populate the RBAC policy sheet with roles, permissions, and data classifications Configure your enterprise system endpoints in the tool dispatch nodes Set your tenant IDs and org-level data policies Activate workflow and register the webhook URL with your AI agent platform Sample Enterprise MCP Request { "mcpVersion": "1.1", "agentId": "sales-agent-prod-007", "jwtToken": "eyJhbGciOiJIUzI1NiJ9...", "tenantId": "ORG-ACME-001", "userId": "john.doe@acme.com", "userRole": "sales_manager", "toolRequests": [ { "toolName": "crm.get_pipeline", "parameters": { "region": "APAC" } }, { "toolName": "erp.get_inventory", "parameters": { "sku": "PROD-001" } } ], "agentGoal": "Prepare a quarterly sales brief for the APAC team meeting", "dataClassification": "INTERNAL", "sessionId": "sess-xyz-001" } Enterprise Features Multi-tenant isolation** — strict org boundary enforcement RBAC + ABAC** — role and attribute-based access control per tool per data class Data Loss Prevention (DLP)** — redacts PII/secrets before returning to agent Contextual AI reasoning** — Claude plans the optimal tool chain for agent goals SOC2 / ISO 27001** audit trail with data lineage Geo & time-based policy** — restrict tool access by region or business hours Explore More LinkedIn & Social Automation: Contact us to design AI-powered lead nurturing, content engagement, and multi-platform reply workflows tailored to your growth strategy.
by Agent Circle
This N8N template makes it easy to extract key YouTube video data - including title, view count, like count, comment count, and many more - and save it directly into a connected Google Sheet. Use cases are many: Whether you're YouTubers, content strategists, growth marketers, and automation engineers, this tool gives you fast, structured access to video-level insights in seconds. How It Works The workflow begins when you click Execute Workflow or Test Workflow manually in N8N. It reads the list of video URLs in the connected Google Sheet. Only the URLs marked with the Ready status will be processed. The tool loops through each video and prepares the necessary data for the YouTube API call later. For each available URL, the tool extracts the video ID and sends a request to the YouTube API to fetch key metrics. The response is checked: If successful: the video’s statistics are written back to the corresponding row in the Google Sheet and the row's status is marked as Finished. If unsuccessful: the row's status is updated to Error for later review. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the YouTube - Get Video Statistics Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet template: Node Google Sheets - Get Video URLs → connected to Tab Video URLs; Node Google Sheets - Update Data → connected to Tab Video URLs; Node Google Sheets - Update Data - Error → connected to Tab Video URLs. For YouTube access, set up a GET method to connect to YouTube API in the following node: Node HTTP - Find Video Data. In your connected Google Sheet, enter the video URLs that you want to crawl and set the rows' status to Ready. Run the workflow by clicking Execute Workflow or Test Workflow in N8N. View the results in your connected Google Sheet: Successful fetches will update the rows' status in Column A in the Video URLs tab to Finished and the video metrics will populate. If the call fails, the rows' status in Column A in the tab will be marked as Error. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries automatically. If you want to fetch additional video fields or analytics (like tags, category ID, etc.), you can expand the HTTP - Find Video Data node to include those. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Rosh Ragel
Automatically Send Square Summary Report for Yesterday's Sales via Microsoft Outlook What It Does This workflow automatically connects to the Square API and generates a daily sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run daily and pull the previous day's sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Microsoft Outlook credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every day at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the specified report_date Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent to the chosen third party Example Use Cases Automatically send Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per day Instead of a daily summary, you can modify this workflow to produce a weekly summary once a week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Jitesh Dugar
Customer Support Ticket Documentation Automation Automatically transform resolved support tickets into professional, AI-powered PDF documentation with complete tracking and team notifications. Overview This workflow automates the entire process of documenting resolved support tickets — from receiving ticket data to generating professional PDF case studies, storing them in Google Drive, tracking in spreadsheets, and notifying your team. Powered by AI, it creates consistent, high-quality documentation that can be reused for knowledge base articles, training materials, and compliance records. What This Workflow Does Receives resolved support tickets via webhook from your support platform Extracts and normalizes ticket data (works with Zendesk, Freshdesk, and custom formats) Generates AI-powered summaries using OpenAI GPT-4, creating structured case studies with: Problem description Troubleshooting steps taken Final resolution Key takeaways and prevention tips Creates professional PDF documents with branded HTML templates Uploads PDFs to organized Google Drive folders Tracks all tickets in a Google Sheets database for reporting and analytics Sends Slack notifications to your team with links to completed documentation Handles errors gracefully with automatic alerts when issues occur Key Features Fully Automated:** Zero manual intervention after setup AI-Powered Documentation:** Intelligent summaries that extract insights from raw ticket data Professional Output:** Branded, print-ready PDFs with modern styling Multi-Platform Support:** Works with any support tool that can send webhooks Centralized Tracking:** All tickets logged in Google Sheets for easy reporting Error Handling:** Built-in failure detection with Slack alerts Customizable:** Easy to brand with your company colors, logo, and styling Scalable:** Handles hundreds of tickets per day Use Cases Knowledge Base Building:** Automatically create searchable documentation from real support cases Team Training:** Build a library of resolved issues for onboarding new support agents Compliance & Audit:** Maintain complete records of all customer interactions Performance Analytics:** Track resolution times, common issues, and agent performance Customer Success:** Share professional case studies with stakeholders Process Improvement:** Identify recurring issues and optimize workflows Prerequisites Required Services & Accounts n8n** (self-hosted or cloud) OpenAI Account** with API access PDFMunk Account** (for HTML → PDF conversion) Google Workspace** (for Drive & Sheets) Slack Workspace** (optional but recommended) Support Platform** that can send webhooks (Zendesk, Freshdesk, Intercom, etc.) Required Credentials OpenAI API Key PDFMunk API Key Google Drive OAuth2 credentials Google Sheets OAuth2 credentials Slack Bot Token (OAuth2) Setup Instructions 1. Import the Workflow Copy the workflow JSON. In n8n, click “Import from File” or “Import from Clipboard.” Paste and import. 2. Configure Credentials OpenAI API Get API key from OpenAI Add in n8n: Credentials → OpenAI API → Paste key PDFMunk API Sign up at pdfmunk.com Copy API key → Add in Credentials → HtmlcsstopdfApi Google Drive OAuth2 Create project at Google Cloud Console Enable Drive API Create OAuth 2.0 credentials Add in n8n: Credentials → Google Drive OAuth2 → Connect Google Sheets OAuth2 Enable Google Sheets API in the same project Add in n8n: Credentials → Google Sheets OAuth2 → Connect Slack OAuth2 Create app at Slack API Add scopes: chat:write, chat:write.public Install to workspace Add bot token in Credentials → Slack OAuth2 API 3. Configure Node Settings Google Drive Folder ID Create a folder in Drive for PDFs Copy folder ID from the URL → https://drive.google.com/drive/folders/FOLDER_ID_HERE Paste in the “Upload to Google Drive” node Google Sheets Configuration Create a new sheet titled “Support Ticket Documentation Log.” Add these headers in Row 1: | Ticket ID | Subject | Customer Name | Customer Email | Agent Name | Priority | Category | Resolved Date | Resolution Time | PDF Link | Document Generated | Status | | --------- | ------- | ------------- | -------------- | ---------- | -------- | -------- | ------------- | --------------- | -------- | ------------------ | ------ | Copy Sheet ID from URL → https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Paste it in the “Update Google Sheets” node configuration. Slack Channel ID Right-click your Slack channel → View Channel Details Copy the Channel ID Update it in: “Send Slack Notification” node “Error – PDF Failed” node “Error – Upload Failed” node 4. Configure Webhook in Support Tool Activate the workflow in n8n Copy the Webhook URL from the “Webhook – Receive Ticket” node In Zendesk/Freshdesk: Trigger: “Ticket Status = Resolved” Method: POST Paste the n8n webhook URL Send ticket data as JSON 5. Test the Workflow Click “Execute Workflow” manually Send a test webhook Verify each step completes successfully Check: Generated PDF in Google Drive Row entry in Google Sheets Slack notification delivery How It Works Webhook Trigger → Receives resolved ticket Data Extraction → Normalizes ticket fields AI Summarization (OpenAI) → Generates structured summary HTML Formatting → Styles and adds branding PDF Conversion (PDFMunk) → Converts HTML → PDF Google Drive Upload → Saves and returns shareable link Sheets Logging → Appends ticket info + PDF link Slack Notification → Notifies team with summary Error Handling → Detects and reports failures Result → Clean, documented ticket case study in minutes Customization Branding Update company name, logo URL, and color scheme in the “Format HTML” node. Default color: #4CAF50 → Replace with your brand color. AI Prompt Customization Modify “AI Summarization (OpenAI)” node to add: Industry-specific terms Additional sections or insights Different summary tone or length Notification Customization Add @mentions or emojis in Slack messages for better visibility. Data Flow Webhook → Extract Data → AI Summary → Format HTML → Convert to PDF ↓ Download PDF → Upload to Drive → Log in Sheets → Notify Team ↓ Error Handling (if any) Expected Output PDF Document Includes: Branded header with company name/logo Resolution time badge Ticket metadata (ID, priority, customer, agent, etc.) Full AI-generated case study Professional footer with timestamp Google Sheets Entry: All ticket info Resolution metrics Direct PDF link Status = “Generated” Slack Notification: Summary of ticket Clickable PDF link Timestamp Performance Processing Time:** 10–20 seconds/ticket Capacity:** 100+ tickets/day PDF Size:** 50–300 KB Troubleshooting Webhook not triggering → Check webhook URL, trigger conditions, and public access. PDF generation fails → Verify HTML syntax and PDFMunk API key. Google Drive upload fails → Re-authenticate credentials or check folder permissions. Slack notification missing → Ensure bot token, scopes, and channel ID are valid. Data extraction errors → Adjust field mappings or inspect payload format. Best Practices Test before production rollout Monitor first-week error logs Organize Drive by date/priority Validate Sheets columns Use a dedicated Slack channel Archive old PDFs regularly Review AI summaries weekly Document configuration changes Security Notes All credentials stored securely in n8n PDF links are restricted by Drive sharing settings Webhooks use HTTPS for secure data transfer No sensitive info logged in error messages Future Enhancements Multi-language summaries Integration with Confluence or Notion Customer satisfaction feedback link ML-based issue categorization Analytics dashboard Weekly email digests Public knowledge base generation Support Resources n8n Documentation n8n Community OpenAI API Docs PDFMunk Support Google Drive API Slack API Docs License This workflow template is provided as-is for use with n8n under the MIT License.