by Evoort Solutions
🎁 Automate YouTube Giveaway Winner Selection with YouTube Comments Scraper API Description: Easily automate your YouTube video giveaways using n8n and the YouTube Comments Scraper API. This workflow fetches comments, selects a random winner, logs results to Google Sheets, and notifies the admin—all hands-free! 🧩 Node-by-Node Breakdown | Node | Name | Purpose | | ---- | -------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | | 1️⃣ | Form Trigger | Captures a YouTube video URL from a user via form submission. | | 2️⃣ | Fetch YouTube Comments | Makes a POST request to YouTube Comments Scraper API to retrieve comments. | | 3️⃣ | Check API Response Status | Ensures that the response status is 200 before proceeding. | | 4️⃣ | Select Random Commenter | Parses the comments and selects a random commenter as the giveaway winner. | | 5️⃣ | Log Winner to Google Sheet | Appends winner name, video URL, and date to a Google Sheet for record-keeping. | | 6️⃣ | Notify Winner Email | Sends a congratulatory email to the admin with the selected winner's name. | | 7️⃣ | Notify: Invalid API Response | If the API fails, sends an alert to the admin about the issue. | 🔑 How to Get Your RapidAPI Key To use the YouTube Comments Scraper API, follow these steps: Go to YouTube Comments Scraper API. Sign in or create a free RapidAPI account. Click the "Subscribe to Test" button. Copy your x-rapidapi-key from the "Code Snippets" or "Header Parameters" section. Paste it into your HTTP Request node in n8n. 🎯 Use Case & Benefits ✅ Use Case: Automatically pick a random commenter from a YouTube video as a giveaway winner. 🚀 Benefits: Fully automated** – no manual comment scanning or random selection. Accurate & fair** – random selection from valid commenters only. Time-saving** – especially for creators running multiple giveaways. Integrated logging** – keep a historical record of all winners in Google Sheets. Email alerts** – get notified whether the flow succeeds or fails. 👥 Who Is This For? YouTube Content Creators** running giveaways. Marketing Teams** promoting products via YouTube contests. Agencies** managing influencer campaigns. Developers & Automation Enthusiasts** looking to simplify giveaway processes. 💡 Why Use YouTube Comments Scraper API? The YouTube Comments Scraper API offers a simple and effective way to extract public YouTube comments programmatically. It’s fast, reliable, and integrates smoothly with platforms like n8n. You’ll use this API: To retrieve all comments from a YouTube video. To power fair and transparent giveaways. To trigger downstream automations like winner logging and notification. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Rahul Joshi
Description: This workflow automates team capacity monitoring using Jira data to identify over-allocated team members and alert managers instantly. It ensures proactive workload management by fetching active issues, calculating utilization rates, logging capacity metrics, and sending detailed email alerts when members exceed 100% capacity. It helps project managers prevent burnout, balance workloads, and maintain operational efficiency — all with zero manual tracking. What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Start the capacity analysis manually on demand. 📋 Fetch Active Jira Issues – Retrieves all “In Progress” tasks from Jira to analyze workloads. ✅ Data Validation – Checks whether Jira returned valid data before continuing. True Path: Moves to capacity calculation. False Path: Logs query failure to error tracking sheet. 📊 Capacity Calculator – Aggregates logged hours per user and calculates utilization percentage based on an 8-hour daily capacity. 📈 Log Capacity Data to Tracking Sheet – Appends capacity metrics (Assignee, Total Hours, Utilization %, Status) to a Google Sheet for historical tracking and trend analysis. ⚠️ Over-Allocation Check – Identifies team members exceeding 100% utilization (status = “Overallocated”). 📢 Alert Report Generator – Builds a dynamic report summarizing all over-allocated members, their logged hours, utilization %, and corrective suggestions. Generates both alert and “All Clear” reports based on findings. 📧 Send Over-Allocation Alert to Manager – Sends an automated Gmail alert to the project manager, including severity-based subject lines and detailed breakdown of each over-allocated member. 🚨 Log Query Failures to Error Sheet – Records any Jira API or data retrieval issues in the “error log sheet” for monitoring and debugging. Prerequisites Jira account with API access Google Sheets for “Team Capacity Tracking” and “Error Log” Gmail credentials for automated email delivery Key Benefits ✅ Early detection of team over-allocation ✅ Automated data logging and historical tracking ✅ Real-time email alerts to prevent burnout ✅ Data-driven sprint planning and workload balancing ✅ Zero manual monitoring required Perfect For Project Managers and Scrum Masters tracking team load Engineering teams managing multiple active sprints Organizations looking to automate workload visibility HR and PMOs monitoring resource utilization trends
by Robert Breen
This workflow pulls all tasks from your Monday.com board each day and logs them into a Google Sheet. It creates a daily snapshot of your project’s progress and statuses for reporting, tracking, or analysis. ⚙️ Setup Instructions 1️⃣ Connect Monday.com API In Monday.com → go to Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API → paste your token and save 2️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access In the workflow, select your Spreadsheet ID and the correct Sheet Name 🧠 How it works Trigger**: Runs on click or via schedule (e.g., daily at 9 AM). Get many items (Monday.com)**: Fetches all tasks and their current status. Today's Date Node**: Adds the current date for snapshot logging. Map Fields**: Normalizes task name and status. Google Sheets (Append)**: Saves all tasks with status + date into your sheet for historical tracking. 📬 Contact Need help customizing this (e.g., filtering by status, emailing daily reports, or adding charts)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by automedia
Transcribe New YouTube Videos and Save to Supabase Who's It For? This workflow is for content creators, marketers, researchers, and anyone who needs to quickly get text transcripts from YouTube videos. If you analyze video content, repurpose it for blogs or social media, or want to make videos searchable, this template will save you hours of manual work. What It Does This template automatically monitors multiple YouTube channels for new videos. When a new video is published, it extracts the video ID, retrieves the full transcript using the youtube-transcript.io API, and saves the structured data—including the title, author, URL, and full transcript—into a Supabase table. It intelligently filters out YouTube Shorts by default and includes error handling to ensure that only successful transcriptions are processed. Requirements A Supabase account with a table ready to receive the video data. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. You can find this using a free online tool like TunePocket's Channel ID Finder. How to Set Up Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with your own. The workflow uses these IDs to create RSS links and find new videos. Configure API Credentials: Find the "youtube-captions" HTTP Request node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: Navigate to the "Add to Content Queue Table" node. Create new credentials for your Supabase account using your Project URL and API key. Once connected, select your target table and map the incoming fields (title, source_url, content_snippet, etc.) to the correct columns in your table. Set Your Schedule (Optional): The workflow starts with a manual trigger. To run it automatically, replace the "When clicking ‘Execute workflow’" node with a Schedule node and set your desired interval (e.g., once a day). Activate the Workflow: Save your changes and toggle the workflow to Active in the top right corner. How to Customize Transcribe YouTube Shorts:* To include Shorts in your workflow, select the *"Does url exist?"** If node and delete the second condition that checks for youtube.com/shorts. Change Your Database:* Don't use Supabase? Simply replace the *"Add to Content Queue Table"* node with another database or spreadsheet node, such as *Google Sheets, **Airtable, or n8n's own Table.
by BizThrive.ai
Turn your Telegram bot into a real-time research assistant with this intelligent n8n workflow. Designed for founders, analysts, and knowledge workers, this automation uses Perplexity Sonar and Sonar Pro to deliver concise, citation-rich answers to complex queries — directly inside Telegram. 🔍 What It Does ✅ Smart Query Routing** Automatically selects the right tool based on query complexity — Sonar for fast lookups, Sonar Pro for multi-source synthesis. 📚 Cited Research Summaries** Includes clickable URLs from Perplexity’s source data for transparency and auditability. 🧠 Session Memory** Maintains chat context using Telegram chat ID for follow-up questions and threaded insight. 🔐 Secure Access Filter** Restricts bot usage to authorized Telegram users. ⚙️ Customizable Agent Behavior** Easily adjust tone, tool preferences, and citation style via system message. 🚀 Use Cases Market research & competitor analysis Academic and scientific deep-dives Legal and transcript summarization Podcast, video, and trend monitoring Personal AI assistant for founders and consultants 🛠 Setup Instructions Create a Telegram bot via @BotFather and add your token. Add your OpenAI and Perplexity API keys. Update the filter node with your Telegram user ID. Deploy and start chatting — responses appear in Telegram.
by Yehor EGMS
🔐 n8n Workflow: Access Control for Internal Chats or Chatbots This n8n workflow helps you restrict access to your internal chats or chatbots so that only authorized team members can interact with them. It's perfect for setups using Telegram, Slack, or other corporate messengers, where you need to prevent unauthorized users from triggering internal automations. 📌 Section 1: Trigger & Input ⚡ Receive Message (Telegram Trigger) Purpose: Captures every incoming message from a user interacting with your Telegram bot (or another messenger). How it works: When a user sends a message, it instantly triggers the workflow and passes their username or ID as input data. Benefit: Acts as the entry point for verifying whether a user is allowed to proceed. 📌 Section 2: Access Table Lookup 📋 User Access Table (Data Node / Spreadsheet / DB Query) Purpose: Stores all your team members and their current access status. Structure Example: | Username | Access Status | |----------|---------------| | user1 | granted | | user2 | denied | | user3 | granted | Benefit: Centralized access control — you can easily update user permissions without editing the workflow. 📌 Section 3: Permission Check 🧩 Check Access (IF Node) Purpose: Compares the incoming user's name or ID against the access table. Logic: If status = granted → Allow message to continue If status = denied → Stop workflow execution Benefit: Ensures only approved users can interact with your automations or receive responses. 📌 Section 4: Response Handling 💬 Send Reply (Telegram Node) Purpose: Sends a message back to the user depending on their access level. Paths: ✅ Granted: Sends the normal bot response or triggers the main process. ❌ Denied: Sends no reply (or an optional "Access denied" message). Benefit: Prevents unauthorized access while maintaining a seamless experience for approved users. 📊 Workflow Overview Table | Section | Node Name | Purpose | |---------|-----------|---------| | 1. Trigger | Receive Message | Captures incoming messages | | 2. Access Table | User Access Table | Stores usernames + permissions | | 3. Check | Check Access | Verifies if user has permission | | 4. Response | Send Reply | Sends or blocks response based on status | 🎯 Key Benefits 🔐 Secure access control: Only trusted users can trigger your internal automations. ⚙️ Dynamic management: Easily update user permissions from a table or database. 🧠 Lightweight setup: Just three nodes create a fully functional access gate. 🚀 Scalable foundation: Extend it with role-based access or activity logging later.
by Fahmi Fahreza
Analyze Trustpilot & Sitejabber sentiment with Decodo + Gemini to Sheets Sign up for Decodo HERE for Discount This template scrapes public reviews from Trustpilot and Sitejabber with a Decodo tool, converts findings into a flat, spreadsheet-ready JSON, generates a concise sentiment summary with Gemini, and appends everything to Google Sheets. It’s ideal for reputation snapshots, competitive analysis, or lightweight BI pipelines that need structured data and a quick narrative. Who’s it for? Marketing teams, growth analysts, founders, and agencies who need repeatable review collection and sentiment summaries without writing custom scrapers or manual copy/paste. How it works A Form Trigger collects the Business Name or URL. Set (Config Variables) stores business_name, spreadsheet_id, and sheet_id. The Agent orchestrates the Decodo tool and enforces a strict JSON schema with at most 10 reviews per source. Gemini writes a succinct summary and recommendations, noting missing sources with: “There’s no data in this website.” A Merge node combines JSON fields with the narrative. Google Sheets appends a row. How to set up Add Google Sheets, Gemini, and Decodo credentials in Credential Manager. Replace (YOUR_SPREADSHEET_ID) and (YOUR_SHEET_ID) in Set: Config Variables. In Google Sheets, select Define below and map each column explicitly. Keep the parser and agent connections intact to guarantee flat JSON. Activate, open the form URL, submit a business, and verify the appended row.
by Matthew
Instagram Hashtag Lead Generation Automate the process of finding and qualifying Instagram leads based on hashtags. This workflow reads hashtags from Google Sheets, scrapes Instagram for posts using Apify, analyzes caption content and language, compiles unique usernames, gathers detailed user info, and filters leads based on follower count. How It Works Fetch Hashtags The workflow starts and pulls a list of hashtags from a Google Sheet. Scrape Instagram Posts For each hashtag, it builds Instagram explore URLs and scrapes posts using Apify. Analyze Captions Each caption is cleaned, hashtags and links are removed, and language/content is analyzed (English/French/Spanish). Extract & Filter Usernames Usernames are combined and deduplicated, their Instagram profiles scraped for follower counts and other details. Qualified Leads Only users with followers in your target range are kept as qualified leads for outreach or analysis. Requirements An n8n instance. Apify API key. Google account with a Sheet containing hashtags. Apify Instagram Scraper actor access. The Google Sheet should have a column with hashtags. Setup Instructions Add Credentials In n8n, add your Apify API key and Google Sheets credentials. Configure Google Sheets Nodes Choose your credentials, spreadsheet, and sheet name in the “Get list of Hashtags” node. Configure Apify Request Nodes Enter your Apify API key and select the Instagram scraper actors. Adjust Filtering Edit the min/max follower count in the relevant filter node to match your needs. Test & Run Manually execute the workflow or add a trigger to run on a schedule. Customization Options 💡 Trigger:** Add a schedule or webhook to automate execution. Language Filtering:** Modify keyword lists or add language detection logic. Lead Output:** Extend the workflow to save leads to a CRM or send notifications. Details Nodes used in workflow: Manual Trigger Google Sheets Code HTTP Request (Apify) IF (Conditional) Aggregate Remove Duplicates Sticky Note
by Julian Kaiser
Turn Your Reading Habit into a Content Creation Engine This workflow is built for one core purpose: to maximize the return on your reading time. It turns your passive consumption of articles and highlights into an active system for generating original content and rediscovering valuable ideas you may have forgotten. Why This Workflow is Valuable End Writer's Block Before It Starts:** This workflow is your personal content strategist. Instead of staring at a blank page, you'll start your week with a list of AI-generated content ideas—from LinkedIn posts and blog articles to strategic insights—all based on the topics you're already deeply engaged with. It finds the hidden connections between articles and suggests novel angles for your next piece. Rescue Your Insights from the Digital Abyss:** Readwise is fantastic for capturing highlights, but the best ones can get lost over time. This workflow acts as your personal curator, automatically excavating the most impactful quotes and notes from your recent reading. It doesn't just show them to you; it contextualizes them within the week's key themes, giving them new life and relevance. Create an Intellectual Flywheel:** By systematically analyzing your reading, generating content ideas, and saving those insights back into your "second brain," you create a powerful feedback loop. Your reading informs your content, and the process of creating content deepens your understanding, making every reading session more valuable than the last. How it works This workflow automates the process of generating a "Weekly Reading Insights" summary based on your activity in Readwise. Trigger:** It can be run manually or on a weekly schedule Fetch Data:** It fetches all articles and highlights you've updated in the last 7 days from your Readwise account. Filter & Match:** It filters for articles that you've read more than 10% of and then finds all the corresponding highlights for those articles. Generate Insights:** It constructs a detailed prompt with your reading data and sends it to an AI model (via OpenRouter) to create a structured analysis of your reading patterns, key themes, and content ideas. Save to Readwise:** Finally, it takes the AI-generated markdown, converts it to HTML, and saves it back to your Readwise account as a new article titled "Weekly Reading Insights". Set up steps Estimated Set Up Time:** 5-10 minutes. Readwise Credentials: Authenticate the two HTTP Request nodes and the two Fetch nodes with your Readwise API token Get from Reader API. Also check how to set up Header Auth AI Model Credentials: Add your OpenRouter API key to the OpenRouter Chat Model node. You can swap this for any other AI model if you prefer. Customize the Prompt: Open the Prepare Prompt Code node to adjust the persona, questions, and desired output format. This is where you can tailor the AI's analysis to your specific needs. Adjust Schedule: Modify the Monday - 09:00 Schedule Trigger to run on your preferred day and time.
by DIGITAL BIZ TECH
SharePoint → Supabase → Google Drive Sync Workflow Overview This workflow is a multi-system document synchronization pipeline built in n8n, designed to automatically sync and back up files between Microsoft SharePoint, Supabase/Postgres, and Google Drive. It runs on a scheduled trigger, compares SharePoint file metadata against your Supabase table, downloads new or updated files, uploads them to Google Drive, and marks records as completed — keeping your databases and storage systems perfectly in sync. Workflow Structure Data Source:** SharePoint REST API for recursive folder and file discovery. Processing Layer:** n8n logic for filtering, comparison, and metadata normalization. Destination Systems:** Supabase/Postgres for metadata, Google Drive for file backup. SharePoint Sync Flow (Frontend Flow) Trigger:** Schedule Trigger Runs at fixed intervals (customizable) to start synchronization. Fetch Files:** Microsoft SharePoint HTTP Request Recursively retrieves folders and files using SharePoint’s REST API: /GetFolderByServerRelativeUrl(...)?$expand=Files,Folders,Folders/Files,Folders/Folders/Folders/Files Filter Files:** filter files A Code node that flattens nested folders and filters unwanted file types: Excludes system or temporary files (~$) Excludes extensions: .db, .msg, .xlsx, .xlsm, .pptx Normalize Metadata:** normalize last modified date Ensures consistent Last_modified_date format for accurate comparison. Fetch Existing Records:** Supabase (Get) Retrieves current entries from n8n_metadata to compare against SharePoint files. Compare Datasets:** Compare Datasets Detects new or modified files based on UniqueId, Last_modified_date, and Exists. Routes only changed entries forward for processing. File Processing Engine (Backend Flow) Loop:** Loop Over Items2 Iterates through each new or updated file detected. Build Metadata:** get metadata and Set metadata Constructs final metadata fields: file_id, file_title, file_url, file_type, foldername, last_modified_date Generates fileUrl using UniqueId and ServerRelativeUrl if missing. Upsert Metadata:** Insert Document Metadata Inserts or updates file records in Supabase/Postgres (n8n_metadata table). Operation: upsert with id as the primary matching key. Download File:** Microsoft SharePoint HTTP Request1 Fetches the binary file directly from SharePoint using its ServerRelativeUrl. Rename File:** rename files Renames each downloaded binary file to its original file_title before upload. Upload File:** Upload file Uploads the renamed file to Google Drive (My Drive → root folder). Mark Complete:** Postgres Updates the Supabase/Postgres record setting Loading Done = true. Optional Cleanup:** Supabase1 Deletes obsolete or invalid metadata entries when required. Integrations Used | Service | Purpose | Credential | |----------|----------|-------------| | Microsoft SharePoint | File retrieval and download | microsoftSharePointOAuth2Api | | Supabase / Postgres | Metadata storage and synchronization | Supabase account 6 ayan | | Google Drive | File backup and redundancy | Google Drive account 6 rn dbt | | n8n Core | Flow control, dataset comparison, batch looping | Native | System Prompt Summary > “You are a SharePoint document synchronization workflow. Fetch all files, compare them to database entries, and only process new or modified files. Download files, rename correctly, upload to Google Drive, and mark as completed in Supabase.” Workflow rule summary: > “Maintain data integrity, prevent duplicates, handle retries gracefully, and continue on errors. Skip excluded file types and ensure reliable backups between all connected systems.” Key Features Scheduled automatic sync across SharePoint, Supabase, and Google Drive Intelligent comparison to detect only new or modified files Idempotent upsert for consistent metadata updates Configurable file exclusion filters Safe rename + upload pipeline for clean backups Error-tolerant and fully automated operation Summary > A reliable, SharePoint-to-Google Drive synchronization workflow built with n8n, integrating Supabase/Postgres for metadata management. It automates file fetching, filtering, downloading, uploading, and marking as completed — ensuring your data stays mirrored across platforms. Perfect for enterprises managing document automation, backup systems, or cross-cloud data synchronization. Need Help or More Workflows? Want to customize this workflow for your organization? Our team at Digital Biz Tech can extend it for enterprise-scale document automation, RAGs and social media automation. We can help you set it up for free — from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help.
by Abhiman G S
Quick Summary This workflow connects Telegram, Google Gemini AI, and Notion to make task creation effortless. Whenever you send a message to your Telegram bot, Gemini AI reads your message, understands what task you meant, and automatically creates it in your Notion database - complete with the task name and due date. If you approve, the task is added to Notion and you’ll get a confirmation in Telegram. If you decline, it simply replies that ❌ no task was created. Perfect for anyone who wants to capture ideas or to-dos directly from Telegram without opening Notion every time. Detailed Use Cases Quick task capture from Telegram Send a message like “Buy milk 25 May” to your Telegram bot. The AI extracts the task name and date, asks you to approve, and creates a Notion task once approved. Turn reminders into Notion tasks Message: “Pay rent next Monday” → Approve → Task added to Notion with the correct due date. Perfect for quickly saving reminders while on the go. Tip: For best results, include clear dates in your messages like 25 May or May 25, 2025 so the AI can extract them accurately. Prerequisites / Requirements Before using this workflow, make sure you have the following ready: Telegram Bot Setup Go to @BotFather on Telegram. Create a new bot using the /newbot command. Copy the Bot Token — you’ll need it to connect in the Telegram Trigger and Telegram nodes in n8n. Google Gemini API Key Sign up for a free Google AI Studio account. Create an API key and connect it in your n8n credentials under Google Gemini. This workflow uses the models/gemini-2.5-flash-lite model, which works perfectly on the free tier. Notion Database Setup Create a Notion database for storing tasks. It must have: A Title property — set this as the Task Name (type: Title / text). A Date property — set this as the Due Date (type: Date). Copy your Notion database ID and connect your Notion API credentials in n8n. Optional Security Note In the Telegram Trigger node, restrict the chatId to your own Telegram user ID if this workflow is for personal use. Customization Guide You can easily extend this workflow to capture more details from your Telegram messages using the AI Extract node. Here are a few simple ways to customize it: Add more extracted fields In the AI Extract: TaskName & TaskDue node, open the Attributes section. Add new fields like: Priority → text (e.g., High, Medium, Low) Duration → number or text (e.g., 30 mins, 1 hour) Notes → text (optional extra info) The Gemini model will automatically try to extract these from your message. Example messages you can send “Finish report by Friday — high priority, 2 hours.” “Workout tomorrow evening — medium priority.” Map extra fields to Notion In the Notion: Create Task (Page) node, scroll to the Properties section. Add matching Notion properties for each new field (e.g., Priority, Duration). Map the AI outputs (like output.Priority) to their respective fields. Send confirmation with extra data In the Send and Wait for Response (Approve/Decline) node, include the new extracted details in the Telegram message before approval. Example confirmation message: Task: {{ $json.output.TaskName }} Due: {{ $json.output.TaskDue }} Priority: {{ $json.output.Priority }} Duration: {{ $json.output.Duration }} By doing this, you can build a smarter task capture system that includes priority, time estimation, and other useful details, all directly from your Telegram chat.
by Rahul Joshi
Description: Keep your customer knowledge base up to date with this n8n automation template. The workflow connects Zendesk with Google Sheets, automatically fetching tickets tagged as “howto,” enriching them with requester details, and saving them into a structured spreadsheet. This ensures your internal or public knowledge base reflects the latest customer how-to queries—without manual copy-pasting. Perfect for customer support teams, SaaS companies, and service providers who want to streamline documentation workflows. What This Template Does (Step-by-Step) ⚡ Manual Trigger or Scheduling Run the workflow manually for testing/troubleshooting, or configure a schedule trigger for daily/weekly updates. 📥 Fetch All Zendesk Tickets Connects to your Zendesk account and retrieves all available tickets. 🔍 Filter for "howto" Tickets Only Processes only tickets that contain the “howto” tag, ensuring relevance. 👤 Enrich User Data Fetches requester details (name, email, profile info) to provide context. 📊 Update Google Sheets Knowledge Base Saves ticket data—including Ticket No., Description, Status, Tag, Owner Name, and Email. ✔️ Smart update prevents duplicates by matching on description. 🔁 Continuous Sync Each new or updated “howto” ticket is synced automatically into your knowledge base sheet. Key Features 🔍 Tag-based filtering for precise categorization 📊 Smart append-or-update logic in Google Sheets ⚡ Zendesk + Google Sheets integration with OAuth2 ♻️ Keeps knowledge base fresh without manual effort 🔐 Secure API credential handling Use Cases 📖 Maintain a live “how-to” guide from real customer queries 🎓 Build self-service documentation for support teams 📩 Monitor and track recurring help topics 💼 Equip knowledge managers with a ready-to-export dataset Required Integrations Zendesk API (for ticket fetch + user info) Google Sheets (for storing/updating records) Why Use This Template? ✅ Automates repetitive data entry ✅ Ensures knowledge base accuracy & freshness ✅ Reduces support team workload ✅ Easy to extend with more tags, filters, or sheet logic