by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection The first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. 3. This is the third pipeline --- the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification The first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Anomaly Detection Tool This is the tool that can be used directly for anomalous images (crops) detection. It takes as input (any) image URL and returns a text message telling if whatever this image depicts is anomalous to the crop dataset stored in Qdrant. An Image URL is received via the Execute Workflow Trigger, which is used to generate embedding vectors using the Voyage AI Embeddings API. The returned vectors are used to query the Qdrant collection to determine if the given crop is known by comparing it to threshold scores of each image class (crop type). If the image scores lower than all thresholds, then the image is considered an anomaly for the dataset.
by Max Tkacz
Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can enter prompts and select predefined visual styles that are customizable with no-code. The workflow integrates seamlessly with Hugging Face's free tier, and it’s easy to modify for any Text-to-Image model that supports API access. Try it Curious what this template does? Try a public version here: https://devrel.app.n8n.cloud/form/flux Set Up Watch this quick set up video 👇 Accounts required Huggingface.co account (free) Cloudflare.com account (free - used for storage; but can be swapped easily e.g. GDrive) Key Features: Text-to-Image Creation**: Generates unique visuals based on your prompt and style. Hugging Face Integration**: Utilizes Hugging Face’s Inference API for reliable image generation. Customizable Visual Styles**: Select from preset styles or easily add your own. Adaptable**: Swap in any Hugging Face Text-to-Image model that supports API calls. Ideal for: Creators**: Rapidly create visuals for projects. Marketers**: Prototype campaign visuals. Developers**: Test different AI image models effortlessly. How It Works: You submit an image prompt via the webform and select a visual style, which appends style instructions to your prompt. The Hugging Face Inference API then generates and returns the image, which gets hosted on Cloudflare S3. The workflow can be easily adjusted to use other models and styles for complete flexibility.
by Yaron Been
This workflow automatically monitors marketing job boards to identify growing companies and potential business opportunities. It saves you time by eliminating the need to manually check job listings and provides insights into which companies are actively hiring and expanding their marketing teams. Overview This workflow automatically scrapes marketing job listings from Indeed and other job boards to extract company information, job details, and growth indicators. It uses Bright Data to access job sites without being blocked and AI to intelligently parse job postings into structured data, then sends formatted email alerts to your marketing team. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping job boards without being blocked OpenAI**: AI agent for intelligent job data extraction and parsing Gmail**: For sending automated job alert emails to your team How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Gmail: Connect your Gmail account for sending notifications Customize: Set your target job search parameters and email recipients Use Cases Business Development**: Identify rapidly growing companies for potential partnerships Sales Teams**: Target companies actively hiring for sales outreach opportunities Market Research**: Track hiring trends and identify emerging market players Recruitment**: Monitor competitor hiring patterns and market opportunities Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #jobboards #marketingj jobs #brightdata #webscraping #businessdevelopment #leadgeneration #companyresearch #jobmonitoring #n8nworkflow #workflow #nocode #jobautomation #marketresearch #growingcompanies #hiringtrends #salesleads #prospecting #jobscraping #indeed #recruitmentintel #businessintelligence #marketanalysis #companytracking #automatedalerts #emailnotifications #jobdata #hiringinsights #marketopportunities
by Angel Menendez
CallForge - AI-Powered Product Insights Processor from Sales Calls Automate product feedback extraction from AI-analyzed sales calls and store structured insights in Notion for data-driven product decisions. 🎯 Who is This For? This workflow is designed for: ✅ Product managers tracking customer feedback and feature requests. ✅ Engineering teams identifying usability issues and AI/ML-related mentions. ✅ Customer success teams monitoring product pain points from real sales conversations. It streamlines product intelligence gathering, ensuring customer insights are structured, categorized, and easily accessible in Notion for better decision-making. 🔍 What Problem Does This Workflow Solve? Product teams often struggle to capture, categorize, and act on valuable feedback from sales calls. With CallForge, you can: ✔ Automatically extract and categorize product feedback from AI-analyzed sales calls. ✔ Track AI/ML-related mentions to gauge customer demand for AI-driven features. ✔ Identify feature requests and pain points for product development prioritization. ✔ Store structured feedback in Notion, reducing manual tracking and increasing visibility across teams. This workflow eliminates manual feedback tracking, allowing product teams to focus on innovation and customer needs. 📌 Key Features & Workflow Steps 🎙️ AI-Powered Product Feedback Processing This workflow processes AI-generated sales call insights and organizes them in Notion databases: Triggers when AI sales call data is received. Detects product-related feedback (feature requests, bug reports, usability issues). Extracts key product insights, categorizing feedback based on customer needs. Identifies AI/ML-related mentions, tracking customer interest in AI-driven solutions. Aggregates feedback and categorizes it by sentiment (positive, neutral, negative). Logs insights in Notion, making them accessible for product planning discussions. 📊 Notion Database Integration Product Feedback** → Logs feature requests, usability issues, and bug reports. AI Use Cases** → Tracks AI-related discussions and customer interest in machine learning solutions. 🛠 How to Set Up This Workflow 1. Prepare Your AI Call Analysis Data Ensure AI-generated sales call insights are available. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. 2. Connect Your Notion Database Set up Notion databases for: 🔹 Product Feedback (logs feature requests and bug reports). 🔹 AI Use Cases (tracks AI/ML mentions and customer demand). 3. Configure n8n API Integrations Connect your Notion API key** in n8n under “Notion API Credentials.” Set up webhook triggers** to receive AI-generated sales insights. Test the workflow** using a sample AI sales call analysis. 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to align with your product team's workflow. 💡 Refine AI Data Processing Rules – Customize how feature requests and pain points are categorized. 💡 Integrate with Slack or Email – Notify teams when recurring product issues emerge. 💡 Expand with Project Management Tools – Sync insights with Jira, Trello, or Asana to create product tickets automatically. ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Detect if product feedback, AI mentions, or feature requests exist in AI data. 🔹 Notion Nodes – Create and update structured feedback entries in Notion. 🔹 Split Out & Aggregate Nodes – Process multiple insights and consolidate AI-generated data. 🔹 Wait Nodes – Ensure smooth sequencing of API calls and database updates. 🚀 Why Use This Workflow? ✔ Eliminates manual sales call review for product teams. ✔ Provides structured, AI-driven insights for feature planning and prioritization. ✔ Tracks AI/ML mentions to assess demand for AI-powered solutions. ✔ Improves product development strategies by leveraging real customer insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers product teams by transforming sales call data into actionable intelligence, optimizing feature planning, bug tracking, and AI/ML strategy. 🚀
by Ludwig
Using PostBin to Test Webhooks Without Changing WEBHOOK_URL How it Works Many new n8n users struggle with testing webhooks when running n8n on localhost, as external services cannot reach localhost. This workflow introduces a technique using PostBin, which provides a temporary, publicly accessible URL to receive webhook requests. Generates a temporary webhook endpoint via PostBin. Uses this endpoint in place of localhost to test webhooks. Captures and displays the incoming webhook request data. Enables debugging and iterating without modifying the WEBHOOK_URL environment variable. Set Up Steps Estimated time:** ~5–10 minutes Create a PostBin instance to generate a publicly accessible webhook URL. Copy the PostBin URL and use it as the webhook destination in n8n. Trigger the webhook from an external service or manually. Inspect the request payload in PostBin to verify data reception. (EXAMPLE) Using PostBin for Webhook Testing in a BambooHR Integration How it Works In this example, we apply the PostBin technique to a BambooHR integration. Instead of manually configuring a webhook in BambooHR, this workflow automates webhook registration using the BambooHR API. The workflow: Uses the BambooHR API to programmatically register the PostBin URL as a webhook. Retrieves the most recent webhook calls made by BambooHR to the PostBin URL. (Optional) Sends a personalized Slack message for new employees using OpenAI. Set Up Steps Estimated time:** ~15–20 minutes Set up PostBin using the steps from the first section. Log into BambooHR to generate an API key for authentication. Run the workflow to register the PostBin URL as a webhook in BambooHR via the API. Retrieve recent webhook calls from PostBin to validate data reception. (Optional) Send a Slack notification using the processed data.
by Didac Fernandez
🤖 Autonomous Email Assistant - AI-Powered Inbox Management > Transform Your Email Workflow with Intelligent Automation This advanced n8n workflow creates a fully autonomous email assistant that processes incoming emails through AI-powered classification, generates contextually-aware responses in your personal brand voice, and automatically organizes your inbox. Perfect for: Professionals managing high email volumes who want to maintain response quality while saving hours each week. 🎯 What This Workflow Does The Autonomous Email Assistant monitors your Outlook inbox and intelligently processes every incoming email through a sophisticated multi-stage pipeline: 🏷️ Smart Classification - Automatically categorizes emails into 7 distinct types (Commercial/Spam, Internal, Meeting, Newsletter, Notifications, Urgent, Other) ✍️ AI Response Generation - Creates draft responses tailored to the email type, maintaining your unique communication style 📅 Meeting Automation - Checks your calendar availability and handles meeting requests automatically ⚡ Priority Handling - Sends Slack notifications for urgent emails requiring immediate attention 📂 Inbox Organization - Files processed emails into categorized folders with AI tagging 📊 Comprehensive Logging - Records all processed emails and responses in Excel for audit trails ✨ Key Features 🔍 Dual Classification System Primary LLM classifier for fast categorization Secondary text classifier for validation 7 predefined categories with smart routing logic 🎨 Brand Voice Integration Maintains consistent communication style across all responses Customizable writing patterns and key phrases Professional tone with configurable formality levels 📆 Intelligent Meeting Handler Calendar integration with availability checking Automatic event creation for confirmed meetings Suggests alternative times when unavailable Maintains 15-minute buffers between meetings Respects working hours (8:30 AM - 5:00 PM) 👤 Human-in-the-Loop for Critical Emails Slack notifications for urgent messages Approval workflow with feedback incorporation Draft responses for review before sending 📥 Complete Inbox Management Auto-marking as read AI category tagging for tracking Organized folder archiving by email type Excel logging for analytics and compliance 🛠️ Workflow Requirements 🔐 Required Credentials Microsoft Outlook OAuth2** - Email access, calendar permissions Microsoft Excel 365** - For logging workbook OpenRouter API** - GPT-5-mini model recommended Slack OAuth2** - Optional, for urgent notifications 💻 Technical Stack | Component | Technology | |-----------|-----------| | AI Model | OpenRouter GPT-5-mini | | Email Provider | Microsoft Outlook | | Data Storage | Microsoft Excel 365 | | Notifications | Slack | | Polling Interval | Every minute (configurable) | ⚙️ How It Works Stage 1️⃣: Email Ingestion Microsoft Outlook Trigger monitors inbox → Information Extractor pulls sender details Stage 2️⃣: Classification Dual AI classifiers determine email category → Routes to appropriate handler Stage 3️⃣: Response Generation General emails** → emailReplier Meeting requests** → AI Agent with calendar tools Urgent emails** → urgentReplier + Slack notification Others** → Context-aware handler Stage 4️⃣: Brand Voice Application All responses pass through brand voice nodes for style consistency Stage 5️⃣: Organization ✅ Mark as read 🏷️ Apply AI category tag 📁 Archive to appropriate folder 📝 Log to Excel 🎛️ Customization Options 📋 Adjust Classification Categories Modify the Virtual Postman categories to match your specific needs. Add industry-specific classifications or merge existing ones. ✏️ Personalize Brand Voice The embedded brand voice prompts can be completely customized: Update key phrases and sign-offs Adjust sentence length preferences Modify formality and tone Add company-specific terminology ⚙️ Configure Response Behaviors Change meeting scheduling preferences Update working hours Modify urgent email criteria Adjust buffer times between meetings 🔔 Notification Preferences Switch Slack to email notifications Add multiple notification channels Customize urgency thresholds 💼 Use Cases | Role | Benefits | |------|----------| | 🎯 Busy Executives | Handle routine correspondence while maintaining personal touch | | 🎧 Customer Support | First-line response generation with consistent brand voice | | 💰 Sales Teams | Automated meeting scheduling and follow-up management | | 📊 Project Managers | Internal communication routing and priority handling | | 💡 Consultants | Client communication management across multiple projects | 🚀 Setup Guide Import Workflow - Import the JSON into your n8n instance Configure Credentials - Add all four required OAuth2 connections Create Excel Workbook - Set up "Email Automator" workbook with specified columns Create Outlook Folders - Add the 7 category folders to your Outlook Customize Brand Voice - Update the brand voice prompts with your writing style Test Classification - Send test emails to verify category routing Activate Workflow - Enable the workflow to start processing ⚠️ Important Notes ⚡ All urgent emails require human approval before sending 📝 Most responses are saved as drafts for review 📊 Comprehensive Excel logging enables quality assurance 🏷️ AI tagging allows easy identification of automated processing 📅 Calendar integration respects existing commitments 🔒 Data Privacy & Security This workflow processes emails locally within your n8n instance. Email content is sent to OpenRouter for AI processing. Review OpenRouter's data policies and ensure compliance with your organization's data handling requirements. 📜 Version History v1.0 - Initial Release 7-category classification system Brand voice integration Meeting automation Excel logging Slack notifications 💬 Support & Community For questions, customization help, or to share improvements, visit the n8n community forum. This workflow is designed to be highly customizable - adapt it to your specific needs! Created by: Didac Fernandez Girona | AutoSolutions.ai - AI Consulting Services Tags: email automation AI assistant outlook calendar management brand voice inbox organization meeting scheduler
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors local event platforms (Eventbrite, Meetup, Facebook Events) and aggregates upcoming events that match your criteria. Never miss a networking or sponsorship opportunity again. Overview A scheduled trigger scrapes multiple event sites via Bright Data, filtering by location, date range, and keywords. OpenAI classifies each event (conference, meetup, workshop) and extracts key details such as venue, organizers, and ticket price. Updates are posted to Slack and archived in Airtable for quick lookup. Tools Used n8n** – Core automation engine Bright Data** – Reliable multi-site scraping OpenAI** – NLP-based event categorization Slack** – Delivers daily event digests Airtable** – Stores enriched event records How to Install Import the Workflow: Add the .json file to n8n. Configure Bright Data: Provide your account credentials. Set Up OpenAI: Insert your API key. Connect Slack & Airtable: Authorize both services. Customize Filters: Edit the initial Set node to adjust city, radius, and keywords. Use Cases Community Managers**: Curate a calendar of relevant events. Sales Teams**: Identify trade shows and meetups for prospecting. Event Planners**: Track competing events when choosing dates. Marketers**: Spot speaking or sponsorship opportunities. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #eventmonitoring #brightdata #openscraping #openai #slackalerts #n8nworkflow #nocode #meetup #eventbrite
by dirogar
Telegram Tasker Bot — это сценарий n8n, который принимает голосовые сообщения в Telegram, автоматически превращает их в текст, извлекает из него ключевые поля задачи и создаёт карточку в нужной доске Trello. Пользователь просто говорит задачу — бот сам оформляет её и присылает ссылку на готовую карточку. Для использования вам потребуется telegram bot. Его можно создать через бота BotFather Так же понадобится доступ к API chatgpt - он используется только для транскрибции аудио в речь. Вы можете использовать любой другой сервис, по вашему выбору. И аккаунт в trello, с доступом к API. !Внимание! ID доски в trello можно взять из url ID столбца на доске трелло можно взять через инструменты разработчика (по крайней мере я так получал эти данные)
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. Google BigQuery: This node will send the data from the previous node to the position table in Google BigQuery. If you have created a table with a different name, use that table instead.
by n8n Team
This n8n workflow automates the handling of security detections from CrowdStrike, streamlining incident response and notification processes. The workflow is triggered daily at midnight by the Schedule Trigger node. It begins by fetching recent security detections from CrowdStrike using an HTTP Request node. The response is then split into individual detections for further processing. Each detection is enriched by querying the CrowdStrike API for detailed information using another HTTP Request node. The workflow then processes these detections sequentially using the Split In Batches node. Next, it looks up behavioral information associated with each detection in VirusTotal using two HTTP Request nodes. One node queries VirusTotal based on SHA256 values, and the other based on IOC (Indicator of Compromise) values. The workflow includes a 1-second pause using the Wait node to prevent rate limiting when making requests to the VirusTotal API. Subsequently, the workflow sets fields with relevant details from both CrowdStrike and VirusTotal, including detection links, confidence scores, filenames, usernames, and more. These details are concatenated using an Item Lists node for each detection. The final step involves creating Jira issues for each detection, including summaries with CrowdStrike alert severity and hostnames, as well as descriptions that incorporate information from CrowdStrike and VirusTotal. Information about this issue is then sent via a Slack message to a Slack user. Potential issues during setup might include configuring the Schedule Trigger node to trigger at the correct time zone and handling potential rate limiting from the VirusTotal API, which could lead to throttled requests. Additionally, the note about a possible typo in the URL for the Virustotal nodes should be addressed to ensure correct API calls. The Jira node may need to be replaced with the latest version for compatibility. Properly configuring API credentials and handling errors that may occur during API requests are essential for a smooth workflow operation. Careful testing with sample data is recommended to validate the workflow's functionality and ensure it aligns with your organization's security incident response processes.
by Robert Breen
Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order:* Use *Manual mapping** in the Google Sheets node Binary not found:* Ensure each *Convert to File* node’s binaryPropertyName matches what *Merge** expects Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Rahul Joshi
📊 Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. 🚀📚💬 What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. 🧩 Step 2: Fetches release metadata including version, tag, and publish date. 📦 Step 3: Logs release data into Google Sheets for record-keeping and analysis. 📊 Step 4: Retrieves FAQ or documentation data from Notion. 📚 Step 5: Merges GitHub and Notion data to calculate documentation drift. 🔍 Step 6: Flags SDKs whose documentation is over 30 days out of date. ⚠️ Step 7: Sends detailed Slack alerts to notify responsible teams. 🔔 Key Benefits ✅ Keeps SDK documentation aligned with product releases ✅ Prevents outdated information from reaching users ✅ Provides centralized release tracking in Google Sheets ✅ Sends real-time Slack alerts for overdue updates ✅ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOUR_GOOGLE_SHEET_ID and YOUR_SHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.