by Ranjan Dailata
Description This workflow automates the process of scraping the latest discussions from HackerNews, transforming raw threads into human readable content using Google Gemini, and exporting the final content into a well-formatted Google Doc. Overview This n8n workflow is responsible for extracting trending posts from the HackerNews API. It loops through each item, performs HTTP data extraction, utilizes Google Gemini to generate human-readable insights, and then exports the enriched content into Google Docs for distribution, archiving, or content creation. Who this workflow is for Tech Newsletter Writers**: Automate the collection and summarization of trending HackerNews posts for inclusion in weekly or daily newsletters. Content Creators & Bloggers**: Quickly generate structured summaries and insights from HackerNews threads to use as inspiration or supporting content for blog posts, videos, or social media. Startup Founders & Product Builders**: Monitor HackerNews for discussions relevant to your niche or competitors, and keep a pulse on the community’s opinions. Investors & Analysts**: Surface early signals from the tech ecosystem by identifying what’s trending and how the community is reacting. Researchers & Students**: Analyze popular discussions and emerging trends in technology, programming, and startups—enriched with AI-generated insights. Digital Agencies & Consultants**: Offer HackerNews monitoring and insight reports as a value-added service to clients interested in the tech space. Tools Used n8n**: The core automation engine that manages the trigger, transformation, and export. HackerNews API**: Provides access to trending or new HN posts. Google Gemini**: Enriches HackerNews content with structured insights and human-like summaries. Google Docs**: Automatically creates and updates a document with the enriched content, ready for sharing or publishing. How to Install Import the Workflow**: Download the .json file and import it into your n8n instance. Set Up HackerNews Source**: Choose whether to use the HN API (via HTTP Request node) or RSS Feed node. Configure Gemini API**: Add your Google Gemini API key and design the prompt to extract pros/cons, key themes, or insights. Set Up Google Docs Integration**: Connect your Google account and configure the Google Docs node to create/update a document. Test and Deploy**: Run a test job to ensure data flows correctly and outputs are formatted as expected. Use Cases Tech Newsletter Authors**: Generate ready-to-use summaries of trending HackerNews threads. Startup Founders**: Stay informed on key discussions, product launches, and community feedback. Investors & Analysts**: Spot early trends, technical insights, and startup momentum directly from HN. Researchers**: Track community reactions to new technologies or frameworks. Content Creators**: Use the enriched data to spark blog posts, YouTube scripts, or LinkedIn updates. Connect with Me Email: ranjancse@gmail.com LinkedIn: https://www.linkedin.com/in/ranjan-dailata/ Get Bright Data: Bright Data (Supports free workflows with a small commission) #n8n #automation #hackernews #contentcuration #aiwriting #geminiapi #googlegemini #techtrends #newsletterautomation #googleworkspace #rssautomation #nocode #structureddata #webscraping #contentautomation #hninsights #aiworkflow #googleintegration #webmonitoring #hnnews #aiassistant #gdocs #automationtools #gptlike #geminiwriter
by Automate With Marc
📱 Veo3 Instagram Agent – Create & Auto-Post Reels with Blotato Description: This no-code workflow automates the full pipeline of generating and publishing Instagram Reels using Veo3 (via Wavespeed API). From prompt to post, it handles content ideation, short-form video generation, caption writing, logging, and even automatic publishing to Instagram via Blotato. Perfect for creators, brands, and marketers who want to scale content creation without needing to shoot or edit videos manually. 🔗 Watch the full step-by-step tutorial on how to build this workflow: https://youtu.be/s-KzxeKWmIA?si=6x8WKMeiyWodZWVq Google Sheet Template: https://docs.google.com/spreadsheets/d/1bA-PQTrvekC1Rti-XumGANgjIwLjvcFCqoIxVCYsq2E/edit?usp=sharing 🚀 What This Workflow Does: Trigger via Chat or Telegram Start with a simple message like: "Make a reel for a luxury minimalist candle brand using calm aesthetics." AI Video Prompt Generation Uses OpenAI to craft a visually rich, platform-optimized video description prompt. 🎞️ Video Creation with Veo3 API Submits your prompt to Veo3 to create a short video (9:16 ratio, 8 seconds) with motion, tone, and trend styles. ✍️ Caption Writing An AI agent writes an engaging and playful caption based on the video content. 📄 Google Sheets Logging Stores prompt, video URL, caption, and status in a GSheet to keep track of all generated assets. 📤 Auto-Publish to Instagram Posts the video + caption directly to Instagram using Blotato’s social media publishing API. 🔌 Tools & Integrations Used: OpenAI for prompt & caption generation Wavespeed API (Veo3) for video generation Google Sheets for tracking Blotato for scheduling & publishing content n8n for orchestration and automation logic 💡 Use Cases: Content calendar automation for small teams Trend-based ad creation and testing UGC-style reel generation for e-commerce Rapid ideation & creative experimentation
by Michael Yang
Who is this template for? This workflow is perfect for competitive‑intel analysts, product managers, content marketers, and anyone who tracks multiple company blogs or news sources. If you need a weekly snapshot of fresh, on‑topic articles—without wading through dozens of tabs—this template is for you. What does it do? The workflow reads a curated list of candidate URLs from Google Sheets, filters out duplicates and off‑topic pages with an AI agent, scrapes the surviving links, generates three‑sentence summaries, logs the results back to Sheets, and delivers a polished HTML digest to your inbox every week. Why is it useful? Instead of manually opening competitor links, checking for relevance, copying highlights, and pasting them into reports, this automation does the grunt work for you. It turns scattered URLs into a searchable knowledge base and a ready‑to‑share email, freeing you to focus on insights and strategy—not housekeeping. How does it work? A Sunday‑morning cron trigger kicks things off. The workflow pulls links from the Input Links tab, compares them to the existing Summary tab, and passes fresh candidates to an AI “bouncer” that keeps only blog posts, tutorials, news, and product updates. Firecrawl then scrapes each page; Gemini 2.5‑Flash and OpenAI condense the content into title, author, date, and summary. The structured data is appended to your Summary sheet and formatted into a company‑grouped HTML digest, which lands in your email before the workweek starts. Set up steps Clone the workflow Import the JSON into your n8n Cloud workspace. Create the Google Sheet Make a new spreadsheet with two tabs: Input Links and Summary (names must match). In Input Links, add columns Company, Page Type, and Link (or rename to match the node mapping). Leave Summary blank—the workflow will populate it. Copy the Sheet URL; you’ll paste it into two Google Sheets nodes. Add credentials (n8n ▸ Credentials) Google Sheets OAuth2 – Authorise with the Google account that owns the spreadsheet. Gmail OAuth2 – Authorise the Gmail account that should send the digest. Firecrawl HTTP Header Auth – Set Authorization: Bearer <YOUR_FIRECRAWL_API_KEY>. Point nodes to your Sheet Open each Google Sheets node (Input Links, Read_Url_Summary_Tool, Append row in sheet, Get row(s) in sheet). Paste the Document ID (found in the Sheet URL) and select the correct tab (Input Links or Summary). Update email recipients In the Send a message (Gmail) node, replace the sample addresses with your own distribution list. Adjust scheduling (optional) Double‑click the Schedule Trigger node and change the cron expression if you prefer a different day/time. Tune AI models (optional) OpenAI o4‑mini and Gemini 2.5‑Flash nodes default to cost‑efficient settings. Feel free to switch models or tweak temperature to suit your tone. Test with a single URL Add one row in Input Links, then execute the workflow manually (▶ Run). Verify that a new row appears in Summary and an email lands in your inbox. Go live Activate the workflow (toggle in top bar). Confirm the green status badge and wait for the next scheduled run. Tip: The Firecrawl Free tier limits you to ~10 requests/min. If you scale beyond that, raise the batching interval in both Firecrawl nodes or upgrade your Firecrawl plan.
by Evozard
This workflow functions by integrating Shopify customers into Odoo customers. Trigger: Shopify – New Customer Created The workflow starts when a new customer is added in Shopify. Action: Odoo – Search Contact by Email It checks in Odoo to see if a contact already exists with the same email address as the Shopify customer. Condition: Email Match Check If a contact with the same email is found, the workflow ends (no duplicate contact is created). If no match is found, the workflow proceeds to the next step. Action: Odoo – Create New Contact A new contact is created in Odoo using the customer's: Full name Email address Phone number Full Address (whichever is available)
by Davide
📩🤖 This workflow automatically processes emails received in Gmail, extracts their attachments, and organizes them into specific folders in Google Drive based on the sender's email address. Note: The workflow avoids duplicates by checking folder existence before creation. Benefits: ✅ Automated Organization: No need to manually sort or download email attachments. 📁 Sender-based Categorization: Files are stored in clearly labeled folders per sender, improving traceability and reducing clutter. ⏱ Time-saving: Reduces repetitive administrative tasks by automating the workflow end-to-end. 🔁 Modular and Scalable: Can be easily extended or reused with other services (e.g., Dropbox, S3) or integrated into larger document workflows. 🔐 Secure Cloud Storage: Attachments are safely backed up in Google Drive, minimizing the risk of data loss from email. How It Works Trigger: The workflow can be triggered manually ("When clicking ‘Execute workflow’) or automatically (via Gmail Trigger polling emails every minute). Email Processing: Fetches emails (with attachments) from Gmail within a date range (default: July 6–9, 2025). For each email, checks if it contains attachments (via IF node). Folder Management: Searches Google Drive for a folder named after the sender’s email address (under parent folder "Email Attachments"). Creates the folder if it doesn’t exist. Attachment Handling: Splits out binary attachments, extracts filenames, and uploads each file to the sender’s dedicated folder in Google Drive. Sub-Workflow Execution: Uses Execute Workflow to modularize the upload process (reusable for other workflows). Set Up Steps Google Services: Connect Gmail and Google Drive nodes to your accounts via OAuth2. Ensure the parent folder "Email Attachments" (ID: 1EitwWVd5rKZTlvOreB4R-6xxxxxx) exists in Google Drive. Adjust Date Range: Modify receivedAfter/receivedBefore in the Get emails node to target specific emails. Test: Run manually to verify folder creation and attachment uploads. Activate Automation: Enable the Gmail Trigger for real-time processing (currently active: false). Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Sk developer
YouTube Transcript Summarization in Any Language for Social Media This n8n workflow automates the process of: Retrieving YouTube Video Transcripts: It fetches the transcript for any YouTube video URL provided using the YouTube Transcript API from RapidAPI. Generating a Concise Summary in Any Language: The workflow uses Google Gemini (PaLM) to create a concise summary of the transcript in the language specified by the user (e.g., English, Spanish, etc.). Storing the Summary in Google Docs: The generated summary is inserted into a predefined Google Document, making it easy for users to share or edit. Features: Language Flexibility:** Summaries are created in the desired language. Fully Automated:** From fetching the transcript to updating Google Docs, the process is fully automated. Social Media Ready:** The summary is formatted and stored in a Google Doc, ready for use in social media posts. This workflow integrates with YouTube Transcript API via RapidAPI, allowing you to easily fetch video transcripts and summarize them with AI. The entire process is automated and seamless. Powered by RapidAPI: API Used:* YouTube Transcript API via *RapidAPI** to get the transcript data. Benefits: Saves Time:** Automates the transcript summarization process, eliminating the need for manual content extraction and summarization. Customizable Language Support:** Provides summaries in any language, enabling accessibility and engagement for a global audience. Streamlined Content Creation:** Automatically generates concise, engaging summaries that are ready for social media use. Google Docs Integration:** Saves summaries directly into a Google Doc for easy sharing, editing, and content management. Challenges Addressed: Manual Transcript Extraction:** Problem: Manually transcribing and summarizing YouTube videos for social media can be time-consuming and error-prone. Solution: This workflow fully automates the process, saving hours of manual work using the YouTube Transcript API. Lack of Language Support in Summaries:** Problem: Many automated tools only summarize content in a single language, limiting their accessibility. Solution: With language flexibility, the workflow creates summaries in the language of your choice, helping you cater to diverse audiences. Inconsistent Video Quality & Transcript Accuracy:** Problem: Not all YouTube videos have well-structured or accurate transcripts, leading to incomplete or inaccurate summaries. Solution: The workflow can process and format even imperfect transcripts, ensuring that the generated summaries are still accurate and useful. Managing Content Across Platforms:** Problem: Transcripts and summaries often need to be stored in multiple locations for social media posts, which can be cumbersome. Solution: The workflow integrates with Google Docs to automatically store and manage summaries in one place, making it easier to share and reuse content.
by galelem
This n8n workflow automates the entire pipeline of generating, formatting, and publishing SEO-rich blog posts to a Blogger site—ideal for auto service businesses. What it does: ⏱ Runs on a schedule via the Schedule Trigger 📰 Fetches trending news from Mediastack (technology category) 🖼 Generates relevant images using the Pexels API 🧠 Creates SEO-optimized content using AI agents (LangChain & OpenRouter) 📝 Formats content into Blogger-compatible HTML, including title, metadata, images, FAQs, and internal linking 🔄 Posts directly to Blogger via authenticated Google Blogger API 📢 Sends Telegram notifications with previews and publishing confirmations 🔐 Uses secure credentials (no hardcoded API keys) Ideal For: Bloggers and marketers looking to automate content creation Auto repair, dealership, or detailing businesses maintaining a content strategy Agencies managing multiple Blogger-based SEO campaigns
by Łukasz
Who is it for? This is automation for support project manager, which helps not only to keep developres informed but also automatically keep clients in the loop - especially useful if you are managing SLA-like agreement. It is actually simple incident management board using free Kanban board, that is extended in functionality via N8N. How It Works? Script has two entry points. The first one is incident form. When incident details are provided, automation gets incident definitions from database and pushes both information to AI. AI comparse definitions with client request, refines incident priority and pushed it in NocoDB database. Second is schedule trigger, which is responsible for regular notificaitons on task status. If task is not picked up or delivered in proper time, then emails or slack messages are being sent both to client and responsible developer. How to set up? Clone automation Create (samples below) two NocoDB tables: one with definitions and second that servers as Kanban board (mind column naming!) Set up email and slack connection You should be ready to go Different incident naming If your incident level naming is different, you need to update few nodes and few columns in NocoDB. This is because incident naming must be unified through: automation flow, incident definitions and column NocoDB select fields. So be sure that following is the same: NocoDB: Incident definitions, column "Title" NocoDB: Tasks table, single select fields: "expected category" "assigned category" N8N: Incident Form "Incident Desired Category" NocoDB Tables Incident definitions table |Title |Definition |Response time|Resolution time|Default assignee| |single line text|text|number|number|email| Tasks table |email|message|expected category|internal notes|assigned category|status|expected response|expected resolution|assignee|assignee slack| |email|text|single select|text|single select|single select|date and time|date and time|email|slack username| Use kanban board Simply set up Kanban view and stack by "status" field. What's More? That's actually it. I hope that this automation will help your support line be much more streamlined! There is actually more that you could do with this automation, but it really depends on your needs. For example, you could add Email trigger to handle incoming support requests (but remember to adjust nodes accordingly). Another thing is that you could make different notification schema, depending on your needs (for example I do imagine that you may want a day or two delay before you notify client that task is after due). Thank you, perfect! Glad I could help. Visit my profile for other automations for businesses. And if you are looking for dedicated software development, do not hesitate to reach out!
by Rosh Ragel
Automatically Send Weekly Sales Reports from Square via Outlook What It Does This workflow automatically connects to the Square API and generates a weekly sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run weekly and pull the previous week’s sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Microsoft Outlook credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every Monday at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the previous week (e.g., Monday to Sunday) Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent via Microsoft Outlook to the chosen third party Example Use Cases Automatically send weekly Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Evoort Solutions
🖼️ Image-to-Image AI Generator from Google Sheets with Google Drive Upload ✅ Use Case Automatically generate AI images from prompts listed in a Google Sheet, upload the images to Google Drive, and log the result back into the sheet. Uses the image-to-image-gpt API for fast, customizable generation. 💡 Problem It Solves Manual image generation workflows are inefficient and error-prone. Creative and content teams often have to: Manually paste prompts into image generation tools Save images locally Upload to Google Drive Paste the link back into tracking spreadsheets This automation removes all that friction—turning one spreadsheet into a complete image creation pipeline. 🌟 Benefits 🔁 Fully automated image generation 📤 Direct uploads to Google Drive 🧾 Image links and timestamps logged in Google Sheets ⚠️ Built-in error logging for API failures 🧩 Modular and easily extensible 📊 Keeps a historical log of successes and errors 🧩 Workflow Overview | Node | Description | |------|-------------| | 1. Manual Trigger | Starts the workflow when executed manually | | 2. Google Sheets2 | Reads all rows from the input Google Sheet | | 3. Loop Over Items | Processes one row (prompt) at a time | | 4. If2 | Filters only rows where Prompt is not empty and drive path is empty | | 5. HTTP Request1 | Calls the image-to-image-gpt API with the prompt | | 6. If1 (Error Handling) | If an error exists in the API response, route to logging | | 7. Google Sheets4 (Error Log) | Appends error details to a log sheet for review | | 8. Code1 | Decodes the base64 image returned by the API | | 9. Google Drive1 | Uploads the image to a selected Google Drive folder | | 10. Google Sheets1 (Write Back) | Updates the original row with the image drive path and timestamp | | 11. Wait | Delays the next prompt to prevent hitting API rate limits | 🛠 Tech Stack n8n** (no-code automation) Google Sheets** (data input/output) Google Drive** (image storage) image-to-image-gpt API via RapidAPI JavaScript (in Code node)** for base64 processing 📝 Sheet Format Your Google Sheet should include the following columns: | Column | Purpose | |----------------|----------------------------------| | Prompt | The AI prompt to send to the API | | Image url | (Optional) Initial image URL | | drive path | Updated with Drive link by flow | | Generated Date | Auto-filled by the workflow | | Base64 | Stores raw or error data | 🚀 How to Use Import this workflow into your n8n instance Set up Google Sheets and Google Drive service credentials Add your RapidAPI key in the HTTP Request node headers Use the image-to-image-gpt endpoint in the HTTP request Configure the Google Sheet and Drive folder in the respective nodes Execute manually or add a Cron node for scheduling 📌 Example Applications 🛍 eCommerce: Auto-generate product mockups 🧵 Fashion/Design: Visualize styles or fabrics from prompts ✍️ Blogging/Content: Auto-generate header images from titles 📣 Marketing: Generate ad banners from text 🧪 Tips You can add a Cron node if you want this to run on a schedule Use a separate tab/sheet for logging failed prompts Extend the flow by adding: Email notifications Slack alerts File name templating Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Oneclick AI Squad
This workflow provides real-time detection of ransomware encryption patterns using Claude AI, with automated system isolation and incident response. How it works File System Monitoring - Continuously monitors file operations (create, modify, rename, delete) across critical directories Behavior Pattern Collection - Aggregates file operation metrics in 30-second windows (entropy changes, extension changes, I/O velocity) AI Threat Analysis - Claude AI analyzes patterns against known ransomware behaviors (mass encryption, shadow copy deletion, etc.) Threat Scoring & Classification - Assigns threat scores (0-100) and classifies attack types (crypto-locker, wiper, etc.) Auto-Isolation Decision - Determines if immediate network isolation is required based on confidence thresholds System Quarantine - Executes automated isolation: disable network adapters, block shares, kill suspicious processes Forensic Snapshot - Captures system state, process tree, network connections, and file operation logs Incident Response Alert - Notifies SOC team with detailed threat intelligence and recommended actions Evidence Preservation - Stores forensic data and AI analysis in SIEM for investigation Detection Capabilities Entropy Analysis**: Detects high-entropy file creation (encrypted data signature) Extension Scanning**: Identifies suspicious extension changes (.docx → .locked, .encrypted, .crypted) I/O Velocity**: Flags abnormal file modification rates (>100 files/min) Shadow Copy Deletion**: Detects vssadmin.exe / wmic.exe shadow copy deletion attempts Ransom Note Detection**: Identifies README.txt, HOW_TO_DECRYPT.html creation patterns Lateral Movement**: Monitors SMB/RDP connection spikes from infected hosts Process Behavior**: Analyzes suspicious parent-child process relationships Setup Steps Import workflow into n8n Configure credentials: Anthropic API - Claude AI for threat analysis Windows Event Collector / Sysmon - File system event source EDR API (CrowdStrike/Defender/SentinelOne) - For isolation commands SIEM API (Splunk/Elastic) - For log forwarding Slack/PagerDuty - For SOC alerts Install file system watcher on monitored endpoints (sysmon, osquery, or auditd) Configure isolation thresholds (default: threat_score >= 75) Test isolation procedure in sandbox environment Activate workflow Sample Detection Event { "hostname": "DESKTOP-WKS-042", "username": "jdoe", "timestamp": "2025-02-25T14:23:17Z", "detection_window_seconds": 30, "file_operations": { "files_modified": 247, "files_renamed": 189, "files_created": 58, "files_deleted": 31, "avg_entropy_increase": 7.89, "suspicious_extensions": [".locked", ".crypted", ".encrypted"], "ransom_notes_created": ["README_DECRYPT.txt", "HOW_TO_RECOVER.html"] }, "process_activity": { "high_io_processes": [ {"name": "explorer.exe", "pid": 4782, "io_rate": "523 ops/sec"}, {"name": "svchost.exe", "pid": 2194, "io_rate": "412 ops/sec"} ], "suspicious_commands": [ "vssadmin.exe delete shadows /all /quiet", "wmic shadowcopy delete", "bcdedit /set {default} recoveryenabled no" ] }, "network_activity": { "c2_connections": [ {"ip": "185.220.101.32", "port": 443, "country": "RU"}, {"ip": "194.165.16.85", "port": 8443, "country": "NL"} ], "lateral_movement": [ {"target": "FILE-SERVER-01", "protocol": "SMB", "status": "success"}, {"target": "DB-SERVER-03", "protocol": "RDP", "status": "failed"} ] } } Threat Intelligence Sources MITRE ATT&CK Framework (T1486 - Data Encrypted for Impact, T1490 - Inhibit System Recovery) Known ransomware families: LockBit, BlackCat/ALPHV, Royal, Play, Cl0p File extension IOCs from ransomware tracking feeds Behavioral signatures from recent campaigns Compliance & Forensics Chain of Custody**: All isolation actions logged with timestamps and justifications NIST CSF Alignment**: DE.CM-7 (Monitoring for unauthorized activity), RS.MI-3 (Incident containment) Evidence Integrity**: Forensic snapshots include cryptographic hashes for court admissibility Post-Incident Review**: AI analysis archived for threat hunting and pattern improvement
by Cheng Siong Chin
How It Works This workflow automates enterprise risk management by intelligently routing risks across three severity tiers. Built for compliance teams and risk managers, it eliminates manual evaluation bottlenecks and inconsistent escalation. The system retrieves risk data from spreadsheets, calculates severity indicators, then routes items through specialized AI agents—critical risks trigger coordinated multi-agent assessment with Gmail and Slack alerts, medium risks undergo standard AI evaluation, while low risks receive automated acknowledgment. Each severity level follows distinct processing paths ensuring appropriate review depth, stakeholder notification, and audit documentation. Setup Steps Connect Google Sheets with risk data Configure Anthropic API credentials for Claude Model nodes Set up Gmail authentication for notification delivery Connect Slack workspace and specify channel IDs for critical/low risk alerts Customize risk thresholds Update parser regex patterns in Code nodes matching assessment output format Prerequisites Active accounts: Google Sheets, Anthropic Claude API, Gmail, Slack. Use Cases Enterprise compliance monitoring, operational risk management Customization Modify scoring formulas, adjust severity thresholds, add custom AI criteria Benefits Eliminates manual triage, ensures consistent standards, accelerates critical response