by Khairul Muhtadin
Effortlessly track your expenses with MoneyMate, an n8n workflow that transforms receipts into organized financial insights. Upload a photo or text via Telegram, and let MoneyMate extract key details—store info, transaction dates, items, and totals—using Google Vision OCR and AI-powered parsing via OpenRouter. It categorizes expenses (e.g., Food & Beverages, Transport, Household) and delivers a clean, emoji-rich summary back to your Telegram chat. Handles zero-total errors with a friendly nudge to double-check inputs. Perfect for freelancers, small business owners, or anyone seeking hassle-free expense management. No database required, ensuring privacy and simplicity. Deploy MoneyMate and take control of your finances today! Key Features 📱 Telegram Integration: Input via photo or text, receive summaries instantly. 📸 Receipt Scanning: Converts receipt images to text using Google Vision API. 🤖 AI Parsing: Categorizes transactions with OpenRouter’s AI analysis. 🛡️ Privacy-First: Processes data on-the-fly without storage. ⚠️ Smart Error Handling: Catches zero totals with user-friendly prompts. 📊 Flexible Categories: Supports Income/Expense and custom expense types. Ideal For Budget-conscious individuals** managing personal finances. Entrepreneurs** tracking business expenses. Teams** needing quick, automated expense reporting. Pre-Requirements n8n Instance:** A running n8n instance (cloud or self-hosted). Credentials:** Telegram: A bot token and webhook setup (obtained via BotFather). For more information, please refer to Telegram bots creation Google Cloud: A service account with Google Vision API enabled and API key. For more informations, please refer to Google cloud Vision OpenRouter: An account with API access for AI language model usage. Telegram Bot:* A configured *Telegram** bot to receive inputs and send summaries. Setup Instructions Import Workflow:* Copy the *MoneyMate** workflow JSON and import it into your n8n instance using the "Import Workflow" option. Set Up Telegram Bot:* Create a bot via BotFather on *Telegram** to get a token and set up a webhook. For detailed steps, refer to n8n’s Telegram setup guide. Configure Credentials:** In the Telegram Trigger, Send Error Message, and Send Expense Summary nodes, add Telegram API credentials with your bot token. In the Get Telegram File and Download Image nodes, ensure Telegram API credentials are linked. In the Google Vision OCR node, add Google Cloud credentials with Google Vision API access. In the OpenRouter AI Model node, set up OpenRouter API credentials. Test the Workflow:* Send a test receipt photo or text (e.g., "Lunch 50,000 IDR") via *Telegram** and verify the summary in your chat. Activate:** Enable the workflow in n8n to run automatically for each input. Customization Options Add Categories:* Modify the *AI Categorizer* node to include new expense types (e.g., *Entertainment**). Change Output Format:* Adjust the *Format Summary Message** node to include more details like taxes or payment methods. Switch AI Model:* In the *OpenRouter AI Model* node, select a different *OpenRouter** model for better parsing. Store Data:* Add a *Google Sheets* node after *Parse Receipt Data** to save expense records. Enhance Errors:* Include an email notification node after *Check Invalid Input** for failed inputs. Why Choose MoneyMate? Save time, reduce manual entry, and gain clarity on your spending with MoneyMate’s AI-driven workflow. Ready to streamline your finances? Get MoneyMate now! Made by: khmuhtadin Need a custom? contact me on LinkedIn or Web
by Usman Liaqat
Description: This n8n workflow helps you capture Slack messages via a webhook and download attached media files (like images, documents, or videos) directly from those messages. How it works: Slack Trigger (Webhook) – Listens for new messages in a Slack channel where the app is added. HTTP Request – Uses the file's private download URL to retrieve the media securely. Use cases: Download files shared by team members in a Slack channel. Capture and process media from specific project or support channels. Prepare media for later processing, archiving, or review. Requirements: Slack app with appropriate permissions (files:read, channels:history, etc.). Slack webhook set up to listen to channel messages. - Authenticated HTTP request to handle private Slack file URLs. This template is ideal for users who want full control over file handling triggered by real-time Slack messages.
by Mind-Front
Description: The closest definition to this workflow is a cheaper Modular Version of Perplexity online API empowered by LLM models that outperform the Perplexity Lama Model. This flow provides a seamless way to conduct detailed web searches, extract data, and generate insightful reports based on real-time information. It provides a webhook-based flow that gets any search question and reports back the results via a multi-level web search analysis and domain-specific emulation of an agent to deliver an unbiased expert report. This Flow is Ideal for market research, competitive analysis, or any scenario where actionable, structured insights are needed. A more complete, step-by-step guide is provided within the workflow, ensuring you have all the details to set up and customize each component. This tool is designed to function similarly to Perplexity by performing semantic search, reranking, and follow-up queries. However, it offers a unique advantage—complete customization at every stage. Modify any part of the process, from query refinement to data extraction, allowing you to tailor the workflow to your specific needs. Key Features: AI-Powered Query Generation and Expert Emulation**: Uses Google Gemini to transform user queries into expert-level searches, providing accurate and context-aware results. Dual-Stage Semantic Search with Intelligent Reranking**: Performs an initial search, reranks results, and refines the query based on findings to conduct a second, more targeted search. Top-Result Data Extraction**: Extracts content from the top three results of each search, capturing relevant insights from six total sources. Customizable API Options**: Pre-configured with free APIs (Google Gemini, DuckDuckGo, and Article Extraction APIs) but easily adaptable to other APIs if preferred. Automated, Insightful Reporting**: Synthesizes data into a cohesive report, providing expert-level insights tailored to the user’s query. Instructions for API Setup: This workflow is designed to work with free-tier APIs, offering a cost-effective way to retrieve high-quality data. Here’s how to set up each API, with detailed instructions included in the workflow: Google Gemini API (for Query Generation and Analysis): Visit Google AI Studio and log in. Create a free API key under "Get API Key" → "Create API Key in New Project." The free tier includes up to 15 requests per minute, 1 million tokens per minute, and up to 1,500 requests per day. Brave Search API (for Web Search): To attain the free web search API tier from Brave, follow these steps: Visit api.search.brave.com Create an account Subscribe to the free plan (no charge) Navigate to the API Keys section Generate an API key. For the subscription type, choose "Free". Article Extraction API (for Content Extraction): Register on RapidAPI.com and subscribe to the Article Extraction API. The free plan allows up to 300 extractions per month. Enter your API key in each of the 6 extraction nodes for content retrieval. Alternative: In the workflow, we have provided the full instructions on how to replace the current flow with alternative API Keys and provided suggestions such as Scraper Tech API. Additional Tip: To use other APIs, you can generate a cURL request in RapidAPI’s playground, and then paste it into the HTTP Request node in n8n. This approach streamlines integration by automatically filling in headers and request details. Why Choose This Workflow? The Intelligent Online Web Researcher offers an all-in-one solution for complex, customizable online research. Unlike other tools that provide automated semantic search, this workflow is fully modifiable, allowing you to tailor each step, from the initial query and reranking to data extraction and reporting. With built-in instructions and a structure that’s easy to adapt, it’s ideal for commercial applications that require real-time, high-quality insights. Tags: Online Research, Web Search, Market Analysis, Web Search Automation, Data Extraction, Semantic Search, API Integration, Competitive Intelligence, Business Intelligence, Real-Time Reporting, Web Scrape, Data Crawler, Perplexity
by Femi Ad
Description AI-Powered Business Idea Generation & Social Media Content Strategy Workflow This intelligent content discovery and strategy system features 15 nodes that automatically monitor Reddit communities, analyze business opportunities, and generate targeted social media content for AI automation agencies and entrepreneurs. It leverages AI classification, structured analysis, and automated content creation to transform community discussions into actionable business insights and marketing materials. Core Components Reddit Intelligence: Multi-subreddit monitoring across AI automation, n8n, and entrepreneur communities with keyword-based filtering. AI Classification Engine: Intelligent categorization of posts into "Questions" vs "Requests" using LangChain text classification. Dual Analysis System: Specialized AI agents for educational content (questions) and sales-focused content (service requests). Content Strategy Generator: Automated creation of LinkedIn and Twitter content tailored to different audience engagement strategies. Telegram Integration: Real-time delivery of formatted content strategies and business insights. Structured Output Processing: JSON-formatted analysis with relevancy scores, feasibility assessments, and actionable content recommendations. Target Users • AI Automation Agency Owners seeking consistent lead generation and thought leadership content • Entrepreneurs wanting to identify market opportunities and position themselves as industry experts • Content Creators in the automation/AI space needing data-driven content strategies • Business Development Professionals looking for systematic opportunity identification • Digital Marketing Agencies serving tech and automation clients Setup Requirements To get started, you'll need: Reddit API Access: OAuth2 credentials for accessing Reddit's API and monitoring multiple subreddits. Required APIs: • OpenRouter (for AI model access - supports GPT-4, Claude, and other models) • Reddit OAuth2 API (for community monitoring and data extraction) n8n Prerequisites: • Version 1.7+ with LangChain nodes enabled • Webhook configuration for Telegram integration • Proper credential storage and management setup Telegram Bot: Create via @BotFather for receiving formatted content strategies and business insights. Disclaimer: This template uses LangChain nodes and Reddit API integration. Ensure your n8n instance supports these features and verify API rate limits for production use. Step-by-Step Setup Guide Install n8n: Ensure you're running n8n version 1.7 or higher with LangChain node support enabled. Set Up API Credentials: • Create Reddit OAuth2 application at reddit.com/prefs/apps • Set up OpenRouter account and obtain API key • Store credentials securely in n8n credential manager Create Telegram Bot: • Go to Telegram, search for @BotFather • Create new bot and note the token • Configure webhook pointing to your n8n instance Import the Workflow: • Copy the workflow JSON from the template submission • Import into your n8n dashboard • Verify all nodes are properly connected Configure Monitoring Settings: • Adjust subreddit targets (currently: ArtificialIntelligence, n8n, entrepreneur) • Set keyword filters for relevant topics • Configure post limits and sorting preferences Customize AI Analysis: • Update system prompts to match your business expertise • Adjust relevancy and feasibility scoring criteria • Modify content generation templates for your brand voice Test the Workflow: • Run manual execution to verify Reddit data collection • Check AI classification and analysis outputs • Confirm Telegram delivery of formatted content Schedule Automation: • Set up daily trigger (currently configured for 12 PM) • Monitor execution logs for any API rate limit issues • Adjust frequency based on content volume needs Usage Instructions Automated Discovery: The workflow runs daily at 12 PM, scanning three key subreddits for relevant posts about AI automation, business opportunities, and n8n workflows. Intelligent Classification: Posts are automatically categorized as either "Questions" (educational opportunities) or "Requests" (potential service leads) using AI text classification. Dual Analysis Approach: • Questions → Educational content strategy with relevancy and detail scoring • Requests → Sales-focused content with relevancy and feasibility scoring Content Strategy Generation: Each analyzed post generates: • 3 LinkedIn posts (thought leadership, case studies, educational frameworks) • 3 Twitter posts (quick insights, engagement questions, thread starters) Telegram Delivery: Receive formatted content strategies with: • Post summaries and business context • Relevancy/feasibility scores • Ready-to-use social media content • Strategic recommendations Content Customization: Adapt generated content for different tones (business, educational, technical) and posting schedules. Workflow Features Multi-Platform Monitoring: Simultaneous tracking of 3 key Reddit communities with customizable keyword filters. AI-Powered Classification: Automatic categorization of posts into actionable content types. Dual Scoring System: • Relevancy scores (0.05-0.95) for business alignment • Detail/Feasibility scores (0.05-0.95) for content quality assessment Content Variety: Generates both educational and sales-focused social media strategies. Structured Output: JSON-formatted analysis for easy integration with other systems. Real-time Delivery: Instant Telegram notifications with formatted content strategies. Scalable Monitoring: Easy addition of new subreddits and keyword filters. Error Handling: Comprehensive validation with graceful failure management. Performance Specifications • Monitoring Frequency: Daily automated execution with manual trigger capability • Post Analysis: 5 posts per subreddit (15 total daily) • Content Generation: 6 social media posts per analyzed opportunity • Classification Accuracy: AI-powered with structured output validation • Delivery Method: Real-time Telegram integration • Scoring Range: 0.05-0.95 scale for relevancy and feasibility assessment Why This Workflow? Systematic Opportunity Identification: Never miss potential business opportunities or content ideas from key communities. AI-Enhanced Analysis: Leverage advanced language models for intelligent content categorization and strategy generation. Time-Efficient Content Creation: Transform community discussions into ready-to-use social media content. Data-Driven Insights: Quantified scoring helps prioritize opportunities and content strategies. Automated Lead Intelligence: Identify potential service requests and educational content opportunities automatically. Workflow Image Need help customizing this workflow for your specific use case? As a fellow entrepreneur passionate about automation and business development, I'd be happy to consult. Connect with me on LinkedIn: https://www.linkedin.com/in/femi-adedayo-h44/ or email for support. Let's make your AI automation agency even more efficient!
by PUQcloud
Setting up n8n workflow Overview The Docker Grafana WHMCS module uses a specially designed workflow for n8n to automate deployment processes. The workflow provides an API interface for the module, receives specific commands, and connects via SSH to a server with Docker installed to perform predefined actions. Prerequisites You must have your own n8n server. Alternatively, you can use the official n8n cloud installations available at: n8n Official Site Installation Steps Install the Required Workflow on n8n You have two options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates for our modules are available on the official n8n marketplace. Visit our profile to access all available templates: PUQcloud on n8n Option 2: Manual Installation Each module version comes with a workflow template file. You need to manually import this template into your n8n server. n8n Workflow API Backend Setup for WHMCS/WISECP Configure API Webhook and SSH Access Create a Basic Auth Credential for the Webhook API Block in n8n. Create an SSH Credential for accessing a server with Docker installed. Modify Template Parameters In the Parameters block of the template, update the following settings: server_domain – Must match the domain of the WHMCS/WISECP Docker server. clients_dir – Directory where user data related to Docker and disks will be stored. mount_dir – Default mount point for the container disk (recommended not to change). Do not modify the following technical parameters: screen_left screen_right Deploy-docker-compose In the Deploy-docker-compose element, you have the ability to modify the Docker Compose configuration, which will be generated in the following scenarios: When the service is created When the service is unlocked When the service is updated nginx In the nginx element, you can modify the configuration parameters of the web interface proxy server. The main section allows you to add custom parameters to the server block in the proxy server configuration file. The main\_location section contains settings that will be added to the location / block of the proxy server configuration. Here, you can define custom headers and other parameters specific to the root location. Bash Scripts Management of Docker containers and all related procedures on the server is carried out by executing Bash scripts generated in n8n. These scripts return either a JSON response or a string. All scripts are located in elements directly connected to the SSH element. You have full control over any script and can modify or execute it as needed.
by Mark Shcherbakov
Video Guide I prepared a detailed guide that illustrates the entire process of building an AI agent using Supabase and Google Drive within N8N workflows. Youtube Link Who is this for? This workflow is designed for developers, data scientists, and business users who wish to automate document management and enable AI-powered interactions over their stored files. It's especially beneficial for scenarios where users need to process, analyze, and retrieve information from uploaded documents rapidly. What problem does this workflow solve? Managing files across multiple platforms often involves tedious manual processes. This workflow facilitates automated file handling, making it easier for users to upload, parse, and interact with documents through an AI agent. It reduces redundancy and enhances the efficiency of data retrieval and management tasks. What this workflow does This workflow integrates Supabase storage with Google Drive and employs an AI agent to manage files effectively. The agent can: Upload files to Supabase storage and activate processes based on file changes in Google Drive. Retrieve and parse documents, converting them into a structured format for easy querying. Utilize an AI agent to answer user queries based on saved document data. Data Collection: The workflow initially gathers files from Supabase storage, ensuring no duplicates are processed in the 'files' table. File Handling: It processes files to be parsed based on their type, leveraging LlamaParse for effective data transformation. Google Drive Integration: The workflow monitors a designated Google Drive folder to upload files automatically and refresh document records in the database with new data. AI Interaction: A webhook is established to enable the AI agent to converse with users, facilitating queries and leveraging stored document knowledge. Setup Supabase Storage Setup: Create a private bucket in Supabase storage, modifying the default name in the URL. Upload your files using the provided upload options. Database Configuration: Establish the 'file' and 'document' tables in Supabase with the necessary fields. Execute any required SQL queries for enabling vector matching features. N8N Workflow Logic: Start with a manual trigger for the initial workflow segment or consider alternative triggers like webhooks. Replace all relevant credentials across nodes with your own to ensure seamless operation. File Processing and Google Drive Monitoring: Set up file processing to take care of downloading and parsing files based on their types. Create triggers to monitor the designated Google Drive folder for file uploads and updates. Integrate AI Agent: Configure the webhook for the AI agent to accept chat inputs while maintaining session context for enhanced user interactions. Utilize PostgreSQL to store user interactions and manage conversation states effectively. Testing and Adjustments: Once everything is set up, run tests with the AI agent to validate its responses based on the documents in your database. Fine-tune the workflow and AI model as needed to achieve desired performance.
by David Ashby
Complete MCP server exposing all LinkedIn Tool operations to AI agents. Zero configuration needed - all 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every LinkedIn Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n LinkedIn Tool tool with full error handling 📋 Available Operations (1 total) Every possible LinkedIn Tool operation is included: 🔧 Post (1 operations) • Create a post 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native LinkedIn Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every LinkedIn Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Markhah
Overview This workflow generates automated revenue and expense comparison reports from a structured Google Sheet. It enables users to compare financial data across the current period, last month, and last year, then uses an AI agent to analyze and summarize the results for business reporting. 1.Prerequisites A connected Google Sheets OAuth2 credential. A valid DeepSeek AI API (or replaceable with another Chat Model). A sub-workflow (child workflow) that handles processing logic. Properly structured Google Sheets data (see below). 2.Required Google Sheet Structure Column headers must include at least: Date, Amount, Type. Data format for Date must be in dd/MM/yyyy or dd-MM-yyyy. Entries should span over multiple time periods (e.g., current month, last month, last year). 3.Setup Steps Import the workflow into your n8n instance. Connect your Google Sheets and DeepSeek API credentials. Update: Sheet ID and Tab Name (already embedded in node: Get revenual from google sheet). Custom sub-workflow ID (in the Call n8n Workflow Tool node). Optionally configure chatbot webhook in the When chat message received node. 4.What the Workflow Does Accepts date inputs via AI chat interface (ChatTrigger + AI Agent). Fetches raw transaction data from Google Sheets. Segments and pivots revenue by classification for: Current period Last month Last year Aggregates totals and applies custom titles for comparison. Merges all summaries into a final unified JSON report. 5.Customization Options Replace DeepSeek with OpenAI or other LLMs. Change the date fields or cycle comparisons (e.g., quarterly, weekly). Add more AI analysis steps such as sentiment scoring or forecasting. Modify the pivot logic to suit specific KPI tags or labels. 6.Troubleshooting Tips If Google Sheets fetch fails: ensure the document is shared with your n8n Google credential. If parsing errors: verify that all dates follow the expected format. Sub-workflow must be active and configured to accept the correct inputs (6 dates). 7.SEO Keywords: google sheets report, AI financial report, compare revenue by month, expense analysis automation, chatbot n8n report generator, n8n Google Sheet integration
by David Ashby
🛠️ Google Cloud Natural Language Tool MCP Server Complete MCP server exposing all Google Cloud Natural Language Tool operations to AI agents. Zero configuration needed - all 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Google Cloud Natural Language Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Google Cloud Natural Language Tool tool with full error handling 📋 Available Operations (1 total) Every possible Google Cloud Natural Language Tool operation is included: 🔧 Document (1 operations) • Analyze sentiment 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Google Cloud Natural Language Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Google Cloud Natural Language Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Mailcheck Tool operations to AI agents. Zero configuration needed - 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Mailcheck Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Mailcheck Tool tool with full error handling 📋 Available Operations (1 total) Every possible Mailcheck Tool operation is included: 🔧 Email (1 operations) • Check an email 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Mailcheck Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Mailcheck Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Gregor
Awork currently does not support a check for open subtasks or open dependencies when setting a task status to done. This workflow offers you a simple workaround to add this functionality to Awork and notifies users when triggered. Multiple configuration options available. How it works Triggered via Awork Webhook call on status change of tasks If task is marked as done, subtasks and/or dependent tasks are checked for their status If unfinished tasks are found, a status rollback to previous status is performed and user gets notified Set up steps Add webhook call to Awork Configure Awork API credentials Set up workflow configuration via setup node, e.g. user notification text, restrict to subtasks/dependency checks etc.
by Adnan Tariq
🛡 CyberScan – AI-Powered Vulnerability Scanner with Nessus, OpenAI, and Google Sheets 👤 Who’s it for Security teams, DevOps engineers, vulnerability analysts, and automation builders who want to eliminate repetitive Nessus scan parsing, AI-based risk triage, and manual reporting. Designed for orgs following NIST CSF or CISA KEV compliance guidelines. ⚙️ How it works / What it does Runs scheduled or manual scans via the Nessus API. Processes scan results and extracts asset + vulnerability data. Uses a custom AI-based risk metric (LEV) to triage findings into: 🚨 Expert review ✅ Self-healing 🕵️ Monitoring Automatically sends email alerts for critical CVEs. Exports daily summaries to Google Sheets (or your own BI system). Maps to NIST CSF (Identify, Protect, Detect, Respond, Recover). 🧰 How to set up Nessus: Add your Nessus API credentials and instance URL. Google Sheets: Authenticate your Google account. OpenAI / LLM: Use your API key if adding LLM triage or rewrite prompts. Email: Update SMTP credentials and alert recipient address. Set your targets: Adjust asset ranges or scan UUIDs as needed. ⚠️ All setup steps are explained in sticky notes inside the workflow. 📋 Requirements Nessus Essentials (Free) or Nessus Pro with API access. SMTP service (e.g. Gmail, Mailgun, SendGrid). Google Sheets OAuth2 credentials. Optional: OpenAI or other LLM provider for LEV scoring and CVE insights. 🛠 How to customize the workflow Swap Google Sheets with Airtable, Supabase, or PostgreSQL. Change scan logic or asset list to fit your internal network scope. Adjust AI scoring logic to match internal CVSS thresholds or KEV tags. Expand alerting logic to include Slack, Discord, or webhook triggers. 🔒 No sensitive data included. All credentials and sheet links are placeholders.