by Rizky Febriyan
How It Works This workflow automates the analysis of security alerts from Sophos Central, turning raw events into actionable intelligence. It uses the official Sophos SIEM integration tool to fetch data, enriches it with VirusTotal, and leverages Google Gemini to provide a real-time threat summary and mitigation plan via Telegram. Prerequisite (Important): This workflow is triggered by a webhook that receives data from an external Python script. You must first set up the Sophos-Central-SIEM-Integration script from the official Sophos GitHub. This script will fetch data and forward it to your n8n webhook URL. Tool Source Code: Sophos/Sophos-Central-SIEM-Integration The n8n Workflow Steps Webhook: Receives enriched event and alert data from the external Python script. IF (Filter): Immediately filters the incoming data to ensure only events with a high or critical severity are processed, reducing noise from low-priority alerts. Code (Prepare Indicator): Intelligently inspects the Sophos event data to extract the primary threat indicator. It prioritizes indicators in the following order: File Hash (SHA256), URL/Domain, or Source IP. HTTP Request (VirusTotal): The extracted indicator is sent to the VirusTotal API to get a detailed reputation report, including how many security vendors flagged it as malicious. Code (Prompt for Gemini): The raw JSON output from VirusTotal is processed into a clean, human-readable summary and a detailed list of flagging vendors. AI Agent (Google Gemini): All collected data—the original Sophos log, the full alert details, and the formatted VirusTotal reputation—is compiled into a detailed prompt for Gemini. The AI acts as a virtual SOC analyst to: Create a concise incident summary. Determine the risk level. Provide a list of concrete, actionable mitigation steps. Telegram: The complete analysis and mitigation plan from Gemini is formatted into a clean, easy-to-read message and sent to your specified Telegram chat. Setup Instructions Configure the external Python script to forward events to this workflow's Production URL. In n8n, create Credentials for Google Gemini, VirusTotal, and Telegram. Assign the newly created credentials to the corresponding nodes in the workflow.
by Pedro Santos
🎥 Summarize YouTube Videos using SearchApi & LLM Who is this for? This workflow is ideal for content creators, students, digital marketers, educators, and researchers who want to quickly summarize YouTube videos. What problem does this workflow solve? Manually extracting important information from lengthy YouTube videos can be tedious and prone to errors. This workflow streamlines the process by automatically fetching video transcripts using SearchApi.io and producing concise, informative summaries through a summarization chain powered by any LLM provider. This allows users to quickly access crucial information without the need for manual transcription or detailed viewing. What this workflow does Fetches the complete transcript of a YouTube video using SearchApi. Combines the retrieved transcript into a single, continuous text. Utilizes a Summarization Chain with an LLM (e.g., OpenRouter models) to create a concise summary of the video content. Setup Install the SearchApi community node: Open Settings → Community Nodes inside your self‑hosted n8n instance. Fill npm Package Name with @searchapi/n8n-nodes-searchapi. Accept the risk prompt, and hit Install. It should now appear as a node when you search for it. API Configuration: Set up your SearchApi.io credentials in n8n. Add your preferred LLM provider credentials (e.g., OpenRouter API). Input Requirements: Provide the YouTube video ID (e.g., wBuULAoJxok). Connect LLM Integration: Configure the summarization chain with your chosen model and parameters for text splitting. How to customize this workflow to meet your needs Adjust the summarization model or modify text-splitter parameters to accommodate different lengths and complexities of video transcripts. Integrate additional nodes to export summaries directly into your preferred tools, such as Google Drive, Slack, or email. Customize prompt templates in the summarization chain to obtain various summary styles (bullet points, paragraphs, etc.). Modify the trigger to suit your workflow. Example Usage Input: YouTube video ID (wBuULAoJxok). Output: A concise, actionable summary that highlights key ideas, recommendations, and insights from the video.
by Kunsh
A streamlined AI-powered tool that extracts actionable technical insights from HackerOne security reports for advanced bug bounty hunters. How It Works Send any HackerOne report URL (e.g., https://hackerone.com/reports/123456) to the chat interface. The AI agent will: Fetch the report JSON automatically Analyze for unique techniques, payloads, and root causes Extract reusable insights in a structured format Summarize with practical pentesting value Setup Requirements Google Gemini API credentials configured Chat interface deployed and accessible HackerOne report URLs Output Format Summary: One-liner impact statement Techniques: Payloads, code snippets, exploitation steps Pro Tips: Reusable insights for future hunts Perfect for rapid triage and building your personal exploit knowledge base.
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Transform your expense tracking with automated AI receipt processing that extracts data and organizes it instantly. What this workflow does Monitors Google Drive for new receipt uploads (images/PDFs) Downloads and processes files automatically Extracts key data using VLM Run community node (merchant, amount, currency, date) Saves structured data to Google Sheets for easy tracking Setup Prerequisites: Google Drive/Sheets accounts, VLM Run API credentials, n8n instance. You need to install VLM Run community node. To install Community nodes you need to follow steps, Settings -> Community Nodes -> Install -> Search with name @vlm-run/n8n-nodes-vlmrun Quick Setup: Configure Google Drive OAuth2 and create receipt upload folder Add VLM Run API credentials Create Google Sheets with columns: Customer, Merchant, Amount, Currency, Date Update folder/sheet IDs in workflow nodes Test and activate How to customize this workflow to your needs Extend functionality by: Adding expense categories and approval workflows Connecting to accounting software (QuickBooks, Xero) Including Slack notifications for processed receipts Adding data validation and duplicate detection This workflow transforms manual receipt processing into an automated system that saves hours while improving accuracy.
by Lucas Peyrin
How it works This template launches your very first AI Agent —an AI-powered chatbot that can do more than just talk— it can take action using tools. Think of an AI Agent as a smart assistant, and the tools are the apps on its phone. By connecting it to other nodes, you give your agent the ability to interact with real-world data and services, like checking the weather, fetching news, or even sending emails on your behalf. This workflow is designed to be the perfect starting point: The Chat Interface:** A Chat Trigger node provides a simple, clean interface for you to talk to your agent. The Brains:** The AI Agent node receives your messages, intelligently decides which tool to use (if any), and formulates a helpful response. Its personality and instructions are fully customizable in the "System Message". The Language Model:* It uses *Google Gemini** to power its reasoning and conversation skills. The Tools:** It comes pre-equipped with two tools to demonstrate its capabilities: Get Weather: Fetches real-time weather forecasts. Get News: Reads any RSS feed to get the latest headlines. The Memory:** A Conversation Memory node allows the agent to remember the last few messages, enabling natural, follow-up conversations. Set up steps Setup time: ~2 minutes You only need one thing to get started: a free Google AI API key. Get Your Google AI API Key: Visit Google AI Studio at aistudio.google.com/app/apikey. Click "Create API key in new project" and copy the key that appears. Add Your Credential in n8n: On the workflow canvas, go to the Connect your model (Google Gemini) node. Click the Credential dropdown and select + Create New Credential. Paste your API key into the API Key field and click Save. Start Chatting! Go to the Example Chat node. Click the "Open Chat" button in its parameter panel. Try asking it one of the example questions, like: "What's the weather in Paris?" or "Get me the latest tech news." That's it! You now have a fully functional AI Agent. Try adding more tools (like Gmail or Google Calendar) to make it even more powerful.
by InfraNodus
Using the knowledge graphs instead of RAG vector stores This workflow creates an AI chatbot agent that has access to several knowledge bases at the same time (used as "experts"). These knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows. The advantages of using GraphRAG instead of the standard vector stores for knowledge are: Easy and quick to set up (no complex data import workflows needed) A knowledge graph has a holistic view of your knowledge base Better retrieval of relations between the document chunks = higher quality responses How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. Here's a description step by step: The user submits a question using the AI chatbot (n8n interface, in this case, which can be accessed via a URL or embedded to any website) The AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge it has auto-generated by InfraNodus. The AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert. The query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert. Each InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expert (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG. The n8n AI Agent node integrates the responses received from the experts to produce the final answer. The final answer is sent back to the user's chat (or a webhook endpoint) How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20174217658396-Using-InfraNodus-Knowledge-Graphs-as-Experts-for-AI-Chatbot-Agents-in-n8n Also check out the video tutorial with a demo:
by Jordan Lee
This n8n template demonstrates how to use AI as a comprehensive personal assistant with multiple specialized agents. Use cases include email management, scheduling, web search, calculations, and more - all automated through AI coordination. Good to know This template integrates multiple AI services through OpenRouter Each agent specializes in different tasks (Gmail, Calendar, Search, etc.) Memory persistence maintains context across interactions How it works The workflow is triggered by Telegram messages (can be replaced with other triggers) A router node directs requests to the appropriate specialized agent Agents include: Gmail for email management Calculator for math operations Google Search for information retrieval Calendar for scheduling Contacts for CRM functions The OpenRouter Chat Model coordinates responses Final responses are sent back through Telegram How to use Connect your Telegram bot credentials Configure each service with appropriate API keys The system will automatically route requests to the right agent Requirements OpenRouter account for AI services Telegram bot token Google API credentials for relevant services Customising this workflow Add more specialized agents as needed Replace Telegram with other communication channels Adjust routing logic for different use cases
by Emad
This workflow automatically sends you a list of your daily meetings every morning via a Telegram bot. Use Cases: This workflow is useful for anyone who wants to be automatically informed of their daily meetings, especially for busy professionals, students, and anyone with a hectic schedule. Setup: Google Calendar connected to n8n A Telegram bot created and connected to n8n Your Telegram user ID specified Notes: You need to replace the placeholder in the Telegram node with your actual Telegram user ID. You can customize the formatting of the Telegram message in the JavaScript Code node.
by Rodrigue Gbadou
What this workflow does This n8n workflow collects client feedback through a form (Tally, Typeform, or Google Forms) and uses AI to analyze it. It automatically generates a summary of the positive points, highlights areas for improvement, and drafts a short social media post based on the feedback. Ideal for: Freelancers Customer support teams Online service providers Coaches and educators Setup steps Connect your form tool to the Webhook node (POST method) and make sure it sends a feedback field. Add your DeepSeek (or other GPT-compatible) API key to the AI request node. Configure the email node with your SMTP credentials and desired recipient address. Replace the Telegram node with Slack, Buffer, or another integration if you prefer. (Optional) Customize the prompt in the function node for different tone/language. 🕐 Estimated setup time: ~15 minutes 💬 Sticky notes are included and clearly positioned to guide you. Technologies used n8n Webhook node n8n Function node DeepSeek Chat or compatible AI API Email node (SMTP) Telegram node (or other integration) Sticky Notes for setup guidance Use cases Analyze feedback from onboarding or satisfaction surveys Create ready-to-publish social media content from real customer praise Help support or marketing teams act on feedback immediately
by Abdullah
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Overview This workflow automates the process of transcribing audio files and summarizing them using OpenAI models, with the final output stored neatly in Notion. Whether you're a researcher, content creator, student, or professional, this automation saves time by converting voice recordings into actionable summaries with zero manual effort. Created by: Abdullah Dilshad Contact: iamabdullahdilshad@gmail.com Who It’s For This template is ideal for: Researchers**: Transcribe and summarize interviews, lectures, or research recordings. Content Creators**: Convert podcasts or videos into transcripts and social captions/show notes. Students**: Automatically turn lectures or study group audio into summarized notes. Professionals**: Log meeting notes and summaries directly into your Notion workspace. How It Works This four-step workflow performs the following: Step 1:* *Trigger: New Audio in Google Drive** Automatically triggers when a new audio file (MP3/WAV) is uploaded to a specified Google Drive folder.The file is then downloaded for processing. Step 2: Transcribe Audio with Whisper** The audio file is sent to OpenAI’s Whisper model for high-accuracy transcription. Step 3: Summarize Transcript with GPT-4** The transcript is passed to GPT-4, which generates a clean, concise summary. Step 4: Store Summary in Notion** A new Notion page is created with the generated summary and optional metadata (file name, upload time, etc.). Setup Instructions Step 1: Google Drive Trigger** Connect your Google Drive account. Select the folder you want to monitor. This node detects new file uploads and passes the file for download. Step 2: Download File** Downloads the new audio file for transcription. Step 3: Transcribe Recording (OpenAI Whisper) Connect your OpenAI API Key. Ensure this node receives the binary audio file. It will return the transcription as plain text. Step 3: Transcribe Recording (OpenAI Whisper)** Connect your OpenAI API Key. Ensure this node receives the binary audio file. It will return the transcription as plain text. Step 4: Summarize Transcript (GPT-4 via AI Agent)** Use your OpenAI API Key. Configure a summarization prompt like: "Summarize the following transcript in a clear and concise manner:" Connect the output from Whisper into this GPT-4 prompt. Step 5: Notion Integration** Connect your Notion account. Choose or create a database to store summaries. Map the GPT output (summary) to a "Text" or "Rich Text" property. Optionally include metadata like filename, file upload date, etc.
by Mirajul Mohin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform your video uploads into AI-powered summaries with key topic extraction and instant team notifications. What this workflow does Monitors Google Drive for new video uploads Downloads and processes videos using VLM Run AI Generates intelligent summaries with key topics extracted Posts results to Slack for immediate team access Setup Prerequisites: Google Drive account, VLM Run API credentials, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Google Drive OAuth2 and create video upload folder Add VLM Run API credentials Set up Slack integration for notifications Update folder/channel IDs in workflow nodes Test and activate Perfect for Meeting recordings and training videos Webinar summaries and educational content Content analysis and team collaboration Any video content requiring quick insights Key Benefits Asynchronous processing** handles large files without timeouts Multi-format support** for MP4, AVI, MOV, WebM, MKV Instant team updates** via Slack notifications Saves hours** of manual video review time How to customize Extend by adding: Video categorization and tagging Integration with project management tools Email notifications alongside Slack Searchable video databases with summaries This workflow transforms lengthy videos into actionable insights, making your content instantly accessible and shareable with your team.
by Rui Borges
his workflow automates time tracking using location-based triggers. How it works Trigger: It starts when you enter or exit a specified location, triggering a shortcut on your iPhone. Webhook: The shortcut sends a request to a webhook in n8n. Check-In/Check-Out: The webhook receives the request and records the time and whether it was a "Check-In" or "Check-Out" event. Google Sheets: This data is then logged into a Google Sheet, creating a record of your work hours. Set up steps Google Drive: Connect your Google Drive account. Google Sheets: Connect your Google Sheets account. Webhook: Set up a webhook node in n8n. iPhone Shortcuts: Create two shortcuts on your iPhone, one for "Check-In" and one for "Check-Out." Configure Shortcuts: Configure each shortcut to send a request to the webhook with the appropriate "Direction" header. It's easy to setup, around 5 minutes.