by Tushar Mishra
This n8n workflow automatically monitors RSS feeds for the latest AI vulnerability news, extracts key threat details, and creates a corresponding Security Incident in ServiceNow for each item. Schedule Trigger – Runs at scheduled intervals to check for updates. RSS Read – Fetches the latest AI vulnerability entries from the RSS feed. Read URL Content – Retrieves the full article for detailed analysis. Information Extractor (OpenAI Chat Model) – Parses and summarizes critical security information. Split Out – Processes each vulnerability alert separately. Create Incident – Generates a ServiceNow Security Incident with the extracted details. Ideal for security teams to track and respond quickly to emerging AI-related threats without manual feed monitoring.
by Yaron Been
Replace manual task prioritization with intelligent AI reasoning that thinks like a Chief Operating Officer. This workflow automatically fetches your Asana tasks every morning, analyzes them using advanced AI models, and delivers the single most critical task with detailed reasoning - ensuring your team always focuses on what matters most. ✨ What This Workflow Does: 📋 Automated Task Collection**: Fetches all assigned Asana tasks daily at 9 AM 🤖 AI-Powered Analysis**: Uses OpenAI GPT-4 to evaluate urgency, impact, and strategic importance 🎯 Smart Prioritization**: Identifies the #1 most critical task with detailed reasoning 🧠 Contextual Memory**: Leverages vector database for historical context and pattern recognition 💾 Structured Storage**: Saves prioritized tasks to PostgreSQL with full audit trail 🔄 Continuous Learning**: Builds organizational knowledge over time for better decisions 🔧 Key Features: Daily automation** with zero manual intervention Context-aware AI** that learns from past prioritization decisions Strategic reasoning** explaining why each task is prioritized Vector-powered memory** using Pinecone for intelligent context retrieval Clean structured output** with task names, priority levels, and detailed justifications Database integration** for reporting and historical analysis 📋 Prerequisites: Asana account with API access OpenAI API key (GPT-4 recommended) PostgreSQL database Pinecone account (for vector storage and context) 🎯 Perfect For: Operations teams managing multiple competing priorities Startups needing systematic task management Project managers juggling complex workflows Leadership teams requiring strategic focus Any organization wanting AI-driven operational intelligence 💡 How It Works: Morning Automation: Triggers every day at 9 AM Data Collection: Pulls all relevant tasks from Asana AI Analysis: Evaluates each task using COO-level strategic thinking Context Retrieval: Searches vector database for similar past tasks Smart Prioritization: Identifies the single most important task Structured Output: Delivers priority level with detailed reasoning Data Storage: Saves results for reporting and continuous improvement 📦 What You Get: Complete n8n workflow with all AI components configured PostgreSQL database schema for task storage Vector database setup for contextual intelligence Comprehensive documentation and setup guide Sample task data and output examples 💡 Need Help or Want to Learn More? Created by Yaron Been - Automation & AI Specialist 📧 Support: Yaron@nofluff.online 🎥 YouTube Tutorials: https://www.youtube.com/@YaronBeen/videos 💼 LinkedIn: https://www.linkedin.com/in/yaronbeen/ Discover more advanced automation workflows and AI integration tutorials on my channels! 🏷️ Tags: AI, OpenAI, Asana, Task Management, COO, Prioritization, Automation, Vector Database, Operations, GPT-4
by Friedemann Schuetz
Welcome to my Automated Image Metadata Tagging Workflow! This workflow automatically analyzes the image content with the help of AI and writes it directly back into the image file as keywords. This workflow has the following sequence: Google Drive trigger (scan for new files added in a specific folder) Download the added image file Analyse the content of the image and extract the file as Base64 code Merge Metadata and Base64 Code Code Node to write the Keywords into the Metadata (dc:subject) Convert to file and update the original file in the Google Drive folder The following accesses are required for the workflow: Google Drive: Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) You can contact me via LinkedIn, if you have any questions: https://www.linkedin.com/in/friedemann-schuetz
by Friedemann Schuetz
Welcome to my Automated Image Metadata Tagging Workflow! DISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-exif-data Community Node! This workflow automatically analyzes the image content with the help of AI and writes it directly back into the image file as keywords. (https://n8n.io/workflows/2995).** This workflow has the following steps: Google Drive trigger (scan for new files added in a specific folder) Download the added image file Analyse the content of the image Merge Metadata and image file Write the Keywords into the Metadata (dc:subject/keywords) and create new image file Update the original file in the Google Drive folder The following accesses are required for the workflow: You have to install the n8n-nodes-exif-data Community Node** Google Drive: Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) You can contact me via LinkedIn, if you have any questions: https://www.linkedin.com/in/friedemann-schuetz
by Samir Saci
Tags: Supply Chain, Logistics, AI Agents Context Hey! I’m Samir, a Supply Chain Data Scientist from Paris, and the founder of LogiGreen Consulting. We design tools to help companies improve their logistics processes using data analytics, AI, and automation—to reduce costs and minimize environmental impacts. >Let’s use N8N to improve logistics operations! 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This workflow template is designed for logistics or manufacturing operations that receive orders by email. The example above illustrate the challenge we want to tackle using an AI Agent to parse the information and load them in a Google sheet. If you want to understand how I built this workflow, check my detailed tutorial: 🎥 Step-by-Step Tutorial How does it work? The workflow is connected to a Gmail Trigger to open all the emails that include Inbound Order in their subject. The email is parsed by an AI Agent equipped with OpenAI's GPT to collect all the information. The results are pulled in a Google Sheet. These orderlines can then be transferred to warehouse teams to prepare *order receiving. What do I need to get started? You’ll need: Gmail and Google Drive Accounts** with the API credentials to access it via n8n An OpenAI API key (GPT-4o) for the chat model. A Google Sheet with these columns: PO_NUMBER, EXPECTED_DELIVERY DATE, SKU_ID, QUANTITY Next Steps Follow the sticky notes in the workflow to configure each node and start using AI to support your logistic operations. 🚀 Curious how N8N can transform your logistics operations? 📬 Let’s connect on LinkedIn Notes An example of email is included in the template so you can try it with your mailbox. This workflow was built using N8N version 1.82.1 Submitted: March 28, 2025
by Paul Taylor
📩 Gmail → GPT → Supabase | Task Extractor This n8n workflow automates the extraction of actionable tasks from unread Gmail messages using OpenAI's GPT API, stores the resulting task metadata in Supabase, and avoids re-processing previously handled emails. ✅ What It Does Triggers on a schedule to check for unread emails in your Gmail inbox. Loops through each email individually using SplitInBatches. Checks Supabase to see if the email has already been processed. If it's a new email: Formats the email content into a structured GPT prompt Calls ChatGPT-4o to extract structured task data Inserts the result into your emails table in Supabase 🧰 Prerequisites Before using this workflow, you must have: An active n8n Cloud or self-hosted instance A connected Gmail account with OAuth credentials in n8n A Supabase project with an emails table and: ALTER TABLE emails ADD CONSTRAINT unique_email_id UNIQUE (email_id); An OpenAI API key with access to GPT-4o or GPT-3.5-turbo 🔐 Required Credentials | Name | Type | Description | |-----------------|------------|-----------------------------------| | Gmail OAuth | Gmail | To pull unread messages | | OpenAI API Key | OpenAI | To generate task summaries | | Supabase API | HTTP | For inserting rows via REST API | 🔁 Environment Variables or Replacements Supabase_TaskManagement_URI → e.g., https://your-project.supabase.co Supabase_TaskManagement_ANON_KEY → Your Supabase anon key These are used in the HTTP request to Supabase. ⏰ Scheduling / Trigger Triggered using a Schedule node Default: every X minutes (adjust to your preference) Uses a Gmail API filter: unread emails with label = INBOX 🧠 Intended Use Case > Designed for productivity-minded professionals who want to extract, summarize, and store actionable tasks from incoming email — without processing the same email twice or wasting GPT API credits. This is part of a larger system integrating GPT, calendar scheduling, and optional task platforms (like ClickUp). 📦 Output (Stored in Supabase) Each processed email includes: email_id subject sender received_at body (email snippet) gpt_summary (structured task) requires_deep_work (from GPT logic) deleted (initially false)
by MRJ
Modular Hazard Analysis Workflow : Free Version Business Value Proposition Accelerates ISO 26262 compliance for automotive/industrial systems by automating safety analysis while maintaining rigorous audit standards. :chart_with_upwards_trend: Key Benefits Time Instant report generation vs. weeks of documentation for HAZOP Risk Mitigation Pre-validated templates reduce human error Quick guide Input a systems_description file to the workflow Provide an OPENAI_API_KEY to the chat model. You can also replace the chat model with the model of your interest. :play_or_pause_button: Running the Workflow Refer to the github repo to understand in detail about how the workflow can be used :email: Contact For collaboration proposals or security issues, contact me by Email. :warning: Validation & Limitations AI-Assisted Analysis Considerations | Advantage | Mitigation Strategy | Implementation Example | |-----------|---------------------|------------------------| | Rapid hazard identification | Human validation layer | Manual review nodes in workflow | | Consistent S/E/C scoring | Rule-based validation | ASIL-D → Redundancy check | | Edge case coverage | Cross-reference with historical data | Integration with incident databases |
by jason
If you have made some investments in cryptocurrency, this workflow will allow you to create an Airtable base that will update the value of your portfolio every hour. You can then track how well your investments are doing. You can check out my Airtable base to see how it works or even copy my base so that you can customize this workflow for yourself. To implement this workflow, you will need to update the Airtable nodes with your own credentials and make sure that they are pointing to your Airtable
by Harshil Agrawal
This workflow demonstrates how to use noItemsLeft to check if there are items left to be processed by the SplitInBatches node. Function node: This node generates mock data for the workflow. Replace it with the node whose data you want to split into batches. SplitInBatches node: This node splits the data with the batch size equal to 1. Based on your use-case, set the value of the Batch Size. IF node: This node check if all the data by the SplitInBatches are not processed or not. It uses the expression {{$node["SplitInBatches"].context["noItemsLeft"]}} which returns a boolean value. If there is data yet to be processed, the expression will return false, otherwise true. Set node: This node prints a message No Items Left. Based on your use-case, connect the false output of the IF node to the input of the node you want to execute, after the data is processed by the SplitInBatches node.
by Harshil Agrawal
This workflow allows you to add positive feedback messages to a table in Notion. Prerequisites Create a Typeform that contains Long Text filed question type to accepts feedback from users. Get your Typeform credentials by following the steps mentioned in the documentation. Follow the steps mentioned in the documentation to create credentials for Google Cloud Natural Language. Create a page on Notion similar to this page. Create credentials for the Notion node by following the steps in the documentation. Follow the steps mentioned in the documentation to create credentials for Slack. Follow the steps mentioned in the documentation to create credentials for Trello. Typeform Trigger node: Whenever a user submits a response to the Typeform, the Typeform Trigger node will trigger the workflow. The node returns the response that the user has submitted in the form. Google Cloud Natural Language node: This node analyses the sentiment of the response the user has provided and gives a score. IF node: The IF node uses the score provided by the Google Cloud Natural Language node and checks if the score is positive (larger than 0). If the score is positive we get the result as True, otherwise False. Notion node: This node gets connected to the true branch of the IF node. It adds the positive feedback shared by the user in a table in Notion. Slack node: This node will share the positive feedback along with the score and username to a channel in Slack. Trello node: If the score is negative, the Trello node is executed. This node will create a card on Trello with the feedback from the user.
by David Olusola
When you fill out the form with business challenges and requirements GPT-4 analyzes the input and generates a customized proposal using your template System automatically creates a Google Slides presentation with personalized content Professional proposal email is sent directly to the prospect with the presentation link Set up steps Estimated time: 15-20 minutes Connect your OpenAI API key for GPT-4 access Link your Google account for Slides and Gmail integration Create your proposal template in Google Slides with placeholder variables Customize the AI prompt and email template with your branding Test with sample data and activate the workflow
by n8n Team
This workflow is designed to compare two datasets (Dataset 1 and Dataset 2) based on a common field, "fruit," and provide insights into the differences. Here are the steps: Manual Trigger: The workflow begins when a user clicks "Execute Workflow." Dataset 1: This node generates the first dataset containing information about fruits, such as apple, orange, grape, strawberry, and banana, along with their colors. Dataset 2: This node generates the second dataset, also containing information about fruits, but with some variations in color. For example, it includes a "kiwi" with the color "mostly green." Compare Datasets: The "Compare Datasets" node takes both datasets and compares them based on the "fruit" field. It identifies any differences or matches between the two datasets. In summary, this workflow is used to compare two datasets of fruits and their colors, identify differences, and provide guidance on how to explore the comparison results.