by Billy Christi
What this workflow does This workflow automatically organizes your Gmail inbox by fetching recent emails, analyzing their content with AI, and applying the appropriate Gmail labels based on the results. Step by step: Schedule Trigger runs the workflow automatically at your chosen interval Gmail Fetch retrieves the latest emails from your inbox Loop Over Items processes each email individually AI Text Classifier analyzes email subject and body content to determine the right category Add Labels applies the matching Gmail label according to the AI classification Loop Back continues until all emails are processed and organized How to set up Connect your Gmail account to the Gmail nodes for fetching emails and adding labels Add your OpenAI API key to the OpenAI Chat Model node for AI-powered classification Configure the schedule trigger to run at your preferred interval (default: every 5 minutes) Customize email categories in the Label Classifier node based on your organizational needs Set up Gmail labels that match your classification categories in your Gmail account Adjust the time range for fetching emails (default: last 5 minutes) and email limit (default: 10) Test the workflow with a few sample emails to ensure proper classification and labeling Monitor the workflow execution to verify emails are being processed and labeled correctly How to customize this workflow to your needs Adjust classification categories**: modify the Label Classifier node to include categories like "Work", "Bills", "Social", "Newsletters", or any custom categories you need Change time intervals**: customize the schedule trigger to run hourly, daily, or at specific times based on your email volume Add more label actions**: create additional Gmail label nodes for more granular categorization (urgent, follow-up, archive, etc.) Need help customizing? Contact me for consulting and support: 📧 billychartanto@gmail.com
by LeeWei
⚙️ Trending YouTube Videos Research Workflow: 🧑💻 Author: [Leewei] Automates scraping trending videos based on a keyword, filters high-potential ones, analyzes thumbnails and transcripts with AI, generates optimized titles and outlines, and updates a Google Sheet for content ideas. 🚀 Steps to Connect: Apify API Token Sign up for a free account at Apify and generate your API token. Paste the token into the two HTTP Request nodes (replace <token> in the Authorization header). 💡 This enables scraping YouTube video data and transcripts—setup takes about 5 minutes. OpenAI API Key Go to OpenAI and generate your API key. Add it to the credentials for the YouTube Title Generator, Analyze Thumbnail, and Outline Generator nodes. 💡 Use models like GPT-4o-mini for thumbnail analysis and title/outline generation. Google Sheets Credentials Set up OAuth2 credentials in n8n for Google Sheets with access to your Drive. Update the documentId in the Step 1 Results, Find Duplicate Entries, and Update Rows nodes to your own Google Sheet ID (clone the provided sheet if needed). 💡 This stores filtered video data, AI-generated titles, and outlines—expect 10-15 minutes for auth setup. (Optional) Customize Form Trigger If deploying publicly, no changes needed—the form prompts for "Keyword or Topic" to start the search. Test with a sample keyword like "AI automation" to see results in your sheet.
by Oriol Seguí
This is an n8n workflow designed to implement an advanced AI chatbot with real-time conversation and search capabilities. Configured with a minimalist European design, this chatbot is ready to be integrated into any website. What Does This Workflow Do? The workflow uses a combination of nodes to create a complete chatbot: Chat Trigger: Starts the process when a user sends a message. The configuration includes a customized visual design (minimalist European CSS), welcome messages, and titles. AI Agent: Acts as the chatbot's brain. It coordinates interaction with the language model, memory, and tools to generate intelligent responses. Conversational Memory: Allows the chatbot to remember the context of the conversation, providing a smoother and more coherent experience. Language Model (GPT): Generates the chat responses. Search Tool: Enables the AI agent to search for information on the web and answer questions it doesn't already know. Respond to Chat: Sends the final response back to the user. Use Cases Customer Support**: Answers frequently asked questions and transfers complex conversations to a human agent. Virtual Assistant**: Provides information about products or services, helps users navigate your website, or completes simple tasks. Content Generator**: Serves as an assistant for generating ideas, writing drafts, or summarizing texts. Who Is This For? This workflow is ideal for: Businesses and developers** looking for a versatile and customizable chatbot solution without having to build it from scratch. Business owners** who want to improve customer service and user interaction in an automated way. Curious individuals** and AI enthusiasts who want to explore how chatbots are built and experiment with their own configurations. This workflow includes detailed documentation that explains how each node works and how to customize it for your needs.
by mariskarthick
🚨Are alert storms overwhelming your Security Operations workflows? This n8n workflow supercharges your SOC by fully automating triage, analysis, and notification for Wazuh alerts—blending event-driven automation, OpenAI-powered contextual analysis, and real-time collaboration for incident response. 🔑 Key Features: ✅ Automated Triage: Instantly filters Wazuh alerts by severity to focus analyst effort on the signals that matter. 🤖 AI-Driven Investigation Reports: Uses OpenAI's GPT-4o-mini to auto-generate context-rich incident reports, including: MITRE Tactic & Technique mapping Impacted scope (IP addresses, hostnames) External artifact reputation checks Actionable security recommendations Fully customizable prompt format aligned with your SOC playbooks 📡 Multi-Channel Notification Delivers clean, actionable reports directly to your SOC team via Telegram. Easily extendable to Slack, Outlook, Gmail, Discord, or any other preferred channel. 🔇 Noise Reduction Eliminates alert fatigue using smart filters and custom AI prompts that suppress false positives and highlight real threats. 🔧 Fully Customizable Tweak severity thresholds, update prompt logic, or integrate additional data sources and channels — all with minimal effort ⚙️ How It Works Webhook Listens for incoming Wazuh alerts in real time. If Condition Filters based on severity (1 low, 2 medium, etc.) or other logic you define. AI Investigation (LangChain + OpenAI) Summarizes full alert logs and context using custom prompts to generate: Incident Overview Key Indicators Log Analysis Threat Classification Risk Assessment Security Recommendations Notification Delivery The report is parsed, cleaned, and sent to your SOC team in real-time, enabling rapid response — even during high-alert volumes. No-Op Path Efficiently discards irrelevant alerts without breaking the flow. 🧠 Why n8n + AI? Traditional alert triage is manual, slow, and error-prone — leading to analyst burnout and missed critical threats. This workflow shows how combining workflow automation with a tailored AI model enables your SOC to shift from reactive to proactive. Analysts can now: Focus on critical investigations Respond to alerts faster Eliminate copy-paste fatigue Get instant contextual summaries > ⚠️ Note: We learned that generic AI isn’t enough. Context-rich prompts and alignment with your actual SOC processes are key to meaningful, scalable automation. 🚀 Ready to build a smarter, less stressful SOC? Clone this workflow, adapt it to your processes, and never miss a critical alert again. 📬 Contributions welcome! Feel free to raise PRs, suggest new enhancements, or fork for your own use cases. Created by Mariskarthick M Senior Security Analyst | Detection Engineer | Threat Hunter | Open-Source Enthusiast
by Maximiliano Rojas-Delgado
Scale Your Creative Strategy: 100+ Creative Ads from 1 Image using Fal.AI, Nano Banana model and GPT-5.1 This workflow turns a single reference image into up to 100 high-performing ad variations using Fal.AI's Nano Banana model and GPT-5.1. Simply upload your inspiration to a form, and watch as unique, campaign-ready images appear in your Google Drive. Why use this? Scale Instantly:** Turn one idea into a full campaign without manual design work. Smart Variations:** Uses GPT-5.1 to analyze your reference and rewrite prompts based on proven creative recipes (Metaphors, Pointers, Comparisons). Cost Effective:* Generates 100 variations for approximately *$4.60** (vs. hundreds of dollars with manual design). How does it work? You submit the form with your reference image, company name, and selected "Creative Recipe." GPT-5.1 analyzes your image deeply ("Describe the image extremely comprehensively..."). It rewrites the prompt into multiple unique variations based on your chosen strategy (e.g., "The Metaphor Bake"). Fal.AI generates the images using the fast and efficient Nano Banana model. The results are saved directly to your Google Drive output folder. What do you need? Google Drive Credentials:** To upload your reference and save the results (Guide). OpenAI API Key:** For GPT-5.1 analysis and prompt engineering. Fal.AI API Key:** For generating the images (ensure you have credits). A Google Drive Folder:** One for inputs and one for outputs. 💰 Estimated Costs Fal.AI:** ~$4.00 for 100 images ($0.04/image). GPT-5.1:** ~$0.60 for 100 descriptions (Input: $1.25/1M tokens, Output: $10.00/1M tokens). Total:** ~$4.60 per 100-ad campaign. Important: The Prompt Strategy This automation doesn't just copy your image. It uses a specific instruction to "Describe the image extremely comprehensively. don't leave anything behind". This ensures GPT-5.1 captures every visual detail—lighting, composition, and mood—before rewriting it into new creative angles. That’s it! Just fill the form and let the automation build your ad campaign. If you have any questions, just contact me in X @maxrojasdelgado.
by Yashraj singh sisodiya
Bakery Data Analytics Workflow Explanation Aim The aim of the Bakery Data Analytics Workflow is to automate the process of analyzing bakery sales and stock data stored in Google Sheets. It allows bakery owners or managers to interact with an AI assistant via chat and receive clear, concise, and actionable insights about their business performance without manually reviewing spreadsheets. Goal The goal is to: Enable users to query bakery sales and stock data through a chat interface. Use an AI Agent to interpret user queries and fetch the required data. Retrieve relevant sales/stock figures from a Google Sheets dataset. Generate insights in plain English, with short summaries, highlights, or breakdowns. Maintain conversation context so users can ask follow-up questions naturally. This ensures that bakery owners can make quick, informed decisions about sales trends, inventory shortages, or product performance with minimal manual effort. Requirements The workflow relies on the following components and configurations: n8n Platform The automation platform hosting the workflow. Node Requirements When chat message received (Trigger) Captures user input via chat. Initiates the workflow execution. AI Agent Central reasoning engine. Interprets queries, decides when to fetch data, and ensures professional responses. Uses short, structured insights (bullets, tables, or compact summaries). Simple Memory Stores short-term conversation history. Maintains context across multiple user queries. Retrieve bakery data (Google Sheets) Connects to a linked Google Sheets file. Fetches sales/stock data (e.g., daily totals, item performance). Data source: Bakery Google Sheet. Azure OpenAI Chat Model Backend language model powering the AI Agent. Provides natural language understanding and generates concise responses. Credentials Google Sheets OAuth2 account** (for accessing bakery data). Azure OpenAI API account** (for AI-driven reasoning and conversation). Input Requirements User question/query via chat (e.g., “What was the best-selling pastry last week?”). Output Compact, conversational insights (totals, highlights, trends) delivered via chat. API Usage The workflow integrates two main APIs: Google Sheets API Used by the Retrieve bakery data node. Fetches structured data (sales, stock, dates) from the bakery dataset. Provides the AI Agent with real-time data access. Azure OpenAI API Used by the Azure OpenAI Chat Model node. Powers natural conversation, ensures responses are plain English, concise, and business-focused. Aligns with AI Agent rules to avoid assumptions and provide actionable insights only when asked. Workflow Summary The Bakery Data Analytics Workflow automates bakery performance analysis by: Triggering on chat message input. Passing the query to the AI Agent for interpretation. Using Simple Memory to track context across the conversation. Fetching relevant data from Google Sheets when needed. Leveraging Azure OpenAI to generate structured, professional responses. This creates an interactive AI-powered assistant for bakery data, enabling quick insights into sales and inventory trends without manually combing through spreadsheets.
by Rahul Joshi
Description: Deliver instant answers and automate customer support on WhatsApp with this intelligent n8n workflow template! The system routes incoming messages using keyword-based logic and provides dynamic, AI-powered responses for greetings, FAQs, and complex queries—ensuring your customers always get the right reply without manual effort. This automation is designed for businesses, service providers, and support teams who want to streamline WhatsApp engagement, reduce manual workload, and provide consistent, conversational answers that scale with demand. What This Template Does (Step-by-Step): 📲 Capture Incoming WhatsApp Messages Triggers on every new message received via WhatsApp API. 🔄 Keyword-Based Routing Sequential IF conditions check for predefined keywords (e.g., “hi”, “pricing”, “support”). 💬 Send Tailored Keyword Responses Returns fast, pre-written responses for greetings, FAQs, or common scenarios. 🤖 AI-Powered Fallback with OpenAI Chat Model For advanced or unrecognized queries, the workflow generates context-aware, conversational answers using AI. 🚀 Deliver Automated Replies in Real Time Replies are instantly sent back to WhatsApp for seamless customer communication. 📊 Optional: Conversation Logging Extend the template to log chats in Notion, Airtable, or your CRM for tracking and insights. Perfect For: Customer support teams handling repetitive queries Businesses wanting instant replies for FAQs & greetings Service providers delivering personalized, scalable engagement Anyone looking to combine rule-based automation with AI intelligence Built With: WhatsApp API (message triggers & replies) n8n IF Node (keyword routing) OpenAI Chat Model (AI fallback for complex queries) Extendable storage (Notion, Google Sheets, Airtable, etc.) Key Benefits: ✅ Faster, automated customer support on WhatsApp 🔍 Accurate, human-like replies for complex questions 🧠 Hybrid system: keyword rules + AI intelligence 📒 Centralized chat logging for insights (optional) 🛠 100% no-code and customizable in n8n
by vinci-king-01
Deep Research Agent with AI Analysis and Multi-Source Data Collection 🎯 Target Audience Market researchers and analysts Business intelligence teams Academic researchers and students Content creators and journalists Product managers conducting market research Consultants performing competitive analysis Data scientists gathering research data Marketing teams analyzing industry trends 🚀 Problem Statement Manual research processes are time-consuming, inconsistent, and often miss critical information from multiple sources. This template solves the challenge of automating comprehensive research across web, news, and academic sources while providing AI-powered analysis and actionable insights. 🔧 How it Works This workflow automatically conducts deep research on any topic using AI-powered web scraping, collects data from multiple source types, and provides comprehensive analysis with actionable insights. Key Components Webhook Trigger - Receives research requests and initiates the automated research process Research Configuration Processor - Validates and processes research parameters, generates search queries Multi-Source AI Scraping - Uses ScrapeGraphAI to collect data from web, news, and academic sources Data Processing Engine - Combines and structures data from all sources for analysis AI Research Analyst - Uses GPT-4 to provide comprehensive analysis and insights Data Storage - Stores all research findings in Google Sheets for historical tracking Response System - Returns structured research results via webhook response 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | sessionId | String | Unique research session identifier | "research_1703123456789" | | query | String | Research query that was executed | "artificial intelligence trends" | | timestamp | DateTime | When the research was conducted | "2024-01-15T10:30:00Z" | | analysis | Text | AI-generated comprehensive analysis | "Executive Summary: AI trends show..." | | totalSources | Number | Total number of sources analyzed | 15 | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials OpenAI API account and credentials Google Sheets account with API access Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up OpenAI Credentials Add OpenAI API credentials Enter your API key from OpenAI dashboard Ensure you have access to GPT-4 model Test the connection to verify API access 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for research data Configure the sheet name (default: "Research_Data") 5. Configure Research Parameters Update the webhook endpoint URL Customize default research parameters in the configuration processor Set appropriate search query generation logic Configure research depth levels (basic, detailed, comprehensive) 6. Test the Workflow Send a test webhook request with research parameters Verify data collection from all source types Check Google Sheets for proper data storage Validate AI analysis output quality 🔄 Workflow Customization Options Modify Research Sources Add or remove source types (web, news, academic) Customize search queries for specific industries Adjust source credibility scoring algorithms Implement custom data extraction patterns Extend Analysis Capabilities Add industry-specific analysis frameworks Implement comparative analysis between sources Create custom insight generation rules Add sentiment analysis for news sources Customize Data Storage Add more detailed metadata tracking Implement research versioning and history Create multiple sheet tabs for different research types Add data export capabilities Output Customization Create custom response formats Add research summary generation Implement citation and source tracking Create executive dashboard integration 📈 Use Cases Market Research**: Comprehensive industry and competitor analysis Academic Research**: Literature reviews and citation gathering Content Creation**: Research for articles, reports, and presentations Business Intelligence**: Strategic decision-making support Product Development**: Market validation and trend analysis Investment Research**: Due diligence and market analysis 🚨 Important Notes Respect website terms of service and robots.txt files Implement appropriate delays between requests to avoid rate limiting Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and compliance requirements Validate research findings from multiple sources 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status OpenAI API errors: Check API key and model access permissions Google Sheets permission errors: Check OAuth2 scope and permissions Research data quality issues: Review search query generation logic Rate limiting: Adjust request frequency and implement delays Webhook response errors: Check response format and content Support Resources: ScrapeGraphAI documentation and API reference OpenAI API documentation and model specifications n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations
by Adrian Bent
Part two of the Indeed Job Scraper, Filter, and Enrichment workflow, this workflow takes information about the scraped and filtered job listings on Indeed via Apify, which is stored in Google Sheets to generate a customized, five-line email icebreaker that implies the rest of the icebreaker is personalized. Personalized IJSFE (Indeed Job Scraper For Enrichment). ++ I am an engineering student, so I love my acronyms. Benefits Instant Icebreaker Generation - Convert hours of research, copywriting and personalization into seconds automatically Live Integration - Generate and send personalized icebreakers whenever, wherever Virtually Complete Automation - From research into company and job description to personalized response with the AI solution, this workflow does it in a click. Professional Presentation - Because the chatbot has context about the company and the job listing, it generates an icebreaker that makes the reader think that there was some research done How It Works Google Sheets Search: Google sheets node fetches all rows where the icebreaker column is empty Each row is an item returned that contains information about the company and the job listing AI Personalization: Uses sophisticated GPT-4 prompting Converts a bunch of information about a job posting and company into a customized, five-line personalized email icebreaker Applies a consistent and casual tone automatically to seem more human-written Database Update: Updates all rows fetched in the search and updates only the icebreaker column with the new personalized icebreaker Each item is returned as a row that contains information about the company, the job listing and the icebreaker Required Template Setup Google Sheets Template: Create a Google Sheet with these variables: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily’s search icebreaker - AI-generated icebreakers for each job listing scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of the company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where job is to take place salary/salaryText - Salary on job listing Setup Instructions: Google Sheets Search & Update Setup: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The search logic in the first Google Sheets node relies on the ID column for icebreaker generation, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. AI Icebreaker Generation Setup: Configure OpenAI API for sophisticated proposal writing Implement example-based training with input/output for more specific output Set up JSON formatting for structure (Personally, I think JSON is easier to handle as an output) Setup Steps: Search & Fetch Rows Setup Create a Google Sheets database with the provided column structure Connect Google Sheets OAuth credentials Configure the filer on the get rows node to only include empty icebreaker columns Set up AI Personalization Add OpenAI API credentials for personalized icebreaker generation Customize the AI prompts for your specific niche, requirements or interest Update Google Sheets Setup Remember to map all items to their respective columns based on the row number All fields in the update sheets node should have the same value as the first sheets node, and the icebreaker field is to take the ChatGPT output as its value
by Harshil Agrawal
This workflow allows you to create a collection and create, update, and get a bookmark in Raindrop. Raindrop node: This node will create a new collection in Raindrop. If you already have a collection, you can skip this node. Raindrop1 node: This node will create a new bookmark and add it to a collection. Raindrop2 node: This node will update the bookmark that we created in the previous node. Raindrop3 node: This node will return the information about the bookmark that we created earlier.
by tanaypant
Get daily SMS updates to tell you if you should wear a sweater
by Harshil Agrawal
This workflows allows you to create a customer and an invoice and send the invoice to the customer. QuickBooks node: This node will create a new customer in QuickBooks. QuickBooks1 node: This node will create an invoice for the customer that we created in the previous node. QuickBooks2 node: This node will send the invoice that we created in the previous node.