by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically discovers and collects information about Stack Overflow user profiles for lead generation. It saves you time by eliminating the need to manually browse through developer profiles and provides a centralized database of potential leads with their technical expertise. Overview This workflow automatically scrapes Stack Overflow user profiles and extracts key information like developer names, locations, reputation scores, and technical tags. It uses Bright Data to access Stack Overflow without being blocked and AI to intelligently parse user data into structured format. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Stack Overflow user profiles without being blocked OpenAI**: AI agent for intelligent data extraction and parsing Google Sheets**: For storing and organizing lead information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and specify the target spreadsheet Customize: Adjust the Stack Overflow URL and user criteria you want to target Use Cases Recruitment Teams**: Find developers with specific technical skills for hiring Business Development**: Identify potential clients or partners in the tech industry Sales Teams**: Build targeted outreach lists for developer-focused products Research**: Gather data on developer communities and skill distributions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #stackoverflow #leadgeneration #brightdata #webscraping #developers #recruitment #businessdevelopment #salesleads #n8nworkflow #workflow #nocode #leadautomation #developerscraping #techtalent #userprofiles #aiautomation #datamining #prospecting #outreach #techrecruiting #developerleads #stackoverflowscraping #profilescraping #leadcollection #techcommunity #developerdatabase #automatedleads #intelligentscraping
by Simon
This n8n workflow simplifies the process of removing backgrounds from images stored in Google Drive. By leveraging the PhotoRoom API, this template enables automatic background removal, padding adjustments, and output formatting, all while storing the updated images back in a designated Google Drive folder. This workflow is very useful for companies or individuals that are spending a lot of time into removing the background from product images. How it Works The workflow begins with a Google Drive Trigger node that monitors a specific folder for new image uploads. Upon detecting a new image, the workflow downloads the file and extracts essential metadata, such as the file size. Configurations are set for background color, padding, output size, and more, which are all customizable to match specific requirements. The PhotoRoom API is called to process the image by removing its background and adding padding based on the settings. The processed image is saved back to Google Drive in the specified output folder with an updated name indicating the background has been removed. Requirements PhotoRoom API Key Google Drive API Access Customizing the Workflow Easily adjust the background color, padding, and output size using the configuration node. Modify the output folder path in Google Drive or replace Google Drive with another storage service if needed. For advanced use cases, integrate further image processing steps, such as adding captions or analyzing content using AI.
by Hans Blaauw
This flow is supported by a Chrome plugin created with Cursor AI. The idea was to create a Chrome plugin and a backend service in N8N to do chart analytics with OpenAI. It's a good sample on how to submit a screenshot from the browser to N8N. Who is it for? N8N developers who want to learn about using a Chrome plugin, an N8N webhook and OpenAI. What opportunity does it present? This sample opens up a whole range of N8N connected Chrome extensions that can analyze screenshots by using OpenAI. What this workflow does? The workflow contains: a webhook trigger an OpenAI node with GPT-4O-MINI and Analyze Image selected a response node to send back the Text that was created after analysing the screenshot. All this is needed to talk to the Chrome extension which is created with Cursor AI. The idea is to visit the tradingview.com crypto charts, click the Chrome plugin and get back analytics about the shown chart in understandable language. This is driven by the N8N flow. With the new image analytics capabilities of OpenAI this opens up a world of opportunities. Requirements/setup OpenAI API key Cursor AI installed The Chrome extension. Download The N8N JSON code. Download How to customize it to your needs? Both the Chrome extension and N8N flow can be adapted to use on other websites. You can consider: analyzing a financial screen and ask questions about the data shown analyzing other charts extending the N8N workflow with other AI nodes With AI and image analytics the sky is the limit and in some cases it saves you from creating complex API integrations. Download Chrome extension
by Angel Menendez
Automate Report Generation with n8n & Qualys Introducing the Save Qualys Reports to TheHive Workflow—a robust solution designed to automate the retrieval and storage of Qualys reports in TheHive. This workflow fetches reports from Qualys, filters out already processed reports, and creates cases in TheHive for the new reports. It runs every hour to ensure continuous monitoring and up-to-date vulnerability management, making it ideal for Security Operations Centers (SOCs). How It Works: Set Global Variables:** Initializes necessary global variables like base_url and newtimestamp. This step ensures that the workflow operates with the correct configuration and up-to-date timestamps. Ensure to change the Global Variables to match your environment. Fetch Reports from Qualys:** Sends a GET request to the Qualys API to retrieve finished reports. Automating this step ensures timely updates and consistent data retrieval. Convert XML to JSON:** Converts the XML response to JSON format for easier data manipulation. This transformation simplifies further processing and integration into TheHive. Filter Reports:** Checks if the reports have already been processed using their creation timestamps. This filtering ensures that only new reports are handled, avoiding duplicates. Process Each Report:** Loops through the list of new reports, ensuring each is processed individually. This step-by-step handling prevents issues related to bulk processing and improves reliability. Create Case in TheHive:** Generates a new case in TheHive for each report, serving as a container for the report data. Automating case creation improves efficiency and ensures that all relevant data is captured. Download and Attach Report:** Downloads the report from Qualys and attaches it to the respective case in TheHive. This automation ensures that all data is properly archived and easily accessible for review. Get Started: Ensure your Qualys and TheHive integrations are properly set up. Customize the workflow to fit your specific vulnerability management needs. Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your vulnerability management process, improve response times, and enhance the efficiency of your security operations.
by Rahul Joshi
📊 Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. 🚀📚💬 What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. 🧩 Step 2: Fetches release metadata including version, tag, and publish date. 📦 Step 3: Logs release data into Google Sheets for record-keeping and analysis. 📊 Step 4: Retrieves FAQ or documentation data from Notion. 📚 Step 5: Merges GitHub and Notion data to calculate documentation drift. 🔍 Step 6: Flags SDKs whose documentation is over 30 days out of date. ⚠️ Step 7: Sends detailed Slack alerts to notify responsible teams. 🔔 Key Benefits ✅ Keeps SDK documentation aligned with product releases ✅ Prevents outdated information from reaching users ✅ Provides centralized release tracking in Google Sheets ✅ Sends real-time Slack alerts for overdue updates ✅ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOUR_GOOGLE_SHEET_ID and YOUR_SHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.
by ConvertAPI
Who is this for? For developers and organizations that need to convert PPTX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the PPTX file from the web. Converts the PPTX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Yaron Been
This workflow automatically tracks inventory stock levels across multiple products and suppliers to prevent stockouts and optimize inventory management. It saves you time by eliminating the need to manually check stock levels and provides automated alerts when inventory reaches critical thresholds. Overview This workflow automatically scrapes supplier websites, e-commerce platforms, and inventory systems to monitor real-time stock levels and availability. It uses Bright Data to access inventory data and AI to intelligently parse stock information, detect low inventory alerts, and track supply chain trends. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping inventory and supplier websites without being blocked OpenAI**: AI agent for intelligent stock level analysis and trend detection Google Sheets**: For storing inventory data and tracking stock movements How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your inventory tracking spreadsheet Customize: Define product URLs and inventory monitoring parameters Use Cases E-commerce**: Monitor product availability across multiple suppliers Retail Management**: Track inventory levels to prevent stockouts Supply Chain**: Monitor supplier stock levels and lead times Procurement**: Identify restocking needs and optimize purchasing decisions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #inventorytracking #stockmonitoring #brightdata #webscraping #inventorymanagement #n8nworkflow #workflow #nocode #stocklevels #supplychain #inventoryautomation #stockalerts #ecommerce #procurement #inventorycontrol #stockanalysis #suppliermonitoring #inventoryoptimization #stocktracking #warehousemanagement #retailautomation #inventorydata #stockmanagement #supplymanagement #inventorymonitoring #productavailability #stockforecasting #inventoryinsights
by Oneclick AI Squad
This automated n8n workflow performs weekly forecasting of restaurant sales and raw material requirements using historical data from Google Sheets and AI predictions powered by Google Gemini. The forecast is then emailed to stakeholders for efficient planning and waste reduction. What is Google Gemini AI? Google Gemini is an advanced AI model that analyzes historical sales data, seasonal patterns, and market trends to generate accurate forecasts for restaurant sales and inventory requirements, helping optimize purchasing decisions and reduce waste. Good to Know Google Gemini AI forecasting accuracy improves over time with more historical data Weekly forecasting provides better strategic planning compared to daily predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications ensure timely review of weekly forecasts by stakeholders The system analyzes trends and predicts upcoming needs for efficient planning and waste reduction How It Works Trigger Weekly Forecast - Automatically starts the workflow every week at a scheduled time Load Historical Sales Data - Pulls weekly sales and material usage data from Google Sheets Format Input for AI Agent - Transforms raw data into a structured format suitable for the AI Agent Generate Forecast with AI - Uses Gemini AI to analyze trends and predict upcoming needs Interpret AI Forecast Output - Parses the AI's response into readable, usable JSON format Log Forecast to Google Sheets - Stores the new forecast data back into a Google Sheet Email Forecast Summary - Sends a summary of the forecast via Gmail for stakeholder review Data Sources The workflow utilizes Google Sheets as the primary data source: Historical Sales Data Sheet - Contains weekly sales and inventory data with columns: Week/Date (date) Menu Item (text) Sales Quantity (number) Revenue (currency) Raw Material Used (number) Inventory Level (number) Category (text) Forecast Output Sheet - Contains AI-generated predictions with columns: Forecast Week (date) Menu Item (text) Predicted Sales (number) Recommended Inventory (number) Material Requirements (number) Confidence Level (percentage) Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up Gmail credentials for forecast report delivery Create the required Google Sheets with the specified column structures Configure Google Gemini AI API credentials Test with sample historical sales data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Gmail API credentials Google Gemini AI API credentials Historical sales and inventory data for initial training Customizing This Workflow Modify the Generate Forecast with AI node to focus on specific menu categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information, weather data, or special events calendar for more accurate predictions.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks email campaign performance metrics and triggers smart follow-up actions based on engagement data. It saves you time by eliminating the need to manually monitor campaign reports and provides intelligent re-engagement strategies for improving email marketing ROI. Overview This workflow automatically scrapes email service provider (ESP) reports to extract campaign performance metrics like open rates, click-through rates, and bounce rates. It uses AI to analyze the data and automatically sends targeted follow-up emails to re-engage subscribers who opened but didn't click, maximizing campaign effectiveness. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping ESP campaign reports without being blocked OpenAI**: AI agent for intelligent campaign data analysis and decision making Gmail**: For sending automated follow-up engagement emails How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Gmail: Connect your Gmail account for sending follow-up emails Customize: Set ESP report URLs and define engagement thresholds for triggering follow-ups Use Cases Email Marketing**: Automatically optimize campaign performance with smart follow-ups Marketing Automation**: Trigger re-engagement campaigns based on behavior data Performance Tracking**: Monitor email metrics without manual ESP login Customer Retention**: Re-engage subscribers who showed interest but didn't convert Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #emailmarketing #campaigntracking #brightdata #webscraping #emailautomation #n8nworkflow #workflow #nocode #emailcampaigns #marketingautomation #emailperformance #campaignanalysis #emailmetrics #reengagement #marketingdata #emailoptimization #campaignmonitoring #emailanalytics #digitalmarketing #performancetracking #emailstrategy #conversionoptimization #marketinganalytics #emailroi #campaigninsights #emailengagement #marketingefficiency #automatedemail
by Matheus Weckwerth
This workflow automates daily LinkedIn posts using Notion. It starts by fetching the day's post from a Notion database, processes and formats the content, including images, then publishes it on LinkedIn. Finally, it updates the post status in the Notion database. Set up Notion and LinkedIn credentials as required.
by n8n Team
This n8n workflow automates the analysis of email messages received in a Microsoft Outlook inbox to identify indicators of compromise (IOCs), specifically suspicious URLs. It can be triggered manually or scheduled to run daily at midnight. The workflow begins by retrieving up to 100 read email messages from the Outlook inbox. However, there seems to be a configuration issue as it should retrieve unread messages, not read ones. It then marks these messages as read to avoid processing them again in the future. The messages are then split into individual items using the Split In Batches node for sequential processing. For each email, the workflow analyzes its content to find URLs, which are considered potential IOCs. If URLs are found, the workflow proceeds to check these URLs for potential threats using two services, URLScan.io and VirusTotal, in parallel. In the first path, URLScan.io scans each URL, and if there are no errors, the results from URLScan.io and VirusTotal are merged. If there are errors, the workflow waits 1 minute before attempting to retrieve the URLScan results again. The loop then continues for the next email. In the second path, VirusTotal is used to scan the URLs, and the results are retrieved. Finally, the workflow checks if the data field is not empty, filtering out items where no data was found. It then sends a summarized Slack message to report details about the analyzed email, including the subject, sender, date, URLScan report URL, and VirusTotal verdict for URLs that were reported as malicious. Potential issues during setup include configuring the Outlook node to retrieve unread messages, resolving a configuration issue in the VirusTotal node, and handling authentication and API keys for both URLScan.io and VirusTotal nodes. Additionally, proper error handling and testing with various email content types and URLs are essential to ensure the workflow accurately identifies IOCs and reports them to the Slack channel.
by Jonathan
Task: Control your data flow with rate limits and external cues Main use cases: Control the rate of items flow into one or more services in your workflow Wait for external events to occur before continuing with the rest of the workflow