by Peter
Read a value by key from a local json file. Related workflow: WriteKey Create a subfolder in your n8n homedir: /home/node/.n8n/local-files. In docker look at the data path and create a subfolder local-files. Set the correct access rights chmod 1000.1000 local-files. Put the workflow code in a new workflow named GetKey. Create another workflow with a function item: return { file: '/4711.json', // 4711 should be your workflow id key: 'MyKey', default: 'Optional returned value if key is empty / not exists' } Pipe the function item to an Execution Workflow that calls the GetKey workflow. It would be nice if we could get someday a shiny built-in n8n node that does the job. :)
by Angel Menendez
CallForge - AI-Powered Product Insights Processor from Sales Calls Automate product feedback extraction from AI-analyzed sales calls and store structured insights in Notion for data-driven product decisions. 🎯 Who is This For? This workflow is designed for: ✅ Product managers tracking customer feedback and feature requests. ✅ Engineering teams identifying usability issues and AI/ML-related mentions. ✅ Customer success teams monitoring product pain points from real sales conversations. It streamlines product intelligence gathering, ensuring customer insights are structured, categorized, and easily accessible in Notion for better decision-making. 🔍 What Problem Does This Workflow Solve? Product teams often struggle to capture, categorize, and act on valuable feedback from sales calls. With CallForge, you can: ✔ Automatically extract and categorize product feedback from AI-analyzed sales calls. ✔ Track AI/ML-related mentions to gauge customer demand for AI-driven features. ✔ Identify feature requests and pain points for product development prioritization. ✔ Store structured feedback in Notion, reducing manual tracking and increasing visibility across teams. This workflow eliminates manual feedback tracking, allowing product teams to focus on innovation and customer needs. 📌 Key Features & Workflow Steps 🎙️ AI-Powered Product Feedback Processing This workflow processes AI-generated sales call insights and organizes them in Notion databases: Triggers when AI sales call data is received. Detects product-related feedback (feature requests, bug reports, usability issues). Extracts key product insights, categorizing feedback based on customer needs. Identifies AI/ML-related mentions, tracking customer interest in AI-driven solutions. Aggregates feedback and categorizes it by sentiment (positive, neutral, negative). Logs insights in Notion, making them accessible for product planning discussions. 📊 Notion Database Integration Product Feedback** → Logs feature requests, usability issues, and bug reports. AI Use Cases** → Tracks AI-related discussions and customer interest in machine learning solutions. 🛠 How to Set Up This Workflow 1. Prepare Your AI Call Analysis Data Ensure AI-generated sales call insights are available. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. 2. Connect Your Notion Database Set up Notion databases for: 🔹 Product Feedback (logs feature requests and bug reports). 🔹 AI Use Cases (tracks AI/ML mentions and customer demand). 3. Configure n8n API Integrations Connect your Notion API key** in n8n under “Notion API Credentials.” Set up webhook triggers** to receive AI-generated sales insights. Test the workflow** using a sample AI sales call analysis. 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to align with your product team's workflow. 💡 Refine AI Data Processing Rules – Customize how feature requests and pain points are categorized. 💡 Integrate with Slack or Email – Notify teams when recurring product issues emerge. 💡 Expand with Project Management Tools – Sync insights with Jira, Trello, or Asana to create product tickets automatically. ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Detect if product feedback, AI mentions, or feature requests exist in AI data. 🔹 Notion Nodes – Create and update structured feedback entries in Notion. 🔹 Split Out & Aggregate Nodes – Process multiple insights and consolidate AI-generated data. 🔹 Wait Nodes – Ensure smooth sequencing of API calls and database updates. 🚀 Why Use This Workflow? ✔ Eliminates manual sales call review for product teams. ✔ Provides structured, AI-driven insights for feature planning and prioritization. ✔ Tracks AI/ML mentions to assess demand for AI-powered solutions. ✔ Improves product development strategies by leveraging real customer insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers product teams by transforming sales call data into actionable intelligence, optimizing feature planning, bug tracking, and AI/ML strategy. 🚀
by Shahrear
📜 AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) ⚙️ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Run’s Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms — then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. 🧩 Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row — this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. ⚙️ How It Works Monitor Contract Uploads – Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File – Automatically downloads new contracts for AI analysis. VLM Run ContractParser – Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data – The webhook endpoint receives the structured JSON response. Format Contract Data – Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) – Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack – Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events – Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) 💡 Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. 🧠 Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets 🛠️ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Simon
This n8n workflow simplifies the process of removing backgrounds from images stored in Google Drive. By leveraging the PhotoRoom API, this template enables automatic background removal, padding adjustments, and output formatting, all while storing the updated images back in a designated Google Drive folder. This workflow is very useful for companies or individuals that are spending a lot of time into removing the background from product images. How it Works The workflow begins with a Google Drive Trigger node that monitors a specific folder for new image uploads. Upon detecting a new image, the workflow downloads the file and extracts essential metadata, such as the file size. Configurations are set for background color, padding, output size, and more, which are all customizable to match specific requirements. The PhotoRoom API is called to process the image by removing its background and adding padding based on the settings. The processed image is saved back to Google Drive in the specified output folder with an updated name indicating the background has been removed. Requirements PhotoRoom API Key Google Drive API Access Customizing the Workflow Easily adjust the background color, padding, and output size using the configuration node. Modify the output folder path in Google Drive or replace Google Drive with another storage service if needed. For advanced use cases, integrate further image processing steps, such as adding captions or analyzing content using AI.
by Hans Blaauw
This flow is supported by a Chrome plugin created with Cursor AI. The idea was to create a Chrome plugin and a backend service in N8N to do chart analytics with OpenAI. It's a good sample on how to submit a screenshot from the browser to N8N. Who is it for? N8N developers who want to learn about using a Chrome plugin, an N8N webhook and OpenAI. What opportunity does it present? This sample opens up a whole range of N8N connected Chrome extensions that can analyze screenshots by using OpenAI. What this workflow does? The workflow contains: a webhook trigger an OpenAI node with GPT-4O-MINI and Analyze Image selected a response node to send back the Text that was created after analysing the screenshot. All this is needed to talk to the Chrome extension which is created with Cursor AI. The idea is to visit the tradingview.com crypto charts, click the Chrome plugin and get back analytics about the shown chart in understandable language. This is driven by the N8N flow. With the new image analytics capabilities of OpenAI this opens up a world of opportunities. Requirements/setup OpenAI API key Cursor AI installed The Chrome extension. Download The N8N JSON code. Download How to customize it to your needs? Both the Chrome extension and N8N flow can be adapted to use on other websites. You can consider: analyzing a financial screen and ask questions about the data shown analyzing other charts extending the N8N workflow with other AI nodes With AI and image analytics the sky is the limit and in some cases it saves you from creating complex API integrations. Download Chrome extension
by Angel Menendez
Automate Report Generation with n8n & Qualys Introducing the Save Qualys Reports to TheHive Workflow—a robust solution designed to automate the retrieval and storage of Qualys reports in TheHive. This workflow fetches reports from Qualys, filters out already processed reports, and creates cases in TheHive for the new reports. It runs every hour to ensure continuous monitoring and up-to-date vulnerability management, making it ideal for Security Operations Centers (SOCs). How It Works: Set Global Variables:** Initializes necessary global variables like base_url and newtimestamp. This step ensures that the workflow operates with the correct configuration and up-to-date timestamps. Ensure to change the Global Variables to match your environment. Fetch Reports from Qualys:** Sends a GET request to the Qualys API to retrieve finished reports. Automating this step ensures timely updates and consistent data retrieval. Convert XML to JSON:** Converts the XML response to JSON format for easier data manipulation. This transformation simplifies further processing and integration into TheHive. Filter Reports:** Checks if the reports have already been processed using their creation timestamps. This filtering ensures that only new reports are handled, avoiding duplicates. Process Each Report:** Loops through the list of new reports, ensuring each is processed individually. This step-by-step handling prevents issues related to bulk processing and improves reliability. Create Case in TheHive:** Generates a new case in TheHive for each report, serving as a container for the report data. Automating case creation improves efficiency and ensures that all relevant data is captured. Download and Attach Report:** Downloads the report from Qualys and attaches it to the respective case in TheHive. This automation ensures that all data is properly archived and easily accessible for review. Get Started: Ensure your Qualys and TheHive integrations are properly set up. Customize the workflow to fit your specific vulnerability management needs. Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your vulnerability management process, improve response times, and enhance the efficiency of your security operations.
by ConvertAPI
Who is this for? For developers and organizations that need to convert PPTX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the PPTX file from the web. Converts the PPTX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Jonathan
Task: Control your data flow with rate limits and external cues Main use cases: Control the rate of items flow into one or more services in your workflow Wait for external events to occur before continuing with the rest of the workflow
by Jonathan
This workflow uses a Hubspot Trigger to check for new contacts. It then validates the contacts' email using OneSimpleAPI. If there are any a message will be sent to Slack. To configure this workflow you will need to set the credentials for the Hubspot, OneSimpleAPI and Slack Nodes. You will also need to select the Slack channel to use for sending the message.
by Sirhexalot
This n8n workflow enables you to export data from Zammad, including Users, Roles, Groups, and Organizations, into individual Excel files. It simplifies data handling and reporting by creating structured outputs for further processing or sharing. Features Export Users with associated details such as email, firstname, lastname, role_ids, and group_ids. Export Roles and Organizations with their respective identifiers and names. Convert all data into separate Excel files for easy access and use. Usage Import this workflow into your n8n instance. Configure the required Zammad API credentials (zammad_base_url and zammad_api_key) in the Basic Variables node. Run the workflow to generate Excel files containing Zammad data. Issues and Suggestions If you encounter any issues or have suggestions for improvement, please report them on the GitHub repository. We appreciate your feedback to help enhance this workflow!
by Manu
This workflow will take all emails you put into a certain folder, upload any attachements to Nextcloud, and mark the emails as read (configurable). Attachements will be saved with automatically generated filenames: 2021-01-01_From-Sender-Name_Filename-of-attachement.pdf Instructions: Allow lodash to be used in n8n (or rewrite the code...) NODE_FUNCTION_ALLOW_EXTERNAL=lodash (environment variable) Import workflow Set credentials for Email & Nextcloud nodes Configure to use correct folder / custom filters Activate Custom filter examples: Only unread emails: Custom Email Config = ["UNSEEN"] Filter emails by 'to' address: Custom Email Config = [["TO", "example+invoices@posteo.de"]]
by Lucas Perret
This node is designed to cleanse URLs and extract their domain names efficiently. It effectively handles a wide range of URL formats, including those with unconventional or complex top-level domains (TLDs), such as 'co.uk'. You can also use it to extract the domain from an email. The node will also check if the domain is from a free email provider (gmail.com / outlook.com...etc) or not. How It Works The node analyzes the provided URL, removing any unnecessary elements. It then identifies and extracts the domain name, ensuring compatibility with a diverse array of TLDs. The node utilizes an extensive list of TLDs to guarantee accurate domain extraction for virtually any URL. To view the complete list of supported top-level domains, please visit: TLD List on GitHub How to use it Call this workflow using the "execute workflow" node You can pass either an email variable or a url variable. For email, the node also detect free mail provider such as Yahoo / Google...etc