by Jaures NYA
This workflow automates the process of generating personalized UGC (User-Generated Content) images based on form submissions. It accepts a form with a character type (e.g., male/female) and an uploaded image, merges them, sends them to an AI model (Google Gemini via OpenRouter) for creative generation, and posts the resulting content as a Telegram photo message. Who’s it for This automation template is designed for marketers, AI creators, content teams, or interactive community platforms that want to let users submit content (image + character type), enrich it with AI-generated descriptions, and instantly publish results to Telegram — without writing a single line of code. How it works Trigger: Workflow starts when a form is submitted by a user. Extract file: The uploaded image file is converted to a Base64 string. Merge data: The character type and image data are combined into one payload. Format to Data URL: The image is wrapped as a proper data:image/... format for API use. Prepare payload: The text and image are mapped into a structure compatible with Gemini API. Generate AI content: Sends the input to Google Gemini (via OpenRouter) to generate a UGC description. Transform response: Cleans and extracts the result from Gemini’s response. Convert back to file: Transforms the Base64 image back into a real image file. Send to Telegram: The image and its AI-generated description are sent as a photo message to your Telegram channel. How to use Set up a form with a dropdown for character type (e.g., Male/Female) and an image upload field. Configure the Gemini API access through OpenRouter. Connect your Telegram bot and channel to receive the final result. Start the workflow → users submit the form, and their data is processed and shared as AI-enhanced UGC. Requirements OpenRouter API key to access Google Gemini. A Telegram Bot connected to your Telegram channel. ❓ Need help Contact me for consulting and support: LinkedIn / YouTube / Skool
by Theodoros Mastromanolis
Who is this for Travel agencies, freelance travel planners, or anyone who wants to automate personalized trip planning by combining real-time hotel and flight data with AI-generated recommendations. What this workflow does Collects travel details (airports, dates, travelers) through an n8n form Scrapes the top 5 hotels from Booking.com sorted by review score via Apify Scrapes the best available flights from Google Flights via Apify Generates restaurant, attraction, and day-by-day itinerary recommendations using OpenAI Merges all results into a formatted Google Doc and returns the link to the user How to set up Create an Apify account and add your API token as both an "Apify API" credential and an "HTTP Query Auth" credential (parameter name: token) Add your OpenAI API key as an "OpenAI" credential Connect your Google account via OAuth2 and update the folderId in the "Create Document" node to your Google Drive folder Activate the workflow and share the form URL Requirements Apify account with API token (for Booking.com and Google Flights scrapers) OpenAI API key Google account with Docs and Drive access How to customize Swap the form trigger for a webhook or chatbot input Change the output from Google Docs to email, Slack, or Notion Adjust the OpenAI prompt to focus on budget travel, luxury, or specific interests
by Dhaval Variya
How it works This Workflow automatically creates a new CV and Cover Letter based on the job description and your Current CV. How to set it up Create API keys for Google sheet Link, Google Docs Link and OpenRouter link -- If you do not know how to create and add an API key, then simply search on Google, and you will find a solution. Create a new Google Sheet in your Google Drive with the name "Job IDs" and also add 3 columns with names "JobLink, CoverLetter, and CV." Then create a 2 Google Docs in your Google Drive with the names Cover Letter and CV. Linked them with the respective node in the workflow Add this code into Google sheet "APP Script" -> "function callN8n(e) { var range = e.range; if (range.getColumn() == 1) { // This is the combined URL from your terminal + n8n path var url = "https://pronounce-surround-curtsy.ngrok-free.dev/webhook/3d23f43d-9d38-45ba-9ad1-b696a2b97905"; var options = { "method": "post", "headers": { "ngrok-skip-browser-warning": "true" }, "payload": JSON.stringify({ "JobLink": e.value, "row": range.getRow() }), "contentType": "application/json" }; UrlFetchApp.fetch(url, options); } }" Contact Detail dhaval.variya07@gmail.com Linkedin YouTube
by Dele Odufuye
N8n OpenAI-Compatible API Endpoints Transform your n8n workflows into OpenAI-compatible API endpoints, allowing you to access multiple workflows as selectable AI models through a single integration. What This Does This workflow creates two API endpoints that mimic the OpenAI API structure: /models - Lists all n8n workflows tagged with aimodel (or any other tag of your choice) /chat/completions - Executes chat completions with your selected workflows, supporting both text and stream responses Benefits Access Multiple Workflows: Connect to all your n8n agents through one API endpoint instead of creating separate pipelines for each workflow. Universal Platform Support: Works with any application that supports OpenAI-compatible APIs, including OpenWebUI, Microsoft Teams, Zoho Cliq, and Slack. Simple Workflow Management: Add new workflows by tagging them with aimodel . No code changes needed. Streaming Support: Handles both standard responses and streaming for real-time agent interactions . How to Use Download the workflow JSON file from this repository Import it into your n8n instance Tag your workflows with aimodel to make them accessible through the API Create a new OpenAI credential in n8n and change the Base URL to point to your n8n webhook endpoints . Learn more about OpenAI Credentials Point your chat applications to your n8n webhook URL as if it were an OpenAI API endpoint Requirements n8n instance (self-hosted or cloud) Workflows you want to expose as AI models Any OpenAI-compatible chat application Documentation For detailed setup instructions and implementation guide, visit https://medium.com/@deleodufuye/how-to-create-openai-compatible-api-endpoints-for-multiple-n8n-workflows-803987f15e24. Inspiration This approach was inspired by Jimleuk’s workflow on n8n Templates.
by Jitesh Dugar
📦 Automated Instagram Product Drop via uploadtourl Streamline your e-commerce marketing with this end-to-end Instagram publishing pipeline. This workflow automates the transition from a new product entry to a live social media announcement, ensuring your brand stays active and consistent across platforms. 🎯 What This Workflow Does This template handles the complex multi-step process of social media publishing through a single trigger: 🔔 Smart Data Intake The Webhook Trigger accepts data from Shopify, Airtable, or manual inputs. A normalization node then sanitizes the raw data, mapping nested fields to a consistent schema and applying fallback defaults for missing captions or hashtags. ☁️ Image Processing & Hosting The system fetches your product image from any public source and passes the binary to the uploadtourl node. This converts your raw image into a public CDN URL, which is required for the Instagram Graph API to access the file. 📸 Instagram Publishing Flow A dedicated Code Node assembles a brand-ready caption with emojis, prices, and CTAs, automatically truncating text to 2,200 characters to meet API limits. The workflow then executes Instagram's two-step flow: creating a media container and publishing it after a safe 5-second buffer. 📊 Audit & Notification Once the post is live, the workflow records the Live Post ID, product name, and timestamp in Airtable for future analytics. Finally, it sends a formatted alert to your team via Slack with a direct link to the new post. ✨ Key Features Multi-Source Support: Works natively with Shopify products/create webhooks and Airtable automations. Automated CDN Hosting: Bypasses manual file uploads by using uploadtourl to generate API-compatible links. Safe Buffer Logic: Built-in wait timers ensure Instagram finishes asynchronous image processing before the final publish call. Brand Guardrails: Centralized caption builder ensures every post follows your brand's emoji and hashtag style. 💼 Perfect For SMM Managers: Automating repetitive product announcement tasks. E-commerce Owners: Linking Shopify new arrivals directly to Instagram. Creative Agencies: Managing high-volume product drops for multiple clients. Content Operations: Creating an automated audit log of all social media activity. 🔧 What You'll Need Required Integrations Instagram Graph API: A Business or Creator account access token. uploadtourl Account: Credentials configured in n8n for media hosting. Airtable & Slack: (Optional) For logging and team notifications. Configuration Steps API Keys: Connect your Instagram and uploadtourl credentials. Environment Variables: Set your IG_ACCOUNT_ID, AIRTABLE_BASE_ID, and SLACK_CHANNEL_ID in n8n.
by 中崎功大
Smart Irrigation Scheduler with Weather Forecast and Soil Analysis Summary Automated garden and farm irrigation system that uses weather forecasts and evapotranspiration calculations to determine optimal watering schedules, preventing water waste while maintaining healthy plants. Detailed Description A comprehensive irrigation management workflow that analyzes weather conditions, forecasts, soil types, and plant requirements to make intelligent watering decisions. The system considers multiple factors including expected rainfall, temperature, humidity, wind speed, and days since last watering to determine if irrigation is needed and how much. Key Features Multi-Zone Management**: Support for multiple irrigation zones with different plant and soil types Weather-Based Decisions**: Uses OpenWeatherMap current conditions and 5-day forecast Evapotranspiration Calculation**: Simplified Penman method for accurate water loss estimation Rain Forecast Skip**: Automatically skips watering when significant rain is expected Plant-Type Specific**: Different requirements for flowers, vegetables, grass, and shrubs Soil Type Consideration**: Adjusts for clay, loam, and sandy soil characteristics Urgency Classification**: High/medium/low priority based on moisture levels Optimal Timing**: Adjusts watering time based on temperature and wind conditions IoT Integration**: Sends commands to smart irrigation controllers Historical Logging**: Tracks all decisions in Google Sheets Use Cases Home garden automation Commercial greenhouse management Agricultural operations Landscaping company scheduling Property management with large grounds Water conservation projects Required Credentials OpenWeatherMap API key Slack Bot Token Google Sheets OAuth IoT Hub API (optional) Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses OpenWeatherMap node (rarely used in templates) Uses Split Out node for loop-style processing of zones Uses Filter node for conditional routing Uses Aggregate node to collect results Implements evapotranspiration calculation using Code node Comprehensive multi-factor decision logic Workflow Architecture [Daily Morning Check] [Manual Override Trigger] | | +----------+-------------+ | v [Define Irrigation Zones] | v [Split Zones] (Loop) / \ v v [Get Current] [Get 5-Day Forecast] \ / +----+----+ | v [Merge Weather Data] | v [Analyze Irrigation Need] / \ v v [Filter Needing] [Aggregate All] \ / +----+----+ | v [Generate Irrigation Schedule] | v [Has Irrigation Tasks?] (If) / \ Has Tasks No Tasks / | \ | Sheets[Slack] [Log No Action] \ | / | +---+---+-----------+ | v [Respond to Webhook] Configuration Guide Irrigation Zones: Edit "Define Irrigation Zones" with your zone data (coordinates, plant/soil types) Water Thresholds: Adjust waterThreshold per zone based on plant needs OpenWeatherMap: Add API credentials in the weather nodes Slack Channel: Set to your garden/irrigation channel IoT Integration: Configure endpoint URL for your smart valve controller Google Sheets: Connect to your logging spreadsheet Decision Logic The system evaluates: Expected rainfall in next 24 hours (skip if >5mm expected) Soil moisture estimate based on days since watering + evapotranspiration Plant-specific minimum and ideal moisture levels Temperature adjustments for hot days Scheduled watering frequency by plant type Wind speed for optimal watering time
by Madame AI
Post Jobs to Multiple Boards from Google Sheets using BrowerAct This powerful n8n template turns a Google Sheet into a control panel for automating job postings across multiple job boards. This workflow is perfect for HR teams, recruiters, and hiring managers who want to streamline their hiring process by posting jobs to multiple boards from a single source of truth. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow is triggered in two ways: manually (to batch-post all "Ready to Post" jobs) or automatically via a Google Sheets Trigger when a single row is updated. An If node filters for jobs marked with the status 'Ready to Post'. A BrowserAct node takes the job details (title, description, logins, target URL) and runs an automation to post the job on the specified board. An If node checks if the posting was successful. If it fails, a Slack alert is sent. A Code node parses the successful result from BrowserAct to get the status and live URL. The workflow Updates the row in Google Sheets with the Live_URL and changes the Status to 'Posted'. A final Slack message is sent to a channel to confirm the successful posting. Requirements BrowserAct** API account for automated posting BrowserAct* "Automated Job Posting to Niche Job Site (Custom Site)*" Template | User needs to customize the workflow based on the target site. BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node
by Rohit Dabra
🧩 Zoho CRM MCP Server Integration (n8n Workflow) 🧠 Overview This n8n flow integrates Zoho CRM with an MCP (Model Context Protocol) Server and OpenAI Chat Model, enabling AI-driven automation for CRM lead management. It allows an AI Agent to create, update, delete, and fetch leads in Zoho CRM through natural language instructions. ▶️ Demo Video Watch the full demo here: 👉 YouTube Demo Video ⚙️ Core Components | Component | Purpose | | ---------------------- | -------------------------------------------------------------------------------------------------- | | MCP Server Trigger | Acts as the entry point for requests sent to the MCP Server (external systems or chat interfaces). | | Zoho CRM Nodes | Handle CRUD operations for leads (create, update, delete, get, getAll). | | AI Agent | Uses the OpenAI Chat Model and Memory to interpret and respond to incoming chat messages. | | OpenAI Chat Model | Provides the LLM (Large Language Model) intelligence for the AI Agent. | | Simple Memory | Stores short-term memory context for chat continuity. | | MCP Client | Bridges communication between the AI Agent and the MCP Server for bi-directional message handling. | 🧭 Flow Description 1. Left Section (MCP Server + Zoho CRM Integration) Trigger:** MCP Server Trigger — receives API requests or chat events. Zoho CRM Actions:** 🟢 Create a lead in Zoho CRM 🔵 Update a lead in Zoho CRM 🟣 Get a lead in Zoho CRM 🟠 Get all leads in Zoho CRM 🔴 Delete a lead in Zoho CRM Each of these nodes connects to the Zoho CRM credentials and performs the respective operation on Zoho CRM’s “Leads” module. 2. Right Section (AI Agent + Chat Flow) Trigger:** When chat message received — initiates flow when a message is received. AI Agent Node:** Uses: OpenAI Chat Model → for natural language understanding and generation. Simple Memory → to maintain context between interactions. MCP Client → to call MCP actions (which include Zoho CRM operations). This creates a conversational interface allowing users to type things like: > “Add a new lead named John Doe with email john@acme.com” The AI agent interprets this and routes the request to the proper Zoho CRM action node automatically. ⚙️ Step-by-Step Configuration Guide 🧩 1. Import the Flow In n8n, go to Workflows → Import. Upload the JSON file of this workflow (or paste the JSON code). Once imported, you’ll see the structure as in the image. 🔐 2. Configure Zoho CRM Credentials You must connect Zoho CRM API to n8n. Go to Credentials → New → Zoho OAuth2 API. Follow Zoho’s official n8n documentation. Provide the following: Environment: Production Data Center: e.g., zoho.in or zoho.com depending on your region Client ID and Client Secret — from Zoho API Console (https://api-console.zoho.com/) Scope: ZohoCRM.modules.leads.ALL Redirect URL: Use the callback URL shown in n8n (copy it before saving credentials) Click Connect and complete the OAuth consent. ✅ Once authenticated, all Zoho CRM nodes (Create, Update, Delete, etc.) will be ready. 🔑 3. Configure OpenAI API Key In n8n, go to Credentials → New → OpenAI API. Enter: API Key: from https://platform.openai.com/account/api-keys Save credentials. In the AI Agent node, select this OpenAI credential under Model. 🧠 4. Configure the AI Agent Open the AI Agent node. Choose: Chat Model: Select your configured OpenAI Chat Model. Memory: Select Simple Memory. Tools: Add MCP Client as the tool. Configure AI instructions (System Prompt) — for example: You are an AI assistant that helps manage leads in Zoho CRM. When the user asks to create, update, or delete a lead, use the appropriate tool. Provide confirmations in natural language. 🧩 5. Configure MCP Server A. MCP Server Trigger Open the MCP Server Trigger node. Note down the endpoint URL — this acts as the API entry point for external requests. It listens for incoming POST requests from your MCP client or chat interface. B. MCP Client Node In the AI Agent, link the MCP Client node. Configure it to send requests back to your MCP Server endpoint (for 2-way communication). > 🔄 This enables a continuous conversation loop between external clients and the AI-powered CRM automation system. 🧪 6. Test the Flow Once everything is connected: Activate the workflow. From your chat interface or Postman, send a message to the MCP Server endpoint: { "message": "Create a new lead named Alice Johnson with email alice@zoho.com" } Observe: The AI Agent interprets the intent. Calls Zoho CRM Create Lead node. Returns a success message with lead ID. 🧰 Example Use Cases | User Query | Action Triggered | | ------------------------------------------------- | ----------------------- | | “Add John as a lead with phone number 9876543210” | Create lead in Zoho CRM | | “Update John’s company to Acme Inc.” | Update lead in Zoho CRM | | “Show me all leads from last week” | Get All Leads | | “Delete lead John Doe” | Delete lead | 🧱 Tech Stack Summary | Layer | Technology | | ---------------------- | ---------------------------- | | Automation Engine | n8n | | AI Layer | OpenAI GPT Chat Model | | CRM | Zoho CRM | | Communication Protocol | MCP (Model Context Protocol) | | Memory | Simple Memory | | Trigger | HTTP-based MCP Server | ✅ Best Practices 🔄 Refresh Tokens Regularly — Zoho tokens expire; ensure auto-refresh setup. 🧹 Use Environment Variables for API keys instead of hardcoding. 🧠 Fine-tune System Prompts for better AI understanding. 📊 Enable Logging for request/response tracking. 🔐 Restrict MCP Server Access with an API key or JWT token.
by MUHAMMAD SHAHEER
Overview This workflow helps you automatically collect verified business leads from Google Search using SerpAPI — no coding required. It extracts company names, websites, emails, and phone numbers directly from search results and saves them into Google Sheets for easy follow-up or CRM import. Perfect for marketers, freelancers, and agencies who want real, usable leads fast — without manual scraping or paid databases. How It Works SerpAPI Node performs a Google search for your chosen keyword or niche. Split Out Node separates each result for individual processing. HTTP Request Node optionally visits each site for deeper data extraction. Code Node filters, validates, and formats leads using smart parsing logic. Google Sheets Node stores the final structured data automatically. All steps include sticky notes with configuration help. Setup Steps Setup takes about 5–10 minutes: Add your SerpAPI key (replace the placeholder). Connect your Google Sheets account. Update the search term (e.g., “Plumbers in New York”). Run the workflow and watch leads populate your sheet in real time.
by Madame AI
Real-Time MAP Enforcement & Price Violation Alerts using BrowserAct & slack This n8n template automates MAP (Minimum Advertised Price) enforcement by monitoring reseller websites and alerting you instantly to price violations and stock issues. This workflow is essential for brand owners, manufacturers, and compliance teams who need to proactively monitor their distribution channels and enforce pricing policies. How it works The workflow runs on a Schedule Trigger (e.g., hourly) to continuously monitor product prices. A Google Sheets node fetches your list of resellers, product URLs, and the official MAP price (AP_Price). The Loop Over Items node ensures that each reseller's product is checked individually. A pair of BrowserAct nodes navigate to the reseller's product page and reliably scrape the current live price. A series of If nodes check for violations: The first check (If1) looks for "NoData," signaling that the product is Out of Stock, and sends a specific Slack alert. The second check (If) compares the scraped price to your MAP price, triggering a detailed Slack alert if a MAP violation is found. The workflow loops back to check the next reseller on the list. Requirements BrowserAct** API account for web scraping BrowserAct* "MAP (Minimum Advertised Price) Violation Alerts*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials for your price list Slack** credentials for sending alerts Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase I Built a Bot to Catch MAP Violators (n8n + BrowserAct Workflow)
by Madame AI
Automated E-commerce Store Monitoring for New Products Using BrowserAct This n8n template is an advanced competitive intelligence tool that automatically monitors competitor E-commerce/Shopify stores and alerts you the moment they launch a new product. This workflow is essential for e-commerce store owners, product strategists, and marketing teams who need real-time insight into what their competitors are selling. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow runs on a Schedule Trigger to check for new products automatically (e.g., daily). A Google Sheets node fetches your master list of competitor store links from a central sheet. The workflow loops through each competitor one by one. For each competitor, a Google Sheets node first creates a dedicated tracking sheet (if one doesn't exist) to store their product list history. A BrowserAct node then scrapes the competitor's current product list from their live website. The scraped data is saved to the competitor's dedicated tracking sheet. The workflow then fetches the newly scraped list and the previously stored list of products. A custom Code node (labeled "Compare Datas") performs a difference check to reliably detect if any new products have been added. If a new product is detected, an If node triggers an immediate Slack alert to your team, providing real-time competitive insight. Requirements BrowserAct** API account for web scraping BrowserAct* "Competitors Shopify Website New Product Monitor*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials for storing and managing data Slack** credentials for sending alerts Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase Automatically Track Competitor Products | n8n & Google Sheets Template
by Ziana Mitchell
Workflow Introduction This workflow is specifically for Self-Hosted N8N Docker users. The purpose of this workflow is to: Take an inventory of all workflows with credentials (Workflow -> Credential 1, Credential 2, etc.), connecting credential names to workflow names Create a credentials impact map (Credential -> Workflow 1, Workflow 2, etc.), connecting workflow names to credential IDs and names This workflow drastically minimizes the amount of time spent trying to figure out which workflows will break if a credential is updated or deleted by bridging the gap between the public API (workflow metadata) and the internal sqlite database (hidden version labels and credential mappings). In essence this tool audits your n8n environment without touching your sensitive encrypted data, keeping you API Keys 100% secure. It features dynamic time saving trackers, set for what I estimate to be the amount of time it would take to manually complete every individual thing this workflow does in less than 5 minutes. Limitations This workflow is optimized for the Standard N8N Execution Mode. Due to the high-frequency data lookups required for this tool, it is not compatible with Task Runners (N8N_RUNNERS_ENABLED=true). For maximum stability and to prevent "Database Locked" errors, please ensure your instance is running in the default configuration. If you use Method 1 (Direct Insertion) in the API Key Instructions below your N8N API key will not appear as connected to this workflow in the resulting Credential Impact Map. Same goes for all other credentials directly inserted into a HTTP Request node across your N8N environment. Setting it up - Requirements Environment Variables Required: These enviornment variables are required in order for the version_labels and flow_names code nodes to work. NODE_FUNCTION_ALLOW_BUILTIN=* NODE_FUNCTION_ALLOW_EXTERNAL=sqlite3 Note: If you need access to multiple external functions across your n8n instance, add them as a string like this: NODE_FUNCTION_ALLOW_EXTERNAL=sqlite3, better-sqlite3 Credentials Needed: n8n account > n8n self-hosted API key Google Sheets account Google Drive account The nodes that need manually inputted credentials or node parameter information are: Create folder, flowsList, credsList, and finalFlowEdits Note: If you create your credentials before openning any process nodes, your credentials will be auto connected to the associated nodes. However, this will not happen for the n8n self-hosted API keys. If you have multiple Google Sheets or Google Drive accounts you will need to point the associated nodes to the specific one you want to use. API Key Instructions How to get: Settings -> API -> Create API Key -> Name it -> Copy it Copy the API key and save it somewhere secure. You won't be able to see it again!!**_ How to use (Two Methods): Note: The node is set up for Method 1 by default Method 1 (Direct Insertion). In the flowsList and credsList nodes: Authentication: None Send Headers: on Name: X-N8N-API-KEY Value: <YOUR N8N API KEY> Method 2 (N8N Credential). In the flowsList and credsList nodes: Authentication: Generic Credential Type Generic Auth Type: Header Auth Header Auth: Select "Create new credential" (If you already have a Header Auth for your N8N API key select it here) 2a. In the credential creation area: Name: X-N8N-API-KEY Value: <YOUR N8N API KEY> Queries and Body aren't needed. Endpoints: flowsList: http://localhost:5678/api/v1/workflows or https://<your webhook domain>/api/v1/workflows credsList: http://localhost:5678/api/v1/credentials or https://<your webhook domain>/api/v1/credentials The Workflow Processes Global Actions Executes on on a set schedule (weekly on Fridays at 2pm) 1a. A deactivated execute on command node is included for testing, or for when updated files are needed before the scheduled date/time Creates a Drive folder to house the resulting spreadsheets using the current date and time as the name 2a. This workflow creates the folders inside a specified parent folder. 2b. This node also sets the folder color You are welcome to change this to your preferred color. Branch 1: Workflow Credentials Inventory Process Creates a google spreadsheet with the current date in the name Gets the list of all workflows from the public API Splits the list into individual items for easier processing Gets the version labels for active workflows from the internal sqlite database Sets fields for the workflow's name, id, published status, latest published version label, current version's id (whether the workflow is published or not) Filters out the workflows that don't have any nodes with credentials Splits out the list of nodes for easier processing Filters out the nodes that don't have associated credentials Sets fields for the node names, and credential names Summarize the output from the Nodes editor, based on workflows 10a. Append nodeNames 10b. Append credetntials 10c. Count the nodes Clean up the summarized information, converting it back to recognizible clean json format Set fields for spreadsheet aethestics This step can be skipped, however, I didn't like the visual look of the direct from CleanJson to credInventory Appends header row and data rows to the spreadsheet created in Step 1. Move the file into the folder created at the start of the workflow from whereever google put it by default Track the estimated time it would have taken to manually do every step, in a time saved node Branch 2: Credential Impact Map Process Creates a spreadsheet with the current date in the name Gets a list of all credentials from the public API Splits the list into individual items for easier processing Sorts the credentials by name in ascending order (A->Z) in case they come from the split out of order Gets a list of all workflows with nodes associated with each credential ID from the internal sqlite database Set fields for the credential name, id, type, created and updated timestamps, and list of workflow names Appends the header row and data rows to the spreadsheet created in Step 1 of this branch Move the file into the folder created at the start of the workflow from whereever google put it by default Track the estimated time it would have taken to manually do every step, in a time saved node