by Airtop
About The Airtop Automation Are you tired of being shocked by unexpectedly high energy bills? With this automation using Airtop and n8n, you can take control of your daily energy costs and ensure you’re always informed. How to monitor your daily energy consumption With this automation, we’ll walk you through setting up an automation that retrieves your PG&E (Pacific Gas and Electric) energy usage data, calculates costs, and emails you the details—all without manual effort. What You’ll Need To get started, make sure you have the following: A free Airtop API Key PG&E Account Credentials - with minor adaptations, this will also work with other providers An Email Address - To receive the energy cost updates Estimated setup time: 5 minutes Understanding the Process This automation works by: Logging into your PG&E account using your credentials Navigating to your energy usage data Extracting relevant details about energy consumption and costs Emailing the daily summary directly to your inbox The automation is straightforward and ensures you have real-time insights into your energy usage, empowering you to adjust your habits and save money. Setting Up Your Automation We’ve created a step-by-step guide to help you set up this workflow. Here’s how: Insert Your Credentials: In the tools section, add your PG&E login details as variables In Airtop, add your Airtop API Key Configure your email address to receive the updates Run the Automation: Start the scenario, and watch as the automation retrieves your energy data and sends you a detailed email summary. Customization Options While the default setup works seamlessly, you can tweak it to suit your needs: Data Storage: Store energy usage data in a database for long-term tracking and analysis Visualization: Plot graphs of your energy usage trends over time for better insights Notifications: Change the automation to only send alerts on high usage instead of a daily email Real-World Applications This automation isn’t just about monitoring energy usage and taking control. Here are some practical applications: Daily Energy Management: Receive updates every morning and adjust your energy consumption based on costs Smart Home Integration: Use the data to automate appliances during off-peak hours Budgeting: Track energy expenses over weeks or months to plan your budget more effectively Happy automating!
by dataplusminus+-
🎯 Project Purpose This project automates the process of collecting and managing new leads submitted through a web form. It eliminates the need for manual data entry and ensures that each lead is: Properly recorded and time-stamped in a structured format Automatically communicated to the sales or support team Ready for follow-up, with a reminder system in place It’s a lightweight but effective solution suitable for freelancers, small teams, and growing businesses that want to streamline their lead intake process. 🛠️ Tools & Technologies Used Google Forms / Web Form** – Frontend for capturing leads Google Sheets** – Central database for storing lead information n8n** – Automation platform that connects and coordinates all services Gmail** – Handles email notifications for new leads Slack* *(optional) – Provides instant team notifications Date & Time nodes** – Tracks and manages lead response timing Conditional (IF) nodes** – Filters out duplicate and incomplete entries 🔄 Workflow Overview ✨ Key Features ✅ No-code integration using n8n ✅ Instant alerts via Gmail and/or Slack ✅ Google Sheets as an easily accessible backend ✅ Modular design — easy to expand with CRM tools (like HubSpot) ✅ Clean JSON structure and logic, beginner-friendly 📈 Possible Improvements Add email validation via external API (e.g., NeverBounce, Hunter) Integrate with a CRM for deeper automation Add lead scoring based on answers Include automatic follow-up emails after X days Schedule weekly summary reports via email 🧑🏻💻 Creator Information Developed by: Adem Tasin Adem T. 🌐 Website: Dataplusminus+- 📧 Email:dataplusminuss@gmail.com 💼 LinkedIn: Adem Tasin
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically performs weekly keyword research and competitor analysis to discover trending keywords in your industry. It saves you time by eliminating the need to manually research keywords and provides a constantly updated database of trending search terms and opportunities. Overview This workflow automatically researches trending keywords for any specified topic or industry using AI-powered search capabilities. It runs weekly to gather fresh keyword data, analyzes search trends, and saves the results to Google Sheets for easy access and analysis. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For accessing search engines and keyword data sources OpenAI**: AI agent for intelligent keyword research and analysis Google Sheets**: For storing and organizing keyword research data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your keyword tracking spreadsheet Customize: Define your target topics or competitors for keyword research Use Cases SEO Teams**: Discover new keyword opportunities and track trending search terms Content Marketing**: Find trending topics for content creation and strategy PPC Teams**: Identify new keywords for paid advertising campaigns Competitive Analysis**: Monitor competitor keyword strategies and market trends Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #keywordresearch #seo #brightdata #webscraping #competitoranalysis #contentmarketing #n8nworkflow #workflow #nocode #seoresearch #keywordmonitoring #searchtrends #digitalmarketing #keywordtracking #contentautomation #marketresearch #trendingkeywords #keywordanalysis #seoautomation #keyworddiscovery #searchmarketing #keyworddata #contentplanning #seotools #keywordscraping #searchinsights #markettrends #keywordstrategy
by Kalyxi Ai
🚀 Automate News Discovery & Publishing with GPT-4, Google Search API & Slack 🎯 Overview Automated content publishing system that discovers industry news, transforms it into original articles using GPT-4, and publishes across multiple channels with SEO optimization and intelligent duplicate prevention. ✨ Key Features 🤖 Smart Query Generation** - AI agent generates unique search queries while checking Google Sheets to avoid duplicates 🔍 News Discovery** - Uses Google Custom Search API to find recent articles (last 7 days) 🧠 Content Intelligence** - Processes search results and skips anti-bot protected sites automatically 📝 GPT-4 Article Generation** - Creates professional, SEO-optimized news articles in Reuters/Bloomberg style 📢 Multi-Channel Publishing** - Publishes to CMS with automatic Slack notifications 📊 Comprehensive Tracking** - Logs all activity to Google Sheets for analytics and duplicate prevention 🔄 How It Works ⏰ Scheduled Trigger runs every 8 hours to maintain consistent content flow 🤖 AI Agent generates targeted search queries for your niche while checking historical data 🔍 Google Search finds recent articles and extracts metadata (title, snippet, source) 🛡️ Smart Content Handler bypasses sites with anti-bot protection, using search snippets instead ⚡ GPT-4 Processing transforms snippets into comprehensive 2000+ word articles with proper formatting 🚀 Publishing Pipeline formats content for CMS with SEO metadata and publishes automatically 📱 Notification System sends detailed Slack updates with article metrics 📈 Activity Logging tracks all published content to prevent future duplicates 🔧 Setup Requirements 📋 Prerequisites Google Custom Search API key and Search Engine ID OpenAI GPT-4 API access Google account for tracking spreadsheet Slack workspace for notifications CMS or website with API endpoint for publishing 🛠️ Step-by-Step Setup Step 1: 🔎 Google Custom Search Configuration Go to Google Custom Search Engine Create a new search engine Configure to search the entire web Copy your Search Engine ID (cx parameter) Get your API key from Google Cloud Console Step 2: 📊 Google Sheets Template Setup Create a Google Sheet with these required columns: Column A:** timestamp - ISO date format (YYYY-MM-DD HH:MM:SS) Column B:** query - The search query used Column C:** title - Published article title Column D:** url - Published article URL Column E:** status - Publication status (success/failed) Column F:** word_count - Final article word count Template URL: Copy this Google Sheets template Step 3: 🔑 Credential Configuration Set up the following credentials in n8n: 📊 Google Sheets API - OAuth2 connection to your Google account 🤖 OpenAI API - Your GPT-4 API key 📱 Slack Webhook - Webhook URL for your notification channel 🔍 Custom Search API - Your Google Custom Search API key Step 4: ⚙️ Workflow Customization Modify these key parameters to fit your needs: 🎯 Search Topic:** Edit the AI agent prompt to focus on your industry ⏰ Publishing Schedule:** Adjust the cron trigger (default: every 8 hours) 📝 Article Length:** Modify GPT-4 prompt for different word counts 🌐 CMS Endpoint:** Update the publishing node with your website's API 🎨 Customization Options 🎯 Content Targeting Modify the AI agent's search query generation to focus on specific industries Adjust date restrictions (currently set to last 7 days) Change the number of search results processed per run ✍️ Article Style Customize GPT-4 prompts for different writing styles (formal, casual, technical) Adjust article length requirements Modify SEO optimization parameters 📡 Publishing Channels Add additional CMS endpoints for multi-site publishing Configure different notification channels (Discord, Teams, etc.) Set up social media auto-posting integration 💡 Use Cases 📰 Automated news websites 📝 Industry blog content generation 🔍 SEO content pipeline automation 📊 News aggregation and republishing 📈 Content marketing automation 🛠️ Technical Notes Workflow includes error handling for anti-bot protection Duplicate prevention through Google Sheets tracking Rate limiting considerations for API usage Automatic retry logic for failed requests 🆘 Support For setup assistance or customization help, refer to the workflow's internal documentation nodes or contact the template creator.
by Samir Saci
Tags: Automation, AI, Marketing, Content Creation Context I’m a Supply Chain Data Scientist and content creator who writes regularly about data-driven optimization, logistics, and sustainability. Promoting blog articles on LinkedIn used to be a manual task — until I decided to automate it with N8N and GPT-4o. This workflow lets you automatically extract blog posts, clean the content, and generate a professional LinkedIn post using an AI Agent powered by GPT-4o — all in one seamless automation. >Save hours of repetitive work and boost your reach with AI. 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This template is perfect for: Bloggers and writers** who want to promote their content on LinkedIn Marketing teams** looking to automate professional post-generation Content creators** using Ghost platforms It generates polished LinkedIn posts with: A hook A quick summary A call-to-action A signature to drive readers to your contact page How does it work? This workflow runs in N8N and performs the following steps: 🚀 Triggers manually or you can add a scheduler 📰 Pulls recent blog posts from your Ghost site (via API) 🧼 Cleans the HTML content for AI input 🤖 Sends content to GPT-4o with a tailored prompt to create a LinkedIn post 📄 Records all data (post content + LinkedIn output) in a Google Sheet What do I need to start? You don’t need to write a single line of code. Prerequisites: A Ghost CMS account with blog content A Google Sheet to store generated posts An OpenAI API Key Google Sheets API** connected via OAuth2 Next Steps Use the sticky notes in the workflow to understand how to: Add your Ghost API credentials Link your Google Sheet Customize the AI prompt (e.g., change the author name or tone) Optionally add auto-posting to LinkedIn using tools like Buffer or Make 🎥 Watch My Tutorial 🚀 Want to explore how automation can scale your brand or business? 📬 Let’s connect on LinkedIn Notes You can adapt this template for Twitter, Facebook, or even email newsletters by adjusting the prompt and output channel. This workflow was built using n8n 1.85.4 Submitted: April 9th, 2025
by Emmanuel Bernard
Automatically Add Captions to Your Video Who Is This For? This workflow is ideal for content creators, marketers, educators, and businesses that regularly produce video content and want to enhance accessibility and viewer engagement by effortlessly adding subtitles. What Problem Does This Workflow Solve? Manually adding subtitles or captions to videos can be tedious and time-consuming. Accurate captions significantly boost viewer retention, accessibility, and SEO rankings. What Does This Workflow Do? This automated workflow quickly adds accurate subtitles to your video content by leveraging the Json2Video API. It accepts a publicly accessible video URL as input. It makes an HTTP request to Json2Video, where AI analyzes the video, generates captions, and applies them seamlessly. The workflow returns a URL to the final subtitled video. The second part of the workflow periodically checks the Json2Video API to monitor the processing status at intervals of 10 seconds. 👉🏻 Try Json2Video for Free 👈🏻 Key Features Automatic & Synced Captions:** Captions are generated automatically and synchronized perfectly with your video. Fully Customizable Design:** Easily adjust fonts, colors, sizes, and more to match your unique style. Word-by-Word Display:** Supports precise, word-by-word captioning for improved clarity and viewer engagement. Super Fast Processing:** Rapid caption generation saves time, allowing you to focus more on creating great content. Preconditions To use this workflow, you must have: A Json2Video API account. A video hosted at a publicly accessible URL. Why You Need This Workflow Adding subtitles to your videos significantly enhances their reach and effectiveness by: Improving SEO visibility, enabling search engines to effectively index your video content. Enhancing viewer engagement and accessibility, accommodating viewers who watch without sound or who have hearing impairments. Streamlining your content production process, allowing more focus on creativity. Specific Use Cases Social Media Content:** Boost viewer retention by adding subtitles. Educational Videos:** Enhance understanding and improve learning outcomes. Marketing Videos:** Reach broader and more diverse audiences.
by Nikan Noorafkan
🧾 Template: Extract Ad Creatives from Google’s Ads Transparency Center This n8n workflow pulls ad creatives from Google's Ads Transparency Center using SerpApi, filtered by a specific domain and region. It extracts, filters, categorizes, and exports ads into neatly formatted CSV files for easy analysis. 👤 Who’s it for? Marketing Analysts** researching competitive PPC strategies Ad Intelligence Teams** monitoring creatives from specific brands Digital Marketers** gathering visual and copy trends Journalists & Watchdogs** reviewing ad activity transparency ✅ Features Fetch creatives** using SerpApi's google_ads_transparency_center engine Filter results** to include only ads with an exact match to your target domain Categorize** by ad format: text, image, or video Export CSVs**: Generates a downloadable file for each format under the /files/ directory 🛠 How to Use Edit the “Set Domain & Region” node domain: e.g. example.com region: SerpApi numeric region code → See codes Add your SerpApi API key In the “Get Ads Page 1” node’s credentials section. Run the workflow Click "Test workflow" to initiate the process. Download your results Navigate to /files/ to find: text_{domain}_ads.csv image_{domain}_ads.csv video_{domain}_ads.csv 📌 Notes Only the first page (up to 50 creatives) is fetched; pagination is not included. Sticky Notes inside the workflow nodes offer helpful internal annotations. CSV files include creative-level details: ad copy, images, video links, etc.
by Harshil Agrawal
This workflow gets the top 5 products from Product Hunt and shares them on the Discord server. Cron node: This node triggers the workflow every hour. Based on your use case, you can update the node to trigger the workflow at a different time. GraphQL node: This node makes the API call to the Product Hunt GraphQL API. You will need an API token from Product Hunt to make the call. Item Lists node: This node transforms the single item returned by the previous node into multiple items. Set node: The Set node is used to return only the name, description, and votes of the product. Discord node: This node is used to send the top 5 products to the Discord server.
by Swot.AI
This workflow automates document summarization directly from Google Drive, processes the content using Mistral AI, and delivers a clean, styled summary via Gmail. It's ideal for professionals who need quick insights from lengthy documents without manually reading through them. ✅ Key Features: Google Drive Integration: Fetches a file (PDF/DOCX) from your Drive. AI Summarization: Uses Mistral AI to extract key points efficiently. Styled Email Output: Delivers a formatted, easy-to-read summary to your inbox with a timestamp. Error Handling: Built to skip corrupted files or missing credentials. 🔧 Nodes Breakdown: 1️⃣ Manual Trigger — Starts the workflow manually for easy testing. 2️⃣ Google Drive Node — Downloads a specified file from Google Drive (supports PDF/DOCX). 3️⃣ Mistral Cloud Chat Model Node — Connects to Mistral AI for summarization. 4️⃣ Summarization Chain Node — Breaks the file into chunks, processes content, and generates a concise summary. 5️⃣ Gmail Node — Sends the styled summary directly to the user’s inbox, with custom formatting and current time in the Lagos timezone. Extra Features: Dynamic Time Formatting: Supports Lagos timezone (easily adjustable). HTML Styling: Beautiful email formatting with headers, icons, and line breaks for clarity. Custom Email Sender Name: Branded output (e.g., "Swot.AI"). Future Expansion: Can extend to WhatsApp or Slack with minor tweaks. Use Cases: Legal teams summarizing contracts. Content creators extracting highlights from research papers. Business analysts getting insights from reports on-the-go. Customization Tips: Change the timezone (Africa/Lagos) to match your preferred location. Add error-handling nodes for missing files or API failures. Swap Mistral AI with OpenAI for different summarization behavior. Change the "Send To" address(email to receive the Summarized texts) with your personal preffered address.** Change the "Sender Name" from Swot.AI to your preferred Sender Name.** Why To Use This Workflow? This automation saves hours of manual reading. It’s perfect for personal productivity, legal analysis, content creation, or business reporting. With clean formatting and a professional email summary — your team will get instant insights in seconds! I can make this much better and build others, If Interested: *Swot.ai25@gmail.com*
by Harshil Agrawal
This workflow demonstrates the use of the $item(index) method. This method is useful when you want to reference an item at a particular index. This example workflow makes POST HTTP requests to a dummy URL. Set node: This node is used to set the API key that will be used in the workflow later. This node returns a single item. This node can be replaced with other nodes, based on the use case. Customer Datastore node: This node returns the data of customers that will be sent in the body of the HTTP request. This node returns 5 items. This node can be replaced with other nodes, based on the use case. HTTP Request node: This node uses the information from both the Set node and the Customer Datastore node. Since, the node will run 5 times, once for each item of the Customer Datastore node, you need to reference the API Key 5 times. However, the Set node returns the API Key only once. Using the expression {{ $item(0).$node["Set"].json["apiKey"] }} you tell n8n to use the same API Key for all the 5 requests.
by Mauricio Perera
Overview This workflow exposes an HTTP endpoint (webhook) that accepts a JSON definition of an n8n workflow, validates it, and—if everything is correct—dynamically creates that workflow in the n8n instance via its internal API. If any validation fails or the API call encounters an error, an explanatory message with details is returned. Workflow Diagram Webhook │ ▼ Validate JSON ── fails validation ──► Validation Error │ └─ passes ─► Validation Successful? │ ├─ true ─► Create Workflow ──► API Successful? ──► Success Response │ │ │ └─ false ─► API Error └─ false ─► Validation Error Step-by-Step Details 1. Webhook Type**: Webhook (POST) Path**: /webhook/create-workflow Purpose**: Expose a URL to receive a JSON definition of a workflow. Expected Input**: JSON containing the main workflow fields (name, nodes, connections, settings). 2. Validate JSON Type**: Code Node (JavaScript) Validations Performed**: Ensure that payload exists and contains both name and nodes. Verify that nodes is an array with at least one item. Check that each node includes the required fields: id, name, type, position. If missing, initialize connections, settings, parameters, and typeVersion. Output if Error**: { "success": false, "message": "<error description>" } Output if Valid**: { "success": true, "apiWorkflow": { "name": payload.name, "nodes": payload.nodes, "connections": payload.connections, "settings": payload.settings } } 3. Validation Successful? Type**: IF Node Condition**: $json.success === true Branches**: true: proceed to Create Workflow false: route to Validation Error 4. Create Workflow Type**: HTTP Request (POST) URL**: http://127.0.0.1:5678/api/v1/workflows Authentication**: Header Auth with internal credentials Body**: The apiWorkflow object generated earlier Options**: continueOnFail: true (to handle failures in the next IF) 5. API Successful? Type**: IF Node Condition**: $response.statusCode <= 299 Branches**: true: proceed to Success Response false: route to API Error 6. Success Response Type**: SET Node Output**: { "success": "true", "message": "Workflow created successfully", "workflowId": "{{ $json.data[0].id }}", "workflowName": "{{ $json.data[0].name }}", "createdAt": "{{ $json.data[0].createdAt }}", "url": "http://localhost:5678/workflow/{{ $json.data[0].id }}" } 7. API Error Type**: SET Node Output**: { "success": "false", "message": "Error creating workflow", "error": "{{ JSON.stringify($json) }}", "statusCode": "{{ $response.statusCode }}" } 8. Validation Error Type**: SET Node Output**: { "success": false, "message": "{{ $json.message }}" } Example Webhook Request curl --location --request POST 'http://localhost:5678/webhook/create-workflow' \ --header 'Content-Type: application/json' \ --data-raw '{ "name": "My Dynamic Workflow", "nodes": [ { "id": "start-node", "name": "Start", "type": "n8n-nodes-base.manualTrigger", "typeVersion": 1, "position": [100, 100], "parameters": {} }, { "id": "set-node", "name": "Set", "type": "n8n-nodes-base.set", "typeVersion": 1, "position": [300, 100], "parameters": { "values": { "string": [ { "name": "message", "value": "Hello from a webhook-created workflow!" } ] } } } ], "connections": { "Start": { "main": [ [ { "node": "Set", "type": "main", "index": 0 } ] ] } }, "settings": {} }' Expected Success Response { "success": "true", "message": "Workflow created successfully", "workflowId": "abcdef1234567890", "workflowName": "My Dynamic Workflow", "createdAt": "2025-05-31T12:34:56.789Z", "url": "http://localhost:5678/workflow/abcdef1234567890" } Validation Error Response { "success": false, "message": "The 'name' field is required in the workflow" } API Error Response { "success": "false", "message": "Error creating workflow", "error": "{ ...full API response details... }", "statusCode": 401 }
by Daniel Ng
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Restore n8n Credentials from Google Drive Backup This template enables you to restore your n8n credentials from a backup file in Google Drive. It's an essential companion to a credential backup workflow, ensuring you can recover your setup in case of data loss, instance migration, or disaster recovery. The workflow intelligently checks for existing credentials to prevent accidental overwrites of credentials with the same name that are already present. This workflow is manually triggered. We recommend you use this restore workflow in conjunction with a backup solution like our "Auto Backup Credentials to Google Drive" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This workflow is for n8n administrators and users who have backed up their n8n credentials to Google Drive (e.g., using a companion backup template) and need to restore them to the same or a different n8n instance. It's crucial for those managing self-hosted instances. What problem is this workflow solving? / use case If an n8n instance becomes corrupted, needs to be migrated, or if credentials are accidentally deleted, a manual re-creation of all credentials can be extremely time-consuming and error-prone. This workflow automates the restoration process from a known backup, saving significant time and ensuring accuracy. It's particularly useful for: Disaster recovery. Migrating n8n instances. Quickly setting up a new n8n instance with existing credentials. What this workflow does The workflow is manually triggered and performs the following operations: Fetch Current Credentials: An "On Click Trigger" starts the process. It executes the command npx n8n export:credentials --all --decrypted via the "Execute Command Get All Credentials" node to get a list of all credentials currently in your n8n instance. This list is then processed by "JSON Formatting Data" and "Aggregate Credentials" nodes to extract just the names of existing credentials for comparison. Download Backup File from Google Drive: The "Google Drive Get Credentials File" node searches your Google Drive for the n8n_backup_credentials.json file. The "Google Drive Download File" node then downloads the found file. Process Backup Data: The "Convert Files To JSON" (an Extract From File node) converts the downloaded file content, expected to be JSON, into a usable JSON object. "Split Out" nodes then process this data to handle individual credential entries from the backup file. Loop and Restore Credentials: The "Loop Over Items" (a SplitInBatches node) iterates through each credential from the backup file. Duplicate Check: For each credential, an "IF" node ("Check For Skipped Credentials") checks two conditions using an OR combinator: If the credential name from the backup ($('Loop Over Items').item.json.name) is empty. If a credential with the same name already exists in the current n8n instance (by checking against the list from the "Aggregate Credentials" node). Conditional Restore: If the credential name is NOT empty AND it does NOT already exist (i.e., the conditions in the IF node are false), the workflow proceeds to the "Restore N8n Credentials" node (an n8n API node). This node uses the name, type, and data for each new credential from the backup file to create it in the n8n instance. Credentials with empty names or those already present are skipped as they take the true path of the IF node, which loops back. A "Wait" node introduces a 1-second delay after each restoration attempt, to prevent API rate limiting before looping to the next item. Step-by-step setup n8n Instance Environment (for current credentials check): The n8n instance must have access to npx and n8n-cli for the "Execute Command Get All Credentials" node to function. Google Drive Credentials: Configure the "Google Drive Get Credentials File" and "Google Drive Download File" nodes with your Google OAuth2 credentials. n8n API Credentials: Configure the "Restore N8n Credentials" node with your n8n API credentials. This API key needs permissions to manage credentials. Backup File Name: The workflow is configured to search for a file named n8n_backup_credentials.json in the "Google Drive Get Credentials File" node. If your backup file has a different name or you want to specify a path, update the "Query String" parameter in this node. How to customize this workflow to your needs Backup File Location/Query:** Modify the "Google Drive Get Credentials File" node parameters if your backup file is in a specific folder, has a different naming convention, or if you want more specific query logic. Overwrite Logic:** The current workflow skips existing credentials by name. If you need to update/overwrite existing credentials, you would need to modify the logic in the "Check For Skipped Credentials" (IF) node and potentially use an "update" operation in the "n8n" API node if available for credentials (note: updates often require the credential ID, which might not be in the backup file). Notifications:** Add notification steps (e.g., Email, Slack) to report on the success or failure of the restoration process, and to list which credentials were restored or skipped. Selective Restore:** To restore only specific credentials, you could add a filter step after "Split Out1" or modify the IF condition in "Check For Skipped Credentials" to check for particular credential names or types from the backup file. Error Handling:** Implement more robust error handling for API errors (e.g., from the n8n API node or Google Drive nodes), file not found issues, or problems during command execution. Important Note on Credential Security Decrypted Backup File:** This workflow assumes the n8n_backup_credentials.json file contains decrypted credential data, typically created by a companion backup workflow. Execution Environment:** The "Execute Command Get All Credentials" node requires npx n8n-cli access on the server running n8n.