by Zacharia Kimotho
How it works This workflow gets the search console results data and exports this to google sheets. This makes it easier to visualize and do other SEO related tasks and activities without having to log into Search Console Setup and use Set your desired schedule Enter your desired domain Connect to your Google sheets or make a copy of this sheet. Detailed Setup Inputs and Outputs:** Input: API response from Google Search Console regarding keywords, page data, and date data. Output: Entries written to Google Sheets containing keyword data, clicks, impressions, CTR, and positions. Setup Instructions: Prerequisites:** An n8n instance set up and running. Active Google Account with access to Google Search Console and Google Sheets. Google OAuth 2.0 credentials for API access. Step-by-Step Setup:** Open n8n and create a new workflow. Add the nodes as described in the JSON. Configure the Google OAuth2 credentials in n8n to enable API access. Set your domain in the Set your domain node. Customize the Google Sheets document URLs to your personal sheets. Adjust the schedule in the Schedule Trigger node as per your requirements. Save the workflow. Configuration Options:** You can customize the date ranges in the body of the HttpRequest nodes. Adjust any fields in the Edit Fields nodes based on different data requirements. Use Case Examples: Useful in tracking website performance over time using Search Console metrics. Ideal for digital marketers, SEO specialists, and web analytics professionals. Offers value in compiling performance reports for stakeholders or team reviews. Running and Troubleshooting: Running the Workflow:** Trigger the workflow manually or wait for the schedule to run it automatically. Monitoring Execution:** Check the execution logs in n8n's dashboard to ensure all nodes complete successfully. Common Issues:** Invalid OAuth credentials – ensure credentials are set up correctly. Incorrect Google Sheets URLs – double-check document links and permissions. Scheduling conflicts – make sure the schedule set does not overlap with other workflows.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works Enhanced Calendly Integration: This workflow processes bookings and cancellations in Calendly, dynamically managing invitee and guest data with KlickTipp. Data Transformation: Dates and times are converted into formats (UNIX timestamps) compatible with KlickTipp’s API, ensuring seamless data integration. Key Features Calendly Trigger: Captures new bookings or cancellations of events, including participant details. Invitee and Guest Subscription in KlickTipp: Adds or updates invitees and guests in KlickTipp based on booking details (event name, time, join link, reschedule link, cancel link, etc.). Tracks and processes cancellations for both invitees and guests. Handles rescheduling intelligently to avoid redundant operations. Guest-Specific Operations: Processes guests individually for bookings and cancellations using dynamic arrays of email addresses. Recovers guest data from invitee records for cancellations since Calendly does not provide guest data upon cancellation. Data Processing: Standardizes and validates input fields Converts phone numbers to numeric-only format with international prefixes. Transforms dates into UNIX timestamps. Reads out the name of the invitee based on both possible input fields for name (name vs. firstname and lastname field setup). Error Handling: Validates critical fields like phone numbers, URLs, and dates to prevent incorrect data submissions. Setup Instructions Authentication: Set up the Calendly and KlickTipp nodes in your n8n instance. Configure authentication for both Calendly and KlickTipp nodes. Custom Field Preparation in KlickTipp: Create the following custom fields in KlickTipp to align with workflow requirements: | Feld-Name | Feld-Typ | |----------------------------------|------------------| | Calendly_event_name | Zeile | | Calendly_join_url | URL | | Calendly_reschedule_url | URL | | Calendly_cancel_url | URL | | Calendly_event_start_datetime | Datum & Zeit | | Calendly_event_end_datetime | Datum & Zeit | | Calendly_invitee_start_date | Datum | | Calendly_invitee_end_date | Datum | | Calendly_invitee_start_time | Zeit | | Calendly_invitee_end_time | Zeit | | Calendly_invitee_timezone | Zeile | | Calendly_invitee_guests_adresses| Zeile | After creating fields, allow 10-15 minutes for them to sync. If fields don’t appear, reconnect your KlickTipp credentials. Field Mapping and Adjustments: Open each KlickTipp node and map fields to match your setup. The workflow includes placeholders for: Invitee details (first name, last name, email, and phone). Event details (start/end times, timezone, etc.). Workflow Logic Trigger via Calendly event Booking: A new form event booking or cancellation from Calendly initiates the workflow Data Transformation: Processes raw Calendly event data to ensure compatibility with KlickTipp’s API. Add to KlickTipp Subscriber List: Adds invitees and guests to the designated KlickTipp list, including event-specific details. Benefits Efficient lead generation: Contacts from event bookings are automatically imported into KlickTipp and can be used immediately, saving time and increasing the conversion rate. Automated processes: Experts can start workflows directly, such as reminder emails or course admissions, reducing administrative effort. Error-free data management: The template ensures precise data mapping, avoids manual corrections and reinforces a professional appearance. Testing and Deployment Test the workflow by triggering a Calendly event and verifying data updates in KlickTipp. Notes: Customization: Update field mappings within the KlickTipp nodes to align with your account setup. This ensures accurate data syncing. Resources: Calendly KlickTipp Knowledge Base help article Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Johan Denoyer
How it works 1) Extracts all company entries in Agile CRM 2) Search for company name in French INSEE OpenData database to extract address and government ID (SIREN) 3) Updates entries with data extracted from French Insee OpenData dabase Workflow also has a readonly feature to make sure entry is not overwritten. Setup steps Add your AgileCRM credentials Add your INSEE OpenData credentials Add two company custom fields in your Agile CRM (for SIREN data and ReadOnly support)
by Angel Menendez
CallForge - AI Gong Transcript PreProcessor Transform your Gong.io call transcripts into structured, enriched, and AI-ready data for better sales insights and analytics. Who is This For? This workflow is designed for: ✅ Sales teams looking to automate call transcript formatting. ✅ Revenue operations (RevOps) professionals optimizing AI-driven insights. ✅ Businesses using Gong.io that need structured, enriched call transcripts for better decision-making. What Problem Does This Workflow Solve? Manually processing raw Gong call transcripts is inefficient and often lacks essential context for AI-driven insights. With CallForge, you can: ✔ Extract and format Gong call transcripts for structured AI processing. ✔ Enhance metadata using sales data from Salesforce. ✔ Classify speakers as internal (sales team) or external (customers). ✔ Identify external companies by filtering out free email domains (e.g., Gmail, Yahoo). ✔ Enrich customer profiles using PeopleDataLabs to identify company details and locations. ✔ Prepare transcripts for AI models by structuring conversations and removing unnecessary noise. What This Workflow Does 1. Retrieves Gong Call Data Calls the Gong API to extract call metadata, speaker interactions, and collaboration details. Fetches call transcripts for AI processing. 2. Processes and Cleans Transcripts Converts call transcripts into structured, speaker-based dialogues. Assigns each speaker as either Internal (Sales Team) or External (Customer). 3. Extracts Company Information Retrieves Salesforce data** to match customers with existing sales opportunities. Filters out free email domains* to determine the *customer’s actual company domain**. Calls the PeopleDataLabs API to retrieve additional company data and location details. 4. Merges and Enriches Data Combines Gong metadata, Salesforce customer details and insights**. Ensures all necessary data is available for AI-driven sales insights. 5. Final Formatting for AI Processing Merges all call transcript data into a single structured format for AI analysis. Extracts the final cleaned, enriched dataset for further AI-powered insights. How to Set Up This Workflow 1. Connect Your APIs 🔹 Gong API Access – Set up your Gong API credentials in n8n. 🔹 Salesforce Setup – Ensure API access if you want customer enrichment. 🔹 PeopleDataLabs API – Required to retrieve company and location details based on email domains. 🔹 Webhook Integration – Modify the webhook call to push enriched call data to an internal system. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion How to Customize This Workflow 💡 Modify Data Sources – Connect different CRMs (e.g., HubSpot, Zoho) instead of Salesforce. 💡 Expand AI Analysis – Add another AI model (e.g., OpenAI GPT, Claude) for advanced conversation insights. 💡 Change Speaker Classification Rules – Adjust internal vs. external speaker logic to match your team’s structure. 💡 Filter Specific Customers – Modify the free email filtering logic to better fit your company’s needs. Why Use CallForge? 🚀 Automate Gong call transcript processing to save time. 📊 Improve AI accuracy with enriched, structured data. 🛠 Enhance sales strategy by extracting actionable insights from calls. Start optimizing your Gong transcript analysis today!
by Joseph LePage
Compare Local Ollama Vision Models for Image Analysis using Google Docs Process images using locally hosted Ollama Vision Models to extract detailed descriptions, contextual insights, and structured data. Save results directly to Google Docs for efficient collaboration. Who is this for? This workflow is ideal for developers, data analysts, marketers and AI enthusiasts who need to process and analyze images using locally hosted Ollama Vision Language Models. It’s particularly useful for tasks requiring detailed image descriptions, contextual analysis, and structured data extraction. What problem is this workflow solving? / Use Case The workflow solves the challenge of extracting meaningful insights from images in exhaustive detail, such as identifying objects, analyzing spatial relationships, extracting textual elements, and providing contextual information. This is especially helpful for applications in real estate, marketing, engineering, and research. What this workflow does This workflow: Downloads an image file from Google Drive. Processes the image using multiple Ollama Vision Models (e.g., Granite3.2-Vision, Gemma3, Llama3.2-Vision). Generates detailed markdown-based descriptions of the image. Saves the output to a Google Docs file for easy sharing and further analysis. Setup Ensure you have access to a local instance of Ollama. https://ollama.com/ Pull the Ollama vision models. Configure your Google Drive and Google Docs credentials in n8n. Provide the image file ID from Google Drive in the designated node. Update the list of Ollama vision models Test the workflow by clicking ‘Test Workflow’ to trigger the process. How to customize this workflow to your needs Replace the image source with another provider if needed (e.g., AWS S3 or Dropbox). Modify the prompts in the "General Image Prompt" node to suit specific analysis requirements. Add additional nodes for post-processing or integrating results into other platforms like Slack or HubSpot. Key Features: Detailed Image Analysis**: Extracts comprehensive details about objects, spatial relationships, text elements, and contextual settings. Multi-Model Support**: Utilizes multiple vision models dynamically for optimal performance. Markdown Output**: Formats results in markdown for easy readability and documentation. Google Drive Integration**: Seamlessly downloads images and saves results to Google Docs.
by lin@davoy.tech
This workflow template, "Chinese Translator via Line x OpenRouter (Text & Image)" is designed to provide seamless Chinese translation services directly within the LINE messaging platform. By integrating with OpenRouter.ai and advanced language models like Qwen, this workflow translates text or images containing Chinese characters into pinyin and English translations, making it an invaluable tool for language learners, travelers, and businesses operating in multilingual environments. This template is ideal for: Language Learners: Who want to practice Chinese by receiving instant translations of text or images. Travelers: Looking for quick translations of Chinese signs, menus, or documents while abroad. Educators: Teaching Chinese language courses and needing tools to assist students with translations. Businesses: Operating in multilingual markets and requiring efficient communication tools. Automation Enthusiasts: Seeking to build intelligent chatbots that can handle language translation tasks. What Problem Does This Workflow Solve? Translating Chinese text or images into English and pinyin can be challenging, especially for beginners or those without access to reliable translation tools. This workflow solves that problem by: Automatically detecting and translating text or images containing Chinese characters. Providing accurate translations in both pinyin and English for better comprehension. Supporting multiple input formats (text, images) to cater to diverse user needs. Sending replies directly to users via the LINE messaging platform , ensuring accessibility and ease of use. What This Workflow Does 1) Receive Messages via LINE Webhook The workflow is triggered when a user sends a message (text, image, or other types) to the LINE bot. 2) Display Loading Animation A loading animation is displayed to reassure the user that their request is being processed. 3) Route Input Types The workflow uses a Switch node to determine the type of input (text, image, or unsupported formats). If the input is text , it is sent to the OpenRouter.ai API for translation. If the input is an image , the workflow extracts the image content, converts it to base64, and sends it to the API for translation. Unsupported formats trigger a polite response indicating the limitation. 4) Translate Content Using OpenRouter.ai The workflow leverages Qwen models from OpenRouter.ai to generate translations: For text inputs, it provides Chinese characters , pinyin , and English translations . For images, it extracts and translates using the qwen-VL model which can take images 5) Reply with Translations The translated content is formatted and sent back to the user via the LINE Reply API. Setup Guide Pre-Requisites Access to the LINE Developers Console to configure your webhook and channel access token. An OpenRouter.ai account with credentials to access Qwen models. Basic knowledge of APIs, webhooks, and JSON formatting. Step-by-Step Setup 1) Configure the LINE Webhook: Go to the LINE Developers Console and set up a webhook to receive incoming messages. Copy the Webhook URL from the Line Webhook node and paste it into the LINE Console. Remove any "test" configurations when moving to production. 2) Set Up OpenRouter.ai: Create an account on OpenRouter.ai and obtain your API credentials. Connect your credentials to the OpenRouter nodes in the workflow. 3) Test the Workflow: Simulate sending text or images to the LINE bot to verify that translations are processed and replied correctly. How to Customize This Workflow to Your Needs Add More Languages: Extend the workflow to support additional languages by modifying the API calls. Enhance Image Processing: Integrate more advanced OCR tools to improve text extraction from complex images. Customize Responses: Modify the reply format to include additional details, such as grammar explanations or cultural context. Expand Use Cases: Adapt the workflow for specific industries, such as tourism or e-commerce, by tailoring the translations to relevant vocabulary. Why Use This Template? Real-Time Translation: Provides instant translations of text and images, improving user experience and accessibility. Multimodal Support: Handles both text and image inputs, catering to diverse user needs. Scalable: Easily integrate into existing systems or scale to support multiple users and workflows. Customizable: Tailor the workflow to suit your specific audience or industry requirements.
by Carlos Contreras
Introduction This workflow is designed to create and attach notes or comments to any record in your Odoo instance. It acts as a sub-workflow that can be triggered by a main workflow to log messages or comments in a centralized manner. By leveraging the powerful Odoo API, this template ensures that updates to records are handled efficiently, providing an organized way to document important information related to your business processes. Setup Instructions Import the Workflow: Import the provided JSON file into your n8n instance. Odoo Credentials: Ensure you have valid Odoo API credentials (e.g., "Roodsys Odoo Automation Account") configured in n8n. Node Configuration: Verify that the "Odoo" node (consider renaming it to "Odoo Record Manager" for clarity) is set up with your server details and authentication parameters. Check that the workflow trigger ("When Executed by Another Workflow") is configured to receive input parameters from the parent workflow. Execution Trigger: This workflow is designed to be initiated by another workflow. Make sure the main workflow supplies the required inputs. Workflow Details Trigger Node: The workflow begins with the "When Executed by Another Workflow" node, which accepts three inputs: rec_id: A numeric identifier for the Odoo record. message: The text of the comment or note. model: The specific Odoo model (e.g., rs.deployment.action.log) where the note should be attached. Odoo Node: The second node in the workflow calls the Odoo API to create a new log message. It maps the inputs as follows: message_type is set to "comment". model is assigned the provided model name. res_id is assigned the record ID (rec_id). body is assigned the message content. Additional Information: A sticky note node is included to provide a brief overview of the workflow’s purpose directly within the interface. Input Parameters Record ID (rec_id): The unique identifier of the record in Odoo where the note will be added. Message (message): The content of the comment or note that is to be logged. Model (model): The Odoo model name indicating the context in which the note should be created (e.g., rs.deployment.action.log). Usage Examples Internal Logging: Use the workflow to attach internal comments or logs to specific records, such as customer profiles, orders, or deployment logs. Audit Trails: Create a comprehensive audit trail by documenting changes or important events in Odoo records. Integration with Other Workflows: Link this workflow with other automation processes in n8n (like email notifications, data synchronization, or reporting) to create a seamless integration across your systems. Pre-conditions The Odoo instance must be accessible and correctly configured. API permissions and user roles should be validated to ensure that the workflow has the necessary access rights. The workflow expects inputs from an external trigger or parent workflow. Customization & Integration This template offers several customization options to tailor it to your needs: Field Customization: Modify or add new fields to match your logging or commenting requirements. Node Renaming: Rename nodes for better clarity and consistency within your workflow ecosystem. Integration Possibilities: Easily integrate this workflow with other processes in n8n, such as triggering notifications or synchronizing data across different systems. This sub-workflow receives data from a main workflow (for example, a record ID, a message, and the Odoo model) and creates a new note (or comment) in the corresponding Odoo record. Essentially, it acts as a centralized point for logging comments or notes in a specific Odoo model, ensuring that the information remains organized and easy to track. Your model must inherit from _inherit = ['portal.mixin', 'mail.thread.main.attachment']
by Miquel Colomer
This n8n workflow template automates the process of finding LinkedIn profiles for a person based on their name, and company. It scrapes Google search results via Bright Data, parses the results with GPT-4o-mini, and delivers a personalized follow-up email with insights and suggested outreach steps. 🚀 What It Does Accepts a user-submitted form with a person’s full name, and company. Performs a Google search using Bright Data to find LinkedIn profiles and company data. Uses GPT-4o-mini to parse HTML results and identify matching profiles. Filters and selects the most relevant LinkedIn entry. Analyzes the data to generate a buyer persona and follow-up strategy. Sends a styled email with insights and outreach steps. 🛠️ Step-by-Step Setup Deploy the form trigger to accept person data (name, position, company). Build a Google search query from user input. Scrape search results using Bright Data. Extract HTML content using the HTML node. Use GPT-4o-mini to parse LinkedIn entries and company insights. Filter for matches based on user input. Merge relevant data and generate personalized outreach content. Send email to a predefined address. Show a final confirmation message to the user. 🧠 How It Works: Workflow Overview Trigger:** When User Completes Form Search:** Edit Url LinkedIn, Get LinkedIn Entry on Google, Extract Body and Title, Parse Google Results Matching:** Extract Parsed Results, Filter, Limit, IF LinkedIn Profile is Found? Fallback:** Form Not Found if no match Company Lookup:** Edit Company Search, Get Company on Google, Parse Results, Split Out Content Generation:** Merge, Create a Followup for Company and Person Email Delivery:** Send Email, Form Email Sent 📨 Final Output An HTML-styled email (using Tailwind CSS) with: Matched LinkedIn profile Company insights Persona-based outreach strategy 🔐 Credentials Used BrightData account** for scraping Google search results OpenAI account** for GPT-4o-mini-powered parsing and content generation SMTP account** for sending follow-up emails ❓Questions? Template and node created by Miquel Colomer and n8nhackers. Need help customizing or deploying? Contact us for consulting and support.
by Jason Guest
Automatically deploy n8n workflows by simply dropping JSON files into a Google Drive folder—this template watches for new exports, cleans and imports them into your n8n instance, applies a tag, and then archives the processed files. Who is this template for? This workflow template is designed for n8n power users, and automation specialists who need a simple, reliable way to bulk‑deploy or version‑control n8n workflows via Google Drive. It’s perfect if you: Manage multiple n8n instances (staging, production, etc.) Want an easy “drop‑in” approach to publish new or updated workflows Prefer storing/exporting JSON in Drive rather than editing in the UI Use case Manually importing .json exports into n8n is slow and error‑prone. With this template you can: Keep your workflows in a shared Drive folder (version control friendly) Automatically sanitize each file so only supported settings go through Tag deployed workflows consistently for easy filtering Move processed files to a “Deployed” folder for clear change tracking How it works Watch “ToDeploy” folder in Google Drive for new .json files Download & parse each file into a JSON object Clean payload: strip out everything except the allowed executionOrder (and timezone if you choose) POST the cleaned workflow to your n8n instance via /api/v1/workflows PUT a predefined tag onto the newly created workflow Move file to your “Deployed” folder when import succeeds, or capture the workflow name & error if it fails Setup instructions 1. In Google Drive create a ToDeploy folder and a Deployed folder Update "Google Drive Trigger -ToDeploy folder" to your ToDeploy folder Update "Move JSON file to Deployed folder" to you Deployed folder 2. Create a n8n API key: +Go to Settings > n8n API +Select Create an API key +Copy API Key 3. In "Get Existing Workflow Tags" node: Create n8n API Authentication Authentication: Predefined Credential Type Credential Type: n8n API Create new credential: +Paste in API key +Baseurl: https://SUBDOMAIN.YOURDOMAINNAME.com/api/v1/ 4. Add n8n API authentication to: "Create n8n Workflow" node "Set Workflow Tag" node 5. Add your N8N instance URL to the N8N_Instance_URL variable in "Set n8n URL variable" node. 6. Run "1. Get Workflow Tags" flow and copy the ID of your chosen tag. 7. In "Set n8n API URL & Tag ID variables" node: Add the Workflow Tag ID to the N8N_Instance_Tag variable Add your N8N instance URL to the N8N_Instance_URL variable 8. Set workflow to Active How to adjust it to your needs Use different tags: run Get Existing Workflow Tags on start‑up to refresh available tags, or hard‑code multiple tags in the Set Workflow Tag node. Add notifications**: connect the error branch to Slack or Email nodes so you get alerted if an import fails. Swap Drive for another storage**: replace Google Drive nodes with Dropbox, S3, or GitHub triggers if you prefer a different source for your JSON files.
by n8n Team
This workflow sends a message to a Discord channel when a new row is added or a row is updated in a Google Sheet. The message will send all data rows in the Google Sheet. Prerequisites Discord account and Discord credentials. Google account and Google credentials. How it works Using a code node, we can use the obtained Google Sheet data to create a custom message that will be sent to Discord. The message will be sent to the Discord channel specified in the Discord node. Setup This workflow requires that you set up a Discord webhook and have an existing Google Sheet with data. See how to set up a Discord webhook here.
by ScrapeOps
Amazon Product Price Tracker This workflow automatically monitors Amazon product prices, tracks price changes, and sends alerts when significant price fluctuations occur. Built with ScrapeOps' structured data API, it provides a reliable, maintenance-free solution for price tracking without worrying about anti-bot measures or complex selectors. What This Workflow Does Monitors multiple Amazon products simultaneously using their ASINs Calculates both absolute and percentage price changes Sends customizable email alerts when prices cross defined thresholds Maintains a historical record of all price data for trend analysis Updates a Google Sheets with the latest price information Prerequisites A ScrapeOps API key (register at https://scrapeops.io) Google account for Google Sheets integration SMTP email configuration for alerts Setup Instructions Spreadsheet Setup Make a copy of the template spreadsheet: https://docs.google.com/spreadsheets/d/1hRv-TBXrpN6rkIU65WorttNHt-IPWas_An0sF4Of39U Add your Amazon product ASINs in the "Products to Monitor" sheet Set your desired alert thresholds for price increases/decreases Workflow Configuration Add your ScrapeOps API key to the "Setup" node Update the spreadsheet URL in the "Setup" node with YOUR copy Configure your email settings for notifications Adjust the schedule frequency as needed (default: hourly) How It Works The workflow reads product ASINs from your Google Sheet, fetches current pricing data via ScrapeOps' Amazon Product API, calculates price changes, updates your spreadsheet, and sends alerts when price movements exceed your defined thresholds. Unlike traditional web scrapers that break when websites change, this solution uses ScrapeOps' reliable API that handles all the complexity of Amazon data extraction, ensuring consistent results without maintenance. Additional Notes This workflow is ideal for deal hunters, price comparison services, and e-commerce analytics The alerting system can be extended to additional channels like Slack or Telegram ScrapeOps handles all anti-bot measures, proxy management, and parsing complexities
by Akhil Varma Gadiraju
n8n Workflow: Sync Workflows with GitLab How It Works This workflow ensures that your self-hosted n8n workflows are version-controlled in a GitLab repository. It compares each current workflow from n8n with its stored counterpart in GitLab. If any differences are detected, the GitLab file is updated with the latest version. Core Logic: Retrieve Workflows – Fetch all workflows from the n8n REST API. Compare with GitLab – For each workflow, fetch the corresponding file from GitLab and compare the JSON. Update if Changed – If differences exist, commit the updated workflow to GitLab using its API. Setup Before using the workflow, ensure the following: Prerequisites: n8n**: Self-hosted instance with access to the /rest/workflows API. GitLab**: A repository where workflows will be stored, and a Personal Access Token (PAT) with api and write_repository permissions. n8n Nodes Required**: HTTP Request (to call n8n and GitLab APIs) Code or Function nodes (for diffing and formatting) Looping (SplitInBatches or similar) Configuration: Set environment variables or workflow credentials for: GITLAB_TOKEN GITLAB_REPO GITLAB_BRANCH (e.g., main) GITLAB_FILE_PATH_PREFIX (e.g., n8n-workflows/) How to Use Import the Workflow into your n8n instance. Configure GitLab API Credentials: Set the GitLab PAT as a header in the HTTP Request node: Private-Token: {{ $env.GITLAB_TOKEN }} Map Workflows to GitLab Paths: Use the workflow name or ID to create the file path. Example: n8n-workflows/workflow-name.json Trigger the Workflow: Can be manually triggered, or scheduled to run at intervals (e.g., daily). Review Commits in GitLab: Each updated workflow will be committed with a message like: "Update workflow: Sample Workflow" Disclaimer This workflow does not handle merge conflicts or manual edits made directly in GitLab. Always ensure proper coordination if multiple sources are modifying workflows. Only structural changes are tracked. Non-functional metadata (like timestamps or IDs) may trigger false positives unless filtered. Use at your own risk. Test in a safe environment before applying to production workflows.