by Hugo
This workflow provides a robust solution for automatically backing up all your n8n workflows to a designated GitHub repository on a daily basis. By leveraging the n8n API and GitHub API, it ensures your workflows are version-controlled and securely stored, safeguarding against data loss and facilitating disaster recovery. How it works The automation follows these key steps: Scheduled trigger: The workflow is initiated automatically every day at a pre-configured time. List existing backups: It first connects to your GitHub repository to retrieve a list of already backed-up workflow files. This helps in determining whether a workflow's backup file needs to be created or updated. Retrieve n8n workflows: The workflow then fetches all current workflows directly from your n8n instance using the n8n REST API. Process and prepare: Each retrieved workflow is individually processed. Its data is converted into JSON format. This JSON content is then encoded to base64, a format suitable for GitHub API file operations. Commit to GitHub: For each n8n workflow: A standardized filename is generated (e.g., workflow-name-tag.json). The workflow checks if a file with this name already exists in the GitHub repository (based on the list fetched in step 2). If the file exists: It updates the existing file with the latest version of the workflow. If it's a new workflow (file doesn't exist): A new file is created in the repository. Each commit is timestamped for clarity. This process ensures that you always have an up-to-date version of all your n8n workflows stored securely in your GitHub version control system, providing peace of mind and a reliable backup history. Pre-requisites Before you can use this template, please ensure you have the following: An active n8n instance (self-hosted or cloud). A GitHub account. A GitHub repository created where you want to store the workflow backups. A GitHub Personal Access Token with repo scope (or fine-grained token with read/write access to the specific backup repository). This token will be used for GitHub API authentication. n8n API credentials (API key) for your n8n instance. Set up steps Setting up this workflow should take approximately 10-15 minutes if you have your credentials ready. Import the template: Import this workflow into your n8n instance. Configure n8n API credentials: Locate the "Retrieve workflows" node. In the "Credentials" section for "n8n API", create new credentials (or select existing ones). Enter your n8n instance URL and your n8n API Key (you can create your n8n api key in the settings of your n8n instance) Configure GitHub credentials: Locate the "List files from repo" node (and subsequently "Update file" / "Upload file" nodes which will use the same credential). In the "Credentials" section for "GitHub API", create new credentials. Select OAuth2/Personal Access Token authentication method. Enter the GitHub Personal Access Token you generated as per the pre-requisites. Specify repository details: In the "List files from repo", "Update file", and "Upload file" GitHub nodes: Set the Owner: Your GitHub username or organization name. Set the Repository: The name of your GitHub repository dedicated to backups. Set the Branch (e.g., main or master) where backups should be stored. (Optional) Specify a Path within the repository if you want backups in a specific folder (e.g., n8n_backups/). Leave blank to store in the root. Adjust schedule (Optional): Select the "Schedule Trigger" node. Modify the trigger interval (e.g., change the time of day or frequency) as needed. By default, it's set for a daily run. Activate the workflow: Save and activate the workflow. Explanation of nodes Here's a detailed breakdown of each node used in this workflow: Schedule trigger** Type: n8n-nodes-base.scheduleTrigger Purpose: This node automatically starts the workflow based on a defined schedule (e.g., daily at midnight). List files from repo** Type: n8n-nodes-base.github Purpose: Connects to your specified GitHub repository and lists all files, primarily to check for existing workflow backups. Aggregate** Type: n8n-nodes-base.aggregate Purpose: Consolidates the list of file names obtained from the "List files from repo" node into a single item for easier lookup later in the "Check if file exists" node. Retrieve workflows** Type: n8n-nodes-base.n8n Purpose: Uses the n8n API to fetch a list of all workflows currently present in your n8n instance. Json file** Type: n8n-nodes-base.convertToFile Purpose: Takes the data of each workflow (retrieved by the "Retrieve workflows" node) and converts it into a structured JSON file format. To base64** Type: n8n-nodes-base.extractFromFile Purpose: Converts the binary content of the JSON file (from the "Json file" node) into a base64 encoded string. This encoding is required by the GitHub API for file content. Commit date & file name** Type: n8n-nodes-base.set Purpose: Prepares metadata for the GitHub commit. It generates: commitDate: The current date and time for the commit message. fileName: A standardized file name for the workflow backup (e.g., my-workflow-vps-backups.json), typically using the workflow's name and its first tag. Check if file exists** Type: n8n-nodes-base.if Purpose: A conditional node. It checks if the fileName (generated by "Commit date & file name") is present in the list of files aggregated by the "Aggregate" node. This determines if the workflow backup already exists in GitHub. Update file** Type: n8n-nodes-base.github Purpose: If the "Check if file exists" node determines the file does exist, this node updates that existing file in your GitHub repository with the latest workflow content (base64 encoded) and a commit message. Upload file** Type: n8n-nodes-base.github Purpose: If the "Check if file exists" node determines the file does not exist, this node creates and uploads a new file to your GitHub repository with the workflow content and a commit message. Customization Here are a few ways you can customize this template to better fit your needs: Backup path**: In the GitHub nodes ("List files from repo", "Update file", "Upload file"), you can specify a Path parameter to store backups in a specific folder within your repository (e.g., workflows/ or daily_backups/). Filename convention**: Modify the "Commit date & file name" node (specifically the expression for fileName) to change how backup files are named. For example, you might want to include the workflow ID or a different date format. Commit messages**: Customize the commit messages in the "Update file" and "Upload file" GitHub nodes to include more specific information if needed. Error handling**: Consider adding error handling branches (e.g., using the "Error Trigger" node or checking for node execution failures) to notify you if a backup fails for any reason. Filtering workflows**: If you only want to back up specific workflows (e.g., those with a particular tag or name pattern), you can add a "Filter" node after "Retrieve workflows" to include only the desired workflows in the backup process. Backup frequency**: Adjust the "Schedule Trigger" node to change how often the backup runs (e.g., hourly, weekly, or on specific days). Template was created in n8n v1.92.2
by Krishna Kumar Eswaran
🧠 Problem This Solves Managing credit card expenses can be tricky, especially when you want to stay transparent and keep your spouse in the loop. Most banks don't offer real-time notification sharing with family members, and manually updating expenses takes time and effort. This n8n workflow automates the entire process: tracking your HDFC credit card usage, logging it in Google Sheets, and sending an instant Telegram notification to your spouse. 👥 Who This Template Is For Couples who want shared visibility of credit card spending Individuals looking for automated personal finance tracking Anyone using HDFC Credit Card with email alerts enabled n8n users who want to integrate Gmail, Google Sheets, and Telegram ⚙️ Workflow Breakdown Here’s how the automation works: Gmail Trigger – Monitors your Gmail inbox for credit card transaction alerts from HDFC Bank. Email Parser – Extracts transaction details like amount, merchant name, date, and card type. Google Sheets Node – Logs the parsed transaction data into a structured Google Sheet for record-keeping. Telegram Node – Sends a message to your wife’s Telegram account with transaction details for instant notification. Step-by-Step Setup Instructions Prerequisites An HDFC Credit Card with email alerts enabled A Gmail account connected to n8n A Google Sheet created with columns like Date, Amount, Merchant, Card, etc. A Telegram Bot and your wife’s Telegram Chat ID Set up Gmail Trigger Use the Gmail Trigger Node to monitor incoming emails from alerts@hdfcbank.net or similar. Filter emails with subject line containing keywords like Credit Card Transaction Alert. Extract Email Content Use the HTML Extract or Regex node to parse out transaction amount, merchant name, date, and card number from the email body. Log to Google Sheets Connect your Google Sheets account in n8n Use the Append Row node to add each transaction as a new row in your finance sheet. Send Telegram Message Set up a Telegram Bot and get the Chat ID of your wife’s Telegram account Format a message like: "💳 HDFC Transaction Alert: ₹5,000 at Amazon on 17 May via XXXX1234" Send it via the Telegram node 🛠️ Customization Tips 💡 Add Spending Limits: Add a condition node to alert only if the transaction exceeds a certain amount. 🧾 Category Mapping: Use additional logic to classify expenses (e.g., Shopping, Dining) based on keywords. 📊 Weekly Summary: Create another workflow that sends a weekly Telegram summary using data from Google Sheets. 🔐 Security Tip: Mask part of the card number before sending the Telegram message for added security.
by Airtop
Scoring LinkedIn Profiles Against Your ICP Use Case This automation scores individual LinkedIn profiles against your Ideal Customer Profile (ICP) based on interest in AI, technical depth, and seniority level. It's ideal for prioritizing leads and understanding how well a person fits your ICP criteria. What This Automation Does Given a LinkedIn profile and an Airtop profile, it: Extracts relevant data from the person's profile Determines levels of AI interest, seniority, and technical depth Calculates an ICP score based on weighted criteria Returns the full enriched profile with the score Input parameters: LinkedIn Profile URL** (e.g., https://linkedin.com/in/janedoe) Airtop Profile** connected to LinkedIn ICP scoring method** in the Airtop node prompt Output fields in JSON format: Full name, job title, employer, company LinkedIn URL, location, number of connections and followers, about section content and more Calculated ICP Score (out of 100) How It Works Form Trigger or Workflow Trigger: Accepts input from either a form or another workflow. Parameter Assignment: Ensures proper variable names for downstream nodes. Airtop Enrichment Tool: Extracts and scores the person based on a detailed prompt. Scoring: Uses this point system: AI Interest: beginner (5), intermediate (10), advanced (25), expert (35) Technical Depth: basic (5), intermediate (15), advanced (25), expert (35) Seniority Level: junior (5), mid-level (15), senior (25), executive (30) Output Formatting: Cleans and returns the result as JSON. Setup Requirements IMPORTANT: Enter your ICP scoring method in the prompt field of the Airtop node Airtop Profile connected to LinkedIn. Airtop API credentials configured in n8n. Optional: a front-end form to collect profile URLs and trigger the automation. Next Steps Embed in CRM**: Trigger this automation on new leads to auto-score them. Batch Process Leads**: Run it over a list of profile URLs for segmentation. Customize Scoring**: Adjust point weights based on your sales priorities. Read more about Scoring LinkedIn Profiles Against Your ICP
by Airtop
Monitor X for Relevant Posts Use Case This automation monitors X (formerly Twitter) search pages in real time and extracts high-signal posts that match your categories of interest. It’s ideal for community engagement, lead discovery, thought leadership tracking, or competitive analysis. What This Automation Does Given a search URL and a list of categories, it: Logs into X using Airtop Opens the specified search URL Scrolls through the results Extracts up to 10 valid, English-language posts Filters and classifies each post by category (or marks as [NA] if unrelated) Returns the structured results as JSON Input parameters: airtop_profile** — An Airtop browser profile authenticated on X x_url** — X search URL (e.g., https://x.com/search?q=ai agents&f=live) relevant_categories** — Text-based list of categories to classify posts (e.g., "Web automation use cases", "Thought leadership") Output: A JSON array of posts, each with: writer time text url category How It Works Trigger: This workflow is triggered by another workflow (e.g., a community engagement pipeline). Input Setup: Accepts the Airtop profile, search URL, and categories to use for classification. Session: Starts a browser session using the Airtop profile. Window Navigation: Opens the provided X search URL. Extraction: Scrapes up to 10 posts with /status/ in the URL and text in English. Classification: Each post is labeled with a category if relevant, or [NA] otherwise. Filtering: Discards [NA] posts. Output: Returns the list of classified posts. Setup Requirements Airtop profile with an active X login. Airtop API key connected in n8n. List of category definitions to guide post classification (used in prompt). Next Steps Feed into Engagement Workflows**: Pass the results to workflows that reply, retweet, or track posts. Use in Slack Alerts**: Push classified posts into Slack channels for review and reaction. Customize Classifier**: Refine the categorization logic to include sentiment or company mentions. Read more about Monitoring X for Relevant Posts
by Darien Kindlund
Do you consistently forget to set a Default Error Workflow when creating new workflows? Then this helper workflow is for you! When activated, this helper workflow will: Scan ALL other workflows every 4 hours Make sure ALL workflows have a default error workflow set (based on what Workflow ID you provide) This helper will SKIP OVER any workflows that have the default_error:false tag set (make sure your default error workflow has the default_error:false tag set, so that you don't end up with recursive loops during errors) Setup Nodes: Once imported, edit the Set Vars node with your default_error_workflow_id value. If you want to change the default_error:false tag to some other tag name, you can do so here as well. You need to update the Set Default Error Workflow node with your PostgreSQL credentials to access the n8n database.
by Akhil Varma Gadiraju
📥 Gmail Attachment Backup to Google Drive — n8n Workflow This n8n workflow automatically backs up email attachments from a specific sender in Gmail to a designated folder in Google Drive. It polls Gmail every minute and uploads any new attachments from matching emails to the specified Google Drive folder with a timestamped filename. 📌 Use Case Primary Purpose: Automatically archive and back up attachments from a specific sender (e.g., test@gmail.com) to Google Drive for safekeeping, audit, or processing. Ideal For: Automating invoice/receipt collection from a vendor Archiving reports from a monitored email address Creating a searchable historical log of attachments for compliance 🧭 Workflow Overview Here’s how the workflow operates: 🔔 Gmail Trigger Polls Gmail every minute for new messages from a specific sender (test@gmail.com). 📩 Gmail Get Message Retrieves the full contents (including attachments) of the matched email. 🧠 Code (JS) Iterates over all binary attachments in the email and restructures them as individual binary items to upload separately. 📤 Google Drive Uploads each attachment to a target Google Drive folder (DOcs) with a timestamp and unique name. 📍 Replace Me (NoOp) Placeholder node to indicate workflow completion. You can replace this with Slack notifications, logs, or alerts. 🔧 How to Use Prerequisites An n8n instance (self-hosted or cloud) A connected Gmail account with OAuth2 credentials A connected Google Drive account with OAuth2 credentials Permissions for n8n to access your Gmail and Google Drive Setup Instructions Import the Workflow Copy and paste the workflow JSON into your n8n editor. Set Up Credentials Ensure the following credentials exist and are authorized: `Gmail (for Gmail nodes) `Google Drive (for Google Drive node) Configure the Folder Update the folderId in the Google Drive node if you want to use a different target folder. Activate the Workflow Enable the workflow in n8n. It will start polling Gmail every minute. ✏️ How to Customize | Task | How to Customize | |------|------------------| | Change sender filter | Modify the sender field in the Gmail Trigger node | | Adjust polling frequency | Change the pollTimes configuration in the trigger node | | Change destination folder | Update folderId in the Google Drive node | | Modify filename format | Edit the name expression in the Google Drive node | | Add post-upload logic | Replace or extend the Replace Me node with notifications, logs, etc. | | Process only specific attachments | Add logic in the Code node to filter by filename or MIME type | 📂 Filename Format Example [MessageID]_[Timestamp]_backup_attachment This naming convention ensures uniqueness and traceability back to the original message. ✅ Future Improvements Email subject filtering** to narrow down the match Slack/Email notifications** after upload Deduplication check** to avoid reuploading the same files Virus scan or file validation** before upload 💬 Support For any issues using this workflow: Double-check your credential permissions Review n8n logs for Gmail or Google Drive errors Visit the n8n community forums
by Lucía Maio Brioso
🧑💼 Who is this for? This workflow is for anyone who receives too many emails and wants to stay informed without drowning in their inbox. If you're constantly checking your Gmail and wish you had someone summarizing messages and sending just the important parts to your phone, this is for you. Especially useful for solopreneurs, customer support, busy professionals, or newsletter addicts. 🧠 What problem is this workflow solving? Email is powerful, but also overwhelming. Important info gets buried in threads, and staying on top of things can mean hours wasted scanning messages. This workflow turns that chaos into clarity: as soon as a new email arrives, you get a concise AI-generated summary in Telegram — straight to your pocket. No more checking Gmail constantly. No more missing key updates. Just a clean, human-style summary, written in the language you choose. ⚙️ What this workflow does Watches your Gmail inbox for new messages Prepares the content, including sender, subject, and message body Sends it to OpenAI to generate a friendly, casual summary Delivers that summary to your Telegram chat All in seconds, completely automated. 🛠️ Setup Connect your accounts: Gmail, Telegram, and OpenAI credentials must be added to the respective nodes. Set your Telegram chat ID: Use a bot like @userinfobot to get it. Customize the language in the Set summary language node (default is English). Activate the workflow — and watch it go. 🧩 How to customize this workflow to your needs You can make this workflow your own in a few easy ways: Summarize only some emails**: Add a Filter node after the Gmail trigger (e.g., only messages from certain senders). Change the tone or detail of summaries**: Tweak the system prompt in the Summary generation agent. Use a different model**: Swap OpenAI’s GPT-4o for another provider like Claude or DeepSeek. Translate to your preferred language**: Just change "english" to "español", "français", etc.
by Airtop
Automating LinkedIn Company Data Extraction Use Case This automation extracts detailed company insights from a LinkedIn company page, including identity, scale, classification, and funding data. Ideal for investors, sales teams, and market researchers. What This Automation Does This automation accepts the following inputs: Company's LinkedIn URL**: The public LinkedIn page URL of the company. Airtop Profile (connected to LinkedIn)**: Your Airtop Profile authenticated on LinkedIn. It then extracts and returns structured data with: 1. Company Identity Full name Tagline Headquarters location (city, state, country) About section Website 2. Company Scale Current employee count Employee size bracket: [0-9], [10-150], [150+] 3. Business Classification Is the company an automation agency? (true/false) AI implementation level: Low / Medium / High Technical sophistication: Basic / Intermediate / Advanced / Expert 4. Funding Profile Most recent funding round Total amount raised Key investors Last funding update date How It Works Creates an Airtop session using the provided profile. Navigates to the company LinkedIn page. Executes an Airtop query to extract data. Outputs the result in a standardized JSON schema. Setup Requirements Airtop API Key A LinkedIn-authenticated Airtop Profile Next Steps Feed into CRM**: Enrich your accounts with detailed LinkedIn data. Prioritize Leads**: Use classification and funding data to prioritize outreach. Combine with People Data**: Integrate with individual-level enrichment for full context. Read more about how to extract company data from Linkedin with Airtop and n8n
by Promovaweb
When you collect leads from a form, you need to format the incoming data such as the lead's name and also apply a basic validation of the email entered. Lucky for us, N8N offers all of these features with simple expressions that can easily be applied to data. This workflow aims to show how you can process your lead data before saving it in Mautic. How it Works This workflow receives data from a Wordpress form; applies name formatting and basic validation to the email; Creates the contact in Mautic; If e-mail is invalid, add the lead in Dot Not Contact list. Setup Steps Set up credentials when you first open the workflow. You'll need a Mautic account. You need to configure a form in Wordpress (Elementor, WPForms, etc.) and send it to the N8N Webhook address; Now map the fields you need to apply formatting and validation. After testing your workflow, swap the Test URL to Production URL in Discourse and activate your workflow
by Haqi Ramadhani
Automatically detect new n8n releases (stable or beta) from GitHub, update Coolify environment variables, and trigger deployments. Functionality This workflow automates deployment of n8n releases to a Coolify instance. It supports two tracks: Beta Releases: Checks GitHub every minute for prereleases, filters duplicates, updates the N8N_VERSION environment variable, and deploys. Stable Releases (disabled by default): Checks the latest stable release hourly and deploys. Key Features: Deduplication**: Ensures no repeated deployments for the same release. Version Parsing**: Extracts the semantic version (e.g., 1.34.0) from GitHub release names. Coolify Integration**: Updates environment variables and triggers deployments via API. Expected Outcomes New n8n beta/stable releases detected via GitHub API. Coolify environment variable N8N_VERSION updated to the latest version. Automatic deployment triggered in Coolify. Setup Guide Replace Placeholders: Update m8ccg8k44coogsk84swk8kgs in the Update ENV and Deploy nodes with your Coolify Application UUID. Configure Credentials: Add Coolify API credentials (httpHeaderAuth) with a valid API token in the headers. Enable Triggers: Toggle the Auto Update Latest Release node if stable releases are desired. Adjust schedule intervals as needed. Test: Run the workflow manually to validate API connections and version parsing. SEO Keywords Automated Deployment, n8n CI/CD, Coolify Integration, GitHub Release Monitoring, Environment Variable Management, Beta Release Automation.
by InfraNodus
Using the knowledge graphs instead of RAG vector stores This workflow creates an AI chatbot agent that has access to several knowledge bases at the same time (used as "experts"). These knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows. The advantages of using GraphRAG instead of the standard vector stores for knowledge are: Easy and quick to set up (no complex data import workflows needed) A knowledge graph has a holistic view of your knowledge base Better retrieval of relations between the document chunks = higher quality responses How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. Here's a description step by step: The user submits a question using the AI chatbot (n8n interface, in this case, which can be accessed via a URL or embedded to any website) The AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge it has auto-generated by InfraNodus. The AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert. The query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert. Each InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expert (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG. The n8n AI Agent node integrates the responses received from the experts to produce the final answer. The final answer is sent back to the user's chat (or a webhook endpoint) How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20174217658396-Using-InfraNodus-Knowledge-Graphs-as-Experts-for-AI-Chatbot-Agents-in-n8n Also check out the video tutorial with a demo:
by Aitor | 1Node
This n8n workflow automates the generation and delivery of a daily order summary via email. It leverages an AI Agent to fetch and summarize e-commerce order data from the last 24 hours stored in Supabase, providing a concise overview of the daily business operations. How it works Scheduled Trigger:** The workflow is triggered every day at 8 AM. Sender Email Configuration:** A manual step allows you to set the sender's email address. AI Agent:** An AI Agent node acts as the central intelligence, interacting with various tools to gather and process data. Supabase Data Fetching:** The AI Agent calls "Get Orders," "Get Order Items," "Get Clients," and "Get Products" tables to retrieve relevant e-commerce data from your Supabase database. OpenAI Chat Model:** An OpenAI Chat Model with the 4.1 model is integrated to help the AI Agent understand and summarize the fetched data into a human-readable format. Gmail Summary:** Finally, the workflow sends a summarized report to your specified email address using the "Send Gmail Summary" node. Set up steps This setup will take approximately 15-20 minutes. Download the workflow: Download this workflow and import it into your n8n instance. Configure the Daily 8am trigger: Ensure the "Daily 8am" trigger is active and set to your desired timezone. Set Sender Email: In the "Set Sender Email" node, manually enter the email address you wish to use as the sender for the daily reports. Configure AI Agent: Chat Model: Connect your OpenAI Chat Model credential. Memory & Tools: Ensure all the necessary nodes ("Get Orders", "Get Order Items", "Get Clients", "Get Products", "Send Gmail Summary") are correctly linked to the AI Agent. In our workflow we call data from 4 tables in Supabase. Configure Supabase Database Connections: For each of the "Get Orders," "Get Order Items," "Get Clients," and "Get Products" nodes, you will need to configure your Supabase credentials to access your e-commerce database. Select the tables (e.g., orders, order_items, clients, products) that you want the AI agent to pull data from in your Supabase schema. Configure Gmail Credentials: In the "Send Gmail Summary" node, connect your Gmail account credentials to allow n8n to send emails on your behalf. Test the workflow: Run the workflow manually to ensure all connections are working correctly and the email summary is generated as expected. Requirements n8n instance:** An active n8n instance (self-hosted or cloud). Supabase Account:** A Supabase account with your e-commerce order data accessible. OpenAI API Key:** An OpenAI API key for the Chat Model. Gmail Account:** A Gmail account credentials to send the daily summaries. Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.