by Sidetool
This workflow is a supporting automation to a common Airtable situation, that as of this writing, has no direct solution but has great demand. Interfaces are your secret weapon for managing a variety of tasks – from sales funnels and task tracking to creating dynamic dashboards. But here's a common situation: how do you efficiently bulk upload records (like contacts, leads, or clients) from an interface with just a click? Once set up, you'll be able to upload CSV files directly to your tables from the Interfaces with ease. Workflow Key Points: 1. Bulk Upload Functionality: Say goodbye to the limitations of standard Airtable interfaces. Now, you can upload multiple leads or contacts simultaneously, making your work swift and efficient. 2. Customizable Fields: Tailor the base to meet your specific data needs. This ensures seamless integration with your existing systems and simplifies data management. Perfect for teams in e-commerce, CRM, or any sector where managing a high volume of leads or contacts is key. Our Airtable Base is designed to eliminate the tediousness of importing contacts. It makes large-scale data management straightforward, saving you precious time and hassle. Get ready to streamline your operations and boost your productivity! 🚀💡
by Sk developer
🚀 LinkedIn Video to MP4 Automation with Google Drive & Sheets | RapidAPI Integration This n8n workflow automatically converts LinkedIn video URLs into downloadable MP4 files using the LinkedIn Video Downloader API, uploads them to Google Drive with public access, and logs both the original URL and Google Drive link into Google Sheets. It leverages the LinkedIn Video Downloader service for fast and secure video extraction. 📝 Node Explanations (Single-Line) 1️⃣ On form submission → Captures LinkedIn video URL from the user via a web form. 2️⃣ HTTP Request → Calls LinkedIn Video Downloader to fetch downloadable MP4 links. 3️⃣ If → Checks for API errors and routes workflow accordingly. 4️⃣ Download mp4 → Downloads the MP4 video file from the API response URL. 5️⃣ Upload To Google Drive → Uploads the downloaded MP4 file to Google Drive. 6️⃣ Google Drive Set Permission → Makes the uploaded file publicly accessible. 7️⃣ Google Sheets → Logs successful conversions with LinkedIn URL and sharable Drive link. 8️⃣ Wait → Delays execution before logging failed attempts. 9️⃣ Google Sheets Append Row → Logs failed video downloads with N/A Drive link. 📄 Google Sheets Columns URL** → Original LinkedIn video URL entered in the form. Drive_URL** → Publicly sharable Google Drive link to the converted MP4 file. (For failed downloads) → Drive_URL will display N/A. 💡 Use Case Automate LinkedIn video downloading and sharing using LinkedIn Video Downloader for social media managers, marketers, and content creators without manual file handling. ✅ Benefits Time-saving* (auto-download & upload), *Centralized tracking* in Sheets, *Easy sharing* via Drive links, and *Error logging* for failed downloads—all powered by *RapidAPI LinkedIn Video Downloader**.
by Miquel Colomer
This workflow allows extracting data from multiple pages website. The workflow: 1) Starts in a country list at https://www.theswiftcodes.com/browse-by-country/. 2) Loads every country page (https://www.theswiftcodes.com/albania/) 3) Paginates every page in the country page. 4) Extracts data from the country page. 5) Saves data to MongoDB. 6) Paginates through all pages in all countries. It uses getWorkflowStaticData('global') method to recover the next page (saved from the previous page), and it goes ahead with all the pages. There is a first section where the countries list is recovered and extracted. Later, I try to read if a local cache page is available and I recover the cached page from the disk. Finally, I save data to MongoDB, and we paginate all the pages in the country and for all the countries. I have applied a cache system to save a visited page to n8n local disk. If I relaunch workflow, we check if a cache file exists to discard non-required requests to the webpage. If the data present in the website changes, you can apply a Cron node to check the website once per week. Finally, before inserting data in MongoDB, the best way to avoid duplicates is to check that swift_code (the primary value of the collection) doesn't exist. I recommend using a proxy for all requests to avoid IP blocks. A good solution for proxy plus IP rotation is scrapoxy.io. This workflow is perfect for small data requirements. If you need to scrape dynamic data, you can use a Headless browser or any other service. If you want to scrape huge lists of URIs, I recommend using Scrapy + Scrapoxy.
by Julien DEL RIO
This template is inspired by Save your workflows into a GitHub repository by hikerspath and Back Up Your n8n Workflows To Github by jon-n8n. Basic Retrieve all workflows from an n8n instance and save it on a gitlab project. If the workflow exist, il will only save the changes. Flow What the workflow does : Sets custom parameters Gets workflows Iterates through each workflow one by one Get the file from Gitlab if exists Compare the files as objects (not as strings) Return a status on the workflow Create, Edit or ignore the file depending on the status Return a list of status for each workflow Configuration Select a credential in each Gitlab nodes. Edit the data in node "Globals" : repo.owner : slug of the user or team owning the repo repo.name : slug of the repository repo.branch : branch to commit on repo.path : from root of the repository. Should end with / Comments Error on gitlab nodes will not stop the run but will list the current workflow as error in the results Some fields are ignored to determined if there is changes : updatedAt : should be ignored if only ignores fields are changed globals : it's running information, no need to follow the changes
by Joachim Hummel
This workflow connects a USB scanner to Nextcloud via ScanservJS and the integrated API. It checks for new scans at a scheduled time (e.g., every 5 minutes). If there are any, they are automatically retrieved via HTTP request and then saved to a desired Nextcloud folder. Ideal for home offices, offices, or maker projects with Raspberry Pi and network scanners. Nodes used: Schedule Trigger – starts the flow cyclically HTTP Request – retrieves document data from ScanservJS Nextcloud Node – uploads the file directly to your Nextcloud account Requirements: Local installation of ScanservJS (e.g., on a Raspberry Pi) Configured USB scanner Nextcloud access with write permissions in the target folder
by Nazmy
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. OAuth Token Generator and Validator This n8n template helps you generate, validate, and store tokens for your customers securely using: n8n** as your backend automation engine Airtable** as your lightweight client and token store 🚀 What It Does Accepts client_id and client_secret via POST webhook. Validates client credentials against Airtable. Generates a long token on success. Stores the generated token in Airtable with metadata. Responds with a JSON containing the token, expiry, and type. Returns clear error messages if validation fails. How It Works Webhook node receives client_id and client_secret. Validator (Code node) checks: Body contains only client_id and client_secret. Rejects missing or extra fields. Airtable search: Looks up the client_id. Rejects if not found. Secret validation (If node): Compares provided client_secret with stored value. Rejects if incorrect. Token generation (Code node): Generates a 128-character secure token. Airtable create: Stores token, client ID, creation date, and type. Webhook response: Returns JSON { access_token, expires_in, token_type } on success. Returns appropriate JSON error messages on failure. Related Workflow You can also use it with the published Bearer Token Validation workflow: 👉 Validate API Requests with Bearer Token Authentication and Airtable to securely validate tokens you generate with this workflow across your protected endpoints. Why Use This Provides OAuth-like flows without a complex backend. Uses n8n + Airtable for client management and token storage. Clean, modular, and ready for your SaaS or internal API automations. Extendable for token expiry, refresh, and rotation handling. Enjoy building secure token-based APIs using n8n + Airtable! 🚀 Built by: Nazmy
by Edoardo Guzzi
This template integrates OpenAI's image generation and editing endpoints via the GPT-Image-1 model to visually create and manipulate images based on prompts. It features base64 conversion, binary handling, and prompt chaining. Perfect for marketing, design, product visuals and creative workflows. 🛠️ Requirements OpenAI account with access to gpt-image-1(probably you need organizations verifications for access to that model) OpenAI API credentials configured in n8n A self-hosted or cloud n8n instance Basic familiarity with the n8n UI (no programming required) 🔧 Step-by-step Instructions Step 1: Manual Trigger Starts the workflow on click. Ideal for testing the generation and edit logic. Step 2: Generate Image The Create image call node sends a prompt to OpenAI and returns a base64 image. Example prompt: A cyberpunk city at night with flying cars and neon lights Step 3: Convert to Binary The base64 image is converted into a usable binary PNG file with the Convert json binary to File node. Step 4: Edit the Image The binary file is passed to OpenAI’s /images/edits endpoint. A new prompt applies changes to the image. Example: Add a glowing robot in the foreground with a neon sword ✅ Supports model: gpt-image-1 ⚠️ Requires binary file (not base64) Step 5: Final Conversion Converts the final edited image from base64 to file so it can be downloaded or used in other nodes. 🎯 Real-World Use Cases 🎨 Artists & Creators: concept art and illustration variations 🛍️ E-commerce: auto-generate product mockups 📰 Marketing: create eye-catching blog or social visuals 💡 Bonus Ideas Add a Telegram or Slack node to generate or edit images via chat Use a Webhook to feed prompts from a form or frontend Add a mask to restrict edits to specific areas (e.g., background only)
by Hamed Nickmehr
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Title: n8n Credentials and Workflows Backup on Change Detection Purpose: Never lose track of your n8n changes again! This workflow smartly backs up all your workflows and credentials, automatically detects any changes using hash comparison, and pushes updates to GitHub—but only when something has actually changed. Set your own interval and stop cluttering your repo with redundant commits. Walkthrough Video on YouTube Trigger: Schedule Trigger**: Executes the entire process at a user-defined interval. No need to worry about traceability or managing countless backups, as the workflow only commits changes when a difference is detected. Workflow Backup Process: Set Workflow Path: Defines the local backup file path for workflows. Get Old Workflow Hash: Executes a helper workflow to retrieve the previous hash. Execute Workflow Backup: Runs n8n export:workflow to export all workflows to the defined file path. Get New Workflow Hash: Executes a helper workflow to generate the new hash from the exported file. Compare Hashes (If Workflow Updated): Checks if the new hash differs from the old one. If Updated: Read Workflow Data → Extract Text → Push to GitHub: Reads, extracts, and commits the updated workflow JSON to GitHub under a timestamped filename. Credential Backup Process: Set Credential Path: Defines the local backup file path for credentials. Get Old Credential Hash: Executes a helper workflow to retrieve the previous hash. Execute Credential Backup: Runs n8n export:credentials to export all credentials. Get New Credential Hash: Executes a helper workflow to generate the new hash from the exported file. Compare Hashes (If Credential Updated): Checks for changes. If Updated: Read Credential Data → Extract Text → Push to GitHub: Commits the new credentials JSON to GitHub if changes are found. Hash Generator (Helper Flow): Used in both workflow and credential backup paths: Read File* → *Extract Text* → *Hash Data** Outputs SHA-256 hash used for comparison GitHub Integration: Commits are created with ISO timestamp in the filename and message. Repository: https://github.com/your-github-name/n8n-onchange-bachup File paths: backups/WorkFlow Backup -timestamp-.json and backups/Credential Backup -timestamp-.json Change Detection Logic: Only commits files when hash changes are detected (i.e., actual content change). Avoids unnecessary GitHub commits and storage use. Error Handling: GitHub nodes are set to continue workflow execution on error, avoiding full process interruption.
by Danielle Gomes
This n8n workflow collects and summarizes news from multiple RSS feeds, using OpenAI to generate a concise summary that can be sent to WhatsApp or other destinations. Perfect for automating your daily news digest. 🔁 Workflow Breakdown: Schedule Trigger Start the workflow on your desired schedule (daily, hourly, etc.). 🟨 Note: Set the trigger however you wish. RSS Feeds (My RSS 01–04) Fetches articles from four different RSS sources. 🟨 Note: You can add as many RSS feeds as you want. Edit Fields (Edit Fields1–3) Normalizes RSS fields (title, link, etc.) to ensure consistency across different sources. Merge (append mode) Combines the RSS items into a single unified list. Filter Optionally filter articles by keywords, date, or categories. Limit Limits the analysis to the 10 most recent articles. 🟨 Note: This keeps the result concise and avoids overloading the summary. Aggregate Prepares the selected news for summarization by combining them into a single content block. OpenAI (Message Assistant) Summarizes the aggregated news items in a clean and readable format using AI. Send Summary to WhatsApp Sends the AI-generated summary to a WhatsApp endpoint via webhook (yoururlapi.com). You can replace this with an email service, Google Drive, or any other destination. 🟨 Note: You can send it to your WhatsApp API, email, drive, etc. No Operation (End) Final placeholder to safely close the workflow. You may expand from here if needed.
by Alex Halfborg
BACKGROUND Malaysia's Inland Revenue (LHDN) provides an API to get the tax id number for a business entity, based on a given Business Registration number (BRN or SSM), or NRIC (MyKad). PROBLEM However, the API only allows one search at a time. SOLUTION This free workflow lets you do a batch search to get TIN for multiple SSM or NRIC. This is useful if you need to prepare your internal DB for e-invoicing PRE-REQUISITES 1) Get your connection client id and client secret from myhasil.gov.my website 2) Prepare your Google Sheet containing a list of SSM and NRIC you want to get the TIN 3) Create N8N credential to connect to your google sheet above SUPPORT Questions? Ask alex at halfborg dot com
by Sarfaraz Muhammad Sajib
This n8n workflow sends SMS messages through the Textbelt API by accepting phone numbers, messages, and API keys as inputs. It uses a manual trigger to start the process, sets the necessary data, and executes an HTTP POST request to deliver the SMS. Step-by-Step Explanation: Manual Trigger: Starts the workflow manually by clicking ‘Execute workflow’. Set Data Node: Defines the required input parameters (phone, message, and key) that will be sent to the SMS API. You can populate these fields with your target phone number, the text message, and your Textbelt API key. HTTP Request Node: Sends a POST request to https://textbelt.com/tex with the phone number, message, and API key in the request body to send the SMS. The response from the API confirms whether the message was successfully sent.
by Dataki
This workflow helps you generate an llms.txt file (if you're unfamiliar with it, check out this article) using a Screaming Frog export. Screaming Frog is a well-known website crawler. You can easily crawl a website. Then, export the "internal_html" section in CSV format. How It Works: A form allows you to enter: The name of the website A short description The internal_html.csv file from your Screaming Frog export Once the form is submitted, the workflow is triggered automatically, and you can download the llms.txt file directly from n8n. Downloading the File Since the last node in this workflow is "Convert to File", you will need to download the file directly from the n8n UI. However, you can easily add a node (e.g., Google Drive, OneDrive) to automatically upload the file wherever you want. AI-Powered Filtering (Optional): This workflow includes a text classifier node, which is deactivated by default. You can activate it to apply a more intelligent filter to select URLs for the llms.txt file. Consider modifying the description in the classifier node to specify the type of URLs you want to include. How to Use This Workflow Crawl the website you want to generate an llms.txt file for using Screaming Frog. Export the "internal_html" section in CSV format. In n8n, click "Test Workflow", fill in the form, and upload the internal_html.csv file. Once the workflow is complete, go to the "Export to File" node and download the output. That's it! You now have your llms.txt file! Recommended Usage: Use this workflow directly in the n8n UI by clicking 'Test Workflow' and uploading the file in the form.