by Sk developer
🚀 Facebook to MP4 Video Downloader – Fully Customizable Automated Workflow Easily convert Facebook videos into downloadable MP4 files using Facebook Video Downloader API. This n8n workflow automates fetching videos, downloading them, uploading them to Google Drive, and logging results in Google Sheets. Users can modify and extend this flow according to their own needs (e.g., add email notifications, change storage location, or use another API). 📝 Node-by-Node Explanation On form submission → Triggers when a user submits a Facebook video URL via the form. (You can customize this form to include email or multiple URLs.) Facebook RapidAPI Request → Sends a POST request to Facebook Video Downloader API to fetch downloadable MP4 links. (Easily replace or update API parameters as needed.) If Node → Checks API response for errors before proceeding. (You can add more conditions to handle custom error scenarios.) MP4 Downloader → Downloads the Facebook video file from the received media URL. (You can change download settings, add quality filters, or store multiple resolutions.) Upload to Google Drive → Uploads the downloaded MP4 file to a Google Drive folder. (Easily switch to Dropbox, S3, or any other storage service.) Google Drive Set Permission → Sets the uploaded file to be publicly shareable. (You can make it private or share only with specific users.) Google Sheets → Logs successful conversions with the original URL and shareable MP4 link. (Customizable for additional fields like video title, size, or download time.) Wait Node → Delays before logging failed conversions to avoid rapid writes. (You can adjust the wait duration or add retry attempts.) Google Sheets Append Row → Records failed conversion attempts with N/A as the Drive URL. (You can add notification alerts for failed downloads.) ✅ Use Cases Automate Facebook video downloads for social media teams Instantly generate shareable MP4 links for clients or marketing campaigns Maintain a centralized log of downloaded videos for reporting Customizable flow for different video quality, formats, or storage needs 🚀 Benefits Fast and reliable Facebook video downloading with Facebook Video Downloader API Flexible and fully customizable – adapt nodes, storage, and notifications as required Automatic error handling and logging in Google Sheets Cloud-based storage with secure and shareable Google Drive links Seamless integration with n8n and Facebook Video Downloader API for scalable automation 🔑 Resolved: Manual Facebook video downloads are now fully automated, customizable, and scalable using Facebook Video Downloader API, Google Drive uploads, and detailed logging via Google Sheets.
by Yashraj singh sisodiya
AI Latest 24 Update Workflow Explanation Aim The aim of the AI Latest 24 Update Workflow is to automate the daily collection and distribution of the most recent Artificial Intelligence and Technology news from the past 24 hours. It ensures users receive a clean, well-structured HTML email containing headlines, summaries, and links to trusted sources, styled professionally for easy reading. Goal The goal is to: Automate news retrieval by fetching the latest AI developments from trusted sources daily. Generate structured HTML output with bold headlines, concise summaries, and clickable links. Format the content professionally with inline CSS, ensuring the email is visually appealing. Distribute updates automatically to selected recipients via Gmail. Provide reusability so the HTML can be processed for other platforms if needed. This ensures recipients receive accurate, up-to-date, and well-formatted AI news without manual effort. Requirements The workflow relies on the following components and configurations: n8n Platform Acts as the automation environment for scheduling, fetching, formatting, and delivering AI news updates. Node Requirements Schedule Trigger Runs the workflow every day at 10:00 AM. Automates the process without manual initiation. Message a model (Perplexity API) Uses the sonar-pro model from Perplexity AI. Fetches the most recent AI developments in the past 24 hours. Outputs results as a self-contained HTML email with inline CSS (card-style layout). Send a message (Gmail) Sends the generated HTML email with the subject: “Latest Tech and AI news update 🚀”. Recipients: xyz@gmail.com. HTML Node Processes the AI model’s HTML response. Ensures the email formatting is clean, valid, and ready for delivery. Credentials Perplexity API account**: For fetching AI news. Gmail OAuth2 account**: For secure email delivery. Input Requirements No manual input required; the workflow runs automatically on schedule. Output A daily AI news digest email containing: Headlines in bold. One-sentence summaries in normal text. Full URLs as clickable links. Styled in a clean card-based format with hover effects. API Usage The workflow integrates APIs to achieve automation: Perplexity API Used in the Message a model node. The API fetches the latest AI news, ensures data accuracy, and outputs HTML formatted content. Provides styling via inline CSS (Segoe UI font, light background, card design). Ensures the news is fresh (past 24 hours only). Gmail API Used in the Send a message node. Handles the secure delivery of emails with OAuth2 authentication. Sends AI news updates directly to inboxes. HTML Processing The HTML Node ensures proper formatting before email delivery: Process**: Cleans and validates HTML ($json.message) generated by Perplexity. Output**: A self-contained HTML email with proper structure (<html>, <head>, <body>). Relevance: Ensures Gmail sends a **styled digest instead of raw text. Workflow Summary The AI Latest 24 Update Workflow automates daily AI news collection and delivery by: Triggering at 10:00 AM using the Schedule Trigger node. Fetching AI news via Perplexity API (Message a model node). Formatting results into clean HTML (HTML node). Sending the email via Gmail API to multiple recipients. This workflow ensures a seamless, hands-off system where recipients get accurate, fresh, and well-designed AI news updates every day.
by Marth
How It Works: The 5-Node Anomaly Detection Flow This workflow efficiently processes logs to detect anomalies. Scheduled Check (Cron Node): This is the primary trigger. It schedules the workflow to run at a defined interval (e.g., every 15 minutes), ensuring logs are routinely scanned for suspicious activity. Fetch Logs (HTTP Request Node): This node is responsible for retrieving logs from an external source. It sends a request to your log API endpoint to get a batch of the most recent logs. Count Failed Logins (Code Node): This is the core of the detection logic. The JavaScript code filters the logs for a specific event ("login_failure"), counts the total, and identifies unique IPs involved. This information is then passed to the next node. Failed Logins > Threshold? (If Node): This node serves as the final filter. It checks if the number of failed logins exceeds a threshold you set (e.g., more than 5 attempts). If it does, the workflow is routed to the notification node; if not, the workflow ends safely. Send Anomaly Alert (Slack Node): This node sends an alert to your team if an anomaly is detected. The Slack message includes a summary of the anomaly, such as the number of failed attempts and the IPs involved, enabling a swift response. How to Set Up Implementing this essential log anomaly detector in your n8n instance is quick and straightforward. Prepare Your Credentials & API: Log API: Make sure you have an API endpoint or a way to get logs from your system (e.g., a server, CMS, or application). The logs should be in JSON format, and you'll need any necessary API keys or tokens. Slack Credential: Set up a Slack credential in n8n and get the Channel ID of your security alert channel (e.g., #security-alerts). Import the Workflow JSON: Create a new workflow in n8n and choose "Import from JSON." Paste the JSON code (which was provided in a previous response). Configure the Nodes: Scheduled Check (Cron): Set the schedule according to your preference (e.g., every 15 minutes). Fetch Logs (HTTP Request): Update the URL and header/authentication to match your specific log API endpoint. Count Failed Logins (Code): Verify that the JavaScript code matches your log's JSON format. You may need to adjust log.event === 'login_failure' if your log events use a different name. Failed Logins > Threshold? (If): Adjust the threshold value (e.g., 5) based on your risk tolerance. Send Anomaly Alert (Slack): Select your Slack credential and enter the correct Channel ID. Test and Activate: Manual Test: Run the workflow manually to confirm it fetches logs and processes them correctly. You can temporarily lower the threshold to 0 to ensure the alert is triggered. Verify Output: Check your Slack channel to confirm that alerts are formatted and sent correctly. Activate: Once you're confident in its function, activate the workflow. n8n will now automatically monitor your logs on the schedule you set.
by Sk developer
Bilibili Video Downloader with Google Drive Upload & Email Notification Automate downloading of Bilibili videos via the Bilibili Video Downloader API (RapidAPI), upload them to Google Drive, and notify users by email — all using n8n workflow automation. 🧠 Workflow Overview This n8n automation allows users to: Submit a Bilibili video URL. Fetch download info from the Bilibili Video Downloader API (RapidAPI). Automatically download and upload the video to Google Drive. Share the file and send an email notification to the user. ⚙️ Node-by-Node Explanation | Node | Function | | ---------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | | On form submission | Triggers when a user submits the Bilibili video URL through the form. | | Fetch Bilibili Video Info from API | Sends the video URL to the Bilibili Video Downloader API (RapidAPI) to retrieve download info. | | Check API Response Status | Validates that the API returned a 200 success status before proceeding. | | Download Video File | Downloads the actual video from the provided resource URL. | | Upload Video to Google Drive | Uploads the downloaded video file to the user’s connected Google Drive. | | Google Drive Set Permission | Sets sharing permissions to make the uploaded video publicly accessible. | | Success Notification Email with Drive Link | Sends the Google Drive link to the user via email upon successful upload. | | Processing Delay | Adds a delay before executing error handling if something fails. | | Failure Notification Email | Sends an error notification to the user if download/upload fails. | 🧩 How to Configure Google Drive in n8n In n8n, open Credentials → New → Google Drive. Choose OAuth2 authentication. Follow the on-screen instructions to connect your Google account. Use the newly created credential in both Upload Video and Set Permission nodes. Test the connection to ensure access to your Drive. 🔑 How to Obtain Your RapidAPI Key To use the Bilibili Video Downloader API (RapidAPI): Visit bilibili videodownloade. Click Subscribe to Test (you can choose free or paid plans). Copy your x-rapidapi-key from the “Endpoints” section. Paste the key into your n8n Fetch Bilibili Video Info from API node header. Example header: { "x-rapidapi-host": "bilibili-video-downloader.p.rapidapi.com", "x-rapidapi-key": "your-rapidapi-key-here" } 💡 Use Case This automation is ideal for: Content creators archiving Bilibili videos. Researchers collecting media resources. Teams that need centralized video storage in Google Drive. Automated content management workflows. 🚀 Benefits ✅ No manual downloads – fully automated. ✅ Secure cloud storage via Google Drive. ✅ Instant user notification on success or failure. ✅ Scalable for multiple users or URLs. ✅ Powered by the reliable Bilibili Video Downloader API (RapidAPI). 👥 Who Is This For n8n developers** wanting to explore advanced workflow automations. Content managers** handling large volumes of Bilibili content. Digital archivists** storing video data in Google Drive. Educators** sharing Bilibili educational videos securely. 🏁 Summary With this n8n workflow, you can seamlessly integrate the Bilibili Video Downloader API (RapidAPI) into your automation stack — enabling effortless video downloading, Google Drive uploading, and user notifications in one unified system.
by Sk developer
Automate Text To Video Generation with Google Veo3 API and Google Drive Integration Create CGI ads effortlessly by integrating the Google Veo3 API for video generation and uploading to Google Drive with seamless email notifications. Node-by-Node Explanation: On form submission: Triggers the workflow when a form is submitted with a prompt for the video. Wait for API Response: Waits for 35 seconds for the API to respond. API Request: Check Task Status: Sends an HTTP request to check the task status for success or failure. Condition: Task Output Status: Checks the task's output status and triggers the appropriate action (success, processing, or failure). Wait for Task to Complete: Waits for another 30 seconds to recheck the task's completion status. Send Email: API Error - Task ID Missing: Sends an email if the task ID is missing from the API response. Upload File to Google Drive: Uploads the generated video to Google Drive. Set Google Drive Permissions: Configures the permissions for the uploaded video on Google Drive. Send an email: Video Link: Sends a final email with the link to the completed video on Google Drive. Download Video: Downloads the video from the generated URL. How to Obtain RapidAPI Key: Visit Google Veo3 API on RapidAPI. Sign up or log in to your account. Subscribe to the Google Veo3 API plan. Copy the API key provided in your RapidAPI dashboard. How to Configure Google Drive: Go to Google Cloud Console. Enable the Google Drive API. Create credentials for OAuth 2.0 and download the credentials file. In your workflow, authenticate using these credentials to upload and manage files on Google Drive. Use Case: This workflow is ideal for businesses looking to automate CGI video creation for advertisements using the Google Veo3 API, with seamless file management and sharing via Google Drive. Benefits: Automation**: Completely automates the CGI video creation and sharing process. Error Handling**: Sends error notifications for task failures or missing task IDs. File Management**: Automatically uploads and manages videos on Google Drive. Easy Sharing**: Generates shareable links to videos via email. Who Is This For? Digital marketers looking to create ads at scale. Creative agencies producing CGI content. Developers integrating API workflows for video generation. Link to Google Veo3 API: Google Veo3 API on RapidAPI
by Oneclick AI Squad
Live Airport Delay Dashboard with FlightStats & Team Alerts Description Automates live monitoring of airport delays using FlightStats API. Stores and displays delay data, with Slack alerts for severe delays to operations/sales teams. Essential Information Runs on a scheduled trigger (e.g., hourly or daily). Fetches real-time delay data from FlightStats API. Stores data in Google Sheets and alerts teams via Slack for severe delays. System Architecture Delay Monitoring Pipeline**: Set Schedule: Triggers the workflow hourly or daily via Cron. FlightStats API: Retrieves live airport delay data. Data Management Flow**: Set Output Data: Prepares data for storage or display. Merge API Data: Combines and processes delay data. Alert and Display**: Send Response via Slack: Alerts ops/sales for severe delays. No Action for Minor Delays: Skips minor delays with no action. Implementation Guide Import the workflow JSON into n8n. Configure Cron node for desired schedule (e.g., every 1 hr). Set up FlightStats API credentials and endpoint (e.g., https://api.flightstats.com). Configure Google Sheets or Notion for data storage/display. Test with a sample API call and verify Slack alerts. Adjust delay severity thresholds as needed. Technical Dependencies Cron service for scheduling. FlightStats API for real-time delay data. Google Sheets API or Notion API for data storage/display. Slack API for team notifications. n8n for workflow automation. Database & Sheet Structure Delay Tracking Sheet** (e.g., AirportDelays): Columns: airport_code, delay_status, delay_minutes, timestamp, alert_sent Example: JFK, Severe, 120, 2025-07-29T20:28:00Z, Yes Customization Possibilities Adjust Cron schedule for different frequencies (e.g., every 30 min). Modify FlightStats API parameters to track specific airports. Customize Slack alert messages in the Send Response via Slack node. Integrate with a dashboard tool (e.g., Google Data Studio) for live display. Add email alerts for additional notification channels.
by Sk developer
Automated Seo Website Traffic Checker with Google Sheets Logging (Seo) Description: This workflow uses the Website Traffic Checker Semrush API to analyze website traffic and performance. It collects data through a user-submitted website URL and stores the results in Google Sheets for easy access and reporting. Ideal for SEO analysis and data tracking. Node-by-Node Explanation: 1. On form submission Captures the website URL submitted by the user through a form. Triggers the workflow when a website URL is submitted via the form interface. 2. Check webTraffic Sends a request to the Website Traffic Checker Semrush API to gather traffic data for the submitted website. Uses the provided URL to fetch real-time traffic statistics using the Semrush API. 3. Re format output Extracts and reformats the raw traffic data from the API response. Cleans and structures the traffic data for easy readability and reporting. 4. Google Sheets Appends the formatted traffic data into a Google Sheet for storage and further analysis. Stores the data in a Google Sheets document for long-term tracking and analysis. Benefits of This Flow: Real-Time Data Collection:** Collects real-time website traffic data directly from the Website Traffic Checker Semrush API, ensuring up-to-date information is always available. Automation:** Automatically processes and formats the website traffic data into an easily accessible Google Sheet, saving time and effort. Customizable:** The workflow can be customized to track multiple websites, and the data can be filtered and expanded as per user needs. SEO Insights:** Get in-depth insights like bounce rate, pages per visit, and visits per user, essential for SEO optimization. Use Case: SEO Monitoring:** Track and analyze the traffic of competitor websites or your own website for SEO improvements. This is ideal for digital marketers, SEO professionals, and website owners. Automated Reporting:** Automatically generate traffic reports for various websites and save them in a Google Sheet for easy reference. No need to manually update data or perform complex calculations. Data-Driven Decisions:** By utilizing data from the Website Traffic Checker Semrush API, users can make informed decisions to improve website performance and user experience.
by SpaGreen Creative
Automated Web Form Data Collection and Storage to Google Sheets Overview This n8n workflow allows you to collect data from a web form and automatically store it in a Google Sheet. It includes data cleanup, date stamping, optional batching, and throttling for smooth handling of single or bulk submissions. What It Does Accepts data submitted from a frontend form via HTTP POST Cleans and structures the incoming JSON data Adds the current date automatically Appends structured data into a predefined Google Sheet Supports optional batch processing and a wait/delay mechanism to control data flow Features Webhook trigger for external form submissions JavaScript-based data cleaning and formatting Looping and delay nodes to manage bulk submissions Direct integration with Google Sheets via OAuth2 Fully mapped columns to match sheet structure Custom date field (submitted_date) auto-generated per entry Who’s It For This workflow is perfect for: Developers or marketers collecting lead data via online forms Small businesses tracking submissions from landing pages or contact forms Event organizers managing RSVP or booking forms Anyone needing to collect and store structured data in Google Sheets automatically Prerequisites Make sure the following are ready before use: An n8n instance (self-hosted or cloud) A Google account with edit access to the target Google Sheet Google Sheets OAuth2 API credentials** configured in n8n A web form or app capable of sending POST requests with the following fields: business_name location whatsapp email name Google Sheet Format Ensure your Google Sheet contains the following exact column names (case-sensitive): | Business Name | Location | WhatsApp Number | Email | Name | Date | |---------------|------------|------------------|----------------------|----------------|------------| | SpaGreen | Bangladesh | 8801322827753 | spagreen@gmail.com | Abdul Mannan | 2025-09-14 | | Dev Code Journey | Bangladesh | 8801322827753 | admin@gmail.com | Shakil Ahammed | 2025-09-14 | > Note: The "Email" column includes a trailing space — this must match exactly in both the sheet and column mapping settings. Setup Instructions 1. Configure Webhook Use the Webhook node with path: /93a81ced-e52c-4d31-96d2-c91a20bd7453 Accepts POST requests from a frontend form or application 2. Clean Incoming Data The JavaScript (Code) node extracts the submitted fields Adds a submitted_date in YYYY-MM-DD format 3. Loop Over Items (Optional for Batches) The Split In Batches node allows handling bulk form submissions For single entries, the workflow still works without adjustment 4. Append to Google Sheet The Google Sheets node appends each submission as a new row Mapped fields include: Business Name Location WhatsApp Number Email Name Date (auto-filled) 5. Add Delay (Optional) The Wait node adds a 5-second delay per loop Helps throttle requests when handling large batches How to Use It Clone or import the workflow into your n8n instance Update the Webhook URL in your frontend form’s POST action Connect your Google Sheets account in the Google Sheets node Confirm that your target sheet matches the required column structure Start sending data from your form — new entries will appear in your sheet automatically > This setup ensures form submissions are received, cleaned, stored efficiently, and processed in a controlled manner. Notes Use the Sticky Notes in the workflow to understand each node’s purpose You can modify the delay duration or disable looping for single submissions For added security, consider securing your webhook with headers or tokens Ideal Use Cases Contact forms Lead capture pages Event signups or bookings Newsletter or email list opt-ins Surveys or feedback forms Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: spagreen Developer Portfolio: Codecanyon SpaGreen
by Sk developer
Automated Video Generation, Google Drive Upload, and Email Notification with Veo 3 Fast API This workflow automates the process of generating videos using the Veo 3 Fast API, uploading the video to Google Drive, and notifying the user via email. All tasks are executed seamlessly, ensuring a smooth user experience with automatic error handling. Node-by-Node Explanation On Form Submission: Triggers the workflow when a user submits a form with a prompt. Veo 3 Fast API Processor: Sends the user's prompt to the Veo 3 Fast API to generate a video. Wait for API Response: Pauses the workflow for 35 seconds to allow the API response. API Request: Check Task Status: Sends a request to check the status of the video generation task. Condition: Task Output Status: Evaluates whether the task was successful, still processing, or failed. Wait for Task to Complete: Pauses the workflow for 30 seconds to recheck the task status if processing is ongoing. Send Email: API Error - Task Failed: Sends an email if the task fails to generate the video. Send Email: API Error - Task ID Missing: Sends an email if the task ID is missing in the response. Download Video: Downloads the processed video from the provided output URL. Upload File to Google Drive: Uploads the processed video to the user's Google Drive. Set Google Drive Permissions: Sets the necessary sharing permissions for the uploaded video. Send an Email: Video Link: Sends an email with the link to the uploaded video. How to Obtain a RapidAPI Key Go to Veo 3 Fast on RapidAPI. Create an account or log in. Subscribe to the API plan that suits your needs. After subscription, find your API Key in the "Keys & Access" section. How to Configure Google Drive API Go to Google Cloud Console. Create a new project or select an existing one. Enable the Google Drive API for the project. Go to Credentials and create OAuth 2.0 credentials. Add the credentials to your n8n Google Drive node for seamless access to your Google Drive. Use Case Use Case**: A content creation team can automate the video production process, upload videos to Google Drive, and share them with stakeholders instantly after the task is complete. Benefits Efficiency**: Reduces manual tasks, saving time and effort by automating video creation and file management. Error Handling**: Sends notifications for task failures or missing data, ensuring quick resolutions. Seamless Integration**: Automatically uploads files to Google Drive and shares the link with users, streamlining the workflow. Who Is This For Content Creators**: Automates video creation and file management. Marketing Teams**: Quick and easy video generation for campaigns. Developers**: Can integrate with APIs and automate tasks. Business Teams**: Save time by automating repetitive tasks like file uploads and email notifications.
by Meelioo
How it Works This workflow creates automated daily backups of your n8n workflows to a GitLab repository: Scheduled Trigger - Runs automatically at noon each day to initiate the backup process Fetch Workflows - Retrieves all active workflows from your n8n instance, filtering out archived ones Compare & Process - Checks existing files in GitLab and compares them with current workflows Smart Upload - For each workflow, either updates the existing file in GitLab (if it exists) or creates a new one Notification System - Sends success/failure notifications to a designated Slack channel with execution details >The workflow intelligently handles each file individually, cleaning up unnecessary metadata before converting workflows to formatted JSON files ready for version control. Set up Steps Estimated setup time: 15-20 minutes You'll need to configure three credential connections and customize the Configuration node: GitLab API**: Create a project access token with write permissions to your backup repository n8n Internal API**: Generate an API key from your n8n user settings Slack Bot**: Set up a Slack app with bot token permissions for posting messages to your notification channel > Once credentials are configured, update the Configuration node with your GitLab project owner, repository name, and target branch. The workflow includes detailed setup instructions in the sticky notes for each credential type. After setup, activate the workflow to begin daily automated backups.
by Oneclick AI Squad
This automated n8n workflow delivers daily multi-currency exchange rate updates via API to email and WhatsApp. The system fetches the latest exchange rates, formats the data, and sends alerts to designated recipients to keep users informed of currency fluctuations. What is Multi-Currency Exchange Update? Multi-currency exchange updates involve retrieving the latest exchange rates for multiple currencies against a base currency (INR) via an API, formatting the data, and distributing it through email and WhatsApp for real-time financial awareness. Good to Know Exchange rate accuracy depends on the reliability of the external API source API rate limits should be respected to ensure system stability Manual configuration of API keys and recipient lists is required Real-time alerts help users stay updated on currency movements How It Works Daily Trigger** - Triggers the workflow daily at 7:30 AM IST to fetch and send exchange rates Set Config: API Key & Currencies** - Defines the API key and target currencies (INR, CAD, AUD, CNY, EUR, USD) for use in the API call Fetch Exchange Rates (CurrencyFreaks)** - Calls the exchange rate API and fetches the latest rates with INR as the base Wait for API Response** - Adds a short delay (5s) to ensure API rate limits are respected and system stability is maintained Set Email & WhatsApp Recipients** - Sets the list of email addresses and WhatsApp numbers who should receive the currency update Create Message Subject & Body** - Dynamically generates a subject line (e.g., "Today's Currency Exchange Rates [(Date)]") and the body containing all rates Send Email Alert** - Sends the formatted currency rate update via email Send WhatsApp Alert** - Sends the formatted currency rate update via WhatsApp How to Use Import the workflow into n8n Configure the API key for the CurrencyFreaks API Set the list of target currencies and ensure INR is the base currency Configure email credentials for sending alerts Configure WhatsApp credentials or API integration for sending messages Test the workflow with sample data to verify rate fetching and alert delivery Adjust trigger time or recipient list as needed Requirements CurrencyFreaks API credentials Email service credentials (Gmail, SMTP, etc.) WhatsApp API or integration credentials Customizing This Workflow Modify the target currencies in the Set Config node to include additional currencies Adjust the delay time in the Wait node based on API rate limits Customize the email and WhatsApp message formats in the Create Message node to suit user preferences
by Florent
n8n Restore workflows & credentials from Disk - Self-Hosted Solution This n8n template provides a safe and intelligent restore solution for self-hosted n8n instances, allowing you to restore workflows and credentials from disk backups. Perfect for disaster recovery or migrating between environments, this workflow automatically identifies your most recent backup and provides a manual restore capability that intelligently excludes the current workflow to prevent conflicts. Works seamlessly with date-organized backup folders. Good to know This workflow uses n8n's native import commands (n8n import:workflow and n8n import:credentials) Works with date-formatted backup folders (YYYY-MM-DD) for easy version identification The restore process intelligently excludes the current workflow to prevent overwriting itself Requires proper Docker volume configuration and file system permissions All operations are performed server-side with no external dependencies Compatible with backups created by n8n's export commands How it works Restore Process (Manual) Manual trigger with configurable pinned data options (credentials: true/false, workflows: true/false) The Init node sets up all necessary paths, timestamps, and configuration variables using your environment settings The workflow scans your backup folder and automatically identifies the most recent backup If restoring credentials: Direct import from the latest backup folder using n8n's import command Credentials are imported with their encrypted format intact If restoring workflows: Scans the backup folder for all workflow JSON files Creates a temporary folder with all workflows from the backup Intelligently excludes the current restore workflow to prevent conflicts Imports all other workflows using n8n's import command Cleans up temporary files automatically Optional email notifications provide detailed restore summaries with command outputs How to use Prerequisites Existing n8n backups in date-organized folder structure (format: /backup-folder/YYYY-MM-DD/) Workflow backups as JSON files in the date folder Credentials backups in subfolder: /backup-folder/YYYY-MM-DD/n8n-credentials/ For new environments: N8N_ENCRYPTION_KEY from source environment (see dedicated section below) Initial Setup Configure your environment variables: N8N_ADMIN_EMAIL: Your email for notifications (optional) N8N_BACKUP_FOLDER: Location where your backups are stored (e.g., /files/n8n-backups) N8N_PROJECTS_DIR: Projects root directory GENERIC_TIMEZONE: Your local timezone N8N_ENCRYPTION_KEY: Required if restoring credentials to a new environment (see dedicated section below) Update the Init node: (Optional) Configure your email here: const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com'; Set PROJECT_FOLDER_NAME to "Workflow-backups" (or your preferred name) Set credentials to "n8n-credentials" (or your backup credentials folder name) Verify BACKUP_FOLDER path matches where your backups are stored Ensure your Docker setup has: Mounted volume containing backups (e.g., /local-files:/files) Access to n8n's CLI import commands Proper file system permissions (read access to backup directories) Performing a Restore Open the workflow and locate the "Start Restore" manual trigger node Edit the pinned data to choose what to restore: credentials: true - Restore credentials workflows: true - Restore workflows Set both to true to restore everything Click "Execute workflow" on the "Start Restore" node to execute the restore The workflow will automatically find the most recent backup (latest date) Check the console logs or optional email for detailed restore summary Important Notes The workflow automatically excludes itself during restore to prevent conflicts Credentials are restored with their encryption intact. If restoring to a new environment, you must configure the N8N_ENCRYPTION_KEY from the source environment (see dedicated section below) Existing workflows/credentials with the same names will be overwritten Test in a non-production environment first if unsure ⚠ Critical: N8N_ENCRYPTION_KEY Configuration Why this is critical: n8n generates an encryption key automatically on first launch and saves it in the ~/.n8n/config file. However, if this file is lost (for example, due to missing Docker volume persistence), n8n will generate a NEW key, making all previously encrypted credentials inaccessible. When you need to configure N8N_ENCRYPTION_KEY: Restoring to a new n8n instance When your data directory is not persisted between container recreations Migrating from one server to another As a best practice to ensure key persistence across updates How credentials encryption works: Credentials are encrypted with a specific key unique to each n8n instance This key is auto-generated on first launch and stored in /home/node/.n8n/config When you backup credentials, they remain encrypted but the key is NOT included If the key file is lost or a new key is generated, restored credentials cannot be decrypted Setting N8N_ENCRYPTION_KEY explicitly ensures the key remains consistent Solution: Retrieve and configure the encryption key Step 1: Get the key from your source environment Check if the key is defined in environment variables docker-compose exec n8n printenv N8N_ENCRYPTION_KEY If this command returns nothing, the key is auto-generated and stored in n8n's data volume: Enter the container docker-compose exec n8n sh Check configuration file cat /home/node/.n8n/config Exit container exit Step 2: Configure the key in your target environment Option A: Using .env file (recommended for security) Add to your .env file N8N_ENCRYPTION_KEY=your_retrieved_key_here Then reference it in docker-compose.yml: services: n8n: environment: N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY} Option B: Directly in docker-compose.yml (less secure) services: n8n: environment: N8N_ENCRYPTION_KEY=your_retrieved_key_here Step 3: Restart n8n docker-compose restart n8n Step 4: Now restore your credentials Only after configuring the encryption key, run the restore workflow with credentials: true. Best practice for future backups: Always save your N8N_ENCRYPTION_KEY in a secure location alongside your backups Consider storing it in a password manager or secure vault Document it in your disaster recovery procedures Requirements Existing Backups Date-organized backup folders (YYYY-MM-DD format) Backup files created by n8n's export commands or compatible format Environment Self-hosted n8n instance (Docker recommended) Docker volumes mounted with access to backup location Optional: SMTP server configured for email notifications Credentials (Optional) SMTP credentials for email notifications (if using email nodes) Technical Notes Smart Workflow Exclusion During workflow restore, the current workflow's name is cleaned and matched against backup files This prevents the restore workflow from overwriting itself The exclusion logic handles special characters and spaces in workflow names A temporary folder is created with all workflows except the current one Timezone Handling All timestamps use UTC for technical operations Display times use local timezone for user-friendly readability Backup folder scanning works with YYYY-MM-DD format regardless of timezone Security Credentials are imported in n8n's encrypted format (encryption preserved) Ensure backup directories have appropriate read permissions Consider access controls for who can trigger restore operations No sensitive data is logged in console output Troubleshooting Common Issues No backups found: Verify the N8N_BACKUP_FOLDER path is correct and contains date-formatted folders Permission errors: Ensure Docker user has read access to backup directories Path not found: Verify all volume mounts in docker-compose.yml match your backup location Import fails: Check that backup files are in valid n8n export format Workflow conflicts: The workflow automatically excludes itself, but ensure backup files are properly named Credentials not restored: Verify the backup contains a n8n-credentials folder with credential files Credentials decrypt error: Ensure N8N_ENCRYPTION_KEY matches the source environment Version Compatibility Tested with n8n version 1.113.3 Compatible with Docker-based n8n installations Requires n8n CLI access (available in official Docker images) This workflow is designed for self-hosted server backup restoration. For FTP/SFTP remote backups, see the companion workflow "n8n Restore from FTP". Works best with backups from: "Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP"