by Ria
This is a simple template to show how to extract multiple email attachments and return them as an iterable output. How it works: The Gmail Trigger node detects any new email that has attachments. The Code node will then extract them as binary files and attaches them to the item. They can then be uploaded via the Google Drive node. Setup steps: add your Gmail Credentials add your Google Drive Credentials Follow the official n8n Documentation for help Feedback & Questions If you have any questions or feedback about this workflow - Feel free to get in touch at ria@n8n.io
by DUBCOM
Workflow: Snapshot Contabo How it Works This workflow automates daily backups (snapshots) of VPS instances hosted on Contabo. Each day at midnight, it checks for existing snapshots and ensures that only the latest backups are retained by removing older ones. It provides a seamless, hands-off backup process to keep your data secure. Setup Steps Setting up this workflow is quick, typically taking about 10-15 minutes. The essential part of the setup is providing the necessary credentials, which you can easily retrieve from your Contabo control panel. Import the Workflow: Download and upload the workflow JSON into n8n. Configure Credentials: Add CLIENT_ID, CLIENT_SECRET, API_USER, and API_PASSWORD in the credential node. Activate the Workflow: Enable it to run automatically at midnight every day. Flow Overview Schedule Trigger (00:00 daily):** Automatically initiates the workflow. Formatted Date:** Prepares a timestamp for naming the snapshot. List Snapshots:** Verifies if an existing snapshot is available for each VPS. Conditional Logic:** No Snapshot? Proceeds to create a new one. Snapshot Found? Deletes the old snapshot before creating a new one. Key Points Snapshot Retention:** Old snapshots are deleted to ensure only the latest backups are stored. Unique Identifiers:** UUIDs are used to track and guarantee unique operations.
by Wayne Simpson
Automate your email management with this workflow, designed for freelancers and business professionals who receive high volumes of emails. By leveraging AI-powered categorisation and dynamic email processing, this template helps you organise your inbox and streamline communication for better efficiency and productivity. Check out the YouTube video for step-by-step set up instructions! How it works: Fetch & Filter Emails: The workflow retrieves emails from your Microsoft Outlook account, filtering out flagged emails and those already categorised. Content Preparation: Each email is cleaned up and converted to a structured format using Markdown, making it easier for AI processing. AI Categorization: The content is analysed using an AI model, which categorises the emails into predefined categories (e.g., Action, Junk, Business, SaaS) based on the context and content. Email Categorization & Folder Management: The categorised emails are updated in Microsoft Outlook and moved to respective folders such as "Junk Email" or "Receipts" based on the AI's classification. Conditional Processing & Final Checks: Additional checks and conditions ensure that only unread emails are processed, and errors are gracefully managed to maintain workflow stability. Set up steps: Connect Microsoft Outlook: Link your Microsoft Outlook account using the built-in credentials node to enable email fetching, updating, and folder management. Configure AI Model (Ollama API): Set up the AI model by connecting to the Ollama API and choosing your desired language model for categorisation. Modify Email Categories (Optional): Customize the categories and subcategories within the workflow to suit your unique email management needs. Set Up Error Handling: Review the error handling node settings to ensure smooth workflow execution. This template offers a robust solution for managing and organising your inbox, helping you save time and keep your focus on important emails.
by Darryn Balanco
This workflow automates the management of DigitalOcean Droplet snapshots by listing all droplets, filtering based on the number of snapshots, and deleting excess snapshots before creating new ones. It ensures your droplet snapshots stay organized and within a manageable limit, preventing unnecessary storage costs due to an excess of snapshots. Who is this for? This workflow is perfect for users managing DigitalOcean Droplets and looking to automate the process of snapshot creation and cleanup to save on storage costs and maintain efficient resource management. It’s useful for DevOps teams, cloud administrators, or any developer leveraging DigitalOcean for their infrastructure. What problem is this workflow solving? When managing multiple DigitalOcean Droplets, snapshots can quickly accumulate, taking up space and increasing storage costs. Manually deleting and creating snapshots can be time-consuming and inefficient. This automation solves this problem by automating the snapshot management process, ensuring that no more than a defined number of snapshots are kept per droplet. What this workflow does Runs every 48 hours: The workflow is triggered by a cron node that runs every 48 hours, ensuring timely snapshot management. List all droplets: The workflow retrieves all droplets in the DigitalOcean account. Retrieve snapshots: For each droplet, the workflow retrieves a list of existing snapshots. Filter snapshots: If the number of snapshots exceeds 4, the workflow filters for snapshots that need to be deleted. Delete snapshots: Excess snapshots are automatically deleted based on the filter criteria. Create new snapshot: After cleaning up, the workflow creates a new snapshot for each droplet, ensuring that backups are always up-to-date. Setup DigitalOcean API Key: You’ll need to configure the HTTP Request nodes with your DigitalOcean API key. This key is required for authenticating requests to list droplets, retrieve snapshots, delete snapshots, and create new ones. Snapshot Threshold: By default, the workflow is set to keep no more than 4 snapshots per droplet. This can be adjusted by modifying the filter node conditions. Set Execution Frequency: The cron node is set to run every 48 hours, but you can adjust the timing to suit your needs. How to customize this workflow Adjust Snapshot Limit**: Change the value in the filter node if you want to keep more or fewer snapshots. Modify Run Frequency**: The workflow runs every 48 hours by default. You can change the frequency in the cron node to run more or less often. Enhance with Notifications**: You can add a notification node (e.g., Slack or email) to alert you when snapshots are deleted or created. Workflow Summary This workflow automates the management of DigitalOcean Droplet snapshots by keeping the number of snapshots under a defined limit, deleting the oldest ones, and ensuring new snapshots are created at regular intervals.
by Lucía Maio Brioso
🧑💼 Who is this for? This workflow is for any YouTube user who wants to bulk delete all playlists from their own channel — whether to start fresh, clean up old content, or prepare the account for a new purpose. It’s useful for: Creators reorganizing their channel People transferring content to another account Anyone who wants to avoid deleting playlists manually one by one 🧠 What problem is this workflow solving? YouTube does not offer a built-in way to delete multiple playlists at once. If you have dozens or hundreds of playlists, removing them manually is extremely time-consuming. This workflow automates the entire deletion process in seconds, saving you hours of repetitive effort. ⚙️ What this workflow does Connects to your YouTube account Fetches all playlists you’ve created (excluding system playlists) Deletes them one by one** automatically > ⚠️ This action is irreversible. Once a playlist is deleted, it cannot be recovered. Use with caution. 🛠️ Setup 🔐 Create a YouTube OAuth2 credential in n8n for your channel. 🧭 Assign the credential to both YouTube nodes. ✅ Click “Test workflow” to execute. > 🟨 By default, this workflow deletes everything. If you want to be more selective, see the customization tips below. 🧩 How to customize this workflow to your needs ✅ Add a confirmation flag Insert a Set node with a custom field like confirm_delete = true, and follow it with an IF node to prevent accidental execution. ✂️ Delete only some playlists Add a Filter node after fetching playlists — you can match by title, ID, or keyword (e.g. only delete playlists containing “old”). 🛑 Add a pause before deletion Insert a Wait or NoOp node to give you a moment to cancel before it runs. 🔁 Adapt to scheduled cleanups Use a Cron trigger if you want to periodically clear temporary playlists.
by Sam Nesler
Syncs assignments and completion states to and fro between Canvas LMS and a Notion database. Automatically triggers every 2 hours during the schoolday by default (meaning 7 times a day), but also supports manual refreshing via webhooks. Setup You'll need a few things to get started: A Canvas API key. You can generate one by going to your Canvas account settings and clicking on the "New Access Token" button. The URL looks like https://canvas.wisc.edu/profile/settings You'll also need to replace URLs in Canvas nodes with your institution's domain, unless you're a student at UW-Madison. Canvas nodes are all the HTTP Request nodes except the one labelled "OpenAI Categorization", which is an OpenAI node and will require a key in a later step. A Notion integration token. You can find this by going to your Notion integrations page and clicking "Create new integration". You can make it a "Internal Integration". A Notion database to sync to. I made a template for use with the workflow, but you can use any database that has the following fields: Status (status): Status with at least the options "Not Started" and "Completed" - assignments start out "Not Started", and are marked "Completed" when they are submitted on Canvas. Estimate (select): Select with at least the options "XS", "S", "M", "L", "XL" - this is where the estimated time to complete the assignment will be stored. Even if you don't use AI, they'll start out as "M" Priority (select): Select with at least the options "Could Do", "Should Do", "Must Do" - assignments start out "Should Do" ID (text): this is where the ID of the assignment will be stored. We use this to sync without having a database on the server Due Date (date): this is where the due date of the assignment will be stored Class (text): this is where the name of the class will be stored Link (URL): this is where the link to the assignment will be stored The ID of the Notion database you want to sync to. You can find this by clicking "Share" in the top right of your database and copying the link. The ID is the part of the link that comes after https://www.notion.so/ and before ?v=. So for https://www.notion.so/tsuniiverse/1976e99d91128076b034e7379464560f?v=1976e99d911281e7bd4b000c2cbec692&pvs=4, the ID would be 1976e99d91128076b034e7379464560f. An OpenAI key for assignment length estimation or disable the node. Manual Refreshing Embed the production URL from the Webhook Trigger inside a "toggle list" or "toggle heading" inside Notion, then expand the heading to refresh, like so:
by David Olusola
How It Works The workflow is an automated appointment reminder system built on n8n. Here is a step-by-step breakdown of its process: Reminder Webhook This node acts as the entry point for the workflow. It's a unique URL that waits for data to be sent to it from an external application, such as a booking or scheduling platform. When a new appointment is created in that system, it sends a JSON payload to this webhook. Extract Appointment Data This is a Code node that processes the incoming data. It's a critical step that: Extracts the customer's name, phone number, appointment time, and service from the webhook's JSON payload. Includes validation to ensure a phone number is present, throwing an error if it's missing. Formats the raw appointment time into a human-readable string for the SMS message. Send SMS Reminder This node uses your Twilio credentials to send an SMS message. It dynamically constructs the message using the data extracted in the previous step. The message is personalized with the customer's name and includes the formatted appointment details. Setup Instructions Import the Workflow Copy the JSON code from the Canvas and import it into your n8n instance. Connect Your Twilio Account Click on the "Send SMS Reminder" node. In the "Credentials" section, you will need to either select your existing Twilio account or add new credentials by providing your Account SID and Auth Token from your Twilio console. Find the Webhook URL Click on the "Reminder Webhook" node. The unique URL for this workflow will be displayed. Copy this URL. Configure Your Booking System Go to your booking or scheduling platform (e.g., Calendly, Acuity). In the settings or integrations section, find where you can add a new webhook. Paste the URL you copied from n8n here. You'll need to map the data fields from your booking system (like customer name, phone, etc.) to match the expected format shown in the comments of the "Extract Appointment Data" node. Once these steps are complete, your workflow will be ready to automatically send SMS reminders whenever a new appointment is created.
by Angel Menendez
Streamline Case Management in TheHive via Slack! Our TheHive Slack Integration empowers SOC analysts by allowing them to efficiently manage and update case attributes directly within Slack, reducing the need to switch contexts and enhancing response time. Key Features: Direct Case Management**: Modify case details such as assignee, severity, status, and more through intuitive form inputs embedded within Slack messages. Seamless Integration**: Assumes matching email addresses between TheHive and Slack users for straightforward assignee updates. Note: Ensure email consistency to avoid assignment errors. Instant Case Actions**: Quickly close cases as false positives or adjust threat levels with minimal clicks, directly impacting case status in TheHive and reflecting updates immediately in Slack. Task Management**: Add tasks to cases through a user-friendly modal popup, fostering better task tracking and delegation within your team. Operational Benefits: Efficiency**: Enables analysts to perform multiple case actions without leaving Slack, streamlining workflows and saving valuable time. Accuracy**: Reduces the chances of human error by providing a controlled interface for case updates. Agility**: Enhances the SOC team's agility by providing tools for rapid response and case management, crucial for effective security operations. Setup Tips: Verify that all SOC team members have matching email IDs in TheHive and Slack. Familiarize your team with the Slack form inputs and ensure they understand the importance of accurate data entry. Regularly review and update the integration settings to accommodate any changes in your security operations protocols. Need Help? For detailed setup instructions or troubleshooting, refer to our Integration Guide or reach out on our Support Forum. Leverage this integration to maximize your SOC team's efficiency and responsiveness, ensuring that case management is as streamlined and effective as possible.
by Angel Menendez
Upload Public-Facing Images to an S3 Cloudflare Bucket via Slack Modal 🛠 Who is this for? This workflow is for teams that use Slack for internal communication and need a streamlined way to upload public-facing images to an S3 Cloudflare bucket. It's especially beneficial for DevOps, marketing, or content management teams who frequently share assets and require efficient cloud storage integration. 💡 What problem does this workflow solve? Manually uploading images to cloud storage can be time-consuming and disruptive, especially if you're already working in Slack. This workflow automates the process, allowing you to upload images directly from Slack via a modal popup. It reduces friction and keeps your workflow within a single platform. 🔍 What does this workflow do? This workflow connects Slack with an S3 Cloudflare bucket to simplify the image-uploading process: Slack Modal Interaction**: Users trigger a Slack modal to select images for upload. Dynamic Folder Management**: Choose to create a new folder or use an existing one for uploads. S3 Integration**: Automatically uploads the images to a specified S3 Cloudflare bucket. Slack Confirmation**: After upload, Slack sends a confirmation with the uploaded file URLs. 🚀 Setup Instructions Prerequisites Slack Bot with the following permissions: commands files:write files:read chat:write Cloudflare S3 Credentials: Create an API token with write access to your S3 bucket. n8n Instance: Ensure n8n is properly set up with webhook capabilities. Steps Configure Slack Bot: Set up a Slack app and enable the Events API. Add your n8n webhook URL to the Events Subscription section. Add Credentials: Add your Slack API and S3 Cloudflare credentials to n8n. Customize the Workflow: Open the Idea Selector Modal node and update folder options to suit your needs. Update the Post Image to Channel node with your Slack channel ID. Deploy the Workflow: Activate the workflow and test by triggering the Slack modal. 🛠 How to Customize This Workflow Adjust the Slack Modal You can modify the modal layout in the Idea Selector Modal node to add additional fields or adjust the styling. Change the Bucket Structure Update the Upload to S3 Bucket node to customize the folder paths or change naming conventions. 🔗 References and Helpful Links Slack API Documentation Cloudflare S3 Setup n8n Documentation 📓 Workflow Notes Key Features: Slack Integration**: Uses Slack modal interactions to streamline the upload process. Cloud Storage**: Automatically uploads to a Cloudflare S3 bucket. User Feedback**: Sends a Slack message with file URLs upon successful upload. Setup Dependencies: Slack API token Cloudflare S3 credentials n8n webhook configuration Sticky Notes Included Sticky notes are embedded within the workflow to guide you through configuration and explain node functionality. 🌟 Why Use This Workflow? This workflow keeps your image-uploading process intuitive, efficient, and fully integrated with tools you already use. By leveraging n8n's flexibility, you can ensure smooth collaboration and quick sharing of public-facing assets without switching contexts.
by Akram Kadri
Who is this template for? This workflow template is designed for sales, marketing, and business development professionals who want a cost-effective and efficient way to generate leads. By leveraging n8n core nodes, it scrapes business emails from Google Maps without relying on third-party APIs or paid services, ensuring there are no additional costs involved. Ideal for small business owners, freelancers, and agencies, this template automates the process of collecting contact information for targeted outreach, making it a powerful tool for anyone looking to scale their lead generation efforts without incurring extra expenses. You can watch the video tutorial here: https://youtu.be/HaiO-UeiKBA How it works This template streamlines email scraping from Google Maps using only n8n core nodes, ensuring a completely free and self-contained solution. Here’s how it operates: Input Queries You provide a list of queries, each consisting of keywords related to the type of business you want to target and the specific region or subregion you’re interested in. Iterates through Queries The workflow processes each query one at a time. For each query, it triggers a sub-workflow dedicated to handling the scraping tasks. Scrapes Google Maps for URLs Using these queries, the workflow scrapes Google Maps to collect URLs of business listings matching the provided criteria. Fetches HTML Content The workflow then fetches the HTML pages of the collected URLs for further processing. Extracts Emails Using a Code Node with custom JavaScript, the workflow runs regular expressions on the HTML content to extract business email addresses. Setup Add Queries: Open the first node, "Run Workflow" and input a list of queries, each containing the business keywords and the target region. Configure the Google Sheets Node: Open the Google Sheets node and select a document and specific sheet where the scraped results will be saved. Run the workflow: Click on "Test workflow" and watch your Google Sheets document gradually receive business email addresses. Customize as Needed: You can adjust the regular expressions in the Code Node to refine the email extraction logic or add logic to extract other kinds of information.
by AI/ML API | D1m7asis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generate Veo3 Videos via AI/ML API, Save to Google Drive and Upload to YouTube Transform your content creation process by automating video generation with AI, publishing to YouTube, and logging results in Google Sheets. This workflow is ideal for content creators, marketers, and social media managers looking to streamline video production and distribution. How it works Trigger: Start the workflow manually or on a scheduled interval (e.g., every hour). Generate Video: Send a request to the AI/ML API to create video content based on predefined prompts and settings. Monitor Status: Poll the AI/ML API until the video generation is completed, with retry logic for reliability. Download & Upload: Retrieve the generated video file and upload it to your connected YouTube channel. Title Generation: Generate a YouTube title using AI to optimize for engagement. Log Results: Update a Google Sheet with video metadata and the YouTube URL for tracking and analytics. Set up steps Connect Credentials: Add OAuth2 credentials for AI/ML API, YouTube, and Google Sheets in n8n Credentials. Configure Nodes: Rename nodes for clarity (e.g., Generate Video, Upload to YouTube) and set up the HTTP Request node to use your AI/ML API credential. Sheet Preparation: Create a Google Sheet with columns for Date, Prompt, Video ID, and YouTube URL. Schedule: If using scheduling, configure the Schedule Trigger interval (e.g., every 60 minutes). Test & Deploy: Run a manual trigger, verify video generation, upload, and sheet entry, then activate the workflow for automation.
by Vadym Nahornyi
> ⚠️ Multi-language WhatsApp Error Notifier Get instant WhatsApp alerts when any workflow fails — perfect for mobile-first monitoring and fast incident response. ✅ No coding required ✅ Works with any workflow via Error Workflow ✅ Step-by-step setup instructions included in: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский 📦 What This Template Does This template sends real-time WhatsApp notifications when a workflow fails. It uses the WhatsApp Business Cloud API to deliver a preformatted error message directly to your phone. The message includes: Workflow name Error message Last executed node Example message: Error on WorkFlow: {{ $json.workflow.name }} Message: {{ $json.execution.error.message }} lastNodeExecuted: {{ $json.execution.lastNodeExecuted }} ⚙️ Prerequisites Before using this template, make sure you have: A verified Facebook Business account Access to WhatsApp Business Cloud API A sender phone number (registered in Meta) An access token (used as credentials in n8n) A pre-approved message template (or be within the 24h session window) More info from Meta Docs → 🚀 How to Use Open the template and insert your WhatsApp credentials Enter your target phone number (e.g. your own) in international format Customize the message body if needed Save the workflow but do not activate it In any other workflow → open Settings → set this as your Error Workflow 🌐 Multi-language Setup Guide Included This template includes full setup instructions with screenshots and message formatting help in: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский Choose your language inside the embedded sticky note in the workflow.