by Belmont Digital
This n8n workflow verifies the deliverability of mailing addresses stored in Groundhogg CRM by integrating with Lob’s address verification service. Who is this for? This template is designed for Groundhogg CRM users who need to ensure the accuracy of mailing addresses stored in their CRM systems. What problem is this workflow solving? / Use Case This workflow addresses the challenge of maintaining accurate mailing addresses in CRM databases by verifying the deliverability of addresses. What this workflow does A new contact is created in Groundhogg CRM Webhook sent to n8n Verify if the address is deliverable via LOB Report back to Groundhogg CRM Set Up Steps Watch this setup video: https://www.youtube.com/watch?v=nrV0P0Yz8FI Takes 10-30 minutes to set up Accounts Needed: Groundhogg CRM LOB Account (https://www.lob.com $0.00/mo 300 US addresses Verifications) n8n Before using this template, ensure you have API keys for your Groundhogg CRM app and Lob. Set up authentication for both services within n8n. How to customize this workflow to your needs You can customize this workflow by adjusting the trigger settings to match Groundhogg CRM’s workflow configuration. Additionally, you can modify the actions taken based on the deliverability outcome, such as updating custom fields or sending notifications.
by Belmont Digital
Description This n8n workflow verifies the deliverability of mailing addresses stored in Keap/Infusionsoft by integrating with Lob’s address verification service. Who is this for? This template is designed for Keap/Infusionsoft users who need to ensure the accuracy of mailing addresses stored in their CRM systems. What problem is this workflow solving? / Use Case This workflow addresses the challenge of maintaining accurate mailing addresses in CRM databases by verifying the deliverability of addresses. What this workflow does A new contact is created in Keap/Infusionsoft Webhook sent to n8n Verify if the address is deliverable via LOB Report back to Keap/Infusionsoft Set Up Steps Watch this setup video: https://www.youtube.com/watch?v=T7Baopubc-0 Takes 10-30 minutes to set up Accounts Needed: Keap/Infusionsoft LOB Account (https://www.lob.com $0.00/mo 300 US addresses Verifications) n8n Before using this template, ensure you have API keys for your Keap/Infusionsoft app and Lob. Set up authentication for both services within n8n. How to customize this workflow to your needs You can customize this workflow by adjusting the trigger settings to match Keap/Infusionsoft’s workflow configuration. Additionally, you can modify the actions taken based on the deliverability outcome, such as updating custom fields or sending notifications.
by Yaron Been
Google Veo 3 Video Generator Description Sound on: Google’s flagship Veo 3 text to video model, with audio Overview This n8n workflow integrates with the Replicate API to use the google/veo-3 model. This powerful AI model can generate high-quality video content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Text prompt for video generation Optional Parameters seed** (integer, default: None): Random seed. Omit for random generations resolution** (string, default: 720p): Resolution of the generated video negative_prompt** (string, default: None): Description of what to discourage in the generated video How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate video content Access the generated output from the final node API Reference Model: google/veo-3 API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of video generation parameters
by Akhil Varma Gadiraju
🚀 Form-Based X/Twitter Poster (v2) A user-friendly n8n workflow that enables users to submit tweets through a simple web form — with optional image, video, or GIF uploads — and posts them to a connected X/Twitter account. Designed for ease of use, this workflow handles both media and text-only posts, providing clear feedback upon submission. 🧭 Overview Workflow Name: Form-Based X/Twitter Poster (v2) Goal: Provide a web form for users to create tweets, upload optional media, and post directly to X/Twitter. 🛠 How It Works 1. Form Submission Trigger Node:** On form submission Type:** formTrigger Purpose:** Renders a web form for tweet creation. Fields:** Post Content: Required textarea for tweet text. Media: Optional file upload (.jpg, .png, .gif, .mp4, etc.). Button:** "Submit" Output:** JSON with text and binary media (if any). 2. Extract Media Details Node:** Extract Media Details Type:** code Purpose:** Extracts tweet text, checks for media, determines media type. Output Example:** { "content": "My tweet!", "mime_type": "image/jpeg", "media_type": "IMAGE" } 3. If Media Exists Node:** If Media Exists Type:** if Purpose:** Checks whether media was uploaded. True Path:** Media was uploaded. False Path:** No media uploaded. 4. Upload Media to X/Twitter (True path only) Node:** Upload Media (X) Type:** httpRequest Purpose:** Uploads media to Twitter via API v1.1. Media Category:** TWEET_IMAGE (can be customized) Auth:** Twitter OAuth1 API Output:** Includes media_id_string 5. Post Tweet with Media (True path) Node:** X Type:** twitter Purpose:** Posts tweet with uploaded media. Auth:** Twitter OAuth2 API 6. Post Text-Only Tweet (False path) Node:** X1 Type:** twitter Purpose:** Posts tweet without media. Auth:** Twitter OAuth2 API 7. Show Confirmation Message Node:** End Form Type:** form Purpose:** Displays thank-you message post-submission. Title:** Thank you so much for sharing your experience on X! 🖤 Message:** We truly appreciate your support and are so glad we could make a positive impact. Your words mean the world to us! 🛠 How to Customize Form Fields:** Change form title, labels, help texts, or file formats. Media Logic:** Add logic for distinguishing GIF vs VIDEO. Adjust media upload URL dynamically: https://upload.twitter.com/1.1/media/upload.json?media_category={{ $json.media_type === 'VIDEO' ? 'TWEET_VIDEO' : ($json.media_type === 'GIF' ? 'TWEET_GIF' : 'TWEET_IMAGE') }} Error Handling:** Add Error Trigger nodes to catch and manage failures gracefully. Tweet Text:** Customize tweet text with extra formatting or default content. Advanced Ideas:** Schedule tweets Post to multiple accounts Add content approval steps 🔐 Required Credentials 1. Twitter OAuth1 API Used by:** Upload Media (X) Required for:** Media upload via v1.1 Credentials:** Consumer Key, Consumer Secret, Access Token, Access Token Secret Workflow Credential Name:** X OAuth - Akhil 2. Twitter OAuth2 API Used by:** X, X1 Required for:** Posting tweets Scopes:** tweet.read, tweet.write, users.read, offline.access Workflow Credential Name:** X OAuth2 - Akhil 💡 Use Cases Easy Tweet Tool:** For non-technical users to share content. Content Approval:** Internal review system before posting. Announcements:** Quickly broadcast updates. Campaign Posting:** Streamline recurring content sharing. 🧑💻 Node Naming Suggestions | Old Name | Suggested Name | |--------------------|--------------------------| | If Image Exists | If Media Exists | | X | Post Tweet with Media | | X1 | Post Text-Only Tweet | ❤️ Made with love by Akhil using n8n
by Yaron Been
Bytedance Seedance 1 Pro Video Generator Description A pro version of Seedance that offers text-to-video and image-to-video support for 5s or 10s videos, at 480p and 1080p resolution Overview This n8n workflow integrates with the Replicate API to use the bytedance/seedance-1-pro model. This powerful AI model can generate high-quality video content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Text prompt for video generation Optional Parameters fps** (string, default: 24): Frame rate (frames per second) seed** (integer, default: None): Random seed. Set for reproducible generation image** (string, default: None): Input image for image-to-video generation duration** (string, default: 5): Video duration in seconds resolution** (string, default: 1080p): Video resolution aspect_ratio** (string, default: 16:9): Video aspect ratio. Ignored if an image is used. camera_fixed** (boolean, default: False): Whether to fix camera position How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate video content Access the generated output from the final node API Reference Model: bytedance/seedance-1-pro API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of video generation parameters
by Dataki
Workflow updated on 17/06/2024:** Added 'Summarize' node to avoid creating a row for each Notion content block in the Supabase table.* Store Notion's Pages as Vector Documents into Supabase This workflow assumes you have a Supabase project with a table that has a vector column. If you don't have it, follow the instructions here: Supabase Langchain Guide Workflow Description This workflow automates the process of storing Notion pages as vector documents in a Supabase database with a vector column. The steps are as follows: Notion Page Added Trigger: Monitors a specified Notion database for newly added pages. You can create a specific Notion database where you copy the pages you want to store in Supabase. Node: Page Added in Notion Database Retrieve Page Content: Fetches all block content from the newly added Notion page. Node: Get Blocks Content Filter Non-Text Content: Excludes blocks of type "image" and "video" to focus on textual content. Node: Filter - Exclude Media Content Summarize Content: Concatenates the Notion blocks content to create a single text for embedding. Node: Summarize - Concatenate Notion's blocks content Store in Supabase: Stores the processed documents and their embeddings into a Supabase table with a vector column. Node: Store Documents in Supabase Generate Embeddings: Utilizes OpenAI's API to generate embeddings for the textual content. Node: Generate Text Embeddings Create Metadata and Load Content: Loads the block content and creates associated metadata, such as page ID and block ID. Node: Load Block Content & Create Metadata Split Content into Chunks: Divides the text into smaller chunks for easier processing and embedding generation. Node: Token Splitter
by Ludwig
Overview This template helps n8n cloud plan users execute all executions to a CSV for easy data analysis. Identify what workflows are generating the most executions or could be optimized. How this workflow works Click "Test Workflow" to manually execute the workflow Open the "Convert to CSV" node to access the binary data of the CSV file Download the CSV file Nodes included: n8n node Convert to File No Operation, do nothing - replace with another Set up steps Import the workflow to your workspace Add your n8n API credential Benefits of Exporting n8n Cloud Executions to CSV Exporting n8n Cloud executions to CSV offers significant advantages for enhancing workflow management and data analysis capabilities. Here are three key benefits: Enhanced Data Analysis: Comprehensive Insights: Exporting execution data allows for in-depth analysis of workflow performance, helping identify bottlenecks and optimize processes. Custom Reporting: CSV files can be easily imported into various data analysis tools (e.g., Excel, Google Sheets, or BI software) to create custom reports and visualizations tailored to specific business needs. Improved Workflow Monitoring: Historical Data Review: Accessing historical execution data enables users to track workflow changes and their impacts over time, facilitating better decision-making. Error Tracking and Debugging: By reviewing execution logs, users can quickly identify and address errors or failures, ensuring smoother and more reliable workflow operations. Regulatory Compliance and Auditing: Audit Trails: Keeping a record of all executions provides a clear audit trail, essential for regulatory compliance and internal audits. Data Retention: Exported data ensures that execution records are preserved according to organizational data retention policies, safeguarding against data loss. By leveraging the capabilities of CSV exports, users can gain valuable insights, streamline workflow management, and ensure robust data handling practices, ultimately driving better performance and efficiency in their n8n Cloud operations.
by ConvertAPI
Who is this for? For developers and organizations that need to convert HTML files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Converts HTML to file. Converts the HTML file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Paul Taylor
📩 Gmail → GPT → Supabase | Task Extractor This n8n workflow automates the extraction of actionable tasks from unread Gmail messages using OpenAI's GPT API, stores the resulting task metadata in Supabase, and avoids re-processing previously handled emails. ✅ What It Does Triggers on a schedule to check for unread emails in your Gmail inbox. Loops through each email individually using SplitInBatches. Checks Supabase to see if the email has already been processed. If it's a new email: Formats the email content into a structured GPT prompt Calls ChatGPT-4o to extract structured task data Inserts the result into your emails table in Supabase 🧰 Prerequisites Before using this workflow, you must have: An active n8n Cloud or self-hosted instance A connected Gmail account with OAuth credentials in n8n A Supabase project with an emails table and: ALTER TABLE emails ADD CONSTRAINT unique_email_id UNIQUE (email_id); An OpenAI API key with access to GPT-4o or GPT-3.5-turbo 🔐 Required Credentials | Name | Type | Description | |-----------------|------------|-----------------------------------| | Gmail OAuth | Gmail | To pull unread messages | | OpenAI API Key | OpenAI | To generate task summaries | | Supabase API | HTTP | For inserting rows via REST API | 🔁 Environment Variables or Replacements Supabase_TaskManagement_URI → e.g., https://your-project.supabase.co Supabase_TaskManagement_ANON_KEY → Your Supabase anon key These are used in the HTTP request to Supabase. ⏰ Scheduling / Trigger Triggered using a Schedule node Default: every X minutes (adjust to your preference) Uses a Gmail API filter: unread emails with label = INBOX 🧠 Intended Use Case > Designed for productivity-minded professionals who want to extract, summarize, and store actionable tasks from incoming email — without processing the same email twice or wasting GPT API credits. This is part of a larger system integrating GPT, calendar scheduling, and optional task platforms (like ClickUp). 📦 Output (Stored in Supabase) Each processed email includes: email_id subject sender received_at body (email snippet) gpt_summary (structured task) requires_deep_work (from GPT logic) deleted (initially false)
by Daniel Ng
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Auto Backup n8n Credentials to Google Drive This workflow automates the backup of all your n8n credentials. It can be triggered manually for on-demand backups or will run automatically on a schedule (default to daily execution). It executes a command to export decrypted credentials, formats them into a JSON file, and then uploads this file to a specified Google Drive folder. This process is essential for creating secure backups of your sensitive credential data, facilitating instance recovery or migration. We recommend you use this backup workflow in conjunction with a restore solution like our "Restore Credentials from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This workflow is designed for n8n administrators and users who require a reliable method to back up their n8n credentials. It is particularly beneficial for those managing self-hosted n8n instances, where direct server access allows for command-line operations. What problem is this workflow solving? / use case Managing and backing up n8n credentials manually can be a tedious task, susceptible to errors and often overlooked. This workflow solves the problem by providing an automated, secure, and consistent way to back up all credential data. The primary use case is to ensure that a recovery point for credentials exists, safeguarding against data loss, assisting in instance migrations, or for general disaster recovery preparedness, ideally on a regular, automated basis. What this workflow does The workflow proceeds through the following steps: Triggers: The workflow includes two types of triggers: Manual Trigger: An "On Click Trigger" allows for on-demand execution whenever needed. Scheduled Trigger: A "Schedule Trigger" is included, designed for automated daily backups. Export Credentials: An "Execute Command" node runs the shell command npx n8n export:credentials --all --decrypted. This command exports all credentials from the n8n instance in a decrypted JSON format. Format JSON Data: The output from the command is processed by a "Code" node ("JSON Formatting Data"). This node extracts, parses, and formats the JSON to ensure it is well-structured. Aggregate Credentials: An "Aggregate" node ("Aggregate Cridentials") combines individual credential entries into a single JSON array. Convert to File: The "Convert To File" node transforms the aggregated JSON array into a binary file, preparing it as n8n_backup_credentials.json. Upload to Google Drive: The "Google Drive Upload File" node uploads the generated JSON file to a specified folder in Google Drive. Step-by-step setup To use this workflow, you'll need to configure a few things: n8n Instance Environment: The n8n instance must have access to the npx command and the n8n-cli package. The "Execute Command" node must be able to run shell commands on the server where n8n is hosted. Google Drive Credentials: In the "Google Drive Upload File" node, select or create your Google Drive OAuth2 API credentials. Google Drive Folder ID: Update the folderId parameter in the "Google Drive Upload File" node with the ID of your desired Google Drive folder. File Name (Optional): The backup file will be named n8n_backup_credentials.json. You can customize this in the "Google Drive Upload File" node. Configure Schedule Trigger: The workflow includes a "Schedule Trigger". Review its configuration to ensure it runs daily at your preferred time. How to customize this workflow to your needs Adjust Schedule:** Fine-tune the "Schedule Trigger" for different intervals (e.g., weekly, hourly) or specific days/times as per your requirements. Notifications:** Add notification nodes (e.g., Slack, Email, Discord) after the "Google Drive Upload File" node to receive alerts upon successful backup or in case of failures. Enhanced Error Handling:** Incorporate error handling branches using "Error Trigger" nodes or conditional logic to manage potential failures. Client-Side Encryption (Advanced):* If your security policy requires the backup file itself to be encrypted at rest in Google Drive, you can add a step *before uploading. Insert a "Code" node or use an "Execute Command" node with an encryption utility (like GPG) to encrypt the n8n_backup_credentials.json file. Remember that you would then need a corresponding decryption process. Dynamic File Naming:** Modify the "Google Drive Upload File" node to include a timestamp in the filename (e.g., n8n_backup_credentials_{{$now.toFormat('yyyyMMddHHmmss')}}.json) to keep multiple versions of backups. Important Note on Credential Security To simplify the setup and use of this backup workflow, the exported credentials are stored in the resulting JSON file in a decrypted state. This means the backup file itself is not further encrypted by this workflow. Consequently, it is critically important to: Ensure the Google Drive account used for backups is highly secure (e.g., strong password, two-factor authentication). Restrict access to the Google Drive folder where these backups are stored to only authorized personnel.
by Daniel Ng
Restore All n8n Workflows from Google Drive Backups Restoring multiple n8n workflows manually, especially when migrating your n8n instance to another host or server, can be an incredibly daunting and time-consuming task. Imagine having to individually export and then manually import hundreds of workflows; it's a recipe for errors and significant downtime. This workflow provides a streamlined way to restore all your n8n workflows from backup JSON files stored in a designated Google Drive folder. It's an essential tool for disaster recovery, migrating workflows to a new n8n instance, or recovering from accidental deletions, ideally used in conjunction with a backup solution like our "Auto Backup Workflows To Google Drive" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This template is intended for: n8n Users and Administrators:** Who have previously backed up their n8n workflows as JSON files to Google Drive. Anyone needing to recover their n8n setup:** Whether due to system failure, data corruption, accidental deletions, or during an instance migration. What problem is this workflow solving? / use case Restoring multiple n8n workflows manually can be a slow and error-prone process. This workflow solves that by: Automating Bulk Restore:** Quickly re-imports all workflows from a specified Google Drive backup folder, drastically cutting down on manual effort. Disaster Recovery:** Enables rapid recovery of your automation environment, minimizing downtime after a system failure or data corruption. Simplified Instance Migration:** Makes the process of transferring your entire workflow suite to a new n8n server significantly more manageable and less error-prone compared to manual imports. Data Integrity:** Helps restore workflows to a known good state from your backups, ensuring consistency after a recovery or migration. What this workflow does Manual Trigger: You initiate the workflow manually whenever a restore operation is needed. List Backup Files: The workflow accesses a specific Google Drive folder (which you must configure) and lists all the files within it. It assumes these are your n8n workflow JSON backup files. Iterate and Process: It then loops through each file found in the Google Drive folder: Download Workflow: Downloads the individual workflow JSON file from Google Drive. Extract Content: Parses the downloaded file to extract the JSON data representing the workflow. Import to n8n: Uses the n8n API to create a new workflow (or update an existing one if an ID match is found) in your current n8n instance using the extracted JSON data. Wait Step: Pauses for 3 seconds after attempting to create each workflow to help manage system load and avoid potential API rate-limiting issues. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: Configure your n8n API credentials to allow the workflow to create/update workflows in your instance. Specify Google Drive Backup Folder (CRITICAL): Open the "Google Drive Get All Workflows" node. Locate the "Filter" section, and within it, the "Folder ID" parameter. The default value is a placeholder URL. You MUST change this URL to the direct URL of the Google Drive folder that contains your n8n workflow .json backup files. This would typically be one of the hourly folders (e.g., n8n_backup_YYYY-MM-DD_HH) created by the companion backup workflow. Activate Workflow: Although manually triggered, the workflow needs to be active in your n8n instance to be runnable. How to customize this workflow to your needs Selective Restore:** Option 1 (Manual): Before running the workflow, manually move only the specific workflow JSON files you want to restore into the source Google Drive folder configured in the "Google Drive Get All Workflows" node. Option 2 (Automated Filter): Insert an "Edit Fields" or "Filter" node after the "Google Drive Get All Workflows" node to programmatically select which files (e.g., based on filename patterns) should proceed to the "Loop Over Items" node for restoration. Adjust Wait Time:** The "Wait" node is set to 3 seconds. You can increase this if you have a very large number of workflows or if your n8n instance requires more time between API calls. Conversely, for smaller batches on powerful instances, you might decrease it. Error Handling:** For enhanced robustness, consider adding error handling branches (e.g., using "Error Trigger" nodes or "Continue on Fail" settings within nodes) to log or send notifications if a specific workflow fails to import. Important Considerations Workflow Overwriting/Updating:* If a workflow with the same id as one in a backup JSON file already exists in your n8n instance, this restore process will typically *update/overwrite** that existing workflow with the version from the backup. If the id from the backup file does not correspond to any existing workflow, a new workflow will be created. Idempotency:** Running this workflow multiple times on the exact same backup folder will cause the workflow to re-process all files. This means workflows will be updated/overwritten again if they exist, or created if they don't. Ensure this is the intended behavior. Companion Backup Workflow:** This restore workflow is ideally paired with backups created by a process like our "Auto Backup Workflows To Google Drive" template, which saves workflows in the expected JSON format. Test Safely:** It's highly recommended to test this workflow on a non-production or development n8n instance first, especially when restoring a large number of critical workflows or if you're unsure about the overwrite behavior in your specific n8n setup. Source Folder Content:* Ensure the specified Google Drive folder *only contains n8n workflow JSON files that you intend to restore. Other file types may cause errors in the "Extract from File" node.
by Anthony
This n8n workflow automates the process of researching companies by gathering relevant data such as traffic volume, foundation details, funding information, founders, and more. The workflow leverages the ProspectLens API, which is particularly useful for researching companies commonly found on Crunchbase and LinkedIn. ProspectLens is an API that provides very detailed company data. All you need to do is supply the company's domain name. You can obtain your ProspectLens API key here: https://apiroad.net/marketplace/apis/prospectlens In n8n, create a new "HTTP Header" credential. Set x-apiroad-key as the "Name" and enter your APIRoad API key as the "Value". Use this credential in the HTTP Request node of the workflow.