by Emmanuel Bernard
Automatically Add Captions to Your Video Who Is This For? This workflow is ideal for content creators, marketers, educators, and businesses that regularly produce video content and want to enhance accessibility and viewer engagement by effortlessly adding subtitles. What Problem Does This Workflow Solve? Manually adding subtitles or captions to videos can be tedious and time-consuming. Accurate captions significantly boost viewer retention, accessibility, and SEO rankings. What Does This Workflow Do? This automated workflow quickly adds accurate subtitles to your video content by leveraging the Json2Video API. It accepts a publicly accessible video URL as input. It makes an HTTP request to Json2Video, where AI analyzes the video, generates captions, and applies them seamlessly. The workflow returns a URL to the final subtitled video. The second part of the workflow periodically checks the Json2Video API to monitor the processing status at intervals of 10 seconds. ππ» Try Json2Video for Free ππ» Key Features Automatic & Synced Captions:** Captions are generated automatically and synchronized perfectly with your video. Fully Customizable Design:** Easily adjust fonts, colors, sizes, and more to match your unique style. Word-by-Word Display:** Supports precise, word-by-word captioning for improved clarity and viewer engagement. Super Fast Processing:** Rapid caption generation saves time, allowing you to focus more on creating great content. Preconditions To use this workflow, you must have: A Json2Video API account. A video hosted at a publicly accessible URL. Why You Need This Workflow Adding subtitles to your videos significantly enhances their reach and effectiveness by: Improving SEO visibility, enabling search engines to effectively index your video content. Enhancing viewer engagement and accessibility, accommodating viewers who watch without sound or who have hearing impairments. Streamlining your content production process, allowing more focus on creativity. Specific Use Cases Social Media Content:** Boost viewer retention by adding subtitles. Educational Videos:** Enhance understanding and improve learning outcomes. Marketing Videos:** Reach broader and more diverse audiences.
by Harshil Agrawal
This workflow gets the top 5 products from Product Hunt and shares them on the Discord server. Cron node: This node triggers the workflow every hour. Based on your use case, you can update the node to trigger the workflow at a different time. GraphQL node: This node makes the API call to the Product Hunt GraphQL API. You will need an API token from Product Hunt to make the call. Item Lists node: This node transforms the single item returned by the previous node into multiple items. Set node: The Set node is used to return only the name, description, and votes of the product. Discord node: This node is used to send the top 5 products to the Discord server.
by Harshil Agrawal
This workflow demonstrates the use of the $item(index) method. This method is useful when you want to reference an item at a particular index. This example workflow makes POST HTTP requests to a dummy URL. Set node: This node is used to set the API key that will be used in the workflow later. This node returns a single item. This node can be replaced with other nodes, based on the use case. Customer Datastore node: This node returns the data of customers that will be sent in the body of the HTTP request. This node returns 5 items. This node can be replaced with other nodes, based on the use case. HTTP Request node: This node uses the information from both the Set node and the Customer Datastore node. Since, the node will run 5 times, once for each item of the Customer Datastore node, you need to reference the API Key 5 times. However, the Set node returns the API Key only once. Using the expression {{ $item(0).$node["Set"].json["apiKey"] }} you tell n8n to use the same API Key for all the 5 requests.
by Harshil Agrawal
This workflow updates your Twitter profile banner when you have a new follower. To use this workflow: Configure Header Auth in the Fetch New Followers to connect to your Twitter account. Update the URL of the template image in the Fetch BG node. Create and configure your Twitter OAuth 1.0 credentials in the last HTTP Request node. You can configure the size, and position of the avatar images in the Edit Image nodes. Check out this video to learn how to build it from scratch: How to automatically update your Twitter Profile Banner
by Tom
This workflow automatically deletes user data from different apps/services when a specific slash command is issued in Slack. Watch this talk and demo to learn more about this use case. The demo uses Slack, but Mattermost is Slack-compatible, so you can also connect Mattermost in this workflow. Prerequisites Accounts and credentials for the apps/services you want to use. Some basic knowledge of JavaScript. Nodes Webhook node triggers the workflow when a Slack slash command is issued. IF nodes confirm Slack's verification token and verify that the data has the expected format. Set node simplifies the payload. Switch node chooses the correct path for the operation to perform. Respond to Webhook nodes send responses back to Slack. Execute Workflow nodes call sub-workflows tailored to deleting data from each individual service. Function node, Crypto node, and Airtable node generate and store a log entry containing a hash value. HTTP Request node sends the final response back to Slack.
by Samir Saci
Tags: Supply Chain Management, Logistics, Transportation, Data Transmission Context Hey! I'm Samir, a Supply Chain Engineer and Data Scientist from Paris founder of LogiGreen Consulting We help small and medium businesses improve their logistics processes using AI, Data Analytics and Automation. > Sustainable and Efficient supply chains with N8N! π¬ For business inquiries, you can add me on Here What is an EDI Message? Electronic Data Interchange (EDI) is a standardized method of automatically transferring data between computer systems. They ensure the smooth flow of essential transactional data, such as purchase orders, invoices, shipping notices, and more. For instance, a manufacturing company can receive purchase orders from a retailer via EDI. However, they need complex integration for the transmission and processing of the messages. Who is this template for? This workflow template is designed for small companies that cannot connect to their customers and need to manually process the EDI messages received. How does it work? This workflow uses a Gmail Trigger that analyzes all the incoming emails. π§ Gmail Trigger β Detects emails with "EDI" in the subject. π Parses EDI Message β Uses a JavaScript Code Node to extract structured data. π Formats the Data β Converts it into a table-friendly format. π Updates Google Sheets β Automatically logs the processed orders. Prerequisite This workflow does not require any additional paying subscription. A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Gmail API A Google sheet to store the shipment records. You do not need to prepare the columns. Next Steps Follow the sticky notes to set up the parameters inside each node and get ready to improve your logistics operations! πΊ Watch the Step-by-Step Guide π₯ Check My Tutorial π Interested in applications of N8N for Logistics & Supply Chain Management? Let's connect on Linkedin Notes This template includes an example of EDI message to test the workflow. If you want to learn more about Electronic Data Interchange: π Blog Article about Electronic Data Interchange (EDI) This workflow has been created with N8N 1.82.1 Submitted: March 19th, 2025
by Davide
This workflow enables users to perform web searches directly from Telegram using the Brave search engine. By simply sending the command /brave followed by a query, the workflow retrieves search results from Brave and returns them as a Telegram message. This workflow is ideal for users who want a quick and private way to search the web without switching between apps. π This workflow is a powerful tool for automating interactions with Brave tools through Telegram, providing users with quick and easy access to information directly in their chat. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to process user queries from Telegram, execute a Brave tool via the MCP Client, and send the results back to the user. Here's how it works: Telegram Trigger: The workflow starts with the Telegram Trigger node, which listens for new messages in a Telegram chat. When a message is received, the workflow checks if it starts with the command /brave. Filter Messages: The If node filters messages that start with /brave. If the message doesn't start with /brave, the workflow stops. Edit Fields: The Edit Fields node extracts the text of the message for further processing. Clean Query: The Clean Query node removes the /brave command from the message, leaving only the user's query. List Brave Tools: The List Brave Tools node retrieves the list of available tools from the MCP Client. Execute Brave Tool: The Exec Brave Tool node executes the first tool in the list using the cleaned query as input. Send Message: The Send Message node sends the result of the Brave tool execution back to the user in the Telegram chat. 2. Preliminary Steps Access to an n8n self-hosted instance and install the Community node "n8n-nodes-mcp". Please see this easy guide Get your Brave Search API Key: https://brave.com/search/api/ Telegram Bot Access Token In "List Brave Tools" create new credential as shown in this image In Environment field set this value: BRAVE_API_KEY=your-api-key 3. Set Up Steps To set up and use this workflow in n8n, follow these steps: Telegram Configuration: Set up Telegram credentials in n8n for the Telegram Trigger and Send Message nodes. Ensure the Telegram bot is authorized to read messages and send responses in the chat. MCP Client Configuration: Set up MCP Client credentials in n8n for the List Brave Tools and Exec Brave Tool nodes. Ensure the MCP Client is configured to interact with Brave tools. Test the Workflow: Send a message starting with /brave followed by a query (e.g., /brave search for AI tools) to the Telegram chat. The workflow will: Process the query. Execute the Brave tool via the MCP Client. Send the result back to the Telegram chat. Optional Customization: Modify the workflow to include additional features, such as: Adding more commands or tools. Integrating with other APIs or services for advanced use cases. Sending notifications via other channels (e.g., email, Slack) Need help customizing? Contact me for consulting and support or add me on Linkedin.
by bangank36
This workflow retrieves all Shopify Customers and saves them into a Google Sheets spreadsheet using the Shopify Admin REST API. It uses pagination to ensure all customers are collected efficiently. N8n does not have built-in actions for Customers, so I built the workflow using an HTTP Request node. How It Works This workflow uses the HTTP Request node to fetch paginated chunks manually. Shopify uses cursor-based pagination (page_info) instead of traditional page numbers. Pagination data is stored in the response headers, so we need to enable Include Response Headers and Status in the HTTP Request node. The workflow processes customer data, saves it to Google Sheets, and formats a compatible CSV for Squarespace Contacts import. This workflow can be run on demand or scheduled to keep your data up to date. Parameters You can adjust these parameters in the HTTP Request node: limit** β The number of customers per request (default: 50, max: 250). fields** β Comma-separated list of fields to retrieve. page_info** β Used for pagination; only limit and fields are allowed when paginating. π Note: When you query paginated chunks with page_info, only the limit and fields parameters are allowed. Credentials Shopify API Key** β Required for authentication. Google Sheets API credentials** β Needed to insert data into the spreadsheet. Google Sheets Template Clone this spreadsheet: π Google Sheets Template According to Squarespace documentation, your spreadsheet can have up to three columns and must be arranged in this order (no header): Email Address First Name (optional) Last Name (optional) Shopify Customer ID (this field will be ignored) Exporting a Compatible CSV for Squarespace Contacts This workflow also generates a CSV file that can be imported into Squarespace Contacts. How to Import the CSV to Squarespace: Open the Lists & Segments panel and click on your mailing list. Click Add Subscribers, then select Upload a list. Click Add a CSV file and select the file to import. Toggle These subscribers accept marketing to confirm permission. Preview your list, then click Import. Who Is This For? Shopify store owners** who need to export all customers to Google Sheets. Anyone looking for a flexible and scalable** Shopify customers extraction solution. Squarespace website owners** who want to bulk-create their Contacts using CSV. Explore More Templates π Check out my other n8n templates
by ParquetReader
π Convert Parquet, Feather, ORC & Avro Files with ParquetReader This workflow allows you to upload and inspect Parquet, Feather, ORC, or Avro files via the ParquetReader API. It instantly returns a structured JSON preview of your data β including rows, schema, and metadata β without needing to write any custom code. β Perfect For Validating schema and structure before syncing or transformation Previewing raw columnar files on the fly Automating QA, ETL, or CI/CD workflows Converting Parquet, Avro, Feather, or ORC to JSON βοΈ Use Cases Catch schema mismatches before pipeline runs Automate column audits in incoming data files Enrich metadata catalogs with real-time schema detection Integrate file validation into automated workflows π How to Use This Workflow π₯ Trigger via File Upload You can trigger this flow by sending a POST request with a file using curl, Postman, or from another n8n flow. π§ Example (via curl): curl -X POST http://localhost:5678/webhook-test/convert \ -F "file=@converted.parquet" > Replace converted.parquet with your local file path. You can also send Avro, ORC or Feather files. π Reuse from Other Flows You can reuse this flow by calling the webhook from another n8n workflow using an HTTP Request node. Make sure to send the file as form-data with the field name file. π What This Flow Does: Receives the uploaded file via webhook (file) Sends it to https://api.parquetreader.com/parquet as multipart/form-data (field name: file) Receives parsed data (rows), schema, and metadata in JSON format π§ͺ Example JSON Response from this flow { "data": [ { "full_name": "Pamela Cabrera", "email": "bobbyharrison@example.net", "age": "24", "active": "True", "latitude": "-36.1577385", "longitude": "63.014954", "company": "Carter, Shaw and Parks", "country": "Honduras" } ], "meta_data": { "created_by": "pyarrow", "num_columns": 21, "num_rows": 10, "serialized_size": 7598, "format_version": "0.12" }, "schema": [ { "column_name": "full_name", "column_type": "string" }, { "column_name": "email", "column_type": "string" }, { "column_name": "age", "column_type": "int64" }, { "column_name": "active", "column_type": "bool" }, { "column_name": "latitude", "column_type": "double" }, { "column_name": "longitude", "column_type": "double" }, { "column_name": "company", "column_type": "string" }, { "column_name": "country", "column_type": "string" } ] } π API Info Authentication: None required Supported formats: .parquet, .avro, .orc, .feather Free usage: No signup needed; API is currently open to the public Limits: Usage and file size limits may apply in the future (TBD)
by Jonathan
This workflow is part of an MSP collection, which is publicly available on GitHub. This workflow archives or unarchives a Clockify projects, depending on a Syncro status. Note that Syncro should be setup with a webhook via 'Notification Set for Ticket - Status was changed'. It doesn't handle merging of tickets, as Syncro doesn't support a 'Notification Set' for merged tickets, so you should change a ticket to 'Resolved' first before merging it. Prerequisites A Clockify account and credentials Nodes Webhook node triggers the workflow. IF node filters projects that don't have the status 'Resolved'. Clockify nodes get all projects that (don't) have the status 'Resolved', based on the IF route. HTTP Request nodes unarchives unresolved projects, and archives resolved projects, respectively.
by Tom
This is the workflow powering the n8n demo shown at StrapiConf 2022. The workflow searches matching Tweets every 30 minutes using the Interval node and listens to Form submissions using the Webhook node. Sentiment analysis is handled by Google using the Google Cloud Natural Language node before the result is stored in Strapi using the Strapi node. (These were originally two separate workflows that have been combined into one to simplify sharing.)
by Airtop
About The LinkedIn Profile Discovery Automation Are you tired of manually searching for LinkedIn profiles or paying expensive data providers for often outdated information? If you spend countless hours trying to find accurate LinkedIn URLs for your prospects or candidates, this automation will change your workflow forever. Just give this workflow the information you have about a contact, and it will automatically augment it with a LinkedIn profile. How to find a LinkedIn Profile Link In this guide, you'll learn how to automate LinkedIn profile link discovery using Airtop's built-in node in n8n. Using this automation, you'll have a fully automated workflow that saves you hours of manual searching while providing accurate, validated LinkedIn URLs. What You'll Need A free Airtop API key A Google Workspace account. If you have a Gmail account, youβre all set Estimated setup time: 10 minutes Understanding the Process This automation leverages the power of intelligent search algorithms combined with LinkedIn validation to ensure accuracy. Here's how it works: Takes your input data (name, company, etc.) and constructs intelligent search queries Utilizes Google search to identify potential LinkedIn profile URLs Validates the discovered URLs directly against LinkedIn to ensure accuracy Returns confirmed, accurate LinkedIn profile URLs Setting Up Your Automation Getting started with this automation is straightforward: Prepare Your Google Sheet Create a new Google Sheet with columns for input data (name, company, domain, etc.) Add columns for the output LinkedIn URL and validation status (see this example) Configure the Automation Connect your Google Workspace account to n8n if you haven't already Add your Airtop API credentials (Optionally) Configure your Airtop Profile and sign-in to LinkedIn in order to validate profile URL's Run Your First Test Add a few test entries to your Google Sheet Run the workflow Check the results in your output columns Customization Options While the default setup uses Google Sheets, this automation is highly flexible: Webhook Integration**: Perfect for integrating with tools like Clay, Instantly, or your custom applications Alternatives**: Replace Google Sheets with Airtable, Notion, or any other tools you already use for more robust database capabilities Custom Output Formatting**: Modify the output structure to match your existing systems Batch Processing**: Configure for bulk processing of multiple profiles Real-World Applications This automation has the potential to transform how we organizations handle profile enrichment. Recruiting Firm Success Story With this automation, a recruiting firm could save hundreds of dollars a month in data enrichment fees, achieve better accuracy, and eliminate subscription costs. They would also be able to process thousands of profiles weekly with near-perfect accuracy. Sales Team Integration A B2B sales team could integrate this automation with their CRM, automatically enriching new leads with validated LinkedIn profiles and saving their SDRs hours per week on manual research. Best Practices To maximize the accuracy of your results: Always include company information (domain or company name) with your search queries Use full names rather than nicknames or initials when possible Consider including location data for more accurate results with common names Implement rate limiting to respect LinkedIn's usage guidelines Keep your input data clean and standardized for best results Use the integrated proxy to navigate more effectively through Google and LinkedIn What's Next? Now that you've automated LinkedIn profile discovery, consider exploring related automations: Automated lead scoring based on LinkedIn profile data Email finder automation using validated LinkedIn profiles Integration with your CRM for automated contact enrichment