by Yaron Been
This workflow provides automated access to the Stability Ai Stable Video Diffusion AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for video generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete video generation process using the Stability Ai Stable Video Diffusion model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model by stability-ai for automated processing tasks. Key Capabilities AI-powered video generation and processing** High-quality video synthesis from inputs** Advanced video manipulation capabilities** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Stability Ai/stable-video-diffusion AI model Stability Ai Stable Video Diffusion**: The core AI model for video generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Video Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Video Content Creation**: Generate videos for social media, marketing, and presentations Animation & Motion Graphics**: Create animated content and visual effects Video Editing**: Enhance and transform existing video content Educational Content**: Produce instructional and explainer videos Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #videogeneration #aivideo #videoai #motion #videoautomation #videocreation #stablediffusion #diffusion #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Amjid Ali
Detailed Title "Triathlon Coach AI Workflow: Strava Data Analysis and Personalized Training Insights using n8n" Description This n8n workflow enables you to build an AI-driven virtual triathlon coach that seamlessly integrates with Strava to analyze activity data and provide athletes with actionable training insights. The workflow processes data from activities like swimming, cycling, and running, delivers personalized feedback, and sends motivational and performance improvement advice via email or WhatsApp. Workflow Details Trigger: Strava Activity Updates Node:** Strava Trigger Purpose:** Captures updates from Strava whenever an activity is recorded or modified. The data includes metrics like distance, pace, elevation, heart rate, and more. Integration:** Uses Strava API for real-time synchronization. Step 1: Data Preprocessing Node:** Code Purpose:** Combines and flattens the raw Strava activity data into a structured format for easier processing in subsequent nodes. Logic:** A recursive function flattens JSON input to create a clean and readable structure. Step 2: AI Analysis with Google Gemini Node:** Google Gemini Chat Model Purpose:** Leverages Google Gemini's advanced language model to analyze the activity data. Functionality:** Identifies key performance metrics. Provides feedback and insights specific to the type of activity (e.g., running, swimming, or cycling). Offers tailored recommendations and motivational advice. Step 3: Generate Structured Output Node:** Structure Output Purpose:** Processes the AI-generated response to create a structured format, such as headings, paragraphs, and bullet lists. Output:** Formats the response for clear communication. Step 4: Convert to HTML Node:** Convert to HTML Purpose:** Converts the structured output into an HTML format suitable for email or other presentation methods. Output:** Ensures the response is visually appealing and easy to understand. Step 5: Send Email with Training Insights Node:** Send Email Purpose:** Sends a detailed email to the athlete with performance insights, training recommendations, and motivational messages. Integration:** Utilizes Gmail or SMTP for secure and efficient email delivery. Optional Step: WhatsApp Notifications Node:** WhatsApp Business Cloud Purpose:** Sends a summary of the activity analysis and key recommendations via WhatsApp for instant access. Integration:** Connects to WhatsApp Business Cloud for automated messaging. Additional Notes Customization: You can modify the AI prompt to adapt the recommendations to the athlete's specific goals or fitness levels. The workflow is flexible and can accommodate additional nodes for more advanced analysis or output formats. Scalability: Ideal for individual athletes or coaches managing multiple athletes. Can be expanded to include additional metrics or insights based on user preferences. Performance Metrics Handled: Swimming: SWOLF, stroke count, pace. Cycling: Cadence, power zones, elevation. Running: Pacing, stride length, heart rate zones. Implementation Steps Set Up Strava API Key: Log in to Strava Developers to generate your API key. Integrate the API key into the Strava Trigger node. Configure Google Gemini Integration: Use your Google Gemini (PaLM) API credentials in the Google Gemini Chat Model node. Customize Email and WhatsApp Messaging: Update the Send Email and WhatsApp Business Cloud nodes with the recipient’s details. Automate Execution: Deploy the workflow and use n8n's scheduling features or cron jobs for periodic execution. GET n8n Now N8N COURSE n8n Book Developer Notes Author:** Amjid Ali improvements. Resources:** See in Action: Syncbricks Youtube PayPal: Support the Developer Courses : SyncBricks LMS By using this workflow, triathletes and coaches can elevate training to the next level with AI-powered insights and actionable recommendations.
by Yaron Been
This workflow provides automated access to the Black Forest Labs Flux Schnell AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for image generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete image generation process using the Black Forest Labs Flux Schnell model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model by black-forest-labs for automated processing tasks. Key Capabilities High-quality image generation from text prompts** Advanced AI-powered visual content creation** Customizable image parameters and styles** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Black Forest Labs/flux-schnell AI model Black Forest Labs Flux Schnell**: The core AI model for image generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Image Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Content Creation**: Generate unique images for blogs, social media, and marketing materials Design Prototyping**: Create visual concepts and mockups for design projects Art & Creativity**: Produce artistic images for personal or commercial use Marketing Materials**: Generate eye-catching visuals for campaigns and advertisements Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #imagegeneration #aiart #texttoimage #visualcontent #aiimages #generativeart #flux #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically scrapes and summarizes the latest industry news, delivering a curated digest to your team. Stay informed without sifting through countless articles. Overview Bright Data scrapes top news sites, blogs, and press release feeds relevant to your sector. OpenAI summarizes each article and tags it by topic. The daily digest is compiled into Markdown and sent via Slack and email, while full summaries are archived in Notion. Tools Used n8n** – Automation framework Bright Data** – Scrapes news sources reliably OpenAI** – Generates concise summaries and tags Slack & Gmail** – Distributes daily digest Notion** – Stores detailed article notes How to Install Import the Workflow into n8n. Configure Bright Data credentials. Set Up OpenAI API key. Authorize Slack, Gmail, and Notion. Customize Source List & Keywords in the Set node. Use Cases Executive Briefings**: Keep leadership updated. Product Teams**: Track competitor announcements. Marketing**: Identify content trends quickly. Investors**: Monitor sector developments. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #industrynews #webscraping #brightdata #openai #newsdigest #n8nworkflow #nocode
by Harshil Agrawal
This workflow gets the top 5 products from Product Hunt and shares them on the Discord server. Cron node: This node triggers the workflow every hour. Based on your use case, you can update the node to trigger the workflow at a different time. GraphQL node: This node makes the API call to the Product Hunt GraphQL API. You will need an API token from Product Hunt to make the call. Item Lists node: This node transforms the single item returned by the previous node into multiple items. Set node: The Set node is used to return only the name, description, and votes of the product. Discord node: This node is used to send the top 5 products to the Discord server.
by Samir Saci
Tags: Supply Chain Management, Logistics, Transportation, Data Transmission Context Hey! I'm Samir, a Supply Chain Engineer and Data Scientist from Paris founder of LogiGreen Consulting We help small and medium businesses improve their logistics processes using AI, Data Analytics and Automation. > Sustainable and Efficient supply chains with N8N! 📬 For business inquiries, you can add me on Here What is an EDI Message? Electronic Data Interchange (EDI) is a standardized method of automatically transferring data between computer systems. They ensure the smooth flow of essential transactional data, such as purchase orders, invoices, shipping notices, and more. For instance, a manufacturing company can receive purchase orders from a retailer via EDI. However, they need complex integration for the transmission and processing of the messages. Who is this template for? This workflow template is designed for small companies that cannot connect to their customers and need to manually process the EDI messages received. How does it work? This workflow uses a Gmail Trigger that analyzes all the incoming emails. 📧 Gmail Trigger → Detects emails with "EDI" in the subject. 📜 Parses EDI Message → Uses a JavaScript Code Node to extract structured data. 📊 Formats the Data → Converts it into a table-friendly format. 📑 Updates Google Sheets → Automatically logs the processed orders. Prerequisite This workflow does not require any additional paying subscription. A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Gmail API A Google sheet to store the shipment records. You do not need to prepare the columns. Next Steps Follow the sticky notes to set up the parameters inside each node and get ready to improve your logistics operations! 📺 Watch the Step-by-Step Guide 🎥 Check My Tutorial 🚀 Interested in applications of N8N for Logistics & Supply Chain Management? Let's connect on Linkedin Notes This template includes an example of EDI message to test the workflow. If you want to learn more about Electronic Data Interchange: 🚚 Blog Article about Electronic Data Interchange (EDI) This workflow has been created with N8N 1.82.1 Submitted: March 19th, 2025
by Davide
This workflow enables users to perform web searches directly from Telegram using the Brave search engine. By simply sending the command /brave followed by a query, the workflow retrieves search results from Brave and returns them as a Telegram message. This workflow is ideal for users who want a quick and private way to search the web without switching between apps. 🚀 This workflow is a powerful tool for automating interactions with Brave tools through Telegram, providing users with quick and easy access to information directly in their chat. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to process user queries from Telegram, execute a Brave tool via the MCP Client, and send the results back to the user. Here's how it works: Telegram Trigger: The workflow starts with the Telegram Trigger node, which listens for new messages in a Telegram chat. When a message is received, the workflow checks if it starts with the command /brave. Filter Messages: The If node filters messages that start with /brave. If the message doesn't start with /brave, the workflow stops. Edit Fields: The Edit Fields node extracts the text of the message for further processing. Clean Query: The Clean Query node removes the /brave command from the message, leaving only the user's query. List Brave Tools: The List Brave Tools node retrieves the list of available tools from the MCP Client. Execute Brave Tool: The Exec Brave Tool node executes the first tool in the list using the cleaned query as input. Send Message: The Send Message node sends the result of the Brave tool execution back to the user in the Telegram chat. 2. Preliminary Steps Access to an n8n self-hosted instance and install the Community node "n8n-nodes-mcp". Please see this easy guide Get your Brave Search API Key: https://brave.com/search/api/ Telegram Bot Access Token In "List Brave Tools" create new credential as shown in this image In Environment field set this value: BRAVE_API_KEY=your-api-key 3. Set Up Steps To set up and use this workflow in n8n, follow these steps: Telegram Configuration: Set up Telegram credentials in n8n for the Telegram Trigger and Send Message nodes. Ensure the Telegram bot is authorized to read messages and send responses in the chat. MCP Client Configuration: Set up MCP Client credentials in n8n for the List Brave Tools and Exec Brave Tool nodes. Ensure the MCP Client is configured to interact with Brave tools. Test the Workflow: Send a message starting with /brave followed by a query (e.g., /brave search for AI tools) to the Telegram chat. The workflow will: Process the query. Execute the Brave tool via the MCP Client. Send the result back to the Telegram chat. Optional Customization: Modify the workflow to include additional features, such as: Adding more commands or tools. Integrating with other APIs or services for advanced use cases. Sending notifications via other channels (e.g., email, Slack) Need help customizing? Contact me for consulting and support or add me on Linkedin.
by ParquetReader
📄 Convert Parquet, Feather, ORC & Avro Files with ParquetReader This workflow allows you to upload and inspect Parquet, Feather, ORC, or Avro files via the ParquetReader API. It instantly returns a structured JSON preview of your data — including rows, schema, and metadata — without needing to write any custom code. ✅ Perfect For Validating schema and structure before syncing or transformation Previewing raw columnar files on the fly Automating QA, ETL, or CI/CD workflows Converting Parquet, Avro, Feather, or ORC to JSON ⚙️ Use Cases Catch schema mismatches before pipeline runs Automate column audits in incoming data files Enrich metadata catalogs with real-time schema detection Integrate file validation into automated workflows 🚀 How to Use This Workflow 📥 Trigger via File Upload You can trigger this flow by sending a POST request with a file using curl, Postman, or from another n8n flow. 🔧 Example (via curl): curl -X POST http://localhost:5678/webhook-test/convert \ -F "file=@converted.parquet" > Replace converted.parquet with your local file path. You can also send Avro, ORC or Feather files. 🔁 Reuse from Other Flows You can reuse this flow by calling the webhook from another n8n workflow using an HTTP Request node. Make sure to send the file as form-data with the field name file. 🔍 What This Flow Does: Receives the uploaded file via webhook (file) Sends it to https://api.parquetreader.com/parquet as multipart/form-data (field name: file) Receives parsed data (rows), schema, and metadata in JSON format 🧪 Example JSON Response from this flow { "data": [ { "full_name": "Pamela Cabrera", "email": "bobbyharrison@example.net", "age": "24", "active": "True", "latitude": "-36.1577385", "longitude": "63.014954", "company": "Carter, Shaw and Parks", "country": "Honduras" } ], "meta_data": { "created_by": "pyarrow", "num_columns": 21, "num_rows": 10, "serialized_size": 7598, "format_version": "0.12" }, "schema": [ { "column_name": "full_name", "column_type": "string" }, { "column_name": "email", "column_type": "string" }, { "column_name": "age", "column_type": "int64" }, { "column_name": "active", "column_type": "bool" }, { "column_name": "latitude", "column_type": "double" }, { "column_name": "longitude", "column_type": "double" }, { "column_name": "company", "column_type": "string" }, { "column_name": "country", "column_type": "string" } ] } 🔐 API Info Authentication: None required Supported formats: .parquet, .avro, .orc, .feather Free usage: No signup needed; API is currently open to the public Limits: Usage and file size limits may apply in the future (TBD)
by Airtop
About The Airtop Automation Are you tired of being shocked by unexpectedly high energy bills? With this automation using Airtop and n8n, you can take control of your daily energy costs and ensure you’re always informed. How to monitor your daily energy consumption With this automation, we’ll walk you through setting up an automation that retrieves your PG&E (Pacific Gas and Electric) energy usage data, calculates costs, and emails you the details—all without manual effort. What You’ll Need To get started, make sure you have the following: A free Airtop API Key PG&E Account Credentials - with minor adaptations, this will also work with other providers An Email Address - To receive the energy cost updates Estimated setup time: 5 minutes Understanding the Process This automation works by: Logging into your PG&E account using your credentials Navigating to your energy usage data Extracting relevant details about energy consumption and costs Emailing the daily summary directly to your inbox The automation is straightforward and ensures you have real-time insights into your energy usage, empowering you to adjust your habits and save money. Setting Up Your Automation We’ve created a step-by-step guide to help you set up this workflow. Here’s how: Insert Your Credentials: In the tools section, add your PG&E login details as variables In Airtop, add your Airtop API Key Configure your email address to receive the updates Run the Automation: Start the scenario, and watch as the automation retrieves your energy data and sends you a detailed email summary. Customization Options While the default setup works seamlessly, you can tweak it to suit your needs: Data Storage: Store energy usage data in a database for long-term tracking and analysis Visualization: Plot graphs of your energy usage trends over time for better insights Notifications: Change the automation to only send alerts on high usage instead of a daily email Real-World Applications This automation isn’t just about monitoring energy usage and taking control. Here are some practical applications: Daily Energy Management: Receive updates every morning and adjust your energy consumption based on costs Smart Home Integration: Use the data to automate appliances during off-peak hours Budgeting: Track energy expenses over weeks or months to plan your budget more effectively Happy automating!
by Samir Saci
Tags: Automation, AI, Marketing, Content Creation Context I’m a Supply Chain Data Scientist and content creator who writes regularly about data-driven optimization, logistics, and sustainability. Promoting blog articles on LinkedIn used to be a manual task — until I decided to automate it with N8N and GPT-4o. This workflow lets you automatically extract blog posts, clean the content, and generate a professional LinkedIn post using an AI Agent powered by GPT-4o — all in one seamless automation. >Save hours of repetitive work and boost your reach with AI. 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This template is perfect for: Bloggers and writers** who want to promote their content on LinkedIn Marketing teams** looking to automate professional post-generation Content creators** using Ghost platforms It generates polished LinkedIn posts with: A hook A quick summary A call-to-action A signature to drive readers to your contact page How does it work? This workflow runs in N8N and performs the following steps: 🚀 Triggers manually or you can add a scheduler 📰 Pulls recent blog posts from your Ghost site (via API) 🧼 Cleans the HTML content for AI input 🤖 Sends content to GPT-4o with a tailored prompt to create a LinkedIn post 📄 Records all data (post content + LinkedIn output) in a Google Sheet What do I need to start? You don’t need to write a single line of code. Prerequisites: A Ghost CMS account with blog content A Google Sheet to store generated posts An OpenAI API Key Google Sheets API** connected via OAuth2 Next Steps Use the sticky notes in the workflow to understand how to: Add your Ghost API credentials Link your Google Sheet Customize the AI prompt (e.g., change the author name or tone) Optionally add auto-posting to LinkedIn using tools like Buffer or Make 🎥 Watch My Tutorial 🚀 Want to explore how automation can scale your brand or business? 📬 Let’s connect on LinkedIn Notes You can adapt this template for Twitter, Facebook, or even email newsletters by adjusting the prompt and output channel. This workflow was built using n8n 1.85.4 Submitted: April 9th, 2025
by Yaron Been
LinkedIn Hiring Signal Scraper — Jobs & Prospecting Using Bright Data Purpose: Discover recent job posts from LinkedIn using Bright Data's Dataset API, clean the results, and log them into Google Sheets — for both job hunting and identifying high-intent B2B leads based on hiring activity. Use Cases: Job Seekers** – Spot relevant openings filtered by role, city, and country. Sales & Prospecting** – Use job posts as buying signals. If a company is hiring for a role you support (e.g. marketers, developers, ops) — it's the perfect time to reach out and offer your services. Tools Needed: n8n Nodes:** Form Trigger HTTP Request Wait If Code Google Sheets Sticky Notes (for embedded guidance) External Services:** Bright Data (Dataset API) Google Sheets API Keys & Authentication Required: Bright Data API Key** → Add in the HTTP Request headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** → Connect your account in n8n to allow read/write access to the spreadsheet. General Guidelines: Use descriptive names for all nodes. Include retry logic in polling to avoid infinite loops. Flatten nested fields (like job_poster and base_salary). Strip out HTML tags from job descriptions for clean output. Things to be Aware Of: Bright Data snapshots take ~1–3 minutes — use a Wait node and polling. Form filters affect output significantly: 🔍 We recommend filtering by "Last 7 days" or "Past 24 hours" for fresher data. Avoid hardcoding values in the form — leave optional filters empty if unsure. Post-Processing & Outreach: After data lands in Google Sheets, you can use it to: Personalize cold emails based on job titles, locations, and hiring signals. Send thoughtful LinkedIn messages (e.g., "Saw you're hiring a CMO...") Prioritize outreach to companies actively growing in your niche. Additional Notes: 📄 Copy the Google Sheet Template: Click here to make your copy → Rename for each campaign or client. Form fields include: Job Location (city or region) Keyword (e.g., CMO, Backend Developer) Country (2-letter code, e.g., US, UK) This workflow gives you a competitive edge — 📌 For candidates: Be first to apply. 📌 For sellers: Be first to pitch. All based on live hiring signals from LinkedIn. STEP-BY-STEP WALKTHROUGH Step 1: Set up your Google Sheet Open this template Go to File → Make a copy You'll use this copy as the destination for the scraped job posts Step 2: Fill out the Input Form in n8n The form allows you to define what kind of job posts you want to scrape. Fields: Job Location** → e.g. New York, Berlin, Remote Keyword** → e.g. CMO, AI Architect, Ecommerce Manager Country Code (2-letter)** → e.g. US, UK, IL 💡 Pro Tip: For best results, set the filter inside the workflow to: time_range = "Past 24 hours" or "Last 7 days" This keeps results relevant and fresh. Step 3: Trigger Bright Data Snapshot The workflow sends a request to Bright Data with your input. Example API Call Body: [ { "location": "New York", "keyword": "Marketing Manager", "country": "US", "time_range": "Past 24 hours", "job_type": "Part-time", "experience_level": "", "remote": "", "company": "" } ] Bright Data will start preparing the dataset in the background. Step 4: Wait for the Snapshot to Complete The workflow includes a Wait Node and Polling Loop that checks every few minutes until the data is ready. You don't need to do anything here — it's all automated. Step 5: Clean Up the Results Once Bright Data responds with the full job post list: ✔️ Nested fields like job_poster and base_salary are flattened ✔️ HTML in job descriptions is removed ✔️ Final data is formatted for export Step 6: Export to Google Sheets The final cleaned list is added to your Google Sheet (first tab). Each row = one job post, with columns like: job_title, company_name, location, salary_min, apply_link, job_description_plain Step 7: Use the Data for Outreach or Research Example for Job Seekers: You search for: Location: Berlin Keyword: Product Designer Country: DE Time range: Past 7 days Now you've got a live list of roles — with salary, recruiter info, and apply links. → Use it to apply faster than others. Example for Prospecting (Sales / SDR): You search for: Location: London Keyword: Growth Marketing Country: UK And find companies hiring growth marketers. → That's your signal to offer help with media buying, SEO, CRO, or your relevant service. Use the data to: Write personalized cold emails ("Saw you're hiring a Growth Marketer…") Start warm LinkedIn outreach Build lead lists of companies actively expanding in your niche API Credentials Required: Bright Data API Key** Used in HTTP headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** Allows n8n to read/write to your spreadsheet Adjustments & Customization Tips: Modify the HTTP Request body to add more filters (e.g. job_type, remote, company) Increase or reduce polling wait time depending on Bright Data speed Add scoring logic to prioritize listings based on title or location Final Notes: 📄 Google Sheet Template: Make your copy here ⚙️ Bright Data Dataset API: Visit BrightData.com 📬 Personalization works best when you act quickly. Use the freshest data to reach out with context — not generic pitches. This workflow turns LinkedIn job posts into sales insights and job leads. All in one click. Fully automated. Ready for your next move.
by n8n Team
The purpose of this n8n workflow is to automate the process of identifying incoming Gmail emails that are requesting an appointment, evaluating their content, checking calendar availability, and then composing and sending a response email. Note that to use this template, you need to be on n8n version 1.19.4 or later.