by Yaron Been
Google Veo 3 Fast Video Generator Description A faster and cheaper version of Google’s Veo 3 video model, with audio Overview This n8n workflow integrates with the Replicate API to use the google/veo-3-fast model. This powerful AI model can generate high-quality video content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Text prompt for video generation Optional Parameters seed** (integer, default: None): Random seed. Omit for random generations resolution** (string, default: 720p): Resolution of the generated video negative_prompt** (string, default: None): Description of what to discourage in the generated video How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate video content Access the generated output from the final node API Reference Model: google/veo-3-fast API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of video generation parameters
by Yaron Been
Prunaai Hidream E1.1 Image Generator Description Edit an image with a prompt. This is the hidream-e1.1 model accelerated with the pruna optimisation engine. Overview This n8n workflow integrates with the Replicate API to use the prunaai/hidream-e1.1 model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Prompt Optional Parameters seed** (integer, default: -1): Random seed (-1 for random) image** (string, default: None): Input image to edit. speed_mode** (string, default: Juiced 🔥 (more speed)): Speed optimization level clip_cfg_norm** (boolean, default: True): Whether to use CLIP CFG normalization output_format** (string, default: webp): Output format guidance_scale** (number, default: 2.5): Guidance scale output_quality** (integer, default: 100): Output quality (for jpg and webp) refine_strength** (number, default: 0.3): Strength of refinement num_inference_steps** (integer, default: 28): Number of inference steps image_guidance_scale** (number, default: 1): Image guidance scale How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: prunaai/hidream-e1.1 API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Yaron Been
Dessix Moss Ttsd Text Generator Description MOSS-TTSD (text to spoken dialogue) is an open-source bilingual spoken dialogue synthesis model that supports both Chinese and English. It can transform dialogue scripts between two speakers into natural, expressive conversational speech. Overview This n8n workflow integrates with the Replicate API to use the dessix/moss-ttsd model. This powerful AI model can generate high-quality text content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Optional Parameters seed** (integer, default: 42): Random seed for reproducibility text** (string, default: [S1]你好[S2]你好,最近怎么样[S1]还不错,你呢[S2]我也挺好的,谢谢关心): Dialogue text, format: [S1]Speaker 1 content[S2]Speaker 2 content[S1]... use_normalize** (boolean, default: True): Whether to use text normalization (recommended for better handling of numbers, punctuation, etc.) reference_text_speaker1** (string, default: 周一到周五每天早晨七点半到九点半的直播片段,言下之意呢就是废话有点多,大家也别嫌弃,因为这都是直播间最真实的状态了): Reference text for speaker 1 (corresponding to reference audio) reference_text_speaker2** (string, default: 如果大家想听到更丰富更及时的直播内容,记得在周一到周五准时进入直播间,和大家一起畅聊新消费新科技新趋势): Reference text for speaker 2 (corresponding to reference audio) reference_audio_speaker1** (string, default: None): Reference audio file for speaker 1 (optional, for voice cloning) reference_audio_speaker2** (string, default: None): 说话者2的参考音频文件(可选,用于声音克隆)/ Reference audio file for speaker 2 (optional, for voice cloning) How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate text content Access the generated output from the final node API Reference Model: dessix/moss-ttsd API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of text generation parameters
by n8n Team
This workflow performs various Git operations. It starts with a manual trigger, sets the local repository path, decodes a file and then updates a file's content, adds, commits, and pushes changes to a GitHub repository, and finally pulls changes. The upper branch of the workflow retrieves a specific file ("README.md") from a GitHub repository ("git_push_article") owned by "teds-tech-talks." It then decodes the file's binary data into readable text using a code node. The decoded content is used to update the file by adding a timestamp and data. Finally, the modified file is pushed back to the repository using a GitHub node, completing the process of editing and updating the file directly via the workflow. This bottom branch of the workflow makes changes to a local Git repository. It starts by updating the "README.md" file with a timestamp and some content. Then, it adds the modified files, commits the changes with a message, and pushes them to a remote GitHub repository owned by "teds-tech-talks." Additionally, the workflow allows pulling changes from the remote repository into the local repository. The goal is to demonstrate how to perform various Git operations using n8n nodes, including adding, committing, pushing, and pulling changes.
by Robert Breen
This workflow automates invoice creation using Google Sheets for structured input and Google Docs for templated output — all built inside n8n. 🛠️ Step-by-Step Instructions ### Step 1: Manual Trigger Start the workflow manually for testing or development purposes. ### Step 2: Google Sheets — Load Invoice Data Pulls invoice data from a Google Sheet. 📄 Sheet URL: Copy This Sheet Expected Columns**: Company From Company To Terms Invoice Description Amount > 🔑 Credentials Required: > Connect to Google Sheets OAuth2 API in n8n. > Be sure your sheet is shared with the connected Google account. ### Step 3: Get Invoice Template — Load Google Doc Loads a static Google Docs template containing placeholder values. 🧾 Template URL: Copy This Template Required Placeholders** in the document: FromCompany# ToCompany# Terms# Invoice# Description# Amount# > 🔑 Credentials Required: > Connect to Google Docs OAuth2 API in n8n. ### Step 4: Create New Doc — Make Invoice File Creates a new Google Doc by duplicating the invoice template. Title Format**: Invoice: {{ $json.Invoice }} Destination Folder ID**: 1TnDibwPPPUm3VbmETiqWDVhtaUTLJ6mn (You can change this to your own Google Drive folder) > 🔐 Make sure your Google Docs credential has write access to this folder. ### Step 5: Merge — Combine Data Merges the loaded document and spreadsheet row together for downstream updates. ### Step 6: Insert Content into Doc (Optional) You can insert additional content here if needed. For example, a note, header, or footer pulled from your database or a custom field. ### Step 7: Input Invoice Details — Replace Fields Uses Google Docs API to replace all placeholders from the original template with the actual values. Replacements: | Placeholder | Replaced With | |----------------|------------------------------| | FromCompany# | Company From from sheet | | ToCompany# | Company To from sheet | | Terms# | Terms from sheet | | Invoice# | Invoice number | | Description# | Description of service | | Amount# | Amount of invoice | 📤 Final Output Each row from the Google Sheet results in a completed, branded Google Doc invoice stored in your Drive. 🙋 Need Help? Robert Breen Automation Consultant 🌐 ynteractive.com 📧 robert.j.breen@gmail.com 🔗 LinkedIn 🔒 Required APIs | Service | Purpose | |------------------|--------------------------| | Google Sheets API | Pull structured invoice data | | Google Docs API | Load & modify invoice documents | | n8n OAuth2 | Connect both services securely | Let me know if you'd like a follow-up step to export invoices as PDFs or auto-email them to clients!
by Evoort Solutions
🎬 YouTube Video to Blog – Multilingual Blog Generator Convert YouTube videos into SEO-friendly blog posts in just seconds using this fully automated n8n workflow. Perfect for content creators, marketers, educators, and bloggers looking to repurpose video content without manual transcription or formatting. 🔧 What It Does 📥 Accepts a YouTube video URL and preferred language via a simple form 🧠 Uses a third-party API to convert the video into a blog-style article 📄 Automatically inserts the generated content into a Google Docs document 🌍 Supported Languages Supports all major languages, including but not limited to: English Hindi French German Gujarati 🎯 The workflow is flexible and can generate blog content in any language supported by the API. Just select your language when submitting the form. 🚀 Benefits ⏱️ Time-Saving: Eliminate manual video transcription and formatting 🌐 Multilingual: Easily generate blogs in multiple languages 📚 Centralized Storage: Store all generated blogs in a single Google Docs file 🔧 Customizable: Extend the flow to auto-publish, email, or analyze content 🧠 Use Cases Repurpose YouTube content into keyword-rich blog posts Generate multilingual content for global reach Convert educational videos into study guides or summaries Create email newsletters or social media posts from video content 🛠️ Requirements ✅ An n8n instance (self-hosted or cloud) 🔑 RapidAPI key for youtube-to-blog.p.rapidapi.com 🧾 A Google Docs account with API access 🚨 Note: Be sure to update the API key and Google Docs URL with your own credentials before activating the workflow. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by n8n Team
This workflow imports multiple CSV files and appends or updates them to a Google Sheets document. Here's a step-by-step breakdown: When clicked "Execute Workflow", the process starts. The "Read Binary Files" node reads all the '.csv' files from the specified directory. The files are then split into batches (one file in a batch) by the "Split In Batches" node. For each file, the "Read CSV" node reads the data from the CSV file. The "Assign source file name" node assigns the source file name to the data. The data is then processed by the "Remove duplicates" node. This removes any duplicate entries based on the 'user_name' field. The "Keep only subscribers" node filters the data to keep only those entries where the 'subscribed' field is set to 'TRUE'. The data is then sorted by the 'date_subscribed' field using the "Sort by date" node. Finally, the processed data is appended or updated to a specified Google Sheets document using the "Upload to spreadsheet" node. It checks for the 'user_name' field, if the data corresponding to that 'user_name' already exists, it updates the data, otherwise appends the new data.
by Nicolas Le Gallo
Who is this template for ? Basically anyone involved in recurring recruiting processes and looking to save a considerable amount of time and energy (Talent acquisitions Managers, recruiting consultants, hiring managers, founders…etc) What it does : It takes a messy and raw transcript from an “intake meeting” between a recruiter and a Hiring manager and turns it into a clean and exhaustive brief + scorecard templates for each interview rounds It does it under 1 MINUTE while the usual “manual” process usually takes several hours How to customize this workflow to your needs Google doc is the default choice because it allows easy modification of the output, but you can choose to output this under any format and / or store it wherever you want I strongly suggest to choose one of the latest LLM models for better output quality Both LLM prompts can be revised to match your expectations better
by MilanWR
Telegram N8N workflow (de)activator What does it do? This workflow helps you to quickly activate or deactivate a workflow through Telegram. Sometimes we are not able to access a PC to resolve an issue if something goes wrong with a workflow. If you, like me, use Telegram to send yourself error reports, you can quickly react in case of urgency. Just by sending '/stop' combined with the name you use for a workflow, you can deactivate a workflow, or reactivate it with '/start'. For example '/stop marketing'. Walkthrough: https://watch.screencastify.com/v/uWQ88gZKj57WTGOOqSW2 (6min) Instructions Create a Telegram API key through botfather (https://t.me/botfather). Add it to the telegram credentials. For the N8N nodes, go to settings in your n8n instance. Then 'n8n API' and 'create an API key'. To ensure that only we can send commands to the bot, we need the chat ID of our DM with our newly created bot. Open the the Telegram trigger and click on 'listen to events'. Go to Telegram and send a direct message to the bot, this will trigger the Telegram node. Go to the filter node and fill in the chat id you want to filter for with the data you got from the test event in the Telegram node. In the first Switch node you can find the commands, in this case it is '/start' and '/stop'. When you send a message to your bot starting with either of those, it will go to the next switch nodes. Next it will check what other word it contains. As an example I have used the words 'marketing' and 'sales', both corresponding to a marketing and sales workflow. The last nodes will either activate or deactivate a workflow.
by bangank36
This workflow captures Squarespace newsletter signups in a Google Sheet and automatically creates new Mailchimp contacts in the selected audience. It overcomes the limitation in Squarespace’s native Mailchimp integration, which only supports new, empty audiences. You can trigger the workflow manually or schedule it for continuous synchronization. Step-by-step tutorial First, you need to connect Squarespace newsletter block submission to Google Drive In Mailchimp node, choose your targeted audience in List Name or ID Connect a Squarespace Form to Google Drive To connect a form to Google Drive: In the form's storage options, click Connect on Google Drive. Log into your Google account. Click Allow to permit Squarespace to connect to Google Drive. Enter a Spreadsheet Name. This creates a new spreadsheet for your form submissions. Columns in my sheet: Submitted On Email Address Name This structure is inspired by Squarespace’s newsletter block connection, but you can modify it based on your preferred data format. 👉 Clone my Google Sheets template Requirements Credentials To use this workflow, you need: Mailchimp API Key** – Required to add contacts to Mailchimp. Google Sheets API credentials** – Required to retrieve signups from the spreadsheet. 📌 Mailchimp API Authentication Guide Explore More Templates 👉 Check out my other n8n templates
by Oneclick AI Squad
Description Automates error detection and notification to prevent production downtime. Monitors incoming webhooks, filters critical errors, and triggers alerts or bug reports. Ensures rapid response to critical issues in real-time. Essential Information Processes webhook triggers to detect errors instantly. Filters and categorizes errors as critical or non-critical. Sends Slack alerts for critical errors and creates Jira bugs as needed. System Architecture Error Detection Pipeline**: Webhook Trigger: Captures incoming error data via POST requests. Filter Critical Errors: Identifies and separates critical errors. Alert Generation Flow**: Send Slack Alert: Notifies the team via Slack for critical errors. Create Jira Bug: Logs critical errors as Jira issues. Non-Critical Handling**: No Action for Non-Critical: Skips non-critical errors with no further action. Implementation Guide Import the workflow JSON into n8n. Configure webhook URL and test with sample error data. Set up Slack and Jira credentials for alerts and bug creation. Test error filtering and notification flows. Monitor alert accuracy and adjust filter rules as needed. Technical Dependencies Webhook service for error data ingestion. Slack API for real-time notifications. Jira API for bug tracking and issue creation. n8n for workflow automation. Customization Possibilities Adjust Filter Critical Errors node to refine error severity rules. Customize Slack alert messages in Send Slack Alert node. Modify Jira issue templates in Create Jira Bug node. Add logging node to track all errors for analysis. Integrate with additional notification tools (e.g., email).
by scrapeless official
Brief Overview This automation template helps you track the latest real estate listings from the LoopNet platform. By using Scrapeless to scrape property listings, n8n to orchestrate the workflow, and Google Sheets to store the results, you can build a real estate data pipeline that runs automatically on a weekly schedule. How It Works Trigger on a Schedule:** The workflow runs automatically every week (can be adjusted to every 6 hours, daily, etc.). Scrape Property Listings:** Scrapeless crawls the LoopNet real estate website and returns structured Markdown data. Extract & Parse Content:** JavaScript nodes use regex to parse property titles, links, sizes, year built from Markdown. Flatten Data:** Each property listing becomes a single row with structured fields. Save to Google Sheets:** Property data is appended to your Google Sheet for easy analysis, sharing, and reporting. Features No-code, automated real estate listing scraper. Scrapes and structures the latest commercial property listings (for sale or lease). Saves structured listing data directly to Google Sheets. Fully automated, scheduled scraping—no manual scraping is required. Extensible: Add filters, deduplication, Slack/Email notifications, or multi-city scraping. Requirements Scrapeless API Key:** Sign up on the Scrapeless Dashboard. Go to Settings → API Key Management → Create API Key, then copy the generated key. n8n Instance:** Self-hosted or n8n.cloud account. Google Account:** For Google Sheets API access. Target Site:** This template is configured for LoopNet real estate listings but can be adapted for other property platforms like Crexi. Installation Deploy n8n on your preferred platform. Install the Scrapeless node from the community marketplace. Import this workflow JSON file into your n8n workspace. Create and add your Scrapeless API Key in n8n’s credential manager. Connect your Google Sheets account in n8n. Update the target LoopNet URL and Google Sheet details. Usage This automated real estate scraper is ideal for: | Industry / Role | Use Case | | ---------------------- | ----------------------------------------------------------------- | | Real Estate Agencies | Monitor new commercial properties and streamline lead generation. | | Market Research Teams | Track market dynamics and property availability in real-time. | | BI/Data Analysts | Automate data collection for dashboards and market insights. | | Investors | Keep tabs on the latest commercial property opportunities. | | Automation Enthusiasts | Example use case for learning web scraping + automation. | Output Example