by Flavien
Audio Generator – Documentation 🎯 Purpose: Generate audio files from text scripts stored in Google Drive. 🔁 Flow: Receive repo IDs. Fetch text scripts. Generate .wav files using local Bark model. Upload back to Drive. 📦 Dependencies: Python script: /scripts/generate_voice.py Bark (voice generation system) n8n instance with access to local shell Google Drive OAuth2 credentials ✏️ Notes: Script filenames must end with .txt Only works with plain text No external API used = 100% free 📦 /scripts/generate_voice.py: import sys import torch import numpy import re from bark import SAMPLE_RATE, generate_audio, preload_models from scipy.io.wavfile import write as write_wav Patch to allow numpy._core.multiarray.scalar during loading torch.serialization.add_safe_globals([numpy._core.multiarray.scalar]) Monkey patch torch.load to force weights_only=False _original_torch_load = torch.load def patched_torch_load(f, args, *kwargs): if 'weights_only' not in kwargs: kwargs['weights_only'] = False return _original_torch_load(f, args, *kwargs) torch.load = patched_torch_load Preload Bark models preload_models() def split_text(text, max_len=300): Split on punctuation to avoid mid-sentence cuts sentences = re.split(r'(?<=[.?!])\s+', text) chunks = [] current = "" for sentence in sentences: if len(current) + len(sentence) < max_len: current += sentence + " " else: chunks.append(current.strip()) current = sentence + " " if current: chunks.append(current.strip()) return chunks Input text file and output path input_text_path = sys.argv[1] output_wav_path = sys.argv[2] with open(input_text_path, 'r', encoding='utf-8') as f: full_text = f.read() voice_preset = "v2/en_speaker_7" chunks = split_text(full_text) Generate and concatenate audio chunks audio_arrays = [] for chunk in chunks: print(f"Generating audio for chunk: {chunk[:50]}...") audio = generate_audio(chunk, history_prompt=voice_preset) audio_arrays.append(audio) Merge all audio chunks final_audio = numpy.concatenate(audio_arrays) Write final .wav file write_wav(output_wav_path, SAMPLE_RATE, final_audio) print(f"Full audio generated at: {output_wav_path}") `
by bangank36
This workflow automates the Mark as Fulfilled action in Squarespace for each order, ensuring a seamless fulfillment process without manual intervention. How It Works This workflow retrieves all pending Squarespace orders and processes their fulfillment automatically. The workflow follows these steps: 1️⃣ Get all pending orders using the HTTP Request node (Since Squarespace does not have a n8n node) 2️⃣ Create a fulfillment request using Fulfill Order node The Filter Orders node can be used to filter valid pending order to process. Step-by-step The workflow can be run as requested or on schedule You can adjust these parameters within the Global and filter nodes: Global node for API Setting api-version** (string, required) – The current API version (see Squarespace Orders API documentation). modifiedAfter**={a-datetime} (string, conditional) – Fetch orders modified after a specific date (ISO 8601 format). modifiedBefore**={b-datetime} (string, conditional) – Fetch orders modified before a specific date (ISO 8601 format). cursor**={c} (string, conditional) – Used for pagination, cannot be combined with other filters. fulfillmentStatus**={status} (optional, enum) – Filter by fulfillment status: PENDING, FULFILLED, or CANCELED. maxPage** – Set -1 to enables infinite pagination to fetch all available orders. Filter Orders node Order Filtering – Ensures only valid orders are fulfilled, particularly useful if: You sell digital downloads or gift cards exclusively. You use third-party fulfillment services for all products. Requirements Credentials To use this workflow, you need: Squarespace API Key – Retrieve from your Squarespace settings. Who Is This For? Squarespace store owners looking to automate their fulfillment process. Merchants selling digital or personalized products who need instant fulfillment. Explore More Templates Get all orders in Squarespace to Google Sheets Convert Squarespace Profiles to Shopify Customers in Google Sheets Fetch Squarespace Blog & Event Collections to Google Sheets 👉 Check out my other n8n templates
by PiAPI
What's the workflow used for? Leverage this Kling API (unofficial) provided by PiAPI workflow to streamline virtual try-on video creation. This tool is designed for e-commerce platforms, fashion brands, content creators and content influencers. By uploading model and clothing images and linking PiAPI account, users can swiftly generate a realistic video of the model sporting the outfit with a 360° turn, offering an immersive viewing experience. Step-by-step Instruction For basic settings of virtual try-on, check API doc to get best practice. Fill in your X-API-Key of your PiAPI account in Preset Parameters node. Upload the model photo and provide target clothing image urls. Click Test Workflow to generate virtual try-on image. Get the video output in the final node. Param Settings If you want to change into a dress, input the model_input URL and the dress_input URL in the parameters. If you want to change into separates, input model_input URL, upper_input URL and lower_input URL in Preset Parameters. Use Case Input images: Output Video The output demonstrates that the model is wearing the clothing from the specified image and showcases a rotating runway-style view. This workflow enables you to efficiently test garment-on-model presentation effects while reducing business model validation costs to a certain extent.
by David w/ SimpleGrow
Scheduled Trigger: Every X day at Y pm, the workflow is automatically triggered. Fetch User Data: The workflow retrieves all user records from the "WhatsApp Engagement Database" in Airtable. Each record contains the user’s WhatsApp ID, current points, and the number of raffle vouchers. Personalized Message Preparation: For each user, a personalized WhatsApp message is prepared. The message includes: The user’s current point total The number of raffle vouchers they have for the week Encouragement to keep engaging for more chances to win Information about the weekly raffle and available prizes Send WhatsApp Message: The workflow sends this personalized message to each user via the Whapi API, using their WhatsApp ID. Result: Every active user receives a weekly update about their engagement status, raffle tickets, and a motivational message to encourage further participation. This helps boost engagement and keeps users informed about their progress and chances in the weekly raffle.
by Pablo
Get Scaleway Server Info with Dynamic Filtering Description This workflow is designed for developers, system administrators, and DevOps engineers who need to retrieve and filter Scaleway server information quickly and efficiently. It gathers data from Scaleway instances and baremetal servers across multiple zones and is ideal for: Quickly identifying servers by tags, names, public IPs, or zones. Automating server status checks in production, staging, or test environments. Integrating Scaleway data into broader monitoring or inventory systems. High-Level Steps Webhook Trigger:** Receives an HTTP POST request (with basic authentication) containing the search criteria (search_by and search). Server Data Collection:** Fetches server data from Scaleway’s API endpoints for both instances and baremetal servers across defined zones. Data Processing:** Aggregates and normalizes the fetched data using a Code node with helper functions. Dynamic Filtering:** Routes data to dedicated filtering routines (by tags, name, public_ip, or zone) based on the input criteria. Response:** Returns the filtered data (or an error message) via a webhook response. Set Up Steps Insert Your Scaleway Token: In the “Edit Fields” node, replace the placeholder Your personal Scaleway X Auth Token with your Scaleway API token. Configure Zones: Review or update the zone lists (ZONE_INSTANCE and ZONE_BAREMETAL) to suit your environment. Send a Request: Make a POST request to the workflow’s webhook endpoint with a JSON payload, for example: { "search_by": "tags", "search": "Apiv1" } View the Results: The workflow returns a JSON array of servers matching your criteria, including details like name, tags, public IP, type, state, zone, and user.
by Eduard
Are you a visual thinker working with n8n? 🎨 View and understand workflow structures at a glance with this template! Built with mermaid.js, Bootstrap 5 and AXAJ to create an interactive web page displaying n8n workflows as flowcharts. 🌟 Perfect for documentation, presentations, or just getting a clearer picture of your automation processes. Need customization help? Reach out to Eduard! Benefits 📊 Instant workflow visualization 📱 Responsive design 🔗 Direct links to n8n workflows 🧩 Special shapes for different node types 🚫 Disabled node indication 🔒 No external dependencies – just paste the workflow and call the webhook 🛠️ Easily customizable – enhance the JS script or add custom styling ⚠️ Important note for cloud users ⚠️ Since the cloud version doesn't support environmental variables, please make the following changes in the CONFIG node: Update the instance_url variable: Enter your n8n URL instead of {{$env["N8N_PROTOCOL"]}}://{{$env["N8N_HOST"]}} Change the webhook_path to simply "webhook" instead of {{$env["N8N_ENDPOINT_WEBHOOK"] || "webhook"}} 🌟 Examples Multiple flowcharts on a single page: Several shapes for different nodes: Langchain nodes with special connections styling:
by Jonathan | NEX
Effortlessly integrate NixGuard API into your n8n workflows for real-time security insights using your API key. This connector enables seamless interaction with Nix, providing rapid Retrieval-Augmented Generation (RAG) event knowledge with Wazuh integration - completely free and set up in under 5 minutes! 🚀 Features: ✅ Query NixGuard's AI-driven security insights via API authentication ✅ Real-time security event knowledge integration ✅ Plug-and-play workflow trigger for effortless automation ✅ Wazuh compatibility for full security visibility 🛠 How to Use: 1️⃣ Add your API Key to authenticate with NixGuard. 2️⃣ Integrate with your existing n8n workflows using the workflow trigger (default enabled). 3️⃣ (Optional) Activate the chat trigger to streamline security queries via chat-based inputs. 4️⃣ Run the workflow and get instant security intelligence! 📢 Perfect for: Startup CTO's, SOC teams, security engineers, and developers needing real-time security automation within their infrastructure. 🔗 Learn more about NixGuard: thenex.world 🔗 Get started with a free security subscription: thenex.world/security/subscribe
by Klaasjan te Voortwis
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Export workflows with readable names, tagged for different environments To ensure understandable workflow exports, ease of use in delivery pipelines, and a better developer experience, this workflow helps with exporting workflows. Inner workings First, the workflow ensures that the directory structure for storing the workflows is correct. Exports all workflows. Next, it processes all workflow files and stores them with readable names. Based on tags, it will also export to dev and prod folders for easy commit and usage in a delivery pipeline. Configration No special setup is required for readable exporting. Usage Create a workflow and tag it with 'Auto deploy to dev' Run the workflow, this will create the needed folders and workflows with readable names. Commit these in your version control. Have a CICD pipeline build an n8n container —see the attached Dockerfile. Check our Auto Starter workflow for auto-starting workflows after deployment. CI/CD Bonus: Attached are two nodes with some example configuration on building your own automated n8n deployment. A Dockerfile, to get the new entrypoint and exported workflows packaged in the container. An updated entrypoint to build your own container, import the workflows, and run the Auto Starter. Set the following environment variables: STARTUP_WORKFLOWS_LOAD_LOCATION: to specify the folder to import from and distinguish between environments. STARTUP_WORKFLOW_ID: the ID of the workflow to run after starting n8n. > Note: The 'Instance Started' n8n trigger won't work, as all workflows are disabled upon import.
by Oneclick AI Squad
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This automated n8n workflow streamlines the process of screening CVs and validating candidate information using AI and email parsing. The system listens for new emails with CV attachments, extracts and processes the data, and either saves valid CVs to a target directory or notifies HR of invalid submissions. Good to Know The workflow improves efficiency by automating CV screening and validation. Ensures only CVs with essential fields (e.g., name, email, skills) are processed further. Email notifications alert HR to incomplete or invalid CVs for timely follow-up. The system pauses until all CV data is fully loaded to avoid processing errors. How It Works Trigger on New CV Email - Detects new emails with CV attachments. Extract Text from PDF CV - Parses content from attached PDF files. Ensure All CV Data Loaded - Waits until all data is ready for processing. Parse & Structure CV Information - Extracts structured details like name, skills, and experience using AI or custom logic. Check CV for Required Fields - Verifies the presence of essential fields (e.g., name, email, skills). Save Valid CV to Folder - Stores successfully validated CVs into a target directory. Notify HR of Invalid CV - Sends an email alert for incomplete or invalid CVs. Data Sources The workflow processes data from email attachments: CV PDF Files** - Contains candidate information in PDF format. How to Use Import the workflow into n8n. Configure email account credentials for monitoring new CV emails. Set up a target directory for storing validated CVs. Test with sample CV PDFs to verify extraction and validation. Adjust AI or custom logic based on specific required fields. Monitor email notifications for invalid CVs and refine the process as needed. Requirements Email account access with IMAP/POP3 support. PDF parsing capabilities (e.g., OCR or text extraction tools). AI or custom logic for data extraction and validation. A target directory for storing validated CVs. Customizing This Workflow Modify the "Check CV for Required Fields" node to include additional required fields (e.g., education, certifications). Adjust the email notification format to include more details about invalid CVs. Integrate with HR software for seamless candidate tracking. Details The workflow ensures efficient CV screening by automating repetitive tasks. Notifications help maintain a high-quality candidate pool by addressing issues early.
by Oneclick AI Squad
This automated n8n workflow efficiently manages the setup, creation, and deletion of PostgreSQL and MySQL databases on a Linux server, executing tasks in approximately 10 seconds. It automates installation, configuration, and user management with support for remote access. Core Elements Set Parameters** - Defines server details, database type, action, and credentials Type Check** - Confirms the selected database type PostgreSQL Action Check** - Identifies the action for PostgreSQL PostgreSQL Create Check** - Validates creation conditions for PostgreSQL Install PostgreSQL** - Sets up and configures PostgreSQL Create PostgreSQL DB** - Establishes a new PostgreSQL database with user access Delete PostgreSQL DB** - Removes a PostgreSQL database and user MySQL Action Check** - Identifies the action for MySQL MySQL Create Check** - Validates creation conditions for MySQL Install MySQL** - Sets up and configures MySQL Create MySQL DB** - Establishes a new MySQL database with user access Delete MySQL DB** - Removes a MySQL database and user Format Output** - Structures the final workflow output Getting Started Guide Import the workflow into n8n Adjust parameters in the Set Parameters node Execute the workflow Confirm the database operation on the server Necessary Requirements SSH-enabled Linux server Root-level access rights Customization Options Switch db_type between PostgreSQL and MySQL Select action (install, create, delete) via the action parameter Tailor database_name, db_user, and db_password as needed Features Install Database Server - Deploys PostgreSQL or MySQL with optimal configuration Create Database - Generates new databases with assigned users and permissions Delete Database - Eliminates databases and their associated users Parameters to Configure server_host: Your Linux server IP address server_user: SSH username (typically 'root') server_password: SSH password db_type: Select 'postgresql' or 'mysql' action: Select 'install', 'create', or 'delete' database_name: Name of the database to create or delete db_user: Database username db_password: Database password How to Use Copy the JSON code from the artifact Access your n8n workspace Choose "Import from JSON" or "+" → "From JSON" Insert the JSON code Set parameters in the "Set Parameters" node with your server information Run the workflow Workflow Actions Install: Sets up the database server, enables remote access, and initializes the database Create: Establishes a new database with a specific user Delete: Erases the database and its associated user The workflow automatically manages Ubuntu/Debian package setup Service initialization and configuration Remote access setup User and permission assignments Authentication configuration Update the parameters in the "Set Parameters" node with your server specifics and execute the workflow!
by CustomJS
This n8n workflow illustrates how to convert PDF files into text with the PDF Toolkit from www.customjs.space. @custom-js/n8n-nodes-pdf-toolkit Notice Community nodes can only be installed on self-hosted instances of n8n. What this workflow does Change** the requested HTML to PDF.. Extract** text from the PDF. Use** a Code node to handle URLs that point to PDF files. Convert** the PDF to text. Requirements Self-hosted** n8n instance. CustomJS API key** for converting PDF to text. HTML** Data to convert PDF files. Code node** for handling URL that indicates PDF file. Workflow Steps: Manual Trigger: Runs with user interaction. HTML to PDF: Request HTML Data Convert HTML to PDF Convert PDF to Text: Convert the generated Text from PDF Usage Get API key from customJS Sign up to customJS platform. Navigate to your profile page Press "Show" button to get API key Set Credentials for CustomJS API on n8n Copy and paste your API key generated from CustomJS here. Design workflow A Manual Trigger for starting workflow. HTTP Request Nodes for downloading PDF files. Code node for handling URL that indicates PDF file. Convert PDF to Text. You can replace logic for triggering and returning results. For example, you can trigger this workflow by calling a webhook and get a result as a response from webhook. Simply replace Manual Trigger and Write to Disk nodes.
by Yaron Been
This workflow provides automated access to the Fire Part Crafter AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for image generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete image generation process using the Fire Part Crafter model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: PartCrafter is a structured 3D mesh generation model that creates multiple parts and objects from a single RGB image. Key Capabilities High-quality image generation from text prompts** Advanced AI-powered visual content creation** Customizable image parameters and styles** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Fire/part-crafter AI model Fire Part Crafter**: The core AI model for image generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Image Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Content Creation**: Generate unique images for blogs, social media, and marketing materials Design Prototyping**: Create visual concepts and mockups for design projects Art & Creativity**: Produce artistic images for personal or commercial use Marketing Materials**: Generate eye-catching visuals for campaigns and advertisements Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #imagegeneration #aiart #texttoimage #visualcontent #aiimages #generativeart #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation