by Yaron Been
Description This workflow automatically generates Facebook ad headlines for your product using OpenAI and evaluates their quality using custom AI-generated criteria. It ensures you get high‑quality, scroll‑stopping headlines without needing a copywriter. Overview This workflow captures a product description via a form, generates a Facebook ad headline, invents a scoring rubric, evaluates the headline against it, and optionally loops for revisions — all autonomously. Ideal for marketers and media buyers looking to scale creative testing. Tools Used n8n**: The automation platform that powers and orchestrates the entire workflow. OpenAI**: Used for headline generation, scoring criteria creation, and evaluation logic. (Optional)** Google Sheets / Notion / Email: For logging approved headlines or sharing results. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Connect OpenAI: Add your OpenAI credentials to the GPT nodes. Customize the Prompt (optional): Tweak the system prompt inside the Set_PromptForHeadline node. Add Output Handling (optional): Connect the “NO” path in the If_NeedMoreIterations node to Google Sheets, Slack, etc. (Optional) Add loop limits or storage logic to manage iterations or save results. Use Cases Media Buyers**: Generate and test hooks at scale with no creative bottlenecks. Solo Marketers**: Get high-converting headlines even without a copywriter. Agencies**: Streamline copy testing and evaluation in client campaigns. Startup Teams**: Automate creative generation during product launches or A/B tests. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Hashtags #n8n #openai #automation #copywriting #facebookads #headlines #aicopy #promptengineering #marketingautomation #nocode #llm #creativeautomation #mediabuying #adtesting #adcreative #marketingtools #digitalmarketing #copytesting #scalablecreative #chatgpt #adhooks #growthmarketing #automatedworkflows #aiworkflow #creativeops #marketingops #growthtools
by WeblineIndia
This workflow automatically fetches newly uploaded files from a specific folder in Google Drive, shares them via email with specified recipients, and logs the file details (name, ID, created time, modified time) into Airtable for easy tracking. It streamlines the process of file sharing and management while keeping track of important metadata in a central place. Step-by-Step Instructions Google Drive Node (Fetch New File) Action: This node fetches newly uploaded files from the specific folder you’ve mentioned in your Google Drive. Configuration: Set the folder ID in the Google Drive node where the files are uploaded. Use the “New File in Folder” trigger to automatically detect new files added to the folder. Send Email Node (Share File via Email) Action: After detecting the new file, this node shares the file via email with the recipient you specify. Configuration: Set the recipient's email address. Include the file URL from the Google Drive node in the email body, allowing easy access to the file. Add the file name as part of the email subject or body to notify the recipient about the new file. Airtable Node (Store File Metadata) Action: This node stores the file’s metadata, such as name, ID, creation time, modification time, and the email address to which it was sent, in your Airtable database. Configuration: Set up Airtable with a table. Map the output from the Google Drive node to store the file metadata, and use the email address from the email node for tracking. About WeblineIndia WeblineIndia specializes in delivering innovative and custom AI solutions to simplify and automate business processes. If you need any help, please reach out to us.
by Davide
Workflow Overview This workflow automates the creation and management of a custom OpenAI Assistant for a travel agency ("Travel with us"), leveraging Google Drive for document storage. How It Works 1. Create the OpenAI Assistant Node**: OpenAI Action: Creates a custom assistant named "Travel with us" Assistant using the gpt-4o-mini model. Instructions: Respond only using the provided document (e.g., agency-specific info). Stay friendly, brief, and focused on travel-related queries. Ignore irrelevant questions politely. Credentials: Requires OpenAI API key. 2. Upload Agency Document Google Drive Node**: Action: Downloads a Google Doc as a PDF. OpenAI2 Node**: Action: Uploads the PDF to OpenAI with purpose: "assistants". Output: Generates a file_id. 3. Update the Assistant with the Document OpenAI Node**: Action: Updates the assistant to include the uploaded file. 4. Chat Interaction Chat Trigger**: Activates when a message is received ("When chat message received"). OpenAI Assistant Node**: Action: Uses the updated assistant to respond to user queries. Memory: Window Buffer Memory retains chat context for coherent conversations. Set Up Steps Prepare the Document: Store your travel agency guide in Google Drive (e.g., as a Google Doc). Update the Google Drive node with your document’s ID. Configure Credentials: Google Drive: Connect via OAuth2 (googleDriveOAuth2Api). OpenAI: Add your API key to all OpenAI nodes. Customize the Assistant: Modify the instructions in the OpenAI node to reflect your agency’s needs. Ensure the document includes FAQs, policies, and travel info. Test the Workflow: Trigger manually ("Test workflow") to create the assistant and upload the file. Send a chat message (e.g., "What are your travel packages?") to test responses. Dependencies Google Drive Account**: To store and retrieve the agency document. OpenAI API Access**: For assistant creation and file uploads.
by Arnaud MARIE
Replicate Line Items on New Deal in HubSpot Workflow Use Case This workflow solves the problem of manually copying line items from one deal to another in HubSpot, reducing manual work and minimizing errors. What this workflow does Triggers** upon receiving a webhook with deal IDs. Retrieves** the IDs of the won and created deals. Fetches** line items associated with the won deal. Extracts** product SKUs from the retrieved line items. Fetches** product details based on SKUs. Creates** new line items for the created deal and associates them. Sends** a Slack notification with success details. Step up steps Create a HubSpot Deal Workflow 1.1 Set up your trigger (ex: when deal stage = Won) 1.2 Add step : Create Record (deal) 1.3 Add Step : Send webhook. The webhook should be a Get to your n8n first trigger. Set two query parameter : deal_id_won as the Record ID of the deal triggering the HubSpot Workflow deal_id_create as the Record ID of the deal created above. Click Insert Data -> The created object Set up your HubSpot App token in HubSpot -> Settings -> Integration -> Private Apps Set up your HubSpot Token integration using the predefined model. Set up your Slack connection Add an error Workflow to monitor errors
by Jonathan
This workflow automatically posts a message in Slack when a new invoice is uploaded in Stripe, and it updates the fields in the HubSpot CRM. Prerequisites A Slack account and credentials A HubSpot account and credentials A Stripe account and credentials Nodes Stripe Trigger node triggers the workflow when a new invoice is uploaded. IF nodes filter the invoices that don't have a PO number and if there is no deal for the PO. HubSpot nodes retrieve deals with the specific PO number and update the deal status to 'paid'. Slack nodes post messages about the deals in a Slack channel.
by David Olusola
Overview A comprehensive educational workflow that demonstrates practical JavaScript usage in n8n's Code node through real-world business scenarios. Perfect for learning data manipulation, transformation, and automation patterns that you can immediately apply to client projects. What This Template Teaches: Data Filtering & Transformation - Filter employees by age, calculate bonuses, format contact information Statistical Analysis - Generate team statistics, averages, role distributions, and KPIs Multi-Format Export - Create CSV files, email lists, and API-ready payloads from raw data n8n Best Practices - Proper JSON handling, return formats, and data flow patterns How It Works: Manual Trigger starts the workflow with sample employee data Set Sample Data provides realistic business data (employees with roles, salaries, ages) Three Code Node Examples process the same data differently: Filter & Transform: Creates adult employee list with calculated bonuses Calculate Stats: Generates comprehensive team analytics and reports Format for Export: Prepares data for external systems (APIs, emails, CSV) Key Learning Points: Access input data using items[0].json.propertyName Return proper n8n format with [{ json: data }] structure Use JSON.parse() for string-to-object conversion Apply JavaScript array methods (filter, map, reduce) for data processing Handle multiple output scenarios and data aggregation Perfect For: n8n beginners learning Code node fundamentals Developers transitioning to n8n automation Client demos showing data processing capabilities Team training and onboarding sessions Foundation for building custom business automation workflows Business Use Cases: Transform this template for lead qualification, customer segmentation, report generation, data enrichment, and API integrations. Each Code node pattern can be adapted for different industries and automation needs.
by Deborah
Use n8n to bring data from any API to your AI. This workflow uses the Chat Trigger to provide the chat interface, and the Custom n8n Workflow Tool to call a second workflow that calls the API. The second workflow uses AI functionality to refine the API request based on the user's query. It then makes an API call, and returns the response to the main workflow. This workflow is used in Advanced AI examples | Call an API to fetch data in the documentation. To use this workflow: Load it into your n8n instance. Add your credentials as prompted by the notes. Requires n8n 1.28.0 or above
by Flavien
Audio Generator – Documentation 🎯 Purpose: Generate audio files from text scripts stored in Google Drive. 🔁 Flow: Receive repo IDs. Fetch text scripts. Generate .wav files using local Bark model. Upload back to Drive. 📦 Dependencies: Python script: /scripts/generate_voice.py Bark (voice generation system) n8n instance with access to local shell Google Drive OAuth2 credentials ✏️ Notes: Script filenames must end with .txt Only works with plain text No external API used = 100% free 📦 /scripts/generate_voice.py: import sys import torch import numpy import re from bark import SAMPLE_RATE, generate_audio, preload_models from scipy.io.wavfile import write as write_wav Patch to allow numpy._core.multiarray.scalar during loading torch.serialization.add_safe_globals([numpy._core.multiarray.scalar]) Monkey patch torch.load to force weights_only=False _original_torch_load = torch.load def patched_torch_load(f, args, *kwargs): if 'weights_only' not in kwargs: kwargs['weights_only'] = False return _original_torch_load(f, args, *kwargs) torch.load = patched_torch_load Preload Bark models preload_models() def split_text(text, max_len=300): Split on punctuation to avoid mid-sentence cuts sentences = re.split(r'(?<=[.?!])\s+', text) chunks = [] current = "" for sentence in sentences: if len(current) + len(sentence) < max_len: current += sentence + " " else: chunks.append(current.strip()) current = sentence + " " if current: chunks.append(current.strip()) return chunks Input text file and output path input_text_path = sys.argv[1] output_wav_path = sys.argv[2] with open(input_text_path, 'r', encoding='utf-8') as f: full_text = f.read() voice_preset = "v2/en_speaker_7" chunks = split_text(full_text) Generate and concatenate audio chunks audio_arrays = [] for chunk in chunks: print(f"Generating audio for chunk: {chunk[:50]}...") audio = generate_audio(chunk, history_prompt=voice_preset) audio_arrays.append(audio) Merge all audio chunks final_audio = numpy.concatenate(audio_arrays) Write final .wav file write_wav(output_wav_path, SAMPLE_RATE, final_audio) print(f"Full audio generated at: {output_wav_path}") `
by Friedemann Schuetz
What this workflow does This workflow retrieves Google Analytics data from the last 7 days and the same period in the previous year. The data is then prepared by AI as a table, analyzed and provided with a small summary. The summary is then sent by email to a desired address and, shortened and summarized again, sent to a Telegram account. This workflow has the following sequence: time trigger (e.g. every Monday at 7 a.m.) retrieval of Google Analytics data from the last 7 days assignment and summary of the data retrieval of Google Analytics data from the last 7 days of the previous year allocation and summary of the data preparation in tabular form and brief analysis by AI. sending the report as an email preparation in short form by AI for Telegram (optional) sending as Telegram message. Requirements The following accesses are required for the workflow: Google Analytics (via Google Analytics API): Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) SMTP access data (for sending the mail) Telegram access data (optional for sending as Telegram message): Documentation Feel free to contact me via LinkedIn, if you have any questions!
by bangank36
This workflow automates the Mark as Fulfilled action in Squarespace for each order, ensuring a seamless fulfillment process without manual intervention. How It Works This workflow retrieves all pending Squarespace orders and processes their fulfillment automatically. The workflow follows these steps: 1️⃣ Get all pending orders using the HTTP Request node (Since Squarespace does not have a n8n node) 2️⃣ Create a fulfillment request using Fulfill Order node The Filter Orders node can be used to filter valid pending order to process. Step-by-step The workflow can be run as requested or on schedule You can adjust these parameters within the Global and filter nodes: Global node for API Setting api-version** (string, required) – The current API version (see Squarespace Orders API documentation). modifiedAfter**={a-datetime} (string, conditional) – Fetch orders modified after a specific date (ISO 8601 format). modifiedBefore**={b-datetime} (string, conditional) – Fetch orders modified before a specific date (ISO 8601 format). cursor**={c} (string, conditional) – Used for pagination, cannot be combined with other filters. fulfillmentStatus**={status} (optional, enum) – Filter by fulfillment status: PENDING, FULFILLED, or CANCELED. maxPage** – Set -1 to enables infinite pagination to fetch all available orders. Filter Orders node Order Filtering – Ensures only valid orders are fulfilled, particularly useful if: You sell digital downloads or gift cards exclusively. You use third-party fulfillment services for all products. Requirements Credentials To use this workflow, you need: Squarespace API Key – Retrieve from your Squarespace settings. Who Is This For? Squarespace store owners looking to automate their fulfillment process. Merchants selling digital or personalized products who need instant fulfillment. Explore More Templates Get all orders in Squarespace to Google Sheets Convert Squarespace Profiles to Shopify Customers in Google Sheets Fetch Squarespace Blog & Event Collections to Google Sheets 👉 Check out my other n8n templates
by PiAPI
What's the workflow used for? Leverage this Kling API (unofficial) provided by PiAPI workflow to streamline virtual try-on video creation. This tool is designed for e-commerce platforms, fashion brands, content creators and content influencers. By uploading model and clothing images and linking PiAPI account, users can swiftly generate a realistic video of the model sporting the outfit with a 360° turn, offering an immersive viewing experience. Step-by-step Instruction For basic settings of virtual try-on, check API doc to get best practice. Fill in your X-API-Key of your PiAPI account in Preset Parameters node. Upload the model photo and provide target clothing image urls. Click Test Workflow to generate virtual try-on image. Get the video output in the final node. Param Settings If you want to change into a dress, input the model_input URL and the dress_input URL in the parameters. If you want to change into separates, input model_input URL, upper_input URL and lower_input URL in Preset Parameters. Use Case Input images: Output Video The output demonstrates that the model is wearing the clothing from the specified image and showcases a rotating runway-style view. This workflow enables you to efficiently test garment-on-model presentation effects while reducing business model validation costs to a certain extent.
by David w/ SimpleGrow
Scheduled Trigger: Every X day at Y pm, the workflow is automatically triggered. Fetch User Data: The workflow retrieves all user records from the "WhatsApp Engagement Database" in Airtable. Each record contains the user’s WhatsApp ID, current points, and the number of raffle vouchers. Personalized Message Preparation: For each user, a personalized WhatsApp message is prepared. The message includes: The user’s current point total The number of raffle vouchers they have for the week Encouragement to keep engaging for more chances to win Information about the weekly raffle and available prizes Send WhatsApp Message: The workflow sends this personalized message to each user via the Whapi API, using their WhatsApp ID. Result: Every active user receives a weekly update about their engagement status, raffle tickets, and a motivational message to encourage further participation. This helps boost engagement and keeps users informed about their progress and chances in the weekly raffle.