by Yaron Been
Ndreca Hunyuan3d 2 Test AI Generator Description None Overview This n8n workflow integrates with the Replicate API to use the ndreca/hunyuan3d-2-test model. This powerful AI model can generate high-quality other content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters image** (string): Input image for generating 3D shape Optional Parameters seed** (integer, default: 1234): Random seed for generation steps** (integer, default: 50): Number of inference steps num_chunks** (integer, default: 200000): Number of chunks for mesh generation max_facenum** (integer, default: 40000): Maximum number of faces for mesh generation guidance_scale** (number, default: 5.5): Guidance scale for generation octree_resolution** (string, default: 512): Octree resolution for mesh generation remove_background** (boolean, default: True): Whether to remove background from input image How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate other content Access the generated output from the final node API Reference Model: ndreca/hunyuan3d-2-test API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of other generation parameters
by Airtop
LinkedIn Post Engagement Data Extractor Use Case This automation is designed to extract key engagement metrics and audience data from a LinkedIn post. It's useful for analyzing the impact of content and identifying engaged users for lead generation, marketing, or research purposes. What It Does Given a LinkedIn post URL and an Airtop profile, this automation extracts: Total number of reactions Total number of comments Total number of reposts A list of users who reacted or commented, including: Their full name Their job title A link to their LinkedIn profile Input Parameters | Name | Description | Required | |------------------|------------------------------------------------------------|----------| | airtop_profile | The name of an Airtop Profile that's logged into LinkedIn | Yes | | linkedin_post_url| The full URL of the LinkedIn post you want to analyze | Yes | How It Works The workflow starts when triggered manually or from another workflow/form. It maps input fields for Airtop profile and post URL. Airtop opens a browser session and loads the LinkedIn post. An AI agent is instructed to extract engagement data via prompt-based analysis. The response is parsed and output in a structured format. Output Format The output will be a structured JSON object with the following fields: { "interactors": [ { "name": "Jane Doe", "job_title": "Marketing Director at ExampleCorp", "profile_url": "https://linkedin.com/in/janedoe" } // ... more interactors ], "reactions_count": 153, "comments_count": 21, "reposts_count": 8 } Read more about how to extract Linkedin post comments and reactions.
by Miquel Colomer
Disclaimer: This template contains a community node and therefore only works for n8n self-hosted users== This is Miquel from Aprende n8n and Automate with n8n. We have created a new community node Document Generator that generates dynamic content using templates. You can compose complex content with no SETs or FUNCTION ITEMs nodes using this node, like: Send one email with a list of items in the body (i.e., one email with the last entries of an RSS feed). Send one email per item (i.e., one invoice per email). Emails are just a sample. You can create complex dynamic content to: Send messages to Telegram/Slack. Create WordPress entries. Create HTML pages for your website. Create tickets. And more! The sky is your limit ;) If you want to use this workflow, install the community node n8n-nodes-document-generator from Settings > Community nodes. Type "n8n-nodes-document-generator", check "I understand the risks..." and click on "Install": Later, copy and paste this workflow into your n8n. You will get this workflow: This workflow uses the Customer Datastore node to generate sample input items. You can render one template with all items (enable "Render All Items with One Template"): or one template per input item: Visit the oficial NPM page to see more samples. Learning n8n by yourself is nice, but a bit tricky :) We offer n8n training video courses at Aprende n8n. If you need custom trainings, let us know. Additionally, you can contact us at Automate with n8n if you need the next services: Custom installations. Custom nodes. Monitor and alarms. Delegate 12/5 or 24/7 workflow issue resolutions. Automated backups of your workflows. HTTP integrations of non-supported APIs. Complex workflows. I hope you will enjoy this new node and this workflow. Automate your life! Automate it with n8n!
by David Olusola
🚀 Automated Lead Scraper Workflow (Apify + n8n + Google Sheets) 🧠 What It Does This n8n workflow automates the process of scraping leads using Apify, cleaning the extracted data, and exporting it to Google Sheets—ready for use in outreach, prospecting, or CRM pipelines. 🔄 Workflow Steps ✅ Start – Manually triggers the workflow. 🧩 Set Variables – Stores required Apify credentials: APIFY_TOKEN: Your Apify token. APIFY_TASK_ID: The Apify task to run. 🕸️ Run Apify Scraper – Launches the scraper and fetches the dataset. 🧹 Clean Data – Processes scraped results to: ✂️ Strip non-numeric characters from phone numbers. ✉️ Format emails (lowercase + trimmed). 📊 Export to Google Sheets – Appends clean data to your spreadsheet: 🏢 company name → from title 📞 phone → cleaned number 📍 address → from scraped info 🛠️ Requirements 🕷️ Apify Account A valid APIFY_TOKEN An existing Apify task (APIFY_TASK_ID) 📗 Google Sheets Access OAuth2 credentials set up in n8n (e.g., "Google Sheets account 2") 🚦 How to Use ⚙️ Open the Variables node and plug in your Apify credentials. 📄 Confirm the Google Sheets node points to your desired spreadsheet. ▶️ Run the workflow manually from the Start node. 📥 Output A ready-to-use sheet of cleaned lead data containing: Company names Phone numbers Addresses 💼 Perfect For: Sales teams doing outbound prospecting Marketers building lead lists Agencies running data aggregation tasks
by Yang
This workflow helps digital marketers and outreach specialists automate the research and creation of cold email icebreakers for local businesses. What it does: Starts with a Form Trigger, where you input a search keyword (e.g., “Dentist in New York”). Uses Dumpling AI’s Google Maps API to search for local businesses matching the keyword. Extracts individual business data, including website URLs. Sends each website to Dumpling AI to extract: A website summary for personalization An email address (if available) Sends the summary and business info to GPT-4 via OpenAI to write a short, warm, and customized icebreaker message. Filters out results with missing email addresses. Logs the business name, email, website, phone number, website summary, and generated icebreaker into Google Sheets. Optionally pushes the lead and personalization to Instantly.ai for automated cold outreach. Tools Used: Form Trigger (n8n) Dumpling AI (Search & Extraction APIs) OpenAI GPT-4 (via LangChain Node) Google Sheets Instantly.ai (optional lead delivery) 🛠️ How to Customize the Workflow Change the search region or business type:* Adjust the default keyword in the *Form Trigger** or connect a different input source (like Google Sheets). Customize the prompt:* Modify the *GPT-4 node prompt** to match your agency tone or outreach style. Add or remove data fields:* Edit the *Google Sheets node** to store additional business data or remove unnecessary ones. Connect to your CRM or outreach tool:* Replace or extend the *Instantly API node** with your own CRM (e.g., HubSpot, Close, Pipedrive) using HTTP Request or native integrations. Control batching size:* The *Split In Batches node** is set to 2 by default. You can increase this to speed up processing or reduce it to avoid rate limits. This automation is ideal for sales teams, digital marketing freelancers, and agencies who want to scale lead generation while keeping emails personal and relevant.
by Audun
Description This workflow reads a sitemap.xml file, extracts all URLs, and allows you to filter out specific types of links—such as PDF files, images, or any other content—based on your needs. Who Is This For? SEO Specialists** looking to analyze specific URLs in their sitemap. Developers** who need to extract links for automated processing. Content Managers** filtering out downloadable assets like PDFs or images. How It Works Fetch sitemap.xml – The workflow reads the sitemap file from a given URL. Extract URLs – Parses all the URLs listed in the sitemap. Filter URLs – Use a simple filter to extract only the links you need (e.g., *.pdf). Export or Process – The filtered list can be sent via email, stored in a database, or used in another workflow. Customization Edit the Set sitemap URL block and edit the sitemapUrl value to the sitemap you want to fetch. Edit the Filter URLs block and edit the filter conditions to meet your needs.
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 This workflow provides an API endpoint to generate speech from text using Elevenlabs.io, a popular text-to-speech service. Step 1: Configure Custom Credentials in n8n To set up your credentials in n8n, create a new custom authentication entry with the following JSON structure: { "headers": { "xi-api-key": "your-elevenlabs-api-key" } } Replace "your-elevenlabs-api-key" with your actual Elevenlabs API key. Step 2: Send a POST Request to the Webhook Send a POST request to the workflow's webhook endpoint with these two parameters: voice_id: The ID of the voice from Elevenlabs that you want to use. text: The text you want to convert to speech. This workflow has been a significant time-saver in my video production tasks. I hope it proves just as useful to you! Happy automating! The n8Ninja
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 This workflow exposes an API endpoint that lets you dynamically replace an image in Google Slides, perfect for automating deck presentations like updating backgrounds or client logos. *📺 Youtube Overview 📺 * Here's how to get started: Step 1: Set Up a Key Identifier in Google Slides Add a unique key identifier to the images you want to replace. Click on the image. Go to Format Options and then Alt Text. Enter your unique identifier, like client_logo or background. Step 2: Use a POST Request to Update the Image Send a POST request to the workflow endpoint with the following parameters in the body: presentation_id: The ID of your Google Slides presentation. You can find it in the URL of your Google presentation: https://docs.google.com/presentation/d/<this-part>/edit) image_key: The unique identifier you created. image_url: The URL of the new image. That's it! The specified image in your Google Slides presentation will be replaced with the new one from the provided URL. This workflow is designed to be flexible, allowing you to use the same identifier across multiple slides and presentations. I hope it streamlines your slide automation process! Example Curl Request to execute: curl --location 'https://workflow.url' \ --form 'presentation_id="google-presentation-id"' \ --form 'image_key="background"' \ --form 'image_url="https://picsum.photos/536/354"' Happy automating! The n8Ninja 🥷
by Strategiflows
Who Is This For? E-commerce managers, data analysts, and n8n beginners who need a hands-off way to pull all Shopify orders—even stores with thousands of orders—into Google Sheets for reporting or BI. What Problem Does It Solve? Shopify’s GraphQL API only returns up to 250 orders per call, forcing you to manually manage cursors and loops. This template handles the “get next 250” logic for you, so you never miss an order. What This Workflow Does Schedule Trigger – Runs at your chosen cadence (daily, hourly, or manual). Set Date Range – Defines startDay and endDay based on $now. GraphQL Loop – Fetches orders 250 at a time, using pageInfo.hasNextPage and endCursor until complete. Code Node – Flattens orders into line-item rows and summarizes by SKU/vendor. Google Sheets – Appends results to your sheet for easy analysis.
by Khaled
🧾 Description: This automation uses GPT-4o to scan unread Gmail emails and intelligently classify them as: Action → Requires your attention (reply, review, schedule, or respond) No Action → Informational or promotional; no action needed The result? You eliminate inbox noise and gain a clear daily routine: only check what's in Action Required. ⚙️ How It Works: Trigger: Runs on a customizable schedule Fetch Emails: Pulls unread messages from Gmail Classify via GPT-4o: Determines if each email needs action or not Sort Emails: Labels actionable emails as Action Required Labels non-actionable ones as No Action Removes the Inbox label to clean your primary inbox view ✅ Emails stay in your account—just better organized 🚀 How to Use: Import the workflow into your n8n instance Set up Gmail and OpenAI credentials Create Gmail labels: Action Required No Action Activate the workflow Start your day by checking only the Action Required label 📦 Requirements: n8n (self-hosted or cloud) Gmail OAuth2 account OpenAI API key (GPT-4o or GPT-4o-mini) Gmail labels: Action Required, No Action 💡 Why It Matters: Stop manually filtering emails. This workflow helps you focus only on what matters while keeping everything else out of your way—without deleting or archiving anything.
by LuisBetancourt.co
Description Whenever a Zoom “Meeting assets” email arrives in your Gmail inbox, this workflow will: 1) Trigger on new Gmail messages filtered by the subject “Meeting assets”. 2) Extract from the email (HTML or plain text): 3) Type of session (e.g. “1 hour”, “2 hours”, or “exploratory call”). Client’s full name. Session date & time (from the GMT… timestamp). Duration (HH:MM:SS). Recording link. Quick summary. Detailed summary. List of next steps. 4) Lookup the client in your Master Airtable base, table People, by full name. 5) Send a personalized Gmail to the client with all extracted details. 6) Create a new record in your Sessions table in Airtable, linking back to that client. Quick Start Import this JSON into n8n as a new workflow. Connect your Gmail credentials (OAuth2). Connect your Airtable credentials (Personal Access Token). In the Search Records node: Base → your Master base ID. Table → “Your people table”. Filter By Formula → ={Full Name} = '{{ $json.clientName }}'. In the Create Record node: Table → “Sessions”. Map each field (dateTime, duration, summaries, next steps, client link). Activate the workflow. Prerequisites n8n v1.50 or higher A Gmail account with OAuth2 credentials configured An Airtable base containing: Table People with a Full Name field (and email). Table Sessions with fields: DateTime, Duration, Quick Summary, Detailed Summary, Next Steps, and a Linked Record to People. An Airtable Personal Access Token with read/write access to that base. Tips & Extensions Timezone conversion: Use a Function node with moment-timezone to convert UTC if needed. Error handling: Add a catch node to log or notify if any field fails to parse. Alternate notifications: Swap the Gmail node for Slack, Microsoft Teams, or SMS integrations. With this documentation, your team can import and deploy the workflow in minutes. Enjoy!
by Łukasz
Who is it for? If you are having a lot of meetings as a project manager, CFO, CTO, CEO or any other role that requires handling many meetings, AND you are working with people in different timezones, you may have noticed that it is not uncommon that daylight savings time change day may differ from timezone to timezone. This may be very troublesome at times. If DST change day differs between timezones, then you might need to adjust your meetings time accordingly. And this happens twice a year. So it's good to get notification beforehand (at least a day before). This automation will notify you if tomorrow you can expect DST in any zone you provide. How It Works? Script runs daily and loops through provided timezones Checks if there is DST change to or from the tomorrow (if you want to be notified sooner, just adjust number of days) If there is DST change, script provides you with Slack notification (replace with email if needed) How to set up? Add and/or edit timezones you want to monitor in "Timezones List" node Adjust "Calculate Tomorrow's Date" if you want to be notified sooner than 1 day before DST change Adjust "Send Notification on Upcoming Change" to set where on Slack you want to be notified And that's it. Hope that you won't miss any other meeting because of DST!