by DanielV
This workflow is designed to translate SRT subtitle files from one language to another using Google Translate. The workflow follows these main steps: Accept an SRT file upload and target language selection Extract and parse the SRT file content Split the content into translatable segments Translate each segment using Google Translate Reassemble the translated content into a proper SRT format Return the translated file to the user You'll need a Google Console Cloud account to access the Translate API. Who is this for? This workflow is designed for content creators, video editors, translators, and anyone who needs to translate subtitle files (.srt) from one language to another. It's particularly useful for those working with international content, educational materials, or preparing videos for global audiences. What problem does this workflow solve? Translating subtitle files manually is time-consuming and error-prone. Professional translation services can be expensive, especially for multiple videos or long content. This workflow automates the translation process while maintaining the proper SRT format including timestamps and subtitle numbering. Setup Set up Google Translate credentials: -- Create a Google Cloud project and enable the Google Translate API -- Create OAuth credentials and configure them in the Google Translate node Customize language options: -- The default workflow includes English (EN) and Japanese (JP) options -- Add more language options by editing the dropdown field in the "Receive SRT File to Translate" node -- Use standard language codes that Google Translate supports Add more languages: -- Edit the form trigger node to include additional language options in the dropdown
by Rajeet Nair
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This workflow automatically collects daily trending topics from Twitter and YouTube, filters them for relevance, and uses an AI model (such as Mistral Cloud or another OpenAI-compatible API) to generate engaging social media hashtags. The final results, including source platform and date, are saved into a connected Google Sheet for easy access, tracking, or team collaboration. Ideal for content creators, marketers, and social media managers, this automation eliminates the manual effort of trend research and hashtag writing by combining real-time scraping with LLM-powered generation. The result is a scalable, daily strategy tool to stay aligned with what’s trending across major platforms. How It Works Daily Trigger Starts the workflow automatically on a daily schedule. Trend Scraping Scrapes current trending content from Twitter and YouTube using the Crawl and Scrape community node. Filtering & Slicing Removes irrelevant or duplicate entries and limits each platform’s list to top-performing trends. Merge Trends Combines Twitter and YouTube trends into a single dataset. AI Hashtag Generation Sends each trend topic to an AI model to generate relevant hashtags. Output to Google Sheets Loops through AI results and writes them to a Google Sheet, including trend, platform, hashtags, and timestamp. Setup Instructions Estimated time: 10–15 minutes Prerequisites A self-hosted instance of n8n (required for community nodes) API key for Mistral Cloud or any OpenAI-compatible LLM Google Sheets account connected via OAuth2 credentials Twitter and YouTube trend URLs (or scraping logic for target regions) Template Image: Example: Crawl and Scrape Node for Twitter Trends You can use the following configuration in the Crawl and Scrape node to extract Twitter trends from Trends24) { "parameters": { "url": "https://trends24.in/", "selectors": [ { "label": "Twitter Trends", "selector": ".trend-card__list li a", "type": "text" } ] }, "name": "Scrape Twitter Trends", "type": "n8n-nodes-crawl-and-scrape.crawlAndScrape", "typeVersion": 1, "position": [300, 200] } Google Sheet Column Format Column A: Generated Hashtags
by Sherlockes
What this template is made for: I have a personal Telegram channel and a bot inside it where I save interesting links that I want to save or read later. The idea is that n8n will take care of reading the new links added to this channel and send them, through the corresponding API, to the Hoarder and Readeck installations. How it works Since my server where n8n runs is not always on, a "Schedule Trigger" will be responsible for checking every so often if there is any new content in the Telegram channel where I store the links. This request is made through "http request" and the Telegram API. Next, a code block is responsible for filtering out everything that is not a hyperlink. At this point, the flow splits into two so that parallel and similar processes are performed for Hoarder and Readeck. The corresponding API is accessed to get a list of all the links saved in the corresponding service. A code block is responsible for filtering the list of hyperlinks previously obtained from Telegram so that only those that are not already saved in the service continue. Finally, another "Http Request" node is responsible for using the service API to save the link in the corresponding service. Configuration instructions The template makes use of the environment variables that I have declared in the n8n "docker-compose.yml" file through an external ".env" file. These are the variables I use: Telegram Bot Token Sherlink TG_SHERLINK_BOT_TOKEN=XXXXXXXX:XXXXXXXXXXXXXXXX Id Telegram Channel Sherlink TG_SHERLINK_ID=-XXXXXXXXXXXXX Readeck server READECK_SERVER=http://readeck.midomain.com READECK_API_KEY=xxxxxxxxxxxxx Hoarder server HOARDER_SERVER=http://hoarder.midomain.com HOARDER_API_KEY=xxxxxxxxxxxxxx Created in 1.85.4 n8n version
by Ranjan Dailata
Who is this for? This workflow is designed for HR professionals, employer branding teams, talent acquisition strategists, market researchers, and business intelligence analysts who want to monitor, understand, and act upon employee sentiment and company perception on Glassdoor. It's ideal for organizations that value real-time feedback, are tracking employer brand perception, or need summarized insights for leadership reporting without sifting through thousands of raw reviews. What problem is this workflow solving? Manually reviewing and analyzing Glassdoor reviews is tedious, subjective, and not scalable especially for larger companies or those with many subsidiaries. This workflow: Automates review collection by making a Glassdoor company request via the Bright Data Web Scrapper API. Uses Google Gemini to summarize the content. Sends an actionable summary to HR dashboards, leadership teams, or alert systems via the Webhook notification. What this workflow does Makes an HTTP Request to Glassdoor via the Bright Data Web Scrapper API. Polls the BrightData Glassdoor for the completion of the request. Downloads the Glassdoor response when a new snapshot is ready. Sends the prompt to Google Gemini for summarization. Delivers the summarized insights (strengths, weaknesses, sentiment, patterns) to a configured webhook or dashboard endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). A webhook or endpoint to receive the summary (e.g., Slack, Notion, or custom HR dashboard). How to customize this workflow to your needs Change Summary Focus by updating the Summarization of Glassdoor Response node Summarization methods and prompts to extract specific insights: Cultural feedback Leadership issues Compensation comments Exit motivation Update the HTTP Request to Glassdoor node with a specific Glassdoor Company information that you are looking for. Format the output to produce a customized summary to Markdown or HTML for rich delivery. Integrate with HR Systems BambooHR, Workday, SAP SuccessFactors via API. Google Sheets or Airtable
by Joseph LePage
🔍 This n8n workflow integrates Tavily's search and extract APIs with AI summarization capabilities to process web content efficiently. Quick Setup Get your Tavily API key from https://app.tavily.com/home Replace tvly-YOUR_API_KEY in the "Tavily API Key" node Connect your OpenAI credentials to the "OpenAI Chat Model" node Deploy the workflow and start the chat trigger Core Features Search & Extract 🎯 Intelligent web searching with relevance filtering Automated content extraction from top results AI-powered content summarization in markdown format User Interaction 💬 Chat-based search topic input Real-time processing pipeline Structured markdown output The workflow demonstrates practical implementation of Tavily's API endpoints while handling the complete process from search to summarization in a single automated pipeline.
by Richard Uren
Task Read a list of customers from a GoogleSheet and create them in Shopify using Shopify's Admin API (GraphQL). Why ? Generate test users for development stores. Migrate customers from other platforms. Easy intro to Shopify's GraphQL API. Setup Setting up Google Sheets access Follow the instructions in the N8N Docs for granting Oauth2 access to Google services. You'll need to grant API access to Google Sheets and Google Drive (to list available sheets). Setting up Shopify access Shopify's Admin API uses 'Header Auth' with a key of X-Shopify-Access-Token and a value of your shopify access token which starts with shpat_ . How to generate a Shopify Access Token To generate a Shopify Access Token create an app, grant the app the necessary scopes, then generate a token. From inside a store do the following : click Settings (nav link) click Apps and sales channels (nav link) click Develop Apps (button) click Create App (button) give the app a name click configure Admin API Scopes (button) at a minimum grant read_customers and write_customers scope. Grant additional scopes if you plan on accessing other parts of the API. click save To generate the token click install app (button) click install on the dialog that pops up (button) click 'reveal token once' (button) copy the token into a password vault or somewhere secure. Template Updates To test this out you'll need to make the following changes : 1) Create a header credential where the key is X-Shopify-Access-Token and the value is your Shopify Access Token (it starts with shpat_ 2) In the GraphQL node change the endpoint URL to your store. Something like https://{your store goes here}.myshopify.com/admin/api/2025-04/graphql.json Google Sheet Structure Columns can be in any order, because the rows will be mapped to fields in a json object. N8N will treat the first row in the sheet as a column name, so at a minimum use the column names below in row 1 of your sheet. first_name : Any string last_name : Any string email : Valid email mobile_phone : International mobile phone format with no spaces eg. +61414708406 (Shopify will reject anything else). Example CSV "first_name","last_name","email","mobile_phone" "Bob","Smith","bob@example.com","+61414999999"
by Joseph
This workflow automates invoice generation from form submissions, ensuring unique order IDs, creating PDF invoices, storing files, emailing customers, and logging invoice data — all seamlessly integrated. 🔹 Workflow Overview Trigger (Webhook) Starts when an order form is submitted, capturing customer and order details. Generate Random Order ID A Function node creates a unique alphanumeric invoice ID (e.g., INV-X92B7D). Check for Duplicate Order ID Google Sheets looks up the generated order ID in your invoice log sheet to prevent duplicates. Conditional Check (IF Node) If the ID already exists → regenerates a new ID (loops back) If unique → proceeds to invoice creation Prepare Invoice Data A Set node formats customer info, date, order items, and the unique order ID to fit your invoice template. Convert HTML to PDF HTTP Request node sends your invoice HTML to the RapidAPI HTML-to-PDF service and receives the PDF file. Upload PDF to Cloud Storage Save the PDF in Google Drive or Dropbox with a clear file name like Invoice-INV-X92B7D.pdf. Send Invoice Email to Customer Email node attaches the PDF and includes the order ID in the email subject/body. Log Invoice Details Append invoice data (customer info, order ID, total, PDF link) to your Google Sheet for tracking. ⚙️ Node Details & Setup 1. Webhook Trigger Configure to receive form submissions (order details like name, email, items, total). 2. Function: Generate Random Order ID Sample JS code generates unique IDs prefixed by INV-. 3. Google Sheets: Lookup Row Set up connection to your invoice log sheet. Search for existing order ID to avoid duplicates. 4. IF Node: Check Order ID Existence Condition: If order ID found → loop to regenerate. Else → continue workflow. 5. Set Node: Prepare Invoice HTML Define variables like customer name, date, items, and order ID. This data populates your HTML invoice template. 6. HTTP Request: Convert HTML to PDF API URL to get your key Send invoice HTML in the request body. Receive PDF file blob or download URL. 7. Google Drive (or Dropbox) Upload Upload the PDF file. Use file name format: Invoice-{{$json["order_id"]}}.pdf 8. Email Node Recipient: customer email from the form data. Attach generated PDF. Include order ID in email subject or body for reference. 9. Google Sheets: Append Row Log invoice metadata to keep records updated. 📁 Google Sheets Template You can make a copy of the invoice log template here This sheet includes columns for order\_id, customer name, email, total, and invoice PDF link. Customize it as needed. 📌 Additional Notes Customize the invoice HTML template inside the Set node to match your branding. Ensure API credentials for RapidAPI, Google Drive/Dropbox, and email are properly set up in your n8n credentials. You can expand this workflow by adding payment processing or SMS notifications. Need help or want a custom workflow? Reach out via email at joseph@uppfy.com.
by Avkash Kakdiya
How it works This workflow enhances contact intelligence by retrieving new or updated contact data, enriching it using AI and external APIs, and then updating your CRM or contact management system with intelligent insights. It automates the process of gathering, enriching, and organizing contact information to improve targeting, personalization, and engagement. Step-by-step 1. Trigger & Input The workflow is triggered by a scheduler or webhook event. It reads a new contact entry (or an updated one) from your source, such as a spreadsheet or form. Basic fields like name, email, and company are used as the starting point for enrichment. 2. Contact Lookup & Parsing The contact's domain or company is extracted and used to perform a lookup via an external data source. Data such as company details, job title, or LinkedIn profile is retrieved. Parsed and cleaned to remove duplicates, missing values, or invalid results. 3. AI Enrichment The enriched contact is passed through an AI model (such as GPT or another NLP service). The model analyzes job role, seniority, and inferred interests based on available data. Insights like intent, persona category, or engagement score are generated. 4. Validation & Tagging The AI-enriched data is validated to ensure consistency and accuracy. Tags and segments (e.g., "Decision Maker", "Technical Buyer", etc.) are assigned based on rules or AI inference. This enables smart filtering, targeting, and routing later in your CRM or campaigns. 5. Output & Integration The final enriched and validated contact is written back to your CRM, sheet, or marketing platform. The system also: Sends a Slack/Email alert with a summary. Updates the original contact entry with a "Processed" or "Enriched" status. Triggers next steps, such as personalized outreach or nurture sequences. Benefits Enhances Contact Profiles with AI-generated insights and third-party data. Improves Segmentation & Targeting through smart tags and persona classification. Automates Manual Research, saving time and improving accuracy. Easily Extendable by adding more AI models, data sources, or CRM integrations.
by Robert Breen
This no-code n8n workflow finds recent Instagram posts by hashtag, scrapes profile data, and uses an AI agent to evaluate whether each account is a good collaboration lead. The workflow filters based on the number of followers and the content of their bio, and outputs structured reasoning for outreach decisions. Perfect for creators, marketers, or business developers looking to automate influencer or community partnership prospecting—especially in niche ecosystems like n8n. ✅ Key Features 🔍 Hashtag Discovery**: Finds recent Instagram posts from a specified hashtag (e.g., #n8n) 👤 Account Scraping**: Retrieves profile details such as follower count and biography 🧠 AI Evaluation**: Uses OpenAI and LangChain to determine if the profile is a good fit for outreach 📦 Structured Output**: Returns a JSON object with "Yes/No" lead status and reasoning 🛠️ Manual Execution**: Run on demand using the manual trigger 🧰 What You'll Need | Tool / API | Purpose | Setup Steps | |-------------------------|------------------------------------------|-------------| | Apify Account | To access Instagram scraping actors | Create account → Generate API Token → Use in httpQueryAuth credential in n8n | | OpenAI API Key | To power the AI decision-making agent | Sign up at OpenAI → Create API key → Paste into OpenAI credential in n8n | | LangChain Plugin for n8n | AI Orchestration with System Message | Install LangChain nodes from Community Nodes (already installed in this workflow) | 🔧 Step-by-Step Setup 1️⃣ Manual Trigger Node**: When clicking ‘Execute workflow’ Use**: Allows you to run the workflow manually while testing. 2️⃣ Define Hashtag Node**: Create Search Term Value**: Sets "n8n" as the default Instagram hashtag to scan. You can edit this to any other hashtag you'd like. 3️⃣ Find Recent Posts Node**: Find Recent Posts API**: Apify Instagram Hashtag Scraper Auth Setup**: Go to your Apify Console Click “Create new token” In n8n, create a new HTTP Query Auth credential Set token in the token query param (e.g., ?token=yourTokenHere) Choose the credential in this node 4️⃣ Scrape Each Profile Node**: Scrape Accounts API**: Apify Instagram Profile Scraper Body**: JSON with usernames from the hashtag search Note**: Uses the same httpQueryAuth credential as the previous node. 5️⃣ Extract Fields Node**: Set bio and follower count What it does**: Extracts biography and followersCount from the profile JSON and stores them in clean variables for AI input. 6️⃣ AI Lead Scoring Node**: AI Agent Purpose**: Uses GPT-4o-mini to analyze the bio and follower count Prompt Details**: 7️⃣ AI Model Node**: OpenAI Chat Model Model**: gpt-4o-mini Credential**: Connect your OpenAI account via API Key. Go to OpenAI API Keys Copy your key and create a new OpenAI API credential in n8n. 8️⃣ Output Parser Node**: Structured Output Parser What it does**: Parses the response from the AI into structured JSON for further use (e.g., storing leads, sending to Airtable, etc.) 🧪 Sample Output { "lead status": "Yes", "Reasoning": "The user has 3.5k followers and their bio shows they build automations with n8n." } 📬 Need More Help? If you'd like assistance setting this up, customizing it to your niche, or expanding it to score and store leads automatically — I can help! 👤 Robert Breen Automation Consultant | AI Workflow Designer | n8n Expert 📧 robert@ynteractive.com 🌐 ynteractive.com 🔗 LinkedIn
by Davide
This workflow is designed to automate the generation and updating of SEO meta titles and descriptions for WooCommerce products using n8n. It leverages Google Sheets for data input, a FREE language model (Gemini 2.0 Flash Exp. via OpenRouter) for generating SEO-optimized meta tags, and WooCommerce for updating product details. How It Works: Trigger: The workflow can be triggered manually or on a schedule. The manual trigger allows for testing, while the schedule trigger can be set to run at regular intervals (e.g., every few minutes) to process new products. Data Retrieval: The workflow starts by retrieving product IDs from a Google Sheets document. It looks for products that do not yet have meta titles or descriptions. Using the retrieved product ID, the workflow fetches the corresponding product details from WooCommerce, including the product name, description, short description, and categories. Meta Tag Generation: The product details are passed to a language model (Gemini 2.0 Flash Exp) via OpenRouter. The model generates SEO-optimized meta titles and descriptions based on the provided content. The generated meta tags are structured and validated to ensure they meet SEO best practices, such as character limits and keyword inclusion. Update WooCommerce: The generated meta title and description are then updated in the WooCommerce product metadata using the Yoast SEO fields. Update Google Sheets: Finally, the workflow updates the Google Sheets document with the newly generated meta tags, along with the product URL, title, and the timestamp of the update. Set Up Steps: Google Sheets Setup: Create a copy of the provided Google Sheets template and insert WooCommerce product IDs in column "B". Ensure the Google Sheets document has columns for METATITLE, METADESCRIPTION, URL, TITLE POST, and DATA (timestamp). n8n Workflow Configuration: Google Sheets Node: Configure the "Get product ID" node to connect to your Google Sheets document. Use OAuth2 for authentication. WooCommerce Node: Set up the WooCommerce nodes to connect to your WooCommerce store using the WooCommerce API credentials. OpenRouter Node: Configure the "Gemini 2.0 Flash Exp" node with your OpenRouter API credentials to access the language model. Structured Output Parser: Ensure the output parser is set to handle the structured data format for meta titles and descriptions. Workflow Execution: Trigger the workflow manually to test the process or set up a schedule trigger to automate the workflow at regular intervals. Monitor the workflow execution to ensure that meta tags are generated and updated correctly in both WooCommerce and Google Sheets. Validation: After the workflow runs, verify that the meta titles and descriptions in WooCommerce are correctly updated and that the Google Sheets document reflects the changes. This workflow streamlines the process of optimizing WooCommerce product pages for SEO, saving time and ensuring consistency in meta tag generation. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 AI-Powered Blog Post Generator Category: Content Automation / AI Writing / Marketing Description: This automated workflow helps you generate fresh, SEO-optimized blog posts daily using AI tools—perfect for solo creators, marketers, and content teams looking to stay on top of the latest AI trends without manual research or writing. For more of such builds and step-by-step Tutorial Guides, check out: https://www.youtube.com/@Automatewithmarc Here’s how it works: Schedule Trigger kicks off the workflow daily (or at your preferred interval). Perplexity AI Node researches the most interesting recent AI news tailored for a non-technical audience. AI Agent (Claude via Anthropic) turns that news into a full-length blog post based on a structured prompt that includes title, intro, 3+ section headers, takeaway, and meta description—designed for clarity, engagement, and SEO. Optional Memory & Perplexity Tool Nodes enhance the agent's responses by allowing it to clarify facts or fetch more context. Google Docs Node automatically saves the final blog post to your selected document—ready for review, scheduling, or publishing. Key Features: Combines Perplexity AI + Claude AI (Anthropic) for research + writing Built-in memory and retrieval logic for deeper contextual accuracy Non-technical, friendly writing style ideal for general audiences Output saved directly to Google Docs Fully no-code, customizable, and extendable Use Cases: Automate weekly blog content for your newsletter or site Repurpose content into social posts or scripts Keep your brand relevant in the fast-moving AI landscape Setup Requirements: Perplexity API Key Anthropic API Key Google Docs (OAuth2 connected)
by Viktor
Nightly Discord Channel Cleanup This workflow runs every day at 9:00 p.m. and: Retrieves all Discord channels using your provided credentials. Pauses briefly to respect Discord API rate limits. Loops through each channel and fetches messages. Filters out messages older than seven days. Deletes those older messages, again pausing to stay within deletion rate limits. By setting up this workflow on a schedule, you can automatically keep Discord channels tidy and compliant with retention policies. 👨🎤 Setup Add your Discord credentials Change the server in each Discord node to the correct one Click the Test Workflow button Activate the workflow to run on a schedule