by Daniel Shashko
This workflow automates the creation of user-generated-content-style product videos by combining Gemini's image generation with OpenAI's SORA 2 video generation. It accepts webhook requests with product descriptions, generates images and videos, stores them in Google Drive, and logs all outputs to Google Sheets for easy tracking. Main Use Cases Automate product video creation for e-commerce catalogs and social media. Generate UGC-style content at scale without manual design work. Create engaging video content from simple text prompts for marketing campaigns. Build a centralized library of product videos with automated tracking and storage. How it works The workflow operates as a webhook-triggered process, organized into these stages: Webhook Trigger & Input Accepts POST requests to the /create-ugc-video endpoint. Required payload includes: product prompt, video prompt, Gemini API key, and OpenAI API key. Image Generation (Gemini) Sends the product prompt to Google's Gemini 2.5 Flash Image model. Generates a product image based on the description provided. Data Extraction Code node extracts the base64 image data from Gemini's response. Preserves all prompts and API keys for subsequent steps. Video Generation (SORA 2) Sends the video prompt to OpenAI's SORA 2 API. Initiates video generation with specifications: 720x1280 resolution, 8 seconds duration. Returns a video generation job ID for polling. Video Status Polling Continuously checks video generation status via OpenAI API. If status is "completed": proceeds to download. If status is still processing: waits 1 minute and retries (polling loop). Video Download & Storage Downloads the completed video file from OpenAI. Uploads the MP4 file to Google Drive (root folder). Generates a shareable Google Drive link. Logging to Google Sheets Records all generation details in a tracking spreadsheet: Product description Video URL (Google Drive link) Generation status Timestamp Summary Flow: Webhook Request → Generate Product Image (Gemini) → Extract Image Data → Generate Video (SORA 2) → Poll Status → If Complete: Download Video → Upload to Google Drive → Log to Google Sheets → Return Response If Not Complete: Wait 1 Minute → Poll Status Again Benefits: Fully automated video creation pipeline from text to finished product. Scalable solution for generating multiple product videos on demand. Combines cutting-edge AI models (Gemini + SORA 2) for high-quality output. Centralized storage in Google Drive with automatic logging in Google Sheets. Flexible webhook interface allows integration with any application or service. Retry mechanism ensures videos are captured even with longer processing times. Created by Daniel Shashko
by Paolo Ronco
Amazon Luna Prime Games Catalog Tracker (Auto-Sync to Google Sheets)** Automatically fetch, organize, and maintain an updated catalog of Amazon Luna – Included with Prime games.This workflow regularly queries Amazon’s official Luna endpoint, extracts complete metadata, and syncs everything into Google Sheets without duplicates. Ideal for: tracking monthly Prime Luna rotations keeping a personal archive of games monitoring new games appearing on Amazon Games / Prime Gaming, so you can instantly play titles you’re interested in building dashboards or gaming databases powering notification systems (Discord, Telegram, email, etc.) Overview Amazon Luna’s “Included with Prime” lineup changes frequently, with new games added and old ones removed.Instead of checking manually, this n8n template fully automates the process: Fetches the latest list from Amazon’s backend Extracts detailed metadata from the response Syncs the data into Google Sheets Avoids duplicates by updating existing rows Supports all major Amazon regions Once configured, it runs automatically—keeping your game catalog correct, clean, and always up to date. 🛠️ How the workflow works 1. Scheduled Trigger Starts the workflow on a set schedule (default: every 5 days at 3:00 PM).You can change both frequency and time freely. 2. HTTP Request to Amazon Luna Calls Amazon Luna’s regional endpoint and retrieves the full “Included with Prime” catalog. 3. JavaScript Code Node – Data Extraction Parses the JSON response and extracts structured fields: Title Genres Release Year ASIN Image URLs Additional metadata The result is a clean, ready-to-use dataset. 4. Google Sheets – Insert or Update Rows Each game is written into the selected Google Sheet: Existing games get updated New games are appended The Title acts as the unique identifier to prevent duplicates. ## ⚙️ Configuration Parameters | Parameter | Description | Recommended values | | --- | --- | --- | | x-amz-locale | Language + region | it_IT 🇮🇹 · en_US 🇺🇸 · de_DE 🇩🇪 · fr_FR 🇫🇷 · es_ES 🇪🇸 · en_GB 🇬🇧 · ja_JP 🇯🇵 · en_CA 🇨🇦 | | x-amz-marketplace-id | Marketplace backend ID | APJ6JRA9NG5V4 🇮🇹 · ATVPDKIKX0DER 🇺🇸 · A1PA6795UKMFR9 🇩🇪 · A13V1IB3VIYZZH 🇫🇷 · A1RKKUPIHCS9HS 🇪🇸 · A1F83G8C2ARO7P 🇬🇧 · A1VC38T7YXB528 🇯🇵 · A2EUQ1WTGCTBG2 🇨🇦 | | Accept-Language | Response language | Example: it-IT,it;q=0.9,en;q=0.8 | | User-Agent | Browser-like request | Default or updated UA | | Trigger interval | Refresh frequency | Every 5 days at 3:00 PM (modifiable) | | Google Sheet | Storage output | Select your file + sheet | You can adapt these headers to fetch data from any supported country. 💡 Tips & Customization 🌍 Regional catalogs Duplicate the HTTP Request + Code + Sheet block to track multiple countries (US, DE, JP, UK…). 🧹 No duplicates The workflow updates rows intelligently, ensuring a clean catalog even after many runs. 🗂️ Move data anywhere Send the output to: Airtable Databases (MySQL, Postgres, MongoDB…) Notion CSV REST APIs BI dashboards 🔔 Add notifications (Discord, Telegram, Email, etc.) You can pair this template with a notification workflow.When used with Discord, the notification message can include: game title description or metadata the game’s image**, automatically downloaded and attached This makes notifications visually informative and perfect for tracking new Prime titles. 🔒 Important Notes All retrieved data belongs to Amazon. The workflow is intended for personal, testing, or educational use only. Do not republish or redistribute collected data without permission.
by Joseph
Overview This n8n workflow creates an intelligent AI agent that automates browser interactions through Airtop's browser automation platform. The agent can control real browser sessions, navigate websites, interact with web elements, and maintain detailed session records - all while providing live viewing capabilities for real-time monitoring. Youtube Tutorial: https://www.youtube.com/watch?v=XoZqFY7QFps What This Workflow Does The AI agent acts as your virtual assistant in the browser, capable of: Session Management**: Creates, monitors, and terminates browser sessions with proper tracking Web Navigation**: Visits websites, clicks elements, fills forms, and performs complex interactions Multi-Window Support**: Manages multiple browser windows within sessions Live Monitoring**: Provides real-time viewing URLs so you can watch the automation Data Tracking**: Maintains comprehensive records of all browser activities Profile Integration**: Uses Airtop profiles for authenticated sessions Email Notifications**: Sends live URLs and status updates via Gmail Demo Use Case: Automated Reddit Posting The tutorial demonstrates the agent's capabilities by: Logging into Reddit using pre-configured Airtop profile credentials Navigating to a specific subreddit based on user input Creating and publishing a new post with title and content Tracking the entire process with detailed session records Providing live viewing access throughout the automation Core Workflow Components 1. Chat Interface Trigger Node Type**: Chat Trigger Purpose**: Accepts user commands for browser automation tasks Input**: Natural language instructions (e.g., "Create a Reddit post in r/automation") 2. AI Agent Processing Node Type**: OpenAI GPT-4 Purpose**: Interprets user requests and determines appropriate browser actions System Message**: Contains the comprehensive agent instructions from your documentation Capabilities**: Understands complex web interaction requests Plans multi-step browser workflows Manages session states intelligently Handles error scenarios gracefully 3. Google Sheets Data Management Multiple Google Sheets nodes manage different aspects of session tracking: Browser Sessions Sheet Fields**: session_id, description, status, created_date Purpose**: Tracks active browser sessions Operations**: Create, read, update session records Window Sessions Sheet Fields**: session_id, window_id, description, airtop_live_view_url, status Purpose**: Tracks individual browser windows within sessions Operations**: Create, read, update window records Airtop Profiles Sheet Fields**: platform_name, platform_url, profile_name Purpose**: Stores available authenticated profiles Operations**: Read available profiles for session creation 4. Airtop Browser Automation Nodes Multiple specialized nodes for browser control: Session Management create_session**: Creates new browser sessions with optional profile authentication terminate_session**: Closes browser sessions and updates records read_airtop_profiles**: Retrieves available authentication profiles Window Management create_window**: Opens new browser windows with specified URLs query_page**: Analyzes page content and identifies interactive elements Web Interaction click_element**: Clicks specific page elements based on AI descriptions type_text**: Inputs text into form fields and input elements 5. Gmail Integration Node Type**: Gmail Send Purpose**: Sends live viewing URLs and status updates Recipients**: User email for real-time monitoring Content**: Complete Airtop live view URLs for browser session observation 6. Error Handling & Validation Input Validation**: Ensures required parameters are present Session State Checks**: Verifies browser session status before operations Error Recovery**: Handles failed operations gracefully Data Consistency**: Maintains accurate session records even during failures Technical Requirements API Credentials Needed Airtop.ai API Key Sign up at airtop.ai Generate API key from dashboard Required for all browser automation functions OpenAI API Key OpenAI account with GPT-4 access Required for AI agent intelligence and decision-making Google Sheets Access Google account with Google Sheets API access Copy the provided template and get your sheet URL Required for session and profile data management Gmail OAuth Google account with Gmail API access Required for sending live viewing URLs and notifications Airtable Base Structure Create three tables in your Airtable base: 1. Browser Details (Sessions) session_id (Single line text) description (Single line text) status (Single select: Open, Closed) created_date (Date) 2. Window Details (Windows) session_id (Single line text) window_id (Single line text) description (Single line text) airtop_live_view_url (URL) status (Single select: Open, Closed) 3. Airtop Profiles platform_name (Single line text) platform_url (URL) profile_name (Single line text) Workflow Logic Flow User Request Processing User Input: Natural language command via chat interface AI Analysis: OpenAI processes request and determines required actions Session Check: Agent reads current browser session status Action Planning: AI creates step-by-step execution plan Browser Session Lifecycle Session Creation: Check for existing open sessions Ask user about profile usage if needed Create new Airtop session Record session details in Airtable Window Management: Create browser window with target URL Capture live viewing URL Record window details in Airtable Send live URL via Gmail Web Interactions: Query page content for element identification Execute clicks, form fills, navigation Monitor page state changes Handle dynamic content loading Session Cleanup: Terminate browser session when complete Update all related records to "Closed" status Send completion notification Data Flow Architecture User Input → AI Processing → Session Management → Browser Actions → Data Recording → User Notifications Key Features & Benefits Intelligent Automation Natural Language Control**: Users can describe tasks in plain English Context Awareness**: AI understands complex multi-step workflows Adaptive Responses**: Handles unexpected page changes and errors Profile Integration**: Seamlessly uses stored authentication credentials Real-Time Monitoring Live View URLs**: Watch browser automation as it happens Status Updates**: Real-time notifications of task progress Session Tracking**: Complete audit trail of all browser activities Multi-Window Support**: Handle complex workflows across multiple tabs Enterprise-Ready Features Error Recovery**: Robust handling of network issues and page failures Session Persistence**: Maintains state across workflow interruptions Data Integrity**: Consistent record-keeping even during failures Scalable Architecture**: Can handle multiple concurrent automation tasks Use Cases Beyond Reddit This workflow architecture supports automation for any website: Social Media Management Multi-platform posting**: Facebook, Twitter, LinkedIn, Instagram Community engagement**: Responding to comments, messages Content scheduling**: Publishing posts at optimal times Analytics gathering**: Collecting engagement metrics Business Process Automation CRM data entry**: Updating customer records across platforms Support ticket management**: Creating, updating, routing tickets E-commerce operations**: Product listings, inventory updates Report generation**: Gathering data from multiple web sources Personal Productivity Travel booking**: Comparing prices, making reservations Bill management**: Paying utilities, checking statements Job applications**: Submitting applications, tracking status Research tasks**: Gathering information from multiple sources Advanced Configuration Options Custom Profiles Create Airtop profiles for different websites Store authentication credentials securely Switch between different user accounts Handle multi-factor authentication flows Workflow Customization Modify AI system prompts for specific use cases Add custom validation rules Implement retry logic for failed operations Create domain-specific interaction patterns Integration Extensions Connect to additional data sources Add webhook notifications Implement approval workflows Create audit logs and reporting Getting Started 📊 Copy the Google Sheets Template - Just click and make a copy! Set up credentials for Airtop, OpenAI, and Gmail Import workflow into your n8n instance Configure node credentials with your API keys and Google Sheets URL Test with simple commands like "Visit google.com" Expand to complex workflows as you become comfortable Best Practices Session Management Always check for existing sessions before creating new ones Properly terminate sessions to avoid resource waste Use descriptive names for sessions and windows Regularly clean up old session records Error Handling Implement timeout handling for slow-loading pages Add retry logic for network failures Validate element existence before interactions Log detailed error information for debugging Security Considerations Store sensitive credentials in Airtop profiles, not workflow Use webhook authentication for production deployments Implement rate limiting to avoid being blocked by websites Regular audit of browser session activities This workflow transforms n8n into a powerful browser automation platform, enabling you to automate virtually any web-based task while maintaining full visibility and control over the automation process.
by Anirudh Aeran
This workflow is a complete, AI-powered content engine designed to help automation experts build their personal brand on LinkedIn. It transforms a technical n8n workflow (in JSON format) into a polished, engaging LinkedIn post, complete with a custom-generated AI image and a strategic call-to-action. This system acts as your personal content co-pilot, handling the creative heavy lifting so you can focus on building, not just writing. Who’s it for? This template is for n8n developers, automation consultants, and tech content creators who want to consistently showcase their work on LinkedIn but lack the time or desire to write marketing copy and design visuals from scratch. If you want to turn your projects into high-quality content with minimal effort, this is your solution. How it works This workflow is divided into two main parts that work together through Telegram: Content Generation & Image Creation: You send an n8n workflow's JSON file to your first Telegram bot. The workflow sends the JSON to Google Gemini with a sophisticated prompt, instructing it to analyze the workflow and write a compelling LinkedIn post in one of two high-engagement styles ("Builder" or "Strategist"). Gemini also generates a detailed prompt for an AI image model, including a specific headline to be embedded in the visual. This image prompt is then sent to the Cloudflare Workers AI model to generate a unique, high-quality image for your post. The final image and the AI-generated text prompt are sent back to you via Telegram for review. Posting to LinkedIn: You use a second Telegram bot for publishing. Simply reply to the image you received from the first bot with the final, polished post text. The workflow triggers on your reply, grabs the image and the text, and automatically publishes it as a new post on your LinkedIn profile. Why Two Different Workflows? The first workflow sends you the image and the post content. You can make changes in the content or the image and send the image to BOT-2. Then copy the post content send it to BOT-2 as a reply to the image. Then both the image and Content will be posted on LinkedIn as a single post. How to set up Create Two Telegram Bots: You need two separate bots. Use BotFather on Telegram to create them and get their API tokens. Bot 1 (Generator): For submitting JSON and receiving the generated content/image. Bot 2 (Publisher): For replying to the image to post on LinkedIn. (After Human Verification) Set Up Accounts & Credentials: Add credentials for Google Gemini, Cloudflare (with an API Token), Google Sheets, and LinkedIn. For Cloudflare, you will also need your Account ID. Google Sheet for Tracking: Create a Google Sheet with the columns: Keyword, Image Prompt, Style Used to keep a log of your generated content. Configure Nodes: In all Telegram nodes, select the correct credential for each bot. In the Google Gemini node, ensure your API credential is selected. In the Cloudflare nodes ("Get accounts" and "Get Flux Schnell image"), select your Cloudflare credential and replace the placeholder with your Account ID in the URL. In the LinkedIn node, select your credential and choose the author (your profile). In the Google Sheets node, enter your Sheet ID. Activate: Activate both Telegram Triggers in the workflow. Requirements An n8n instance. Credentials for: Google Gemini, Cloudflare, LinkedIn, Google Sheets. Two Telegram bots with their API tokens. A Cloudflare Account ID.
by Rahul Joshi
📊 Description Automate your content repurposing workflow by transforming long-form articles, blogs, and newsletters into short, high-signal, AI-ready social media snippets. ✍️🤖 This workflow fetches pending content from Airtable, generates 30-word snippets, data points, and quote-style insights using GPT-4o-mini, and updates the original record with all generated fields. If Facebook is selected as a target platform, the workflow automatically posts the best snippet via the Meta Graph API and logs the result. Perfect for content, marketing, and social media teams scaling daily publishing without manual rewriting. 🚀📣 🔁 What This Template Does 1️⃣ Fetches “pending” long-form content from Airtable. 📥 2️⃣ Processes all records in batches to avoid rate limits. 🔁 3️⃣ Sends full content + metadata to GPT-4o-mini to generate structured snippets. 🤖 4️⃣ Ensures valid JSON output via the structured parser. 📐 5️⃣ Updates Airtable with: — 30-word snippets — data points — quote insights — a recommended primary snippet — timestamps & status 6️⃣ Checks if Facebook is selected as a posting platform. ⚙️ 7️⃣ Automatically publishes the recommended snippet using the Meta Graph API. 📤 8️⃣ Updates Airtable again with post status + response. 📝 9️⃣ Sends a success notification to Slack with full details. 💬 ⭐ Key Benefits ✅ Automates creation of platform-ready social media snippets ✅ Produces AI-friendly, high-signal content that works for LLM discovery ✅ Eliminates manual rewriting for LinkedIn, Facebook, Twitter, Instagram ✅ Automatically posts to Meta if selected — hands-free publishing ✅ Maintains clean, structured content in Airtable for future reuse ✅ Saves time for marketing, growth, and social teams 🧩 Features Airtable integration for content fetch + update GPT-4o-mini AI snippet generation Structured JSON parser for clean, reliable AI output Auto-detection of selected social platforms Facebook Graph API publishing Slack notifications for success Scheduled automation for hands-free daily processing Full audit trail with timestamps 🔐 Requirements Airtable Personal Access Token OpenAI API key (GPT-4o-mini) Facebook Graph API credentials (for auto-posting) Slack API credentials n8n with LangChain nodes enabled 🎯 Target Audience Content marketing teams repurposing long-form content Social media managers publishing daily posts Growth teams optimizing content for AI search engines Agencies producing content at scale for multiple clients
by Avkash Kakdiya
How it works This workflow starts whenever a new domain is added to a Google Sheet. It cleans the domain, fetches traffic insights from SimilarWeb, extracts the most relevant metrics, and updates the sheet with enriched data. Optionally, it can also send this information to Airtable for further tracking or analysis. Step-by-step Trigger on New Domain Workflow starts when a new row is added in the Google Sheet. Captures the raw URL/domain entered by the user. Clean Domain URL Strips unnecessary parts like http://, https://, www., and trailing slashes. Stores a clean domain format (e.g., example.com) along with the row number. Fetch Website Analysis Uses the SimilarWeb API to pull traffic and engagement insights for the domain. Data includes global rank, country rank, category rank, total visits, bounce rate, and more. Extract Key Metrics Processes raw SimilarWeb data into a simplified structure. Extracted insights include: Ranks: Global, Country, and Category. Traffic Overview: Total Visits, Bounce Rate, Pages per Visit, Avg Visit Duration. Top Traffic Sources: Direct, Search, Social. Top Countries (Top 3): With traffic share percentages. Device Split: Mobile vs Desktop. Update Google Sheet Writes the cleaned and enriched domain data back into the same (or another) Google Sheet. Ensures each row is updated with the new traffic insights. Export to Airtable (Optional) Creates a new record in Airtable with the enriched traffic metrics. Useful if you want to manage or visualize company/domain data outside of Google Sheets. Why use this? Automatically enriches domain lists with live traffic data from SimilarWeb. Cleans messy URLs into a standard format. Saves hours of manual research on company traffic insights. Provides structured, comparable metrics for better decision-making. Flexible: update sheets, export to Airtable, or both.
by Takuya Ojima
Who’s it for Teams that monitor multiple news sources and want an automated, tagged, and prioritized briefing—PMM, PR/Comms, Sales/CS, founders, and research ops. What it does / How it works Each morning the workflow reads your RSS feeds, summarizes articles with an LLM, assigns tags from a maintained dictionary, saves structured records to Notion, and posts a concise Slack digest of top items. Core steps: Daily Morning Trigger → Workflow Configuration (Set) → Read RSS Feeds → Get Tag Dictionary → AI Summarizer and Tagger → Parse AI Output → Write to Notion Database → Sort by Priority → Top 3 Headlines → Format Slack Message → Post to Slack. How to set up Open Workflow Configuration (Set) and edit: rssFeeds (array of URLs), notionDatabaseId, slackChannel. Connect your own credentials in n8n for Notion, Slack, Google Sheets (if used for the tag dictionary), and your LLM provider. Adjust the trigger time in Daily Morning Trigger (e.g., weekdays at 09:00). Requirements n8n (Cloud or self-hosted) Slack app with chat:write to the target channel Notion database with properties: summary (rich_text), tags (multi_select), priority (number), url (url), publishedDate (date) Optional Google Sheet for the tag dictionary (or replace with another source) How to customize the workflow Scoring & selection: Change priority rules, increase “Top N” items, or sort by recency. Taxonomy: Extend the tag dictionary; refine the AI prompt for stricter tagging. Outputs: Post per-tag Slack threads, send DMs, or create Notion relations to initiatives. Sources: Add more feeds or mix in APIs/newsletters. Security: Do not hardcode API keys in HTTP nodes; keep credentials in n8n.
by Julian Kaiser
Automatically Classify Support Tickets in Zoho Desk with AI with Gemini Transform your customer support workflow with intelligent ticket classification. This automation leverages AI to automatically categorize incoming support tickets in Zoho Desk, reducing manual work and ensuring faster ticket routing to the right teams. How It Works Fetches all tickets from Zoho Desk with pagination support Filters unclassified tickets (where classification field is null) Retrieves complete ticket threads for full conversation context Uses OpenRouter AI (GPT-4, Claude, or other models) to classify tickets into predefined categories Updates tickets in Zoho Desk with accurate classifications automatically Use Cases Customer Support Teams**: Automatically route tickets to specialized departments (billing, technical, sales) Help Desks**: Prioritize urgent issues and categorize feature requests Prerequisites Active Zoho Desk account with API access OpenRouter API account (supports multiple AI models) Basic understanding of OAuth2 authentication Predefined ticket categories in your Zoho Desk setup Setup Steps Time: ~15 minutes Configure Zoho Desk OAuth2 - Follow our step-by-step GitHub guide for OAuth2 credential setup Set up OpenRouter API - Create an account and generate API keys at openrouter.ai Customize classifications - Define your ticket categories (e.g., Technical, Billing, Feature Request, Bug Report) Adapt the workflow - Modify for any field: status, priority, tags, assignment, or custom fields Review API documentation - Check Zoho Desk Search API docs for advanced filtering options Test thoroughly - Run manual triggers before automation Note: This workflow demonstrates proper Zoho Desk API integration, including OAuth2 authentication and pagination handling—two common integration challenges.
by JKingma
🛍️ Automated Product Description Generation for Adobe Commerce (Magento 2) Description This n8n template demonstrates how to automatically generate product descriptions for items in Adobe Commerce (Magento 2) that are missing one. The workflow retrieves product data, converts raw attribute values (like numeric IDs) into human-readable labels, and passes the enriched product data to an LLM (Azure OpenAI by default). The LLM generates a compelling description, which is then saved back to Magento using the API. This ensures all products have professional descriptions without manual writing effort. Use cases include: Auto-generating missing descriptions for catalog completeness. Creating consistent descriptions across large product datasets. Reducing manual workload for content managers. Tailoring descriptions for SEO and customer readability. Good to know All attribute options are resolved to human-readable labels before being sent to the LLM. The flow uses Azure OpenAI, but you can replace it with OpenAI, Anthropic, Gemini, or other LLM providers. The LLM prompt can be customised to adjust tone, length, SEO-focus, or specific brand style. Works out-of-the-box with Adobe Commerce (Magento 2) APIs, but can be adapted for other ecommerce systems. How it works Get Product from Magento Retrieves a product that has no description. Collects all product attributes. Generate Description with LLM Resolves attribute option IDs into human-readable values (e.g. color_id = 23 → "Red"). Passes the readable product attributes to an Azure OpenAI model. The LLM creates a clear, engaging product description. The prompt can be customised (e.g. SEO-optimized, short catalog text, or marketing style). Save Description in Magento Updates the product via the Magento API with the generated description. Ensures product data is enriched and visible in the webshop immediately. How to use Configure your Magento 2 API credentials in n8n. Replace the Azure OpenAI node with another provider if needed. Adjust the prompt to match your brand’s tone of voice. Run the workflow to automatically process products missing descriptions. Requirements ✅ n8n instance (self-hosted or cloud) ✅ Adobe Commerce (Magento 2) instance with API access ✅ Azure OpenAI (or other LLM provider) credentials (Optional) Prompt customisations for SEO or brand voice Customising this workflow This workflow can be adapted for: Other attributes**: Include or exclude attributes (e.g. only color & size for apparel). Different LLMs**: Swap Azure OpenAI for OpenAI, Anthropic, Gemini, or any supported n8n AI node. Prompt tuning**: Adjust instructions to generate shorter, longer, or SEO-rich descriptions. Selective updates**: Target only specific categories (e.g. electronics, fashion). Multi-language support**: Generate product descriptions in multiple languages for international shops.
by Jitesh Dugar
What This Does Automatically finds relevant Reddit posts where your brand can add value, generates helpful AI comments, and sends the best opportunities to your Slack channel for review. Setup Requirements Reddit API credentials OpenAI API key Slack webhook URL Quick Setup Reddit API Create app at reddit.com/prefs/apps (select "script" type) Add client ID and secret to n8n credentials Configure Subreddits Edit the workflow to monitor subreddits relevant to your business: entrepreneur, startups, smallbusiness, [your_niche] AI Prompt Setup Customize the OpenAI node with your brand context: You're helping in [subreddit] discussions. When relevant, mention how [your_product] solves similar problems. Be helpful first, promotional second. Slack Integration Add your webhook URL to get notifications with: Post title and link AI-generated comment Engagement score (1-10) Key Features Smart Filtering**: AI evaluates if a post is worth engaging with Brand-Aware Comments**: Generated responses stay on-brand and helpful Team Review**: All opportunities go to Slack before posting Multiple Subreddits**: Monitor several communities simultaneously Customization Tips Adjust AI Scoring - Modify what makes a "good" opportunity: Post engagement level Relevance to your product Tone of the discussion Comment Templates - Set different styles for different subreddits: Technical advice for developer communities Business insights for entrepreneur groups User experience for product discussions Best Practices Start with 2-3 subreddits to test effectiveness Review and approve comments in Slack before posting Follow Reddit's 90/10 rule (90% helpful content, 10% self-promotion) Adjust the AI prompt based on what works in your communities Why Use This Saves hours of manual Reddit browsing Maintains consistent brand voice Never miss relevant conversations Team can review before engaging publicly
by Pavlo Hurhu
This n8n workflow promotes your brand/company/platform by mentioning it in Twitter comments. The responses look human-like, the workflow is robust and designed to avoid bans. Good to know The workflow is configured to maximize efficiency while minimizing costs and ensuring your Twitter account won't get banned or shadow-banned. Generating more than 17 comments per day would require a paid Twitter subscription plan. How it works The User sets a keyword that would be used to find relevant Posts. An AI Agent analyzes each Post and writes a response, promoting User's Brand. After each reponse is submitted, the result is logged in a Report Table for tracking and convenience. Set up steps Set your target keyword and start the workflow. Detailed instructions and tutorials can be found in the workflow's sticky notes. Requirements Twitter and Google accounts. twitterapi.io subscription (used to overcome official Twitter API limitaions). Anthropic subscription (GPT models are also supported, but I personally recommend using Anthropic Claude Sonnet 4 for text generation).
by Fei Wu
Reddit Post Saver & Summarizer with AI-Powered Filtering Who This Is For Perfect for content curators, researchers, developers, and community managers who want to build a structured database of valuable Reddit content without manual data entry. If you're tracking industry trends, gathering user feedback, or building a knowledge base from Reddit discussions, this workflow automates the entire process. The Problem It Solves Reddit has incredible discussions, but manually copying posts, extracting insights, and organizing them into a database is time-consuming. This workflow automatically transforms your saved Reddit posts into structured, searchable data—complete with AI-generated summaries of both the post and its comment section. How It Works 1. Save Posts Manually Simply use Reddit's built-in save feature on any post you find valuable. 2. Automated Daily Processing The workflow triggers once per day and: Fetches all your saved Reddit posts via Reddit API Filters posts by subreddit and custom conditions (e.g., "only posts about JavaScript frameworks" or "posts with more than 100 upvotes") Uses an LLM (Google Gemini) to verify posts match your natural language criteria Generates comprehensive summaries of both the original post and top comments 3. Structured Database Storage Filtered and summarized posts are automatically saved to your Supabase database with this structure: { "reddit_id": "unique post identifier", "title": "post title", "url": "direct link to Reddit post", "summary": "AI-generated summary of post and comments", "tags": ["array", "of", "relevant", "tags"], "post_date": "original post creation date", "upvotes": "number of upvotes", "num_comments": "total comment count" } Setup Requirements Reddit API credentials** (client ID and secret) Supabase account** with a database table Google Gemini API key** (or alternative LLM provider) Basic configuration of filter conditions (subreddit names and natural language criteria) Use Cases Product Research**: Track competitor mentions and feature requests Content Creation**: Build a library of trending topics in your niche Community Management**: Monitor feedback across multiple subreddits Academic Research**: Collect and analyze discussions on specific topics