by Matthew
AI-Powered Viral Video Factory 🚀 This workflow automates the entire process of creating short, cinematic, fact-based videos ready for social media. It takes a single concept, generates a script and visuals, creates video clips, adds a voiceover, and assembles a final video, which is then uploaded directly to your Google Drive. It's perfect for content creators and marketing agencies looking to scale video production with minimal manual effort. How It Works 🎬 Generate a Viral Idea 💡: The workflow begins with the Create New Idea1 (OpenAI) node, which generates a viral-ready video concept, including a punchy title, hashtags, and a brief description based on a core theme (e.g., space, black holes). This idea is then logged in a Google Sheet. Create a Cinematic Script & Voiceover 📜: An OpenAI node (Generating scenes1) creates a detailed 12-scene script, outlining the visuals for a 60-second video. The script text for all scenes is combined and prepared for voiceover generation by another OpenAI node (Generate Voiceover). Generate Scene-by-Scene Visuals ✨: The workflow loops through each of the 12 scenes to create an animated clip: Image Generation: An HTTP Request node sends the scene's prompt to the fal-ai/flux model to create a photorealistic still image. Animation Prompting: The Video Prompts1 (OpenAI Vision) node analyzes the generated image and creates a new, specific prompt to animate it cinematically. Image-to-Video: Another HTTP Request node uses the fal-ai/kling-video model to turn the still image into a 5-second animated video clip based on the new animation prompt. Assemble the Final Video 🎞️: Stitch Clips: Once all 12 clips are generated, the Merge Clips node uses the fal-ai/ffmpeg-api to concatenate them into a single, seamless 60-second video. Add Audio: The Combine Voice and Video node then layers the AI-generated voiceover onto the stitched video. Deliver to Google Drive 📂: Finally, the completed video is converted from a URL to a file and automatically uploaded to your specified Google Drive folder for easy access and publishing. Key Technologies Used n8n**: For orchestrating the entire automated workflow. OpenAI (GPT-4.1 & GPT-4o)**: For idea generation, scriptwriting, voiceover, and vision analysis. Fal.ai**: For high-performance, API-based image generation (Flux), video animation (Kling), and video processing (FFMPEG API). Google Drive & Sheets**: For logging ideas and storing the final video output. Setup Instructions Add Credentials: In n8n, add your OpenAI API key. Connect your Google account for Google Sheets and Google Drive access. You will need a Fal.ai API Key. Configure Fal.ai API Key: Crucially, you must replace the placeholder API key in all HTTP Request nodes that call the fal.run URL. Find the Authorization header in each of these nodes and replace the existing key with your own Key YOUR_FAL_AI_KEY_HERE. Nodes to update: Create Images1, Get Images1, Create Video1, Get Video1, Merge Clips, Get Final video, Combine Voice and Video. Configure OpenAI Nodes: Select each OpenAI node (e.g., Create New Idea1, Generating scenes1) and choose your OpenAI credential. You can customize the main prompt in the Create New Idea1 node to change the theme of the videos you want to generate. Configure Google Sheets & Drive: In the Organise idea, caption etc1 node, select your Google Sheets credential and specify the Spreadsheet and Sheet ID you want to use for logging ideas. In the Upload file to drive node, select your Google Drive credential and choose the destination folder for your final videos.
by NODA shuichi
Description: Don't just get a recipe. Get a Strategy. (Speed / Healthy / Creative) 🍳🤖 This workflow solves the "What should I eat?" problem by using Google Gemini to generate 3 distinct recipe variations simultaneously based on your fridge leftovers. It demonstrates advanced n8n concepts like Array Processing and Data Aggregation. Key Features: Array Processing: Demonstrates how to handle JSON lists (Gemini outputs an array -> n8n splits it -> API calls run for each item). Aggregation: Shows how to combine processed items back into a single summary email. Visual Enrichment: Automatically searches for recipe images using Google Custom Search. How it works: Input: Enter ingredients via the Form Trigger. Generate: Gemini creates 3 JSON objects: "Speed (5min)", "Healthy", and "Creative". Process: The workflow iterates through the 3 recipes, searching for images and logging data to Google Sheets. Aggregate: The results are combined into one HTML comparison table. Deliver: You receive an email with 3 options to choose from. Setup Requirements: Google Sheets: Create a sheet named Recipes with headers: date, ingredients, style, recipe_name, recipe_text, image_url. Credentials: Google Gemini API, Google Custom Search (API Key & Engine ID), Gmail, Google Sheets. Configuration: Enter your IDs in the "1. Configuration" node.
by Cheng Siong Chin
How It Works The webhook receives incoming profiles and extracts relevant demographic, financial, and credential data. The workflow then queries the programs database to identify suitable options, while the AI generates personalized recommendations based on eligibility and preferences. A formal recommendation letter is created, followed by a drafted outreach email tailored to coordinators. Parsers extract structured data from the letters and emails, a Slack summary is prepared for internal visibility, and the final response is sent to the appropriate recipients. Setup Steps Configure AI agents by adding OpenAI credentials and setting prompts for the Program Matcher, Letter Writer, and Email Drafter. Connect the programs database (Airtable or PostgreSQL) and configure queries to retrieve matching program data. Set up the webhook by defining the trigger endpoint and payload structure for incoming profiles. Configure JSON parsers to extract relevant information from profiles, letters, and emails. Add the Slack webhook URL and define the summary format for generated communications. Prerequisites OpenAI API key Financial programs database Slack workspace with webhook User profile structure (income, GPA, demographics) Use Cases Universities automating 500+ annual applicant communications Scholarship foundations personalizing outreach at scale Customization Add multilingual support for international applicants Include PDF letter generation with signatures Benefits Reduces communication time from 30 to 2 minutes per applicant, ensures consistent professional quality
by Janak Patel
Who’s it for This template is ideal for YouTube video creators who spend a lot of time manually generating SEO assets like descriptions, tags, titles, keywords, and thumbnails. If you're looking to automate your YouTube SEO workflow, this is the perfect solution for you. How it works / What it does Connect a Google Sheet to n8n and pull in the Hindi script (or any language). Use OpenAI to generate SEO content: Video description Tags Keywords Titles Thumbnail titles etc. Use the generated description as input to create a thumbnail image using an image generation API. Store all outputs in the same Google Sheet in separate columns. Optionally, use tools like VidIQ or TubeBuddy to test the SEO strength of generated titles, tags, and keywords. 💡 Note: This example uses Runway’s image generation API, but you can plug in any other image-generation service of your choice. Requirements A Google Sheet with clearly named columns Hindi, English, or other language scripts in the sheet OpenAI API key Runway API key (or any other image generation API) How to set up You can set up this workflow in 15 minutes by following the pre-defined steps. Replace the manual Google Sheet trigger with a scheduled trigger for daily or timed automation. You may also swap Google Sheets with any database or data source of your choice. No Google Sheets API required. Requires minimal JavaScript or Python knowledge for advanced customizations.
by rana tamure
This n8n workflow automates the creation of high-quality, SEO-optimized blog posts using AI. It pulls keyword data from Google Sheets, conducts research via Perplexity AI, generates structured content (title, introduction, key takeaways, body, conclusion, and FAQs) with OpenAI and Anthropic models, assembles the post, performs final edits, converts to HTML, and publishes directly to WordPress. Ideal for content marketers, bloggers, or agencies looking to scale content production while maintaining relevance and engagement. Key Features Keyword-Driven Generation: Fetches primary keywords, search intent, and related terms from a Google Sheets spreadsheet to inform content strategy. AI Research & Structuring: Uses Perplexity for in-depth topic research and OpenAI/Anthropic for semantic analysis, outlines, and full content drafting. Modular Content Creation: Generates sections like introductions, key takeaways, outlines, body, conclusions, and FAQs with tailored prompts for tone, style, and SEO. Assembly & Editing: Combines sections into a cohesive Markdown post, adds internal/external links, and applies final refinements for readability and flow. Publishing Automation: Converts Markdown to styled HTML and posts drafts to WordPress. Customization Points: Easily adjust AI prompts, research depth, or output formats via Code and Set nodes. Requirements Credentials: OpenAI API (for GPT models), Perplexity API (for research), Google Sheets OAuth2 (for keyword input), WordPress API (for publishing). Setup: Configure your Google Sheets with columns like "keyword", "search intent", "related keyword", etc. Ensure the sheet is shared with your Google account. Dependencies: No additional packages needed; relies on n8n's built-in nodes for AI, HTTP, and data processing. How It Works Trigger & Input: Start manually or schedule; pulls keyword data from Google Sheets. Research Phase: Uses Perplexity to gather topic insights and citations from reputable sources. Content Generation: AI nodes create title, structure, intro, takeaways, outline, body, conclusion, and FAQs based on research and SEO guidelines. Assembly & Refinement: Merges sections, embeds links, edits for polish, and converts to HTML. Output: Publishes as a WordPress draft or outputs the final HTML for manual use. Benefits Time Savings: Automate 80-90% of content creation, reducing manual writing from hours to minutes. SEO Optimization: Incorporates primary/related keywords naturally, aligns with search intent, and includes semantic structures for better rankings. Scalability: Process multiple keywords in batches; perfect for content calendars or high-volume blogging. Quality Assurance: Built-in editing ensures engaging, error-free content with real-world examples and data-backed insights. Versatility: Adaptable for any niche (e.g., marketing, tech, finance) by tweaking prompts or sheets. Potential Customizations Add more AI models (e.g., via custom nodes) for varied tones. Integrate image generation or social sharing for full content pipelines. Filter sheets for specific topics or add notifications on completion.
by gotoHuman
Auto-detect news from n8n and turn into a human-approved LinkedIn post. gotoHuman is used to keep a human in the loop. There you can manually edit the AI draft of the post or request to regenerate it. How it works The workflow is triggered each day to fetch the latest version of https://blog.n8n.io. It then fetches each article, checks if it was published in the last 24 hours and uses an LLM to summarize it. An LLM then drafts a related LinkedIn post which is sent to gotoHuman for approval. In gotoHuman, the reviewer can manually edit it or ask to regenerate it with the option to even edit the prompt (Retries loop back to the AI Draft LinkedIn Post node) Approved Posts are automatically published to LinkedIn How to set up Most importantly, install the gotoHuman node before importing this template! (Just add the node to a blank canvas before importing) Set up your credentials for gotoHuman, OpenAI, and LinkedIn In gotoHuman, select and create the pre-built review template "Blog scraper agent" or import the ID: sMxevC9tSAgdfWsr6XIW Select this template in the gotoHuman node Requirements You need accounts for gotoHuman (human supervision) OpenAI (summary, draft) LinkedIn How to customize Change the blog URL to monitor. Adapt to its' HTML structure Provide the AI Draft LinkedIn Post with examples of previous posts so it picks up your writing style (consider adding gotoHuman's dataset of approved examples) Use the workflow to target other publications, like your newsletter, blog or other socials
by Sk developer
📊 Automated Website Traffic Tracker with Google Sheets Logging Track website traffic and backlinks effortlessly using the Website Traffic Checker - Ahref API. This n8n workflow automates data retrieval and logging into Google Sheets, making it perfect for SEO professionals and digital marketers. 🧩 What This Workflow Does (Summary) Accepts a domain via a simple web form. Sends the domain to Website Traffic Checker - Ahref API. If successful: Extracts backlink and traffic data. Appends the results to two separate Google Sheets. If failed: Sends an email alert with domain and status code. 🔧 Node-by-Node Explanation | Node | Purpose | | ---------------------------------- | ---------------------------------------------------------------------------------------------------------------- | | 🟢 Form Trigger | Starts the workflow when a domain is submitted via form. | | 🟩 Set Domain Value | Stores the submitted domain into a variable. | | 🌐 HTTP Request | Calls Website Traffic Checker - Ahref API. | | ✅ IF Node | Checks if the API responded with statusCode = 200. | | ❌ Email Node (Fail) | Sends an alert email if API fails. | | 📦 Code (Backlink Info) | Extracts backlink data from API response. | | 📄 Google Sheet: Backlink Info | Appends backlink data to a sheet. | | 📦 Code (Traffic Info) | Extracts traffic data from API response. | | 📄 Google Sheet: Traffic Data | Appends traffic metrics to another sheet. | 📁 Google Sheet Columns Backlink Info Sheet | Column | Description | | ------------------ | --------------------------- | | website | Domain submitted | | ascore | Authority score | | referring domain | Number of referring domains | | total backlinks | Total backlinks | Traffic Data Sheet | Column | Description | |----------------------|---------------------------------------------| | accuracy | Accuracy level of the traffic data | | bounce_rate | Bounce rate percentage | | desktop_share | Percentage of traffic from desktop devices | | direct | Direct traffic sources | | display_ad | Display ad traffic sources | | display_date | Date when traffic data was captured | | mail | Traffic from email campaigns | | mobile_share | Percentage of traffic from mobile devices | | pages_per_visit | Average number of pages per visit | | paid | Paid traffic sources | | prev_bounce_rate | Bounce rate in the previous period | | prev_direct | Previous period's direct traffic | | prev_display_ad | Previous period's display ad traffic | | prev_mail | Previous period's email traffic | | prev_pages_per_visit | Previous period's pages per visit | | prev_referral | Previous period's referral traffic | | prev_search_organic | Previous organic search traffic | | prev_search_paid | Previous paid search traffic | | prev_social_organic | Previous organic social traffic | | prev_social_paid | Previous paid social traffic | | prev_time_on_site | Previous time spent on site | | prev_users | Number of users in the previous period | | prev_visits | Visits in the previous period | | rank | Global rank of the website | | referral | Referral traffic | | search | Total search traffic | | search_organic | Organic search traffic | | search_paid | Paid search traffic | | social | Total social traffic | | social_organic | Organic social traffic | | social_paid | Paid social traffic | | target | Targeted country or demographic | | time_on_site | Average time spent on site | | unknown_channel | Traffic from unknown sources | | users | Number of unique users | | visits | Total number of visits | 🔐 How to Configure 🔑 Get API Key Go to Website Traffic Checker - Ahref API on RapidAPI. Sign in or create a free RapidAPI account. Subscribe to the API plan. Copy your x-rapidapi-key from the Endpoints tab. 📝 Add Key in n8n Go to your HTTP Request node. Under Headers, set: x-rapidapi-host = website-traffic-checker-ahref.p.rapidapi.com x-rapidapi-key = your API key 📄 How to Setup Google Sheets in n8n Connect a Google account via Google Sheets credentials in n8n. Use the full Google Sheet URL in the documentId field. Set correct Sheet name or GID (e.g., "Traffic Data"). Use Auto Map or Custom Map to define columns. > Make sure your Google Sheet has edit access and headers already created. 🧠 Use Case & Benefits 👤 Ideal For: SEO analysts Digital marketers Agencies managing multiple clients Web analytics consultants ✅ Benefits: Fully automated data collection. No manual copy-paste** from tools. Real-time insights delivered to Google Sheets. Easy monitoring of backlinks and traffic trends.
by Shun Nakayama
Turn your favorite podcast episodes into engaging social media content automatically. This workflow fetches new episodes from an RSS feed, transcribes the audio using OpenAI Whisper, generates a concise summary using GPT-4o, and drafts a tweet. It then sends the draft to Slack for your review before posting it to X (Twitter). Who is this for Content creators, social media managers, and podcast enthusiasts who want to share insights without manually listening to and typing out every episode. Key Features Large File Support:** Includes a custom logic to download audio in chunks, ensuring stability even with long episodes (preventing timeouts). Human-in-the-Loop:** Nothing gets posted without your approval. You can review the AI-generated draft in Slack before it goes live. High-Quality AI:** Uses OpenAI's Whisper for accurate transcription and GPT-4o for intelligent summarization. How it works Monitor: Checks the Podcast RSS feed daily for new episodes. Process: Downloads the audio (handling large files via chunking) and transcribes it. Draft: AI summarizes the transcript into bullet points and formats it for X (Twitter). Approve: Sends the draft to a Slack channel. Publish: Once approved by you, it posts the tweet to your X account. Requirements OpenAI API Key Slack Account & App (Bot Token) X (Twitter) Developer Account (OAuth2) Setup instructions RSS Feed: The template defaults to "TED Talks Daily" for demonstration. Open the [Step 1] RSS node and replace the URL with your target podcast. Connect Credentials: Set up your credentials for OpenAI, Slack, and X (Twitter) in the respective nodes. Slack Channel: In the [Step 12] Slack node, select the Channel ID where you want to receive the approval request.
by sato rio
This workflow streamlines the entire inventory replenishment process by leveraging AI for demand forecasting and intelligent logic for supplier selection. It aggregates data from multiple sources—POS systems, weather forecasts, SNS trends, and historical sales—to predict future demand. Based on these predictions, it calculates shortages, requests quotes from multiple suppliers, selects the optimal vendor based on cost and lead time, and executes the order automatically. 🚀 Who is this for? Retail & E-commerce Managers** aiming to minimize stockouts and reduce overstock. Supply Chain Operations** looking to automate procurement and vendor selection. Data Analysts** wanting to integrate external factors (weather, trends) into inventory planning. 💡 How it works Data Aggregation: Fetches data from POS systems, MySQL (historical sales), OpenWeatherMap (weather), and SNS trend APIs. AI Forecasting: Formats the data and sends it to an AI prediction API to forecast demand for the next 7 days. Shortage Calculation: Compares the forecast against current stock and safety stock to determine necessary order quantities. Supplier Optimization: For items needing replenishment, the workflow requests quotes from multiple suppliers (A, B, C) in parallel. It selects the best supplier based on the lowest total cost within a 7-day lead time. Execution & Logging: Places the order via API, updates the inventory system, and logs the transaction to MySQL. Anomaly Detection: If the AI's confidence score is low, it skips the auto-order and sends an alert to Slack for manual review. ⚙️ Setup steps Configure Credentials: Set up credentials for MySQL and Slack in n8n. API Keys: You will need an API key for OpenWeatherMap (or a similar service). Update Endpoints: The HTTP Request nodes use placeholder URLs (e.g., pos-api.example.com, ai-prediction-api.example.com). Replace these with your actual internal APIs, ERP endpoints, or AI service (like OpenAI). Database Prep: Ensure your MySQL database has a table named forecast_order_log to store the order history. Schedule: The workflow is set to run daily at 03:00. Adjust the Schedule Trigger node as needed. 📋 Requirements n8n** (Self-hosted or Cloud) MySQL** database Slack** workspace External APIs for POS, Inventory, and Supplier communication (or mock endpoints for testing).
by Chandan Singh
This workflow creates a daily, automated backup of all workflows in a self-hosted n8n instance and stores them in Google Drive. Instead of exporting every workflow on every run, it uses content hashing to detect meaningful changes and only updates backups when a workflow has actually been modified. To keep Google Drive clean and predictable, the workflow intentionally deletes the existing backup file before uploading the updated version. This avoids duplicate files and ensures there is always one authoritative backup per workflow. A Data Table is used as an index to track workflow IDs, hash values, and timestamps. This allows the workflow to quickly determine whether a workflow already exists, whether its content has changed, or whether it should be skipped entirely. How it works Runs daily using a Cron Trigger. Fetches all workflows from the n8n API. Processes workflows one-by-one for reliability. Generates a SHA-256 hash for each workflow. Compares hashes against a stored Data Table. Deletes existing Google Drive backups when changes are detected. Uploads updated workflows and skips unchanged ones. Store new or updated workflows details in Data Table. Filters workflows based on the configured backup scope (all | active | tagged ). Backs up all workflows, only active workflows, or only workflows matching a specific tag. Applies the scope filter before hashing and comparison, ensuring only relevant workflows are processed. Setup steps Set the Cron schedule** Open the Cron Trigger node and choose the time you want the backup to run (for example, once daily during off-peak hours). Create a Data Table** Create a new n8n Data Table with the title defined in dataTableTitle. This table stores workflowId, workflowName, hashCode, and DriveFiveId. Configure the Set node** In the Set Backup Configuration node, provide the following values: { "n8nHost": "https://your-n8n-domain", "apiKey": "your-n8n-api-key", "backupFolder": "/n8n/workflow-backups", "hashAlgorithm": "sha256", "dataTableTitle": "n8n_workflow_backup_index", "backupScope" : "", "requiredTag" : "" } In the Set Backup Configuration node, choose how workflows should be selected for backup: all – backs up every workflow (default) active – backs up only enabled workflows tagged – backs up only workflows containing a specific tag If using the tagged option, provide the required tag name to match. { "backupScope": "tagged", "requiredTag": "production" } Connect Google Drive credentials** Authorize your Google Drive account and ensure the backup folder exists. Activate the workflow** Once enabled, backups run automatically with no further action required.
by Connor Provines
[Meta] Multi-Format Documentation Generator for N8N Creators (+More) One-Line Description Transform n8n workflow JSON into five ready-to-publish documentation formats including technical guides, social posts, and marketplace submissions. Detailed Description What it does: This workflow takes an exported n8n workflow JSON file and automatically generates a complete documentation package with five distinct formats: technical implementation guide, LinkedIn post, Discord community snippet, detailed use case narrative, and n8n Creator Commons submission documentation. All outputs are compiled into a single Google Doc for easy access and distribution. Who it's for: n8n creators** preparing workflows for the template library or community sharing Automation consultants** documenting client solutions across multiple channels Developer advocates** creating content about automation workflows for different audiences Teams** standardizing workflow documentation for internal knowledge bases Key Features: Parallel AI generation** - Creates all five documentation formats simultaneously using Claude, saving 2+ hours of manual writing Automatic format optimization** - Each output follows platform-specific best practices (LinkedIn character limits, Discord casual tone, n8n marketplace guidelines) Single Google Doc compilation** - All documentation consolidated with clear section separators and automatic workflow name detection JSON upload interface** - Simple form-based trigger accepts workflow exports without technical setup Smart content adaptation** - Same workflow data transformed into technical depth for developers, engaging narratives for social media, and searchable descriptions for marketplaces Ready-to-publish outputs** - No editing required—each format follows platform submission guidelines and style requirements How it works: User uploads exported n8n workflow JSON through a web form interface Five AI agents process the workflow data in parallel, each generating format-specific documentation (technical guide, LinkedIn post, Discord snippet, use case story, marketplace listing) All outputs merge into a formatted document with section headers and separators Google Docs creates a new document with auto-generated title from workflow name and timestamp Final document populates with all five documentation formats, ready for copying to respective platforms Setup Requirements Prerequisites: Anthropic API** (Claude AI) - Powers all documentation generation; requires paid API access or credits Google Docs API** - Creates and updates documentation; free with Google Workspace account n8n instance** - Cloud or self-hosted with AI agent node support (v1.0+) Estimated Setup Time: 20-25 minutes (15 minutes for API credentials, 5-10 minutes for testing with sample workflow) Installation Notes API costs**: Each workflow documentation run uses ~15,000-20,000 tokens across five parallel AI calls (approximately $0.30-0.50 per generation at current Claude pricing) Google Docs folder**: Update the folderId parameter in the "Create a document" node to your target folder—default points to a specific folder that won't exist in your Drive Testing tip**: Use a simple 3-5 node workflow for your first test to verify all AI agents complete successfully before processing complex workflows Wait node purpose**: The 5-second wait between document creation and content update prevents Google Docs API race conditions—don't remove this step Form URL**: After activation, save the form trigger URL for easy access—bookmark it or share with team members who need to generate documentation Customization Options Swappable integrations: Replace Google Docs with Notion, Confluence, or file system storage by swapping final nodes Switch from Claude to GPT-4, Gemini, or other LLMs by changing the language model node (may require prompt adjustments) Add Slack/email notification nodes after completion to alert when documentation is ready Adjustable parameters: Modify AI prompts in each agent node to match your documentation style preferences or add company-specific guidelines Add/remove documentation formats by duplicating or deleting agent nodes and updating merge configuration Change document formatting in the JavaScript code node (section separators, headers, metadata) Extension possibilities: Add automatic posting to LinkedIn/Discord by connecting their APIs after doc generation Create version history tracking by appending to existing docs instead of creating new ones Build approval workflow by adding human-in-the-loop steps before final document creation Generate visual diagrams by adding Mermaid chart generation from workflow structure Create multi-language versions by adding translation nodes after English generation Category Development Tags documentation n8n content-generation ai claude google-docs workflow automation-publishing Use Case Examples Marketplace contributors**: Generate complete n8n template submission packages in minutes instead of hours of manual documentation writing across multiple format requirements Agency documentation**: Automation consultancies can deliver client workflows with professional documentation suite—technical guides for client IT teams, social posts for client marketing, and narrative case studies for portfolio Internal knowledge base**: Development teams standardize workflow documentation across projects, ensuring every automation has consistent technical details, use case examples, and setup instructions for team onboarding
by Robert Breen
This workflow introduces beginners to one of the most fundamental concepts in n8n: looping over items. Using a simple use case—generating LinkedIn captions for content ideas—it demonstrates how to split a dataset into individual items, process them with AI, and collect the output for review or export. ✅ Key Features 🧪 Create Dummy Data**: Simulate a small dataset of content ideas. 🔁 Loop Over Items**: Process each row independently using the SplitInBatches node. 🧠 AI Caption Creation**: Automatically generate LinkedIn captions using OpenAI. 🧰 Tool Integration**: Enhance AI output with creativity-injection tools. 🧾 Final Output Set**: Collect the original idea and generated caption. 🧰 What You’ll Need ✅ An OpenAI API key ✅ The LangChain nodes enabled in your n8n instance ✅ Basic knowledge of how to trigger and run workflows in n8n 🔧 Step-by-Step Setup 1️⃣ Run Workflow Node**: Manual Trigger (Run Workflow) Purpose**: Manually start the workflow for testing or learning. 2️⃣ Create Random Data Node**: Create Random Data (Code) What it does**: Simulates incoming data with multiple content ideas. Code**: return [ { json: { row_number: 2, id: 1, Date: '2025-07-30', idea: 'n8n rises to the top', caption: '', complete: '' } }, { json: { row_number: 3, id: 2, Date: '2025-07-31', idea: 'n8n nodes', caption: '', complete: '' } }, { json: { row_number: 4, id: 3, Date: '2025-08-01', idea: 'n8n use cases for marketing', caption: '', complete: '' } } ]; 3️⃣ Loop Over Items Node**: Loop Over Items (SplitInBatches) Purpose**: Sends one record at a time to the next node. Why It Matters**: Loops in n8n are created using this node when you want to iterate over multiple items. 4️⃣ Create Captions with AI Node**: Create Captions (LangChain Agent) Prompt**: idea: {{ $json.idea }} System Message**: You are a helpful assistant creating captions for a LinkedIn post. Please create a LinkedIn caption for the idea. Model**: GPT-4o Mini or GPT-3.5 Credentials Required**: OpenAI Credential Go to: OpenAI API Keys Create a key and add it in n8n under credentials as “OpenAi account” 5️⃣ Inject Creativity (Optional) Node**: Tool: Inject Creativity (LangChain Tool) Purpose**: Demonstrates optional LangChain tools that can enhance or manipulate input/output. Why It’s Cool**: A great way to show chaining tools to AI agents. 6️⃣ Output Table Node**: Output Table (Set) Purpose**: Combines original ideas and generated captions into final structure. Fields**: idea: ={{ $('Create Random Data').item.json.idea }} output: ={{ $json.output }} 💡 Educational Value This workflow demonstrates: Creating dynamic inputs with the Code node Using SplitInBatches to simulate looping Sending dynamic prompts to an AI model Using Set to structure the output data Beginners will understand how item-level processing works in n8n and how powerful looping combined with AI can be. 📬 Need Help or Want to Customize This? Robert Breen Automation Consultant | AI Workflow Designer | n8n Expert 📧 robert@ynteractive.com 🌐 ynteractive.com 🔗 LinkedIn 🏷️ Tags n8n loops OpenAI LangChain workflow training beginner LinkedIn automation caption generator