by Muhammad Farooq Iqbal
This n8n template demonstrates how to create authentic-looking User Generated Content (UGC) advertisements using AI image generation, voice synthesis, and lip-sync technology. The workflow transforms product images into realistic customer testimonial videos that mimic genuine user reviews and social media content. Use cases are many: Generate authentic UGC-style ads for social media campaigns, create customer testimonial videos without hiring influencers, produce localized UGC content for different markets, automate TikTok/Instagram-style product reviews, or scale UGC ad production for e-commerce brands! Good to know The workflow creates UGC-style content that appears genuine and authentic Uses multiple AI services: OpenAI GPT-4o for analysis, ElevenLabs for voice synthesis, and WaveSpeed AI for image generation and lip-sync Voice synthesis costs vary by ElevenLabs plan (typically $0.18-$0.30 per 1K characters) WaveSpeed AI pricing: ~$0.039 per image generation, additional costs for lip-sync processing Processing time: ~3-5 minutes per complete UGC video Optimized for Malaysian-English content but easily adaptable for global markets How it works Product Input: The Telegram bot receives product images to create UGC ads for AI Analysis: ChatGPT-4o analyzes the product to understand brand, colors, and target demographics UGC Content Creation: AI generates authentic-sounding testimonial scripts and detailed prompts for realistic customer scenarios Character Generation: WaveSpeed AI creates believable customer avatars that look like real users reviewing products Voice Synthesis: ElevenLabs generates natural, conversational audio using gender-appropriate voice models UGC Video Production: WaveSpeed AI combines generated characters with audio to create TikTok/Instagram-style review videos Content Delivery: Final UGC videos are delivered via Telegram, ready for social media posting The workflow produces UGC-style content that maintains authenticity while showcasing products in realistic, relatable scenarios that resonate with target audiences. How to use Setup Credentials: Configure OpenAI API, ElevenLabs API, WaveSpeed AI API, Cloudinary, and Telegram Bot credentials Deploy Workflow: Import the template and activate the workflow Send Product Images: Use the Telegram bot to send product images you want to create UGC ads for Automatic UGC Generation: The workflow will automatically create authentic-looking customer testimonial videos Receive UGC Content: Get both testimonial images and final UGC videos ready for social media campaigns Pro tip: The workflow automatically detects product demographics and creates appropriate customer personas. For best UGC results, use clear product images that show the item in use. Requirements OpenAI API** account for GPT-4o product analysis and UGC script generation ElevenLabs API** account for authentic voice synthesis (requires voice cloning credits) WaveSpeed AI API** account for realistic character generation and lip-sync processing Cloudinary** account for UGC content storage and hosting Telegram Bot** setup for content input and delivery n8n** instance (cloud or self-hosted) Customizing this workflow Platform-Specific UGC: Modify prompts to create UGC content optimized for TikTok, Instagram Reels, YouTube Shorts, or Facebook Stories. Brand Voice: Adjust testimonial scripts and character personas to match your brand's target audience and tone. Regional Adaptation: Customize language, cultural references, and character demographics for different markets and demographics. UGC Style Variations: Create different UGC formats - unboxing videos, before/after comparisons, day-in-the-life content, or product demonstrations. Influencer Personas: Develop specific customer personas (age groups, lifestyles, interests) to create targeted UGC content for different audience segments. Content Scaling: Set up batch processing to generate multiple UGC variations for A/B testing different approaches and styles.
by Jordan
This n8n template acts as your automated social media data analyst. Instead of manually checking your analytics across different dashboards every week, this workflow scrapes your latest stats, calculates your week-over-week growth, and uses AI to write a strategic performance report delivered straight to your inbox. Use cases are many: Perfect for content creators tracking growth, agencies managing client reporting, or community managers monitoring Skool engagement alongside social channels. Good to know First Run Setup:* Since this workflow calculates *growth (Current vs. Last Week), it needs a baseline to start. You will need to manually add one row to the linked Airtable template with your current stats before the first scheduled run. Cost:** This uses the Apify API for scraping TikTok and OpenRouter for the LLM analysis. Costs are generally very low (pennies per run), but you will need active accounts for both. How it works Data Collection: Every Sunday, the workflow triggers and pulls your live follower counts from YouTube, TikTok, and Skool. It also grabs the transcripts and stats for every video you posted in the last 7 days. Growth Calculation: It searches your Airtable database for the previous week's record and compares it against the live numbers to calculate exactly how many subscribers and followers you gained. AI Analysis: The data and video transcripts are fed into an LLM (via OpenRouter). The AI analyzes why certain videos performed well based on the content, identifying trends and engagement patterns. Reporting: The LLM generates a clean, formatted HTML report. Delivery & Archiving: The workflow emails the report to you and saves the new raw data into Airtable to serve as the baseline for next week. How to use The workflow is set to run weekly (Sundays), but you can change the Schedule Trigger to whatever day you prefer to receive reports. You will need to configure the CONFIG node at the start with your specific profile URLs and Channel IDs so the scrapers know where to look. Link to the required Airtable template is included in the workflow sticky notes. Requirements n8n** (Self-hosted recommended) Apify** account (for TikTok and Skool scraping) Google Cloud** project (for YouTube Data API) OpenRouter** or OpenAI API key Airtable** account Gmail** account Customising this workflow You can easily swap out the LLM model in the OpenRouter node if you prefer a specific model (like Claude 3.5 Sonnet or GPT-4o) for the analysis. You could also add more platforms like LinkedIn or Instagram if you have the appropriate Apify actors or API credentials.
by Intuz
This n8n template from Intuz provides a complete end-to-end content factory to automate the entire lifecycle of creating and publishing AI-driven videos. It transforms a single text prompt into a fully scripted, visually rich video with AI-generated images and voiceovers, then distributes it across multiple social media platforms. Who's this workflow for? Content Creators & YouTubers Social Media Managers & Agencies Digital Marketers & Entrepreneurs Brands looking to scale video content production Objective Generate viral video scripts with Gemini AI (via LangChain). Break scripts into structured scenes(hooks, retention, CTA). Create image prompts and high-quality background visuals automatically. Store all scenes, prompts, images, and metadata into Airtable. Handle file formatting, uploads, and naming automatically. Add error handling and retry logic for smooth execution. Uploading into Social Media platforms. How it works 1. AI Script Generation: The workflow starts with a single command. A powerful Google Gemini AI model, acting as a "Content Brain," generates a complete, viral video script with a title, description, and multiple scenes. 2. Content Management in Airtable: The entire script is broken down and saved systematically into an Airtable base, which acts as the central Content Management System (CMS) for the video production pipeline. 3. AI Image Generation: The workflow loops through each scene in Airtable. It uses an AI agent to enhance the image prompts and sends them to an image generation API (like Pollinations.ai) to create a unique, high-quality image for each scene. These images are then automatically uploaded back into Airtable. 4. Automated Video Rendering: Once all images are ready, the workflow gathers the script text and the corresponding image URLs from Airtable and sends them to Creatomate. Creatomate uses a pre-defined template to assemble the final video, complete with AI-generated voiceovers for each scene. 5. Multi-Platform Publishing: After a brief wait for the video to render, the workflow retrieves the final video file and automatically publishes it across your connected social media channels, including YouTube and Instagram. Key Requirements to Use This Template Before you start, please ensure you have the following accounts and assets ready: 1. n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow relies on the official n8n LangChain integration (@n8n/n8n-nodes-langchain). If you are using a self-hosted version of n8n, please ensure this package is installed on your instance. 2. AI & Video Accounts: Google Gemini AI Account: A Google Cloud account with the Vertex AI API enabled and an API Key. Creatomate Account: It's platform to generate videos. An account with Creatomate for video rendering, and your API key and a pre-designed video template ID. 3. Content & Social Media Accounts: Airtable Account: An Airtable base set up to manage the video content. Using the complementary Airtable template is highly recommended. YouTube Account: A YouTube channel with API access enabled via Google Cloud Console. Upload-Post.com Account: An account for posting to other platforms like Instagram. Workflow Steps 1.▶️ Trigger (Manual/Auto) Start workflow manually or via schedule. 2.🧠 Content Brain (Gemini + LangChain) Role-trained viral strategist prompt. Generates 6 scene scripts with: Hook → Retention → Value → CTA. Follows viral content rules (engagement triggers, curiosity gaps, shareable moments). 3.📑 Structured Output Parser Enforces JSON schema: video_id video_title description scenes[] → scene_number, text, image_prompt 4.📄 Airtable – Store Scenes Each scene stored with: Video ID, Title, Description Scene Number & Text Image Prompt & Generated Image link 5.🤖 AI Agent – Image Prompt Creator Transforms scene text →optimized image prompts using structured rules. 6.🎨 Pollination AI – Image Generation Generates vertical 9:16 visuals with parameters: Model: flux Resolution: 1080x1920 Steps: 12 Guidance Scale: 3.5 Safety Checker: Enabled Options: seed=42, nologo=true 7.📂 File Handling & Conversion Assigns filenames automatically (e.g., images_001.png). Converts binary image → base64 for Airtable storage. 8.📤 Airtable Upload – Store Images Attaches generated visuals directly into Generated Image field. 9.⚡ Switch & Error Handling Branches for: ✅ Success → continue ❌ Failed → stop with error message ⏳ Processing → wait/retry 10.Social Media Upload In YouTube via YouTube API from official documentation In Instagram Via Upload Post API Setup Instructions 1. AI Configuration: In the Google Gemini Chat Model nodes, connect your Google Gemini API account. In the Content Brain node, you can customize the core prompt to change the video's niche, style, or topic. 2. Airtable Setup (Crucial): Connect your Airtable account in the Airtable nodes. Set up your Airtable base using the provided template or ensure your base has identical table and field names. Update the Base ID and Table IDs in the Airtable nodes. Airtable Schema: 3. Video Rendering Setup (Creatomate): In the Video Rendering - Creatomate node, add your Creatomate API key to the header authorization. In the Template for Creatomate node, replace the template_id with the ID of your own video template from your Creatomate account. 4. Social Media Connections: In the Upload on YouTube node, connect your YouTube account via OAuth2. In the Upload on Instagram node, replace the API key in the header authorization with your key from Upload-Post.com. 5. Execute the Workflow: Click "Execute workflow" to kick off your automated video content factory. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by zawanah
Categorise and route emails with GPT 5 This workflow demonstrates how to use AI text classifier to classify incoming emails, and uses a multi-agent architecture to respond for each email category respectively. Use cases Business owners with a lot of incoming emails, or anyone who has huge influx of emails How it Works Any incoming emails will be read by the text classifier powered by GPT 5, and routed according to the defined categories where respective agents will take next steps. Workflow is triggered when an email comes in GPT will read email's "subject","from" and "content" to route it accurately to respective designated categories For customer support enquiries, customer support agent will take knowledge from the pinecone vector database about FAQs and policies, reply via gmail, and label the email as "Customer Support" For finance-related queries, finance agent will label email as "Finance" and assess if email is about making payment or receiving from customers. If payment-related, email will be sent to the payments team to take action. If receipts-related, email will be sent to the receivables team to take action. User will be notified via telegram after any email is sent. For sales/leads enquiries, leads agent will label the email as "Sales Opportunities", take knowledge from the pinecone vector database about the business to generate a response and draft into gmail and user will be notified via telegram to review and send. If there is lack of information for agent to generate a response, user will be notified of this via telegram as well. Any internal team member emails will be routed to the internal agent. The agent will label message as "Internal" and send user a summary of the email message via telegram. How to set up Set up Telegram bot via Botfather. See setup instructions here Setup OpenAI API for transcription services (Credits required) here Set up Openrouter account. See details here Set up Pinecone database. See details here Customization Options Other than Gmail, it is possible to connect to Outlook as well. Other than Pinecone vector database, there are other vector database that should serve the same purpose eg. supabase, qdrant, weviate Requirements Gmail account Telegram bot Pinecone account Open router account
by Pixcels Themes
Who’s it for This template is perfect for content creators, marketers, solopreneurs, agencies, and social media strategists who want to understand what audiences are talking about online. It helps teams quickly turn broad topics into structured insights, trend opportunities, and actionable content ideas. What it does / How it works This workflow begins with a form where the user enters a single topic. An AI agent expands the topic into subtopics and generates multiple relevant keywords. For each keyword, the workflow automatically gathers content from Reddit and X (formerly Twitter), extracting posts, titles, text, engagement metrics, and links. Each collected post is then analyzed by an AI model to determine: Trend potential Audience relevance Platform suitability Recommended content formats Categories and keywords Once all posts are processed, a final AI agent synthesizes the results, identifies the strongest emerging trends, groups similar insights, and generates strategic content recommendations. Requirements Google Gemini (PaLM) API credentials X / Twitter OAuth2 credentials Access to the n8n Form Trigger (publicly accessible URL) How to set up Connect your Gemini API and Twitter API credentials. Make sure the Form Trigger node is accessible. Review and adjust the AI prompts if you want different output formats. Run the form, enter a topic, and execute the workflow. How to customize the workflow Add more platforms (YouTube, TikTok, Instagram, Hacker News) Add sentiment scoring or engagement ranking Export insights to Google Sheets or Notion Generate ready-to-post content from the trends
by Servify
This n8n template demonstrates how to build an autonomous AI assistant that handles real business tasks through natural conversation on Telegram. The example shows meeting scheduling with CRM lookup and calendar management, but the architecture supports any business automation you can imagine - simply add tools and the AI learns to use them automatically. Use cases are many: Try automating appointment scheduling, customer support tickets, invoice generation, lead qualification, email management, report generation, data entry, or task coordination! Good to know OpenAI API costs are minimal at ~$0.001 per conversation with GPT-4o-mini The AI agent makes autonomous decisions and can chain multiple tool calls to complete complex tasks Conversation context is not persisted between sessions (can be extended with a memory database) Calendar availability is checked for business hours (9 AM - 4 PM) by default The workflow assumes contacts are stored in Google Sheets with Name and Email columns This is production-ready code that can be deployed immediately for real business use How it works User sends a natural language message to the Telegram bot requesting a meeting The workflow extracts message content, chat ID, and user information CRM database is loaded from Google Sheets containing contact information The AI agent analyzes the request and autonomously decides which tools to use AI searches CRM for contacts, checks Google Calendar availability, and proposes 3 available time slots User confirms their preferred time through conversational reply Upon confirmation, the workflow creates a Google Calendar event with both parties invited A professional confirmation email is automatically sent via Gmail to the meeting attendee The entire multi-step process executes autonomously through simple conversation How to use Set up a Google Sheet as your CRM with columns: Name, Email, Phone Create a Telegram bot via BotFather and get your bot token Import this workflow and connect your credentials (Telegram, OpenAI, Google Sheets, Calendar, Gmail) Replace placeholder IDs with your actual Google Sheet ID and Calendar ID in the workflow nodes Activate the workflow to start listening for Telegram messages Test with: "Schedule a meeting with [contact name] tomorrow at 2 PM" Customize the AI Agent system prompt to match your scheduling preferences and timezone Requirements Telegram Bot Token (free from BotFather) OpenAI API account with GPT-4o-mini access Google Sheets OAuth2 credentials for CRM database access Google Calendar OAuth2 credentials for availability checking and event creation Gmail OAuth2 credentials for sending confirmation emails Customising this workflow Add new tools (APIs, databases, services) and the AI automatically learns to use them - no code changes needed Replace Telegram with Slack, WhatsApp, or SMS for different communication channels Extend capabilities beyond scheduling: invoice generation, customer support, lead qualification, report generation Connect external systems like HubSpot, Salesforce, QuickBooks, Asana, or custom APIs Add conversation memory by integrating a vector database for context-aware multi-turn conversations Implement approval workflows where AI drafts actions for human review before execution Build multiple specialized assistants with different tools and expertise for various business functions
by Karol
How it works This workflow turns any URL sent to a Telegram bot into ready-to-publish social posts: Trigger: Telegram message (checks if it contains a URL). Fetch & parse: Downloads the page and extracts readable text + title. AI writing: Generates platform-specific copy (Facebook, Instagram, LinkedIn). Image: Creates an AI image and stores it in Supabase Storage. Publish: Posts to Facebook Pages, Instagram Business, LinkedIn. Logging: Updates Google Sheets with post URLs and sends a Telegram confirmation (image + links). Setup Telegram – create a bot, connect via n8n Telegram credentials. OpenAI / Gemini – add API key in n8n Credentials and select it in the AI nodes. Facebook/Instagram (Graph API) – create a credential called facebookGraph with: • accessToken (page-scoped or system user) • pageId (for Facebook Page photos) • igUserId (Instagram Business account ID) • optional fbApiVersion (default v19.0) LinkedIn – connect with OAuth2 in the LinkedIn node (leave as credential). Supabase – credential supabase with url and apiKey. Ensure a bucket exists (default used in the Set node is social-media). Google Sheets – replace YOUR_GOOGLE_SHEET_ID and Sheet1. Grant your n8n Google OAuth2 access. Notes • No API keys are stored in the template. Everything runs via n8n Credentials. • You can change bucket name, image size/quality, and AI prompts in the respective nodes. • The confirmation message on Telegram includes direct permalinks to the published posts. Required credentials • Telegram Bot • OpenAI (or Gemini) • Facebook/Instagram Graph • LinkedIn OAuth2 • Supabase (url + apiKey) • Google Sheets OAuth2 Inputs • A Telegram message that contains a URL. Outputs • Social posts published on Facebook, Instagram, LinkedIn. • Row appended/updated in Google Sheets with post URLs and image link. • Telegram confirmation with the generated image + post links.
by yu-ya
Schedule and optimize social media posts to Twitter and LinkedIn using AI This workflow automates the entire lifecycle of social media management—from fetching draft content to AI-driven optimization and multi-platform publishing. Who’s it for Social media managers, marketing teams, and content creators who use Google Sheets to plan their content but want to leverage AI for better engagement and automate the repetitive task of cross-platform posting. How it works The workflow is triggered either hourly or manually via a webhook. It fetches scheduled content from a designated Google Sheet and identifies posts ready for publication. An AI Agent (OpenAI) then analyzes the raw content to generate two optimized versions: a punchy, character-limited post for Twitter and a more professional, detailed version for LinkedIn. After generating relevant hashtags and engagement tips, the workflow publishes the posts simultaneously. Finally, it logs the live URLs back to your spreadsheet and sends a performance summary to a Slack channel for easy tracking. How to set up Google Sheet: Create a sheet with columns for status, content, platforms, scheduled_time, hashtags, and tone. Credentials: Connect your Google Sheets, OpenAI, Twitter (X), LinkedIn, and Slack accounts. Node Configuration: Select your specific spreadsheet and worksheet in both the "Fetch Content" and "Update Content" nodes. Slack: Specify the channel name or ID in the Slack node to receive notifications. Activation: Test with the Manual Webhook, then toggle the workflow to "Active." Requirements Google Sheets OAuth2** OpenAI API Key** (using GPT-4o-mini or higher) Twitter (X) OAuth2** LinkedIn OAuth2** Slack Bot Token** How to customize the workflow AI Tone**: Modify the "System Message" in the AI Content Optimizer node to match your brand's unique voice. Additional Platforms**: Extend the branching logic after the AI Parse node to include platforms like Discord, Facebook, or Mastodon. Advanced Scheduling**: Adjust the Filter node's JavaScript code if you use a different date format or status labels in your spreadsheet.
by Mariela Slavenova
This template crawls a website from its sitemap, deduplicates URLs in Supabase, scrapes pages with Crawl4AI, cleans and validates the text, then stores content + metadata in a Supabase vector store using OpenAI embeddings. It’s a reliable, repeatable pipeline for building searchable knowledge bases, SEO research corpora, and RAG datasets. ⸻ Good to know • Built-in de-duplication via a scrape_queue table (status: pending/completed/error). • Resilient flow: waits, retries, and marks failed tasks. • Costs depend on Crawl4AI usage and OpenAI embeddings. • Replace any placeholders (API keys, tokens, URLs) before running. • Respect website robots/ToS and applicable data laws when scraping. How it works Sitemap fetch & parse — Load sitemap.xml, extract all URLs. De-dupe — Normalize URLs, check Supabase scrape_queue; insert only new ones. Scrape — Send URLs to Crawl4AI; poll task status until completed. Clean & score — Remove boilerplate/markup, detect content type, compute quality metrics, extract metadata (title, domain, language, length). Chunk & embed — Split text, create OpenAI embeddings. Store — Upsert into Supabase vector store (documents) with metadata; update job status. Requirements • Supabase (Postgres + Vector extension enabled) • Crawl4AI API key (or header auth) • OpenAI API key (for embeddings) • n8n credentials set for HTTP, Postgres/Supabase How to use Configure credentials (Supabase/Postgres, Crawl4AI, OpenAI). (Optional) Run the provided SQL to create scrape_queue and documents. Set your sitemap URL in the HTTP Request node. Execute the workflow (manual trigger) and monitor Supabase statuses. Query your documents table or vector store from your app/RAG stack. Potential Use Cases This automation is ideal for: Market research teams collecting competitive data Content creators monitoring web trends SEO specialists tracking website content updates Analysts gathering structured data for insights Anyone needing reliable, structured web content for analysis Need help customizing? Contact me for consulting and support: LinkedIn
by Growth AI
SEO Content Generation Workflow - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation while maintaining quality and personalization. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company's database information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) OpenAI API**: For embeddings and vector search Apify API**: For Google search results Firecrawl API**: For competitor website scraping Supabase**: For vector database (optional but recommended) Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Supabase database: Database name (prevents AI hallucination) Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key OpenAI API key Apify API key Firecrawl API key Supabase credentials (if using vector database) Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword Scrapes first 5 competitor websites Extracts heading structures (H1-H6) Analyzes competitor meta tags and content organization Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data Accesses client database for personalization Generates optimized meta title (65 chars max) Creates compelling meta description (165 chars max) Produces user-focused H1 (70 chars max) Phase 4: Content Brief Creation Analyzes search intent percentages Develops content strategy based on competitor analysis Creates detailed MECE page structure Suggests rich media elements Provides writing recommendations and detail level scoring Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers Modify system prompts for different content styles Adjust character limits for meta elements Modifying Competitor Analysis Change number of competitors analyzed (currently 5) Adjust scraping parameters in Firecrawl nodes Modify heading extraction logic in JavaScript nodes Customizing Output Format Update Google Sheets column mapping in Code node Modify structured output parser schema Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between phases Add error handling and retry logic Implement content quality scoring Extending Functionality Add keyword research capabilities Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Use descriptive node names for easier workflow maintenance Troubleshooting API Errors**: Check credential configuration and usage limits Scraping Failures**: Firecrawl nodes have error handling enabled Empty Results**: Verify keyword formatting and competitor availability Sheet Updates**: Ensure proper column mapping in final Code node Processing Stops**: Check batch processing limits and timeout settings
by Growth AI
SEO Content Generation Workflow (Basic Version) - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation without the complexity of vector databases. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) Apify API**: For Google search results Firecrawl API**: For competitor website scraping Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key Apify API key Firecrawl API key Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword using Apify Scrapes first 5 competitor websites using Firecrawl Extracts heading structures (H1-H6) from competitor pages Analyzes competitor meta tags and content organization Processes markdown content to identify heading hierarchies Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data using Claude Incorporates client information for personalization Generates optimized meta title (65 characters maximum) Creates compelling meta description (165 characters maximum) Produces user-focused H1 (70 characters maximum) Uses structured output parsing for consistent formatting Phase 4: Content Brief Creation Analyzes search intent percentages (informational, transactional, navigational) Develops content strategy based on competitor analysis Creates detailed MECE page structure with H2 and H3 sections Suggests rich media elements (images, videos, infographics, tables) Provides writing recommendations and detail level scoring (1-10 scale) Ensures SEO optimization while maintaining user relevance Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords Key Differences from Advanced Version This basic version focuses on core SEO functionality without additional complexity: No Vector Database**: Removes Supabase integration for simpler setup Streamlined Architecture**: Fewer dependencies and configuration steps Essential Features Only**: Core competitor analysis and content generation Faster Setup**: Reduced time to deployment Lower Costs**: Fewer API services required How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers in the agent nodes Modify system prompts for different content styles or languages Adjust character limits for meta elements in the structured output parser Modifying Competitor Analysis Change number of competitors analyzed (currently 5) by adding/removing Scrape nodes Adjust scraping parameters in Firecrawl nodes for different content types Modify heading extraction logic in JavaScript Code nodes Customizing Output Format Update Google Sheets column mapping in the final Code node Modify structured output parser schema for different data structures Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between workflow phases Add error handling and retry logic to critical nodes Implement content quality scoring mechanisms Extending Functionality Add keyword research capabilities with additional APIs Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Setup and Testing Always test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Content Quality Review generated content before publishing Customize system prompts to match your brand voice Use descriptive node names for easier workflow maintenance Keep competitor analysis current by running regularly Performance Optimization Process keywords in small batches to avoid timeouts Set appropriate retry policies for external API calls Monitor workflow execution times and optimize bottlenecks Troubleshooting Common Issues and Solutions API Errors Check credential configuration in n8n settings Verify API usage limits and billing status Ensure proper authentication for each service Scraping Failures Firecrawl nodes have error handling enabled to continue on failures Some websites may block scraping - this is normal behavior Check if competitor URLs are accessible and valid Empty Results Verify keyword formatting in Google Sheets Ensure competitor websites contain the expected content structure Check if meta tags are properly formatted in system prompts Sheet Update Errors Ensure proper column mapping in final Code node Verify Google Sheets permissions and sharing settings Check that target sheet names match exactly Processing Stops Review batch processing limits and timeout settings Check for errors in individual nodes using execution logs Verify all required fields are populated in input data Template Structure Required Sheets Client Information: Business details and configuration SEO: Target keywords and page information Results Sheet: Where generated content will be written Expected Columns Keywords**: Target search terms Description**: Brief page description Type de page**: Page category Awareness level**: User familiarity level title, meta-desc, h1, brief**: Generated output columns This streamlined version provides all essential SEO content generation capabilities while being easier to set up and maintain than the advanced version with vector database integration.
by David Olusola
🤖 Automated AI News Video Creation and Social Media Publishing Workflow ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 🎯 Overview: This workflow fully automates the creation and social media distribution of AI-generated news videos. It fetches news, crafts captions, generates avatar videos via HeyGen, stores them, and publishes them across Instagram, Facebook, and YouTube via Postiz. 🔄 WORKFLOW PROCESS: News Fetching: Reads the latest news from an RSS feed. AI Captioning: Generates concise, engaging captions using an AI agent (GPT-4o-mini). Video Generation: Creates an AI avatar video using HeyGen with the generated caption. Video Storage: Downloads the video and uploads it to Google Drive for archival. Data Logging: Records all news and video metadata into Google Sheets. Postiz Upload: Uploads the video to Postiz's internal storage for publishing. Social Publishing: Fetches Postiz integrations and routes the video to Instagram, Facebook, and YouTube after platform-specific content cleaning. ⚙️ KEY TECHNOLOGIES: RSS Feeds:** News source. LangChain (n8n nodes):** AI Agent and Chat OpenAI for caption generation. HeyGen API:** AI avatar video creation. Google Drive:** Video file storage. Google Sheets:** Data logging and tracking. Postiz API:** Unified social media publishing platform. ⚠️CRITICAL CONFIGURATIONS: API Keys:** Ensure HeyGen and Postiz API keys are correctly set in credentials and the 'Setup Heygen Parameters' node. HeyGen IDs:** Verify avatar_id and voice_id in 'Setup Heygen Parameters'. Postiz URL:** Confirm https://postiz.yourdomain.com is your correct Postiz instance URL across all HTTP Request nodes. Credentials:** All Google, OpenAI, and Postiz credentials must be properly linked. 📈BENEFITS: Automated content creation and distribution, saving significant time. Consistent branding and messaging across multiple platforms. Centralized logging for tracking and performance analysis. Scalable solution for high-volume content demands.