by Pikor
Different Articles Summarizer & Social Media Auto-Poster This n8n template demonstrates how to extract full-text articles from different news websites, summarize them with AI, and automatically generate content for social networks (Twitter, Instagram, Threads, LinkedIn, YouTube). You can use it for any news topic. Example: posting summaries of breaking news articles. Possible use cases : Automate press article summarization with GPT. Create social media posts optimized for young audiences. Publish content simultaneously across multiple platforms with Late API. How it works The workflow starts manually or with a trigger. URLs of news articles are defined in the Edit Fields node. Each URL is processed separately via Split Out. HTTP Request fetches the article HTML. Custom Code node extracts clean text (title, content, main image). OpenAI summarizes each article factually. Aggregate combines results. Another OpenAI node (Message a model) creates structured JSON summaries for young readers. A final OpenAI node (Message a model1) generates short social media posts (hook, summary, CTA, hashtags). Images are extracted via HTML1 and uploaded to Google Drive. Posts (text + image) are sent to Late API for multi-platform scheduling (Twitter, Instagram, Threads, LinkedIn, YouTube). Requirements OpenAI API key connected to n8n. Google Drive account (for storing article images). Late API credentials with platform account IDs. Valid list of article URLs.
by Dr. Firas
💥 Automate YouTube thumbnail creation from video links (with templated.io) Who is this for? This workflow is designed for content creators, YouTubers, and automation enthusiasts who want to automatically generate stunning YouTube thumbnails and streamline their publishing workflow — all within n8n. If you regularly post videos and spend hours designing thumbnails manually, this automation is built for you. What problem is this workflow solving? Creating thumbnails is time-consuming — yet crucial for video performance. This workflow completely automates that process: No more manual design. No more downloading screenshots. No more repetitive uploads. In less than 2 minutes, you can refresh your entire YouTube thumbnail library and make your channel look brand new. What this workflow does Once activated, this workflow can: ✅ Receive YouTube video links via Telegram ✅ Extract metadata (title, description, channel info) via YouTube API ✅ Generate a custom thumbnail automatically using Templated.io ✅ Upload the new thumbnail to Google Drive ✅ Log data in Google Sheets ✅ Send email and Telegram notifications when ready ✅ Create and publish AI-generated social posts on LinkedIn, Facebook, and Twitter via Blotato Bonus: You can re-create dozens of YouTube covers in minutes — saving up to 5 hours per week and around $500/month in manual design effort. Setup 1️⃣ Get a YouTube Data API v3 key from Google Cloud Console 2️⃣ Create a Templated.io account and get your API key + template ID 3️⃣ Set up a Telegram bot using @BotFather 4️⃣ Create a Google Drive folder and copy the folder ID 5️⃣ Create a Google Sheet with columns: Date, Video ID, Video URL, Title, Thumbnail Link, Status 6️⃣ Get your Blotato API key from the dashboard 7️⃣ Connect your social media accounts to Blotato 8️⃣ Fill all credentials in the Workflow Configuration node 9️⃣ Test by sending a YouTube URL to your Telegram bot How to customize this workflow Replace the Templated.io template ID with your own custom thumbnail layout Modify the OpenAI node prompts to change text tone or style Add or remove social platforms in the Blotato section Adjust the wait time (default: 5 minutes) based on template complexity Localize or translate the generated captions as needed Expected Outcome With one Telegram message, you’ll receive: A professional custom thumbnail An instant email + Telegram notification A Google Drive link with your ready-to-use design And your social networks will be automatically updated — no manual uploads. Credits Thumbnail generation powered by Templated.io Social publishing powered by Blotato Automation orchestrated via n8n 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 🎥 Watch This Tutorial 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Hemanth Arety
Generate AEO strategy from brand input using AI competitor analysis This workflow automatically creates a comprehensive Answer Engine Optimization (AEO) strategy by identifying your top competitors, analyzing their positioning, and generating custom recommendations to help your brand rank in AI-powered search engines like ChatGPT, Perplexity, and Google SGE. Who it's for This template is perfect for: Digital marketing agencies** offering AEO services to clients In-house marketers** optimizing content for AI search engines Brand strategists** analyzing competitive positioning Content teams** creating AI-optimized content strategies SEO professionals** expanding into Answer Engine Optimization What it does The workflow automates the entire AEO research and strategy process in 6 steps: Collects brand information via a user-friendly web form (brand name, website, niche, product type, email) Identifies top 3 competitors using Google Gemini AI based on product overlap, market position, digital presence, and geographic factors Scrapes target brand website with Firecrawl to extract value propositions, features, and content themes Scrapes competitor websites in parallel to gather competitive intelligence Generates comprehensive AEO strategy using OpenAI GPT-4 with 15+ actionable recommendations Delivers formatted report via email with executive summary, competitive analysis, and implementation roadmap The entire process runs automatically and takes approximately 5-7 minutes to complete. How to set up Requirements You'll need API credentials for: Google Gemini API** (for competitor analysis) - Get API key OpenAI API** (for strategy generation) - Get API key Firecrawl API** (for web scraping) - Get API key Gmail account** (for email delivery) - Use OAuth2 authentication Setup Steps Import the workflow into your n8n instance Configure credentials: Add your Google Gemini API key to the "Google Gemini Chat Model" node Add your OpenAI API key to the "OpenAI Chat Model" node Add your Firecrawl API key as HTTP Header Auth credentials Connect your Gmail account using OAuth2 Activate the workflow and copy the form webhook URL Test the workflow by submitting a real brand through the form Check your email for the generated AEO strategy report Credentials Setup Tips For Firecrawl: Create HTTP Header Auth credentials with header name Authorization and value Bearer YOUR_API_KEY For Gmail: Use OAuth2 to avoid authentication issues with 2FA Test each API credential individually before running the full workflow How it works Competitor Identification The Google Gemini AI agent analyzes your brand based on 4 weighted criteria: product/service overlap (40%), market position (30%), digital presence (20%), and geographic overlap (10%). It returns structured JSON data with competitor names, URLs, overlap percentages, and detailed reasoning. Web Scraping Firecrawl extracts structured data from websites using custom schemas. For each site, it captures: company name, products/services, value proposition, target audience, key features, pricing info, and content themes. This runs asynchronously with 60-second waits to allow for complete extraction. Strategy Generation OpenAI GPT-4 analyzes the combined brand and competitor data to generate a comprehensive report including: executive summary, competitive analysis, 15+ specific AEO tactics across 4 categories (content optimization, structural improvements, authority building, answer engine targeting), content priority matrix with 10 ranked topics, and a detailed implementation roadmap. Email Delivery The strategy is formatted as a professional HTML email with clear sections, visual hierarchy, and actionable next steps. Recipients get an immediately implementable roadmap for improving their AEO performance. How to customize the workflow Change AI Models Replace Google Gemini** with Claude, GPT-4, or other LLM in the competitor analysis node Replace OpenAI** with Anthropic Claude or Google Gemini in the strategy generation node Both use LangChain agent nodes, making model swapping straightforward Modify Competitor Analysis Find more competitors**: Edit the AI prompt to request 5 or 10 competitors instead of 3 Add filtering criteria**: Include factors like company size, funding stage, or geographic focus Change ranking weights**: Adjust the 40/30/20/10 weighting in the prompt Enhance Data Collection Add social media scraping**: Include LinkedIn, Twitter/X, or Facebook page analysis Pull review data**: Integrate G2, Capterra, or Trustpilot APIs for customer sentiment Include traffic data**: Add SimilarWeb or Semrush API calls for competitive metrics Change Output Format Export to Google Docs**: Replace Gmail with Google Docs node to create shareable documents Send to Slack/Discord**: Post strategy summaries to team channels for collaboration Save to database**: Store results in Airtable, PostgreSQL, or MongoDB for tracking Create presentations**: Generate PowerPoint slides using automation tools Add More Features Schedule periodic analysis**: Run monthly competitive audits for specific brands A/B test strategies**: Generate multiple strategies and compare results over time Multi-language support**: Add translation nodes for international brands Custom branding**: Modify email templates with your agency's logo and colors Adjust Scraping Behavior Change Firecrawl schema**: Customize extracted data fields based on industry needs Add timeout handling**: Implement retry logic for failed scraping attempts Scrape more pages**: Extend beyond homepage to include blog, pricing, and about pages Use different scrapers**: Replace Firecrawl with Apify, Browserless, or custom solutions Tips for best results Provide clear brand information**: The more specific the product type and niche, the better the competitor identification Ensure websites are accessible**: Some sites block scrapers; consider adding user agents or rotating IPs Monitor API costs**: Firecrawl and OpenAI charges can add up; set usage limits Review generated strategies**: AI recommendations should be reviewed and customized for your specific context Iterate on prompts**: Fine-tune the AI prompts based on output quality over multiple runs Common use cases Client onboarding** for marketing agencies - Generate initial AEO assessments Content strategy planning** - Identify topics and angles competitors are missing Quarterly audits** - Track competitive positioning changes over time Product launches** - Understand competitive landscape before entering market Sales enablement** - Equip sales teams with competitive intelligence Note: This workflow uses community and AI nodes that require external API access. Make sure your n8n instance can make outbound HTTP requests and has the necessary LangChain nodes installed.
by franck fambou
Overview This advanced automation workflow enables deep web scraping combined with Retrieval-Augmented Generation (RAG) to transform websites into intelligent, queryable knowledge bases. The system recursively crawls target websites, extracts content, and indexes all data in a vector database for AI conversational access. How the system works Intelligent Web Scraping and RAG Pipeline Recursive Web Scraper - Automatically crawls every accessible page of a target website Data Extraction - Collects text, metadata, emails, links, and PDF documents Supabase Integration - Stores content in PostgreSQL tables for scalability RAG Vectorization - Generates embeddings and stores them for semantic search AI Query Layer - Connects embeddings to an AI chat engine with citations Error Handling - Automatically retriggers failed queries Setup Instructions Estimated setup time: 30-45 minutes Prerequisites Self-hosted n8n instance (v0.200.0 or higher) Supabase account and project (PostgreSQL enabled) OpenAI/Gemini/Claude API key for embeddings and chat Optional: External vector database (Pinecone, Qdrant) Detailed configuration steps Step 1: Supabase configuration Project creation**: New Supabase project with PostgreSQL enabled Generating credentials**: API keys (anon key and service_role key) and connection string Security configuration**: RLS policies according to your access requirements Step 2: Connect Supabase to n8n Configure Supabase node**: Add credentials to n8n Credentials Test connection**: Verify with a simple query Configure PostgreSQL**: Direct connection for advanced operations Step 3: Preparing the database Main tables**: pages: URLs, content, metadata, scraping statuses documents: Extracted and processed PDF files embeddings: Vectors for semantic search links: Link graph for navigation Management functions**: Scripts to reactivate failed URLs and manage retries Step 4: Configuring automation Recursive scraper**: Starting URL, crawling depth, CSS selectors HTTP extraction**: User-Agent, headers, timeouts, and retry policies Supabase backup**: Batch insertion, data validation, duplicate management Step 5: Error handling and re-executions Failure monitoring**: Automatic detection of failed URLs Manual triggers**: Selective re-execution by domain or date Recovery sub-streams**: Retry logic with exponential backoff Step 6: RAG processing Embedding generation**: Text-embedding models with intelligent chunking Vector storage**: Supabase pgvector or external database Conversational engine**: Connection to chat models with source citations Data structure Main Supabase tables | Table | Content | Usage | |-------|---------|-------| | pages | URLs, HTML content, metadata | Main storage for scraped content | | documents | PDF files, extracted text | Downloaded and processed documents | | embeddings | Vectors, text chunks | Semantic search and RAG | | links | Link graph, navigation | Relationships between pages | Use cases Business and enterprise Competitive intelligence with conversational querying Market research from complex web domains Compliance monitoring and regulatory watch Research and academia Literature extraction with semantic search Building datasets from fragmented sources Legal and technical Scraping legal repositories with intelligent queries Technical documentation transformed into a conversational assistant Key features Advanced scraping Recursive crawling with automatic link discovery Multi-format extraction (HTML, PDF, emails) Intelligent error handling and retry Intelligent RAG Contextual embeddings for semantic search Multi-document queries with citations Intuitive conversational interface Performance and scalability Processing of thousands of pages per execution Embedding cache for fast responses Scalable architecture with Supabase Technical Architecture Main flow: Target URL → Recursive scraping → Content extraction → Supabase storage → Vectorization → Conversational interface Supported types: HTML pages, PDF documents, metadata, links, emails Performance specifications Capacity**: 10,000+ pages per run Response time**: < 5 seconds for RAG queries Accuracy**: >90% relevance for specific domains Scalability**: Distributed architecture via Supabase Advanced configuration Customization Crawling depth and scope controls Domain and content type filters Chunking settings to optimize RAG Monitoring Real-time monitoring in Supabase Cost and performance metrics Detailed conversation logs
by DevCode Journey
Who is this for? This workflow is designed for content creators, marketers, and entrepreneurs who want to automate their video production and social media publishing process. If you regularly post promotional or viral-style content on platforms like TikTok, YouTube Shorts, Instagram Reels, LinkedIn, and more, this template will save you hours of manual work. What problem is this workflow solving? / Use case Creating viral short-form videos is often time-consuming: You need to generate visuals, write scripts, edit videos, and then manually upload them to multiple platforms. Staying consistent across TikTok, YouTube Shorts, Instagram Reels, LinkedIn, Twitter/X, and others requires constant effort. This workflow solves the problem by automating the entire pipeline from idea → video creation → multi-platform publishing. What this workflow does Collects an idea and image from Telegram Enhances visuals with NanoBanana for user-generated content style Generates a complete video script with AI (OpenAI + structured prompts) Creates the final video with VEO3 using your custom prompt and visuals Rewrites captions with GPT to be short, catchy, and optimized for social platforms Saves metadata in Google Sheets for tracking and management Auto-uploads the video to all major platforms via Blotato TikTok YouTube Instagram LinkedIn Threads Pinterest X/Twitter Bluesky Facebook Notifies you on Telegram with a preview link once publishing is complete Setup Connect your accounts: Google Sheets** (for video tracking) Telegram** (to receive and send media) Blotato** (for multi-platform publishing) OpenAI API** (for captions, prompts, and image analysis) VEO3 API** (for video rendering) Fal.ai** (for NanoBanana image editing) Google Drive** (to store processed images) Set your credentials in the respective nodes. Adjust the Google Sheet IDs to match your own sheet structure. Insert your Telegram bot token in the Set: Bot Token (Placeholder) node. 🙋 For Help & Community 🌐 Website: devcodejourney.com 🔗 LinkedIn: Connect with Shakil 📱 WhatsApp Channel: Join Now 💬 Direct Chat: Message Now
by Codint
📰 Related News to Content Marketing Automation Overview This workflow automatically collects news from an RSS feed, identifies the most relevant article(s), and generates ready-to-use social media and blog content tailored for Medium, LinkedIn, and Instagram. It’s ideal for: • Marketing teams who want a steady flow of fresh content. • Social media managers looking to save time on research and writing. • Startups and creators who want consistent posting with minimal effort. Instead of manually scanning articles and drafting posts, this automation gives you AI-generated content with your preferred tone of voice — and even sends confirmation emails so you can review before posting. Prerequisites Before using this workflow, make sure you have: • ✅ An n8n account (self-hosted or cloud). • ✅ An OpenAI API key for content generation. • ✅ An RSS feed URL for your industry or niche. • ✅ A Gmail account (or another configured email service) to receive content confirmation messages. Setup Instructions Import the Workflow Download and import this workflow into your n8n instance. Configure the RSS Feed • Open the RSS Read node. • Replace the sample URL with your preferred news source(s). Connect OpenAI • Open the OpenAI Chat Model node. • Add your OpenAI API Key in the credentials. • (Optional) Adjust the prompts in the “Tone of Voice Writer” or “Instagram & LinkedIn Writer” nodes to match your brand’s style. Set Up Gmail • Open the Send Content Confirmation nodes. • Connect your Gmail (or another email service). • Add the email address where you want to receive content drafts. Run & Automate • Trigger the workflow manually with Execute Workflow to test. • Once tested, enable the Schedule Trigger node to run it automatically (e.g., daily). Customization Options • 🔄 Add more platforms: extend the workflow for Twitter, Facebook, or Slack. • ✏️ Adjust tone & length: update prompts in AI nodes (casual, professional, humorous, etc.). • ✅ Human-in-the-loop: add approval steps before publishing directly. • 🌍 Change news sources: swap RSS feeds for different industries or niches. Example Output LinkedIn Draft: “The future of marketing is AI-driven. 🚀 A new study shows that brands leveraging automation see 3x faster content turnaround. Read more here: [link] Instagram Caption: “Stay ahead of the curve 🌟 Today’s top story in digital marketing: AI tools are reshaping how we create content. What do you think — game-changer or hype? 🤔 #AI #Marketing” Limitations • AI may produce errors → Always review generated content before posting. • Gmail integration only sends drafts for confirmation — direct posting is not included. • RSS feeds vary in quality — results depend on your chosen source. How It Works (Workflow Overview) Collecting the news → Fetch articles via RSS feed. Best Article Finder → Selects the most relevant item. AI Writing → Generates content tailored for Medium, LinkedIn, and Instagram. Email Confirmation → Sends drafts to your inbox for review
by Marth
Automated AI-Driven Competitor & Market Intelligence System Problem Solved:** Small and Medium-sized IT companies often struggle to stay ahead in a rapidly evolving market. Manually tracking competitor moves, pricing changes, product updates, and emerging market trends is time-consuming, inconsistent, and often too slow for agile sales strategies. This leads to missed sales opportunities, ineffective pitches, and a reactive rather than proactive market approach. Solution Overview:** This n8n workflow automates the continuous collection and AI-powered analysis of competitor data and market trends. By leveraging web scraping, RSS feeds, and advanced AI models, it transforms raw data into actionable insights for your sales and marketing teams. The system generates structured reports, notifies relevant stakeholders, and stores intelligence in your database, empowering your team with real-time, strategic information. For Whom:** This high-value workflow is perfect for: IT Solution Providers & SaaS Companies: To maintain a competitive edge and tailor sales pitches based on competitor weaknesses and market opportunities. Sales & Marketing Leaders: To gain comprehensive, automated market intelligence without extensive manual research. Product Development Teams: To identify market gaps and validate new feature development based on competitive landscapes and customer sentiment. Business Strategists: To inform strategic planning with data-driven insights into industry trends and competitive threats. How It Works (Scope of the Workflow) ⚙️ This system establishes a powerful, automated pipeline for market and competitor intelligence: Scheduled Data Collection: The workflow runs automatically at predefined intervals (e.g., weekly), initiating data retrieval from various online sources. Diverse Information Gathering: It pulls data from competitor websites (pricing, features, blogs via web scraping services), industry news and blogs (via RSS feeds), and potentially other sources. Intelligent Data Preparation: Collected data is aggregated, cleaned, and pre-processed using custom code to ensure it's in an optimal format for AI analysis, removing noise and extracting relevant text. AI-Powered Analysis: An advanced AI model (like OpenAI's GPT-4o) performs in-depth analysis on the cleaned data. It identifies competitor strengths, weaknesses, new offerings, pricing changes, customer sentiment from reviews, emerging market trends, and suggests specific opportunities and threats for your company. Automated Report Generation: The AI's structured insights are automatically populated into a professional Google Docs report using a predefined template, making the intelligence easily digestible for your team. Team Notification: Stakeholders (sales leads, marketing managers) receive automated notifications via Slack (or email), alerting them to the new report and key insights. Strategic Data Storage & Utilization: All analyzed insights are stored in a central database (e.g., PostgreSQL). This builds a historical record for long-term trend analysis and can optionally trigger sub-workflows to generate personalized sales talking points directly relevant to ongoing deals or specific prospects. Setup Steps 🛠️ (Building the Workflow) To implement this sophisticated workflow in your n8n instance, follow these detailed steps: Prepare Your Digital Assets & Accounts: Google Sheet (Optional, if using for CRM data): For simpler CRM, create a sheet with CompetitorName, LastAnalyzedDate, Strengths, Weaknesses, Opportunities, Threats, SalesTalkingPoints. API Keys & Credentials: OpenAI API Key: Essential for the AI analysis. Web Scraping Service API Key: For services like Apify, Crawlbase, or similar (e.g., Bright Data, ScraperAPI). Database Access: Credentials for your PostgreSQL/MySQL database. Ensure you've created necessary tables (competitor_profiles, market_trends) with appropriate columns. Google Docs Credential: To link n8n to your Google Drive for report generation. Create a template Google Doc with placeholders (e.g., {{competitorName}}, {{strengths}}). Slack Credential: For sending team notifications to specific channels. CRM API Key (Optional): If directly integrating with HubSpot, Salesforce, or custom CRM via API. Identify Data Sources for Intelligence: Compile a list of competitor website URLs you want to monitor (e.g., pricing pages, blog sections, news). Identify relevant online review platforms (e.g., G2, Capterra) for competitor products. Gather RSS Feed URLs from key industry news sources, tech blogs, and competitor's own blogs. Define keywords for general market trends or competitor mentions, if using tools that provide RSS feeds (like Google Alerts). Build the n8n Workflow (10 Key Nodes): Start a new workflow in n8n and add the following nodes, configuring their parameters and connections carefully: Cron (Scheduled Analysis Trigger): Set this to trigger daily or weekly at a specific time (e.g., Every Week, At Hour: 0, At Minute: 0). HTTP Request (Fetch Competitor Web Data): Configure this to call your chosen web scraping service's API. Set Method to POST, URL to the service's API endpoint, and build the JSON/Raw Body with the startUrls (competitor websites, review sites) for scraping, including your API Key in Authentication (e.g., Header Auth). RSS Feed (Fetch News & Blog RSS): Add the URLs of competitor blogs and industry news RSS feeds. Merge (Combine Data Sources): Connect inputs from both Fetch Competitor Web Data and Fetch News & Blog RSS. Use Merge By Position. Code (Pre-process Data for AI): Write JavaScript code to iterate through merged items, extract relevant text content, perform basic cleaning (e.g., HTML stripping), and limit text length for AI input. Output should be an array of objects with content, title, url, and source. OpenAI (AI Analysis & Competitor Insights): Select your OpenAI credential. Set Resource to Chat Completion and Model to gpt-4o. In Messages, create a System message defining AI's role and a User message containing the dynamic prompt (referencing {{ $json.map(item => ... ).join('\\n\\n') }} for content, title, url, source) and requesting a structured JSON output for analysis. Set Output to Raw Data. Google Docs (Generate Market Intelligence Report): Select your Google Docs credential. Set Operation to Create document from template. Provide your Template Document ID and map the Values from the parsed AI output (using JSON.parse($json.choices[0].message.content).PropertyName) to your template placeholders. Slack (Sales & Marketing Team Notification): Select your Slack credential. Set Chat ID to your team's Slack channel ID. Compose the Text message, referencing the report link ({{ $json.documentUrl }}) and key AI insights (e.g., {{ JSON.parse($json.choices[0].message.content).Competitor_Name }}). PostgreSQL (Store Insights to Database): Select your PostgreSQL credential. Set Operation to Execute Query. Write an INSERT ... ON CONFLICT DO UPDATE SQL query to store the AI insights into your competitor_profiles or market_trends table, mapping values from the parsed AI output. OpenAI (Generate Personalized Sales Talking Points - Optional Branch): This node can be part of the main workflow or a separate, manually triggered workflow. Configure it similarly to the main AI node, but with a prompt tailored to generate sales talking points based on a specific sales context and the stored insights. Final Testing & Activation: Run a Test: Before going live, manually trigger the workflow from the first node. Carefully review the data at each stage to ensure correct processing and output. Verify that reports are generated, notifications are sent, and data is stored correctly. Activate Workflow: Once testing is complete and successful, activate the workflow in n8n. This system will empower your IT company's sales team with invaluable, data-driven intelligence, enabling them to close more deals and stay ahead in the market.
by Dr. Firas
Generate AI viral videos with NanoBanana & VEO3, shared on socials via Blotato Who is this for? This workflow is designed for content creators, marketers, and entrepreneurs who want to automate their video production and social media publishing process. If you regularly post promotional or viral-style content on platforms like TikTok, YouTube Shorts, Instagram Reels, LinkedIn, and more, this template will save you hours of manual work. What problem is this workflow solving? / Use case Creating viral short-form videos is often time-consuming: You need to generate visuals, write scripts, edit videos, and then manually upload them to multiple platforms. Staying consistent across TikTok, YouTube Shorts, Instagram Reels, LinkedIn, Twitter/X, and others requires constant effort. This workflow solves the problem by automating the entire pipeline from idea → video creation → multi-platform publishing. What this workflow does Collects an idea and image from Telegram. Enhances visuals with NanoBanana for user-generated content style. Generates a complete video script with AI (OpenAI + structured prompts). Creates the final video with VEO3 using your custom prompt and visuals. Rewrites captions with GPT to be short, catchy, and optimized for social platforms. Saves metadata in Google Sheets for tracking and management. Auto-uploads the video to all major platforms via Blotato (TikTok, YouTube, Instagram, LinkedIn, Threads, Pinterest, X/Twitter, Bluesky, Facebook). Notifies you on Telegram with a preview link once publishing is complete. Setup Connect your accounts: Google Sheets (for video tracking) Telegram (to receive and send media) Blotato (for multi-platform publishing) OpenAI API (for captions, prompts, and image analysis) VEO3 API (for video rendering) Fal.ai (for NanoBanana image editing) Google Drive (to store processed images) Set your credentials in the respective nodes. Adjust the Google Sheet IDs to match your own sheet structure. Insert your Telegram bot token in the Set: Bot Token (Placeholder) node. How to customize this workflow to your needs Platforms**: Disable or enable only the Blotato social accounts you want to post to. Video style**: Adjust the master prompt schema in the Set Master Prompt node to fine-tune tone, camera style, or video format. Captions**: Modify the GPT prompt in the Rewrite Caption with GPT-4o node to control length and tone. Notifications**: Customize the Telegram nodes to notify team members, not just yourself. Scheduling**: Add a Cron trigger if you want automatic posting at specific times. ✨ With this workflow, you go from idea → AI-enhanced video → instant multi-platform publishing in just minutes, with almost no manual work. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Diptamoy Barman
Social Media Spark (SMS) — Automated Viral Content Engine Automate your entire content workflow: discover viral ideas, generate posts in your tone, repurpose for X, and auto-publish — reducing 90% of manual effort. 🚀 What it Does Scrapes competitors or niche profiles on LinkedIn to find high-performing posts. Classifies and saves evergreen content ideas for later use. Generates fresh posts in your own voice with matching images. Repurposes content for X (Twitter) in a platform-optimized style. Automatically publishes content to LinkedIn and X on your schedule. Allows on-demand commands via Telegram for research or instant content generation. 🧩 Why Use It Save time:** no more manual scraping, idea collection, or formatting. Stay consistent:** maintain a daily content pipeline. Multi-platform leverage:** create once, adapt for LinkedIn and X. Creative control:** mix automation with optional human review. Scalable:** extend to more platforms, analytics, and workflows as you grow. 🔧 Prerequisites & Setup Before importing or activating the workflow, prepare these: AI Provider (OpenAI / Gemini / OpenRouter)** For classifying posts, generating new content, repurposing for X. Google Sheets** Central database for competitors, ideas, generated posts, and posting status. Google Drive** Stores generated images. Apify & Browseract** Scrapes LinkedIn profiles, posts, and performs research tasks. LinkedIn API** Needed for automated LinkedIn publishing. X (Twitter) API** Requires OAuth 1.0a for image uploads and OAuth 2.0 for text posting. Telegram Bot** Enables on-demand commands and notifications. Set your Telegram User ID in the trigger node. > 🔎 In each sub-workflow, look for nodes marked “Configure Me!” to replace example prompts, search keywords, sheet IDs, etc. ⚙️ How It Works (Simplified Flow) Scrape & Classify: Collect high-engagement posts → keep evergreen ones. Generate Content: Rewrite ideas into new posts in your voice → create images. Repurpose for X: Adapt LinkedIn posts for short-form, high-impact tweets. Auto-Publish: Post daily on LinkedIn and X. Control via Telegram: Manually trigger scraping, research, or post generation. 💡 Best Practices & Tips Keep all API keys private — never share them in public repos or screenshots. Adjust cron schedules (e.g., scraping on weekends, posting on weekdays) to fit your content rhythm. Add Human-in-the-Loop review steps for brand-sensitive content. Extend to other platforms (Instagram, TikTok, YouTube Shorts) as needed. Experiment with prompt variations for different tones or creative styles. Add analytics logging (likes, comments, clicks) to measure content performance. 🙋♂️ Who is This For Solo creators & founders** who want to post consistently but don’t have time for daily ideation. Small marketing teams** that need to keep up with trends without spending hours on research. Consultants & thought leaders** who want to amplify their personal brand on LinkedIn and X. Startups & bootstrapped businesses** that need a lean but reliable content engine. Content strategists** who want a data-driven, repeatable pipeline for finding and using what works. Or anyone who wants to boost social presence by 300% 💡 Why SMS Stands Out Authentic voice:** Uses your own tone and style (defined in prompts and examples), so posts feel personal — not generic AI fluff. Data-driven:** Pulls from real, viral posts in your niche to inspire fresh content. Quality over quantity:** Focuses only on proven viral ideas instead of churning random posts. Consistent growth:** Keeps your posting regular, so you stay visible and relevant. Efficient workflow:** Minimizes manual work while letting you step in when needed (e.g., for approvals or special campaigns). > ⚡ SMS combines real market data with your unique voice — so you post smarter, not just more often.
by Franz
🚀 AI Lead Generation and Follow-Up Template 📋 Overview This n8n workflow template automates your lead generation and follow-up process using AI. It captures leads through a form, enriches them with company data, classifies them into different categories, and sends appropriate follow-up sequences automatically. Key Features: 🤖 AI-powered lead classification (Demo-ready, Nurture, Drop) 📊 Automatic lead enrichment with company data 📧 Intelligent email responses and follow-up sequences 📅 Automated demo scheduling for qualified leads 📝 Complete lead logging in Google Sheets 💬 AI assistant for immediate query responses 🛠️ Prerequisites Before setting up this workflow, ensure you have: n8n Instance: Self-hosted or cloud version OpenAI API Key: For AI-powered features Google Workspace Account with: Gmail access Google Sheets Google Calendar Basic understanding of your Ideal Customer Profile (ICP) ⚡ Quick Start Guide Step 1: Import the Workflow Copy the workflow JSON Import into your n8n instance The workflow will appear with all nodes connected Step 2: Configure Credentials You'll need to set up the following credentials: OpenAI API**: For AI agents and classification Gmail OAuth2**: For sending emails Google Sheets OAuth2**: For lead logging Google Calendar OAuth2**: For demo scheduling Step 3: Create Your Lead Log Sheet Create a Google Sheet with these columns: Date Name Email Company Job Title Message Number of Employees Industry Geography Annual Revenue Technology Pain Points Lead Classification Step 4: Update Configuration Nodes Replace Sheet ID: Update all Google Sheets nodes with your sheet ID Update Email Templates: Customize all email content Set Escalation Email: Replace "your-email@company.com" with your team's email Configure ICP Criteria: Edit the "Define ICP and Lead Criteria" node 🎯 Lead Classification Setup Define Your ICP (Ideal Customer Profile) Edit the "Define ICP and Lead Criteria" node to set your criteria: 📌 ICP Criteria Example: Company Size: 50+ employees Industry: SaaS, Finance, Healthcare, Manufacturing Geography: North America, Europe Pain Points: Manual processes, compliance needs, scaling challenges Annual Revenue: $5M+ ✅ Demo-Ready Criteria: High-intent prospects who meet multiple qualifying factors: Large company size (your threshold) Clear pain points mentioned Urgent timeline Budget authority indicated Specific solution requests 🌱 Nurture Criteria: Prospects with future potential: Meet basic size requirements In target industry General interest expressed Planning future implementation Exploring options ❌ Drop Criteria: Only drop leads that clearly don't fit: Outside target geography Wrong industry (B2C if you're B2B) Too small with no growth Already with competitor Spam or test messages 📧 Email Customization Customize Follow-Up Sequences: Demo-Ready Sequence: Immediate calendar invitation Personalized demo confirmation Meeting reminder (optional) Nurture Sequence: Welcome email with resources Educational content (Day 2) Webinar/event invitation (Day 3) Demo offer (Day 4) Drop Message: Polite acknowledgment Clear explanation Keep door open for future 🔧 Advanced Configuration AI Answer Agent Setup: Update the system prompt with your company information Add common Q&A patterns Set escalation rules Configure language preferences Lead Enrichment Options: Add API keys for additional data sources Configure enrichment fields Set data quality thresholds Enable duplicate detection Calendar Integration: Set available meeting times Configure meeting duration Add buffer times Set timezone handling 📊 Monitoring and Optimization Track Key Metrics: Lead volume by classification Response rates Demo conversion rates Time to first response Enrichment success rate Optimization Tips: Regular Review: Check classification accuracy weekly A/B Testing: Test different email sequences Feedback Loop: Use outcomes to refine ICP criteria AI Training: Update prompts based on results 🎉 Best Practices Start Simple: Begin with basic criteria and refine over time Test Thoroughly: Use test leads before going live Monitor Daily: Check logs for the first week Iterate Quickly: Adjust based on results Document Changes: Keep track of criteria updates 📈 Scaling Your Workflow As your lead volume grows: Add Sub-workflows: Separate complex processes Implement Queuing: Handle high volumes Add CRM Integration: Sync with your sales tools Enable Analytics: Track detailed metrics Set Up Alerts: Monitor for issues
by Julian Kaiser
What problem does this solve? Earlier this year, as I got more involved with n8n, I committed to helping users on our community forums and the n8n subreddit. The volume of questions was growing, and I found it was a real challenge to keep up and make sure no one was left without an answer. I needed a way to quickly see what people were struggling with, without spending hours just searching for new posts. So, I built this workflow. It acts as my personal AI research assistant. Twice a day, it automatically scans Reddit and the n8n forums for me. It finds relevant questions, summarizes the key points using AI, and sends me a digest with direct links to each post. This allows me to jump straight into the conversations that matter and provide support effectively. While I built this for n8n support, you can adapt it to monitor any community, track product feedback, or stay on top of any topic you care about. It transforms noisy forums into an actionable intelligence report delivered right to your inbox. How it works Here’s the technical breakdown of my two-part system: AI Reddit Digest (Daily at 9AM / 5 PM): Fetches the latest 50 posts from a specified subreddit. Uses an AI Text Classifier to categorize each post (e.g., QUESTION, JOB_POST). Isolates the posts classified as questions and uses an AI model to generate a concise summary for each. Formats the original post link and its new summary into an email-friendly format and sends the digest. AI n8n Forum Digest (Daily at 9AM / 5 PM): Scrapes the n8n community forum to get a list of the latest post links. Processes each link individually, fetching the full post content. Filters these posts to keep only those containing a specific keyword (e.g., "2025"). Summarizes the filtered posts using an AI model. Combines the original post link with its AI summary and sends it in a separate email report. Set up steps This workflow is quite powerful and requires a few configurations. Setup should take about 15 minutes. Add Credentials: First, add your credentials for your AI provider (like OpenRouter) and your email service (like Gmail or SMTP) in the Credentials section of your n8n instance. Configure Reddit Digest: In the Get latest 50 reddit posts node, enter the name of the Subreddit you want to follow. Fine-tune the AI's behavior by editing the prompt in the Summarize Reddit Questions node. (Optional) Add more examples to the Text Classifier node to improve its accuracy. Configure n8n Forum Digest: In the Filter 2025 posts node, change the keyword to track topics you're interested in. Edit the prompt in the Summarize n8n Forum Posts node to guide the AI's summary style. Activate Workflow: Once configured, just set the workflow to Active. It will run automatically on schedule. You can also trigger it manually with the When clicking 'Test workflow' node.
by KPendic
This n8n flow demos basic dev-ops operation task, dns records management. AI agent with light and basic prompt functions like getter and setter for DNS records. In this special case, we are managing remote dns server, via API calls - that are handled on CloudFlare platform side. Use-cases for this flow can be standalone, or you can chain it in your pipe-line to get powerful infrastructure flows for your needs. How it works we created basic agent and gave it a prompt to know about one tool: cf_tool - sub-routine (to itself flow - or it can be separate dedicated one) prompt have defined arguments that are needed for passing them when calling agent, for each action specifically tool it self have basic if switch that is - based of a action call - calling specific CloudFlare API endpoint (and pass down the args from the tool) Requirements For storing and processing of data in this flow you will need: CloudFlare.com API key/token - for retrieving your data (https://dash.cloudflare.com/?to=/:account/api-tokens) OpenAPI credentials (or any other LLM provider) saved - for agent chat (Optional) PostGres table for chat history saving Official CloudFlare api Documentation For full details and specifications please use API documentation from: https://developers.cloudflare.com/api/ Linkedin post Let me know if you found this flow usefull on my Linkedin post > here. tags: #cloudflare, #dns, #domain