by Juan Carlos Cavero Gracia
This automation workflow is designed for e-commerce businesses, digital marketers, and entrepreneurs who need to create high-quality promotional content for their products quickly and efficiently. From a single product image and description, the system automatically generates 4 promotional carousel-style images, perfect for social media, advertising campaigns, or web catalogs. Note: This workflow uses Gemini 2.5 Flash API for image generation, imgbb for image storage, and upload-post.com for automatic Instagram, Tiktok, Facebook and Youtube publishing* Who Is This For? E-commerce Owners:** Transform basic product photos into professional promotional content featuring real people using products in authentic situations. Digital Marketers & Agencies:** Generate multiple advertising content variations for Facebook Ads, Instagram Stories, and digital marketing campaigns. Small Businesses & Entrepreneurs:** Create professional promotional material without expensive photo shoots or graphic designers. Social Media Managers:** Produce engaging and authentic content that drives engagement and conversions across all social platforms. What Problem Does This Workflow Solve? Creating quality promotional content requires time, resources, and design skills. This workflow addresses these challenges by: Automatic Carousel Generation:** Converts a single product photo into 4 promotional images featuring people using the product naturally. Authentic & Engaging Content:** Generates images showing real product usage, increasing credibility and conversions. Integrated Promotional Text:** Automatically includes visible offers, benefits, and call-to-actions in the images. Social Media Optimization:** Produces vertical 9:16 format images, perfect for Instagram, TikTok, and Facebook Stories. Automatic Publishing:** Optionally publishes the complete carousel directly to Instagram with AI-generated optimized descriptions. How It Works Product Upload: Upload a product image and provide detailed description through the web form. Smart Analysis: The AI agent analyzes the product and creates a storyboard of 4 different promotional images. Image Generation: Gemini 2.5 Flash generates 4 variations showing people using the product in authentic contexts. Automatic Processing: Images are automatically processed, optimized, and stored in imgbb. Promotional Description: GPT-4 generates an attractive, social media-optimized description based on the created images. Optional Publishing: The system can automatically publish the complete carousel to Instagram. Setup fal.ai Credentials: Sign up at fal.ai and add your API token to the Gemini 2.5 Flash nodes. imgbb API: Create an account at imgbb.com Get your API key and configure it in the "Set APIs Vars" node Upload-Post (Optional): For automatic Instagram publishing: Register your account at upload-post.com Connect your Instagram business account Configure credentials in the "Upload Post" node OpenAI API: Configure your OpenAI API key for promotional description generation. Requirements Accounts:** n8n, fal.ai, imgbb.com, OpenAI, upload-post.com (optional), Instagram business (optional). API Keys:** fal.ai token, imgbb API key, OpenAI API key, upload-post.com credentials. Image Format:** Any standard image format (JPG, PNG, WebP) of the product to promote. Features Advanced Generative AI:** Uses Gemini 2.5 Flash to create realistic images of people using products Smart Storyboard:** Automatically creates 4 different concepts to maximize engagement Integrated Promotional Text:** Includes offers, benefits, and CTAs directly in the images Optimized Format:** Generates vertical 9:16 images perfect for social media Parallel Processing:** Generates all 4 images simultaneously for maximum efficiency Automatic Publishing:** Option to publish directly to Instagram with optimized descriptions Use this template to transform basic product photos into complete promotional campaigns, saving time and resources while generating high-quality content that converts visitors into customers.
by Avkash Kakdiya
How it works This workflow automatically generates personalized follow-up messages for leads or customers after key interactions (e.g., demos, sales calls). It enriches contact details from HubSpot (or optionally Monday.com), uses AI to draft a professional follow-up email, and distributes it across multiple communication channels (Slack, Telegram, Teams) as reminders for the sales team. Step-by-step 1. Trigger & Input Schedule Trigger – Runs automatically at a defined interval (e.g., daily). Set Sample Data – Captures the contact’s name, email, and context from the last interaction (e.g., “had a product demo yesterday and showed strong interest”). 2. Contact Enrichment HubSpot Contact Lookup – Searches HubSpot CRM by email to confirm or enrich contact details. Monday.com Contact Fetch (Optional) – Can pull additional CRM details if enabled. 3. AI Message Generation AI Language Model (OpenAI) – Provides the underlying engine for message creation. Generate Follow-Up Message – Drafts a short, professional, and friendly follow-up email: References previous interaction context. Suggests clear next steps (call, resources, etc.). Ends with a standardized signature block for consistency. 4. Multi-Channel Communication Slack Reminder – Posts the generated message as a reminder in the sales team’s Slack channel. Telegram Reminder – Sends the follow-up draft to a Telegram chat. Teams Reminder – Shares the same message in a Microsoft Teams channel. Benefits Personalized Outreach at Scale – AI ensures each follow-up feels tailored and professional. Context-Aware Messaging – Pulls in CRM details and past interactions for relevance. Cross-Platform Delivery – Distributes reminders via Slack, Teams, and Telegram so no follow-up is missed. Time-Saving for Sales Teams – Eliminates manual drafting of repetitive follow-up emails. Consistent Branding – Ensures every message includes a unified signature block.
by Bhuvanesh R
Your Cold Email is Now Researched. This pipeline finds specific bottlenecks on prospect websites and instantly crafts an irresistible pitch 🎯 Problem Statement Traditional high-volume cold email outreach is stuck on generic personalization (e.g., "Love your website!"). Sales teams, especially those selling high-value AI Receptionists, struggle to efficiently find the one Unique Operational Hook (like manual scheduling dependency or high call volume) needed to make the pitch relevant. This forces reliance on expensive, slow manual research, leading to low reply rates and inefficient spending on bulk outreach tools. ✨ Solution This workflow deploys a resilient Dual-AI Personalization Pipeline that runs on a batch basis. It uses the Filter (Qualified Leads) node as a cost-saving Quality Gate to prevent processing bad leads. It executes a Targeted Deep Dive on successful leads, using GPT-4 for analytical insight extraction and Claude Sonnet for coherent, human-like copy generation. The entire process outputs campaign-ready data directly to Google Sheets and sends a critical QA Draft via Gmail. ⚙️ How It Works (Multi-Step Execution) 1\. Ingestion and Cost Control (The Quality Gate) Trigger and Ingestion:* The workflow starts via a *Manual Trigger, pulling leads directly from **Get All Leads (Google Sheets). Cost Filtering:* The *Filter (Qualified Leads)** node removes leads that lack a working email or website URL. Execution Isolation:* The *Loop Over Leads* node initiates individual processing. The *Capture Lead Data (Set)** node immediately captures and locks down the original lead context for stability throughout the loop. Hybrid Scraping:* The *Scrape Site (HTTP Request)* and *Extract Text & Links (HTML)* nodes execute the *Hybrid Scraping* strategy, simultaneously capturing *website text* and *external links**. Data Shaping & Status:* The *Filter Social & Status (Code)* node is the control center. It filters links, bundles the context, and critically, assigns a *status** of 'Success' or 'Scrape Fail'. Cost Control Branch:* The *If (IF node)* checks this status. Items with 'Scrape Fail' bypass all AI steps (saving *100% of AI token costs) and jump directly to **Log Final Result. Successful items proceed to the AI core. 2\. Dual-AI Coherence & Dispatch (The Executive Output) Analytical Synthesis:* The *Summarize Website (OpenAI)* node uses *GPT-4* to synthesize the full context and extract the *Unique Operational Hook** (e.g., manual booking overhead). Coherent Copy Generation:* The *Generate Subject & Body (Anthropic)* node uses the *Claude Sonnet* model to generate the subject and the multi-line body, guaranteeing *coherence** by creating both simultaneously in a single JSON output. Final Parsing:* The *Parse AI Output (Code)* node reliably strips markdown wrappers and extracts the clean *subject* and *body** strings. Final Delivery:* The data is logged via *Log Final Result (Google Sheets), and the completed email is sent to the user via **Create a draft (Gmail) for final Quality Assurance before sending. 🛠️ Setup Steps Before running the workflow, ensure these credentials and data structures are correctly configured: Credentials Anthropic:** Configure credentials for the Language Model (Claude Sonnet). OpenAI:** Configure credentials for the Analytical Model (GPT-4/GPT-4o). Google Services:* Set up OAuth2 credentials for *Google Sheets* (Input/Output) and *Gmail** (Draft QA and Completion Alert). Configuration Google Sheet Setup:* Your input sheet must include the columns *email, **website\_url, and an empty Icebreaker column for initial filtering. HTTP URL:* Verify that the *Scrape Site** node's URL parameter is set to pull the website URL from the stabilized data structure: ={{ $json.website\_url }}. AI Prompts:** Ensure the Anthropic prompt contains your current Irresistible Sales Offer and the required nested JSON output structure. ✅ Benefits Coherence Guarantee:* A single *Anthropic** node generates both the subject and body, guaranteeing the message is perfectly aligned and hits the same unique insight. Maximum Cost Control:* The *IF node* prevents spending tokens on bad or broken websites, making the campaign highly *budget-efficient**. Deep Personalization:* Combines *website text* and *social media links**, creating an icebreaker that implies thorough, manual research. High Reliability:* Uses robust *Code nodes** for data structuring and parsing, ensuring the workflow runs consistently under real-world conditions without crashing. Zero-Risk QA:* The final *Gmail (Create a draft)** step ensures human review of the generated copy before any cold emails are sent out.
by zawanah
Categorise and route emails with GPT 5 This workflow demonstrates how to use AI text classifier to classify incoming emails, and uses a multi-agent architecture to respond for each email category respectively. Use cases Business owners with a lot of incoming emails, or anyone who has huge influx of emails How it Works Any incoming emails will be read by the text classifier powered by GPT 5, and routed according to the defined categories where respective agents will take next steps. Workflow is triggered when an email comes in GPT will read email's "subject","from" and "content" to route it accurately to respective designated categories For customer support enquiries, customer support agent will take knowledge from the pinecone vector database about FAQs and policies, reply via gmail, and label the email as "Customer Support" For finance-related queries, finance agent will label email as "Finance" and assess if email is about making payment or receiving from customers. If payment-related, email will be sent to the payments team to take action. If receipts-related, email will be sent to the receivables team to take action. User will be notified via telegram after any email is sent. For sales/leads enquiries, leads agent will label the email as "Sales Opportunities", take knowledge from the pinecone vector database about the business to generate a response and draft into gmail and user will be notified via telegram to review and send. If there is lack of information for agent to generate a response, user will be notified of this via telegram as well. Any internal team member emails will be routed to the internal agent. The agent will label message as "Internal" and send user a summary of the email message via telegram. How to set up Set up Telegram bot via Botfather. See setup instructions here Setup OpenAI API for transcription services (Credits required) here Set up Openrouter account. See details here Set up Pinecone database. See details here Customization Options Other than Gmail, it is possible to connect to Outlook as well. Other than Pinecone vector database, there are other vector database that should serve the same purpose eg. supabase, qdrant, weviate Requirements Gmail account Telegram bot Pinecone account Open router account
by Muhammad Farooq Iqbal
This n8n template demonstrates how to create authentic-looking User Generated Content (UGC) advertisements using AI image generation, voice synthesis, and lip-sync technology. The workflow transforms product images into realistic customer testimonial videos that mimic genuine user reviews and social media content. Use cases are many: Generate authentic UGC-style ads for social media campaigns, create customer testimonial videos without hiring influencers, produce localized UGC content for different markets, automate TikTok/Instagram-style product reviews, or scale UGC ad production for e-commerce brands! Good to know The workflow creates UGC-style content that appears genuine and authentic Uses multiple AI services: OpenAI GPT-4o for analysis, ElevenLabs for voice synthesis, and WaveSpeed AI for image generation and lip-sync Voice synthesis costs vary by ElevenLabs plan (typically $0.18-$0.30 per 1K characters) WaveSpeed AI pricing: ~$0.039 per image generation, additional costs for lip-sync processing Processing time: ~3-5 minutes per complete UGC video Optimized for Malaysian-English content but easily adaptable for global markets How it works Product Input: The Telegram bot receives product images to create UGC ads for AI Analysis: ChatGPT-4o analyzes the product to understand brand, colors, and target demographics UGC Content Creation: AI generates authentic-sounding testimonial scripts and detailed prompts for realistic customer scenarios Character Generation: WaveSpeed AI creates believable customer avatars that look like real users reviewing products Voice Synthesis: ElevenLabs generates natural, conversational audio using gender-appropriate voice models UGC Video Production: WaveSpeed AI combines generated characters with audio to create TikTok/Instagram-style review videos Content Delivery: Final UGC videos are delivered via Telegram, ready for social media posting The workflow produces UGC-style content that maintains authenticity while showcasing products in realistic, relatable scenarios that resonate with target audiences. How to use Setup Credentials: Configure OpenAI API, ElevenLabs API, WaveSpeed AI API, Cloudinary, and Telegram Bot credentials Deploy Workflow: Import the template and activate the workflow Send Product Images: Use the Telegram bot to send product images you want to create UGC ads for Automatic UGC Generation: The workflow will automatically create authentic-looking customer testimonial videos Receive UGC Content: Get both testimonial images and final UGC videos ready for social media campaigns Pro tip: The workflow automatically detects product demographics and creates appropriate customer personas. For best UGC results, use clear product images that show the item in use. Requirements OpenAI API** account for GPT-4o product analysis and UGC script generation ElevenLabs API** account for authentic voice synthesis (requires voice cloning credits) WaveSpeed AI API** account for realistic character generation and lip-sync processing Cloudinary** account for UGC content storage and hosting Telegram Bot** setup for content input and delivery n8n** instance (cloud or self-hosted) Customizing this workflow Platform-Specific UGC: Modify prompts to create UGC content optimized for TikTok, Instagram Reels, YouTube Shorts, or Facebook Stories. Brand Voice: Adjust testimonial scripts and character personas to match your brand's target audience and tone. Regional Adaptation: Customize language, cultural references, and character demographics for different markets and demographics. UGC Style Variations: Create different UGC formats - unboxing videos, before/after comparisons, day-in-the-life content, or product demonstrations. Influencer Personas: Develop specific customer personas (age groups, lifestyles, interests) to create targeted UGC content for different audience segments. Content Scaling: Set up batch processing to generate multiple UGC variations for A/B testing different approaches and styles.
by explorium
Inbound Agent - AI-Powered Lead Qualification with Product Usage Intelligence This n8n workflow automatically qualifies and scores inbound leads by combining their product usage patterns with deep company intelligence. The workflow pulls new leads from your CRM, analyzes which API endpoints they've been testing, enriches them with firmographic data, and generates comprehensive qualification reports with personalized talking points—giving your sales team everything they need to prioritize and convert high-quality leads. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Salesforce Type:** OAuth2 or Username/Password Used for:** Pulling lead reports and creating follow-up tasks Alternative CRM options: HubSpot, Zoho, Pipedrive Get credentials at Salesforce Setup Databricks (or Analytics Platform) Type:** HTTP Request with Bearer Token Header:** Authorization Value:** Bearer YOUR_DATABRICKS_TOKEN Used for:** Querying product usage and API endpoint data Alternative options: Datadog, Mixpanel, Amplitude, custom data warehouse Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business matching and firmographic enrichment Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered lead qualification and analysis Get your API key at Anthropic Console Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: When clicking 'Execute workflow' Manual trigger that initiates the lead qualification process. Type:** Manual Trigger Purpose:** On-demand execution for testing or manual runs Alternative Trigger Options: Schedule Trigger:** Run automatically (hourly, daily, weekly) Webhook:** Trigger on CRM updates or new lead events CRM Trigger:** Real-time activation when leads are created Node 2: GET SF Report Pulls lead data from a pre-configured Salesforce report. Method:** GET Endpoint:** Salesforce Analytics Reports API Authentication:** Salesforce OAuth2 Returns: Raw Salesforce report data including: Lead contact information Company names Lead source and status Created dates Custom fields CRM Alternatives: This node can be replaced with HubSpot, Zoho, or any CRM's reporting API. Node 3: Extract Records Parses the Salesforce report structure and extracts individual lead records. Extraction Logic: Navigates report's factMap['T!T'].rows structure Maps data cells to named fields Node 4: Extract Tenant Names Prepares tenant identifiers for usage data queries. Purpose: Formats tenant names as SQL-compatible strings for the Databricks query Output: Comma-separated, quoted list: 'tenant1', 'tenant2', 'tenant3' Node 5: Query Databricks Queries your analytics platform to retrieve API usage data for each lead. Method:** POST Endpoint:** /api/2.0/sql/statements Authentication:** Bearer token in headers Warehouse ID:** Your Databricks cluster ID Platform Alternatives: Datadog:** Query logs via Logs API Mixpanel:** Event segmentation API Amplitude:** Behavioral cohorts API Custom Warehouse:** PostgreSQL, Snowflake, BigQuery queries Node 6: Split Out Splits the Databricks result array into individual items for processing. Field:** result.data_array Purpose:** Transform single response with multiple rows into separate items Node 7: Rename Keys Normalizes column names from database query to readable field names. Mapping: 0 → TenantNames 1 → endpoints 2 → endpointsNum Node 8: Extract Business Names Prepares company names for Explorium enrichment. Node 9: Loop Over Items Iterates through each company for individual enrichment. Node 10: Explorium API: Match Businesses Matches company names to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of potential matches Match confidence scores Node 11: Explorium API: Firmographics Enriches matched businesses with comprehensive company data. Method:** POST Endpoint:** /v1/businesses/firmographics/bulk_enrich Authentication:** Header Auth (Bearer token) Returns: Company name, website, description Industry categories (NAICS, SIC, LinkedIn) Size: employee count range, revenue range Location: headquarters address, city, region, country Company age and founding information Social profiles: LinkedIn, Twitter Logo and branding assets Node 12: Merge Combines API usage data with firmographic enrichment data. Node 13: Organize Data as Items Structures merged data into clean, standardized lead objects. Data Organization: Maps API usage by tenant name Maps enrichment data by company name Combines with original lead information Creates complete lead profile for analysis Node 14: Loop Over Items1 Iterates through each qualified lead for AI analysis. Batch Size:** 1 (analyzes leads individually) Purpose:** Generate personalized qualification reports Node 15: Get many accounts1 Fetches the associated Salesforce account for context. Resource:** Account Operation:** Get All Filter:** Match by company name Limit:** 1 record Purpose: Link lead qualification back to Salesforce account for task creation Node 16: AI Agent Analyzes each lead to generate comprehensive qualification reports. Input Data: Lead contact information API usage patterns (which endpoints tested) Firmographic data (company profile) Lead source and status Analysis Process: Evaluates lead quality based on usage, company fit, and signals Identifies which Explorium APIs the lead explored Assesses company size, industry, and potential value Detects quality signals (legitimate company email, active usage) and red flags Determines optimal sales approach and timing Connected to Explorium MCP for supplemental company research if needed Output: Structured qualification report with: Lead Score:** High Priority, Medium Priority, Low Priority, or Nurture Quick Summary:** Executive overview of lead potential API Usage Analysis:** Endpoints used, usage insights, potential use case Company Profile:** Overview, fit assessment, potential value Quality Signals:** Positive indicators and concerns Recommended Actions:** Next steps, timing, and approach Talking Points:** Personalized conversation starters based on actual API usage Node 18: Clean Outputs Formats the AI qualification report for Salesforce task creation. Node 19: Update Salesforce Records Creates follow-up tasks in Salesforce with qualification intelligence. Resource:** Task Operation:** Create Authentication:** Salesforce OAuth2 Alternative Output Options: HubSpot:** Create tasks or update deal stages Outreach/SalesLoft:** Add to sequences with custom messaging Slack:** Send qualification reports to sales channels Email:** Send reports to account owners Google Sheets:** Log qualified leads for tracking Workflow Flow Summary Trigger: Manual execution or scheduled run Pull Leads: Fetch new/updated leads from Salesforce report Extract: Parse lead records and tenant identifiers Query Usage: Retrieve API endpoint usage data from analytics platform Prepare: Format data for enrichment Match: Identify companies in Explorium database Enrich: Pull comprehensive firmographic data Merge: Combine usage patterns with company intelligence Organize: Structure complete lead profiles Analyze: AI evaluates each lead with quality scoring Format: Structure qualification reports for CRM Create Tasks: Automatically populate Salesforce with actionable intelligence This workflow eliminates manual lead research and qualification, automatically analyzing product engagement patterns alongside company fit to help sales teams prioritize and personalize their outreach to the highest-value inbound leads. Customization Options Flexible Triggers Replace the manual trigger with: Schedule:** Run hourly/daily to continuously qualify new leads Webhook:** Real-time qualification when leads are created CRM Trigger:** Activate on specific lead status changes Analytics Platform Integration The Databricks query can be adapted for: Datadog:** Query application logs and events Mixpanel:** Analyze user behavior and feature adoption Amplitude:** Track product engagement metrics Custom Databases:** PostgreSQL, MySQL, Snowflake, BigQuery CRM Flexibility Works with multiple CRMs: Salesforce:** Full integration (pull reports, create tasks) HubSpot:** Contact properties and deal updates Zoho:** Lead enrichment and task creation Pipedrive:** Deal qualification and activity creation Enrichment Depth Add more Explorium endpoints: Technographics:** Tech stack and product usage News & Events:** Recent company announcements Funding Data:** Investment rounds and financial events Hiring Signals:** Job postings and growth indicators Output Destinations Route qualification reports to: CRM Updates:** Salesforce, HubSpot (update lead scores/fields) Task Creation:** Any CRM task/activity system Team Notifications:** Slack, Microsoft Teams, Email Sales Tools:** Outreach, SalesLoft, Salesloft sequences Reporting:** Google Sheets, Data Studio dashboards AI Model Options Swap AI providers: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Salesforce Report Configuration: Create a report with required fields (name, email, company, tenant ID) and use its API endpoint Tenant Identification: Ensure your product usage data includes identifiers that link to CRM leads Usage Data Query: Customize the SQL query to match your database schema and table structure MCP Configuration: Explorium MCP requires Header Auth—configure credentials properly Lead Scoring Logic: Adjust AI system prompts to match your ideal customer profile and qualification criteria Task Assignment: Configure Salesforce task assignment rules or add logic to route to specific sales reps This workflow acts as an intelligent lead qualification system that combines behavioral signals (what they're testing) with firmographic fit (who they are) to give sales teams actionable intelligence for every inbound lead.
by Madame AI
Automate social media content aggregation to a Telegram channel This n8n template automatically aggregates and analyzes key updates from your social media platforms Home Page, delivering them as curated posts to a Telegram channel. This workflow is perfect for digital marketers, brand managers, or data analysts and Busy people, seeking to monitor real-time trends and competitor activity without manual effort. How it works The workflow is triggered automatically on a schedule to aggregate the latest social media posts. A series of If and Wait nodes monitor the data processing job until the full data is ready. An AI Agent, powered by Google Gemini, refines the content by summarizing posts and removing duplicates. An If node checks for an image in the post to decide if a photo or a text message should be sent. Finally, the curated posts are sent to your Telegram channel as rich media messages. How to use Set up BrowserAct Template: In your BrowserAct account, set up “Twitter/X Content Aggregation” template. Set up Credentials: Add your credentials for BrowserAct In Run Node , Google Gemini in Agent Node, and Telegram in Send Node. Add Workflow ID: Change the workflow_id value inside the HTTP Request inside the Run Node, to match the one from your BrowserAct workflow. Activate Workflow: To enable the automated schedule, simply activate the workflow. Requirements BrowserAct** API account BrowserAct* *“Twitter/X Content Aggregation”** Template Gemini** account Telegram** credentials customizing this workflow This workflow provides a powerful foundation for social media monitoring. You could: Replace the Telegram node with an email or Slack node to send notifications to a different platform. Add more detailed prompts to the AI Agent for more specific analysis or summarization. customize BrowserAct Workflow to reach your desire. Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Automate Your Social Media: Get All X/Twitter Updates Directly in Telegram!
by Pawan
This template sets up a scheduled automation that scrapes the latest news from The Hindu website, uses a Google Gemini AI Agent to filter and analyze the content for relevance to the Competitive Exams like UPSC Civil Services Examination (CSE) syllabus, and compiles a structured daily digest directly into a Google Sheet. It saves hours of manual reading and note-taking by providing concise summaries, subject categorization, and explicit UPSC importance notes. Who’s it for This workflow is essential for: UPSC/CSE Aspirants who require a curated, focused, and systematic daily current affairs digest. Coaching Institutes aiming to instantly generate structured, high-quality study material for their students. Educators and Content Creators focused on Governance, Economy, International Relations, and Science & Technology. How it works / What it does This workflow runs automatically every morning (scheduled for 7 AM by default) to generate a ready-to-study current affairs document. Scraping: The Schedule Trigger fires an HTTP Request to fetch the latest news links from The Hindu's front page. Data Curation: The HTML and Code in JavaScript nodes work together to extract and pair every article URL with its title. Content Retrieval: For each identified link, a second HTTP Request node fetches the entire article body. AI Analysis and Filtering: The AI Agent uses a detailed prompt and the Google Gemini Chat Model to perform two critical tasks: Filter: It filters out all irrelevant articles (e.g., sports results, local crime) to keep only the 5-6 most important UPSC-relevant pieces (Polity, Economy, IR, etc.). Analyze: For the selected articles, it generates a Brief Summary, identifies the Main Subject, and clearly articulates Why it is Important for the UPSC Exam. Storage: The AI Agent calls the integrated Google Sheets Tool to automatically append the structured, analyzed data into your designated Google Sheet, creating your daily ready-made notes. Requirements To deploy this workflow, you need: n8n Account: (Cloud or self-hosted). Google Gemini API Key: For connecting the Google Gemini Chat Model and powering the AI Agent. Google Sheets Credentials: For reading/writing the final compiled digest. Target Google Sheet: A spreadsheet with the following columns: Date, URL, Subject, Brief Summary, and What is Important. How to set up Credentials Setup:** Connect your Google Gemini and Google Sheets accounts via the n8n Credentials Manager. Google Sheet Linking:* In the *Append row in sheet and Append row in sheet in Google Sheets1 nodes, replace the **placeholder IDs and GIDs with the actual ID and sheet name of your dedicated UPSC notes spreadsheet. Scheduling:* Adjust the time in the *Schedule Trigger: Daily at 7 AM node** if you want the daily analysis to run at a different hour. AI Customization (Optional):* You can refine the System Message in the *AI Agent: Filter & Analyze UPSC News node** to focus the analysis on specific exam phases (e.g., Prelims only) or adjust the priority of subjects.
by Karol
How it works This workflow turns any URL sent to a Telegram bot into ready-to-publish social posts: Trigger: Telegram message (checks if it contains a URL). Fetch & parse: Downloads the page and extracts readable text + title. AI writing: Generates platform-specific copy (Facebook, Instagram, LinkedIn). Image: Creates an AI image and stores it in Supabase Storage. Publish: Posts to Facebook Pages, Instagram Business, LinkedIn. Logging: Updates Google Sheets with post URLs and sends a Telegram confirmation (image + links). Setup Telegram – create a bot, connect via n8n Telegram credentials. OpenAI / Gemini – add API key in n8n Credentials and select it in the AI nodes. Facebook/Instagram (Graph API) – create a credential called facebookGraph with: • accessToken (page-scoped or system user) • pageId (for Facebook Page photos) • igUserId (Instagram Business account ID) • optional fbApiVersion (default v19.0) LinkedIn – connect with OAuth2 in the LinkedIn node (leave as credential). Supabase – credential supabase with url and apiKey. Ensure a bucket exists (default used in the Set node is social-media). Google Sheets – replace YOUR_GOOGLE_SHEET_ID and Sheet1. Grant your n8n Google OAuth2 access. Notes • No API keys are stored in the template. Everything runs via n8n Credentials. • You can change bucket name, image size/quality, and AI prompts in the respective nodes. • The confirmation message on Telegram includes direct permalinks to the published posts. Required credentials • Telegram Bot • OpenAI (or Gemini) • Facebook/Instagram Graph • LinkedIn OAuth2 • Supabase (url + apiKey) • Google Sheets OAuth2 Inputs • A Telegram message that contains a URL. Outputs • Social posts published on Facebook, Instagram, LinkedIn. • Row appended/updated in Google Sheets with post URLs and image link. • Telegram confirmation with the generated image + post links.
by Mariela Slavenova
This template crawls a website from its sitemap, deduplicates URLs in Supabase, scrapes pages with Crawl4AI, cleans and validates the text, then stores content + metadata in a Supabase vector store using OpenAI embeddings. It’s a reliable, repeatable pipeline for building searchable knowledge bases, SEO research corpora, and RAG datasets. ⸻ Good to know • Built-in de-duplication via a scrape_queue table (status: pending/completed/error). • Resilient flow: waits, retries, and marks failed tasks. • Costs depend on Crawl4AI usage and OpenAI embeddings. • Replace any placeholders (API keys, tokens, URLs) before running. • Respect website robots/ToS and applicable data laws when scraping. How it works Sitemap fetch & parse — Load sitemap.xml, extract all URLs. De-dupe — Normalize URLs, check Supabase scrape_queue; insert only new ones. Scrape — Send URLs to Crawl4AI; poll task status until completed. Clean & score — Remove boilerplate/markup, detect content type, compute quality metrics, extract metadata (title, domain, language, length). Chunk & embed — Split text, create OpenAI embeddings. Store — Upsert into Supabase vector store (documents) with metadata; update job status. Requirements • Supabase (Postgres + Vector extension enabled) • Crawl4AI API key (or header auth) • OpenAI API key (for embeddings) • n8n credentials set for HTTP, Postgres/Supabase How to use Configure credentials (Supabase/Postgres, Crawl4AI, OpenAI). (Optional) Run the provided SQL to create scrape_queue and documents. Set your sitemap URL in the HTTP Request node. Execute the workflow (manual trigger) and monitor Supabase statuses. Query your documents table or vector store from your app/RAG stack. Potential Use Cases This automation is ideal for: Market research teams collecting competitive data Content creators monitoring web trends SEO specialists tracking website content updates Analysts gathering structured data for insights Anyone needing reliable, structured web content for analysis Need help customizing? Contact me for consulting and support: LinkedIn
by Growth AI
SEO Content Generation Workflow - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation while maintaining quality and personalization. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company's database information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) OpenAI API**: For embeddings and vector search Apify API**: For Google search results Firecrawl API**: For competitor website scraping Supabase**: For vector database (optional but recommended) Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Supabase database: Database name (prevents AI hallucination) Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key OpenAI API key Apify API key Firecrawl API key Supabase credentials (if using vector database) Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword Scrapes first 5 competitor websites Extracts heading structures (H1-H6) Analyzes competitor meta tags and content organization Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data Accesses client database for personalization Generates optimized meta title (65 chars max) Creates compelling meta description (165 chars max) Produces user-focused H1 (70 chars max) Phase 4: Content Brief Creation Analyzes search intent percentages Develops content strategy based on competitor analysis Creates detailed MECE page structure Suggests rich media elements Provides writing recommendations and detail level scoring Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers Modify system prompts for different content styles Adjust character limits for meta elements Modifying Competitor Analysis Change number of competitors analyzed (currently 5) Adjust scraping parameters in Firecrawl nodes Modify heading extraction logic in JavaScript nodes Customizing Output Format Update Google Sheets column mapping in Code node Modify structured output parser schema Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between phases Add error handling and retry logic Implement content quality scoring Extending Functionality Add keyword research capabilities Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Use descriptive node names for easier workflow maintenance Troubleshooting API Errors**: Check credential configuration and usage limits Scraping Failures**: Firecrawl nodes have error handling enabled Empty Results**: Verify keyword formatting and competitor availability Sheet Updates**: Ensure proper column mapping in final Code node Processing Stops**: Check batch processing limits and timeout settings
by Growth AI
SEO Content Generation Workflow (Basic Version) - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation without the complexity of vector databases. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) Apify API**: For Google search results Firecrawl API**: For competitor website scraping Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key Apify API key Firecrawl API key Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword using Apify Scrapes first 5 competitor websites using Firecrawl Extracts heading structures (H1-H6) from competitor pages Analyzes competitor meta tags and content organization Processes markdown content to identify heading hierarchies Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data using Claude Incorporates client information for personalization Generates optimized meta title (65 characters maximum) Creates compelling meta description (165 characters maximum) Produces user-focused H1 (70 characters maximum) Uses structured output parsing for consistent formatting Phase 4: Content Brief Creation Analyzes search intent percentages (informational, transactional, navigational) Develops content strategy based on competitor analysis Creates detailed MECE page structure with H2 and H3 sections Suggests rich media elements (images, videos, infographics, tables) Provides writing recommendations and detail level scoring (1-10 scale) Ensures SEO optimization while maintaining user relevance Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords Key Differences from Advanced Version This basic version focuses on core SEO functionality without additional complexity: No Vector Database**: Removes Supabase integration for simpler setup Streamlined Architecture**: Fewer dependencies and configuration steps Essential Features Only**: Core competitor analysis and content generation Faster Setup**: Reduced time to deployment Lower Costs**: Fewer API services required How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers in the agent nodes Modify system prompts for different content styles or languages Adjust character limits for meta elements in the structured output parser Modifying Competitor Analysis Change number of competitors analyzed (currently 5) by adding/removing Scrape nodes Adjust scraping parameters in Firecrawl nodes for different content types Modify heading extraction logic in JavaScript Code nodes Customizing Output Format Update Google Sheets column mapping in the final Code node Modify structured output parser schema for different data structures Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between workflow phases Add error handling and retry logic to critical nodes Implement content quality scoring mechanisms Extending Functionality Add keyword research capabilities with additional APIs Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Setup and Testing Always test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Content Quality Review generated content before publishing Customize system prompts to match your brand voice Use descriptive node names for easier workflow maintenance Keep competitor analysis current by running regularly Performance Optimization Process keywords in small batches to avoid timeouts Set appropriate retry policies for external API calls Monitor workflow execution times and optimize bottlenecks Troubleshooting Common Issues and Solutions API Errors Check credential configuration in n8n settings Verify API usage limits and billing status Ensure proper authentication for each service Scraping Failures Firecrawl nodes have error handling enabled to continue on failures Some websites may block scraping - this is normal behavior Check if competitor URLs are accessible and valid Empty Results Verify keyword formatting in Google Sheets Ensure competitor websites contain the expected content structure Check if meta tags are properly formatted in system prompts Sheet Update Errors Ensure proper column mapping in final Code node Verify Google Sheets permissions and sharing settings Check that target sheet names match exactly Processing Stops Review batch processing limits and timeout settings Check for errors in individual nodes using execution logs Verify all required fields are populated in input data Template Structure Required Sheets Client Information: Business details and configuration SEO: Target keywords and page information Results Sheet: Where generated content will be written Expected Columns Keywords**: Target search terms Description**: Brief page description Type de page**: Page category Awareness level**: User familiarity level title, meta-desc, h1, brief**: Generated output columns This streamlined version provides all essential SEO content generation capabilities while being easier to set up and maintain than the advanced version with vector database integration.