by Yaron Been
Amplify Social Media Presence with O3 and GPT-4 Multi-Agent Team 🌍 Overview This n8n workflow acts like a virtual social media department. A Social Media Director Agent coordinates multiple specialized AI agents (Instagram, Twitter/X, Facebook, TikTok, YouTube, and Analytics). Each agent creates or analyzes content for its platform, powered by OpenAI models. The result? A fully automated, cross-platform social media strategy—from content creation to performance tracking. 🟢 Section 1: Trigger & Director Setup 🔗 Nodes: When chat message received (Trigger)** → Starts the workflow whenever you send a request (e.g., “Plan a TikTok campaign for my product launch”). Social Media Director Agent* (connected to *OpenAI O3 model**) → Acts as the strategist. Think Tool** → Helps the Director “reason” before delegating. 💡 Beginner takeaway: This section makes your workflow interactive. You send a request → the Director decides the best approach → then it assigns tasks. 📈 Advantage: Instead of manually planning content per platform, you only send one command, and the AI Director handles the strategy. 🟦 Section 2: Specialized Social Media Agents 🔗 Nodes (each paired with GPT-4.1-mini): 📸 Instagram Content Creator → Visual storytelling, Reels, Hashtags 🐦 Twitter/X Strategist → Viral tweets, trends, engagement 👥 Facebook Community Manager → Audience growth, ads, group engagement 🎵 TikTok Video Creator → Short-form video, viral trends 🎬 YouTube Content Planner → Long-form strategy, SEO, thumbnails 📊 Analytics Specialist → Performance insights, cross-platform reporting 💡 Beginner takeaway: Each platform has its own AI expert. They receive the Director’s strategy and produce tailored content for their platform. 📈 Advantage: Instead of one-size-fits-all posts, you get optimized content per platform—increasing reach and engagement. 🟣 Section 3: Models & Connections 🔗 Nodes: OpenAI Chat Models** (O3 + multiple GPT-4.1-mini models) Each model is connected to its respective agent. 💡 Beginner takeaway: Think of these as the “brains” behind each specialist. The Director uses O3 for advanced reasoning, while the specialists use GPT-4.1-mini (cheaper, faster) for content generation. 📈 Advantage: This keeps costs low while maintaining quality output. 📊 Final Overview Table | Section | Nodes | Purpose | Benefit | | --------------------- | -------------------------------------------------------- | -------------------------------------- | ------------------------------ | | 🟢 Trigger & Director | Chat Trigger, Director, Think Tool | Capture requests & plan strategy | One command → full social plan | | 🟦 Specialists | Instagram, Twitter, Facebook, TikTok, YouTube, Analytics | Platform-specific content | Optimized posts per platform | | 🟣 Models | O3 + GPT-4.1-mini | Provide reasoning & content generation | High-quality, cost-efficient | 🚀 Why This Workflow is Powerful Multi-platform coverage**: All major platforms handled in one flow Human-like strategy**: Director agent makes real marketing decisions Scalable & fast**: Generate a full campaign in minutes Cost-effective**: O3 only for strategy, GPT-4.1-mini for bulk content Beginner-friendly**: Just type your request → get full campaign output
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Video Creation from Voice Input with HeyGen & n8n 👥 Who is this for? This workflow is ideal for: Content creators who want to turn ideas into videos in minutes Marketers managing multi-platform video campaigns Agencies needing scalable video workflows for multiple clients Entrepreneurs looking to automate social media presence 💡 What problem is this workflow solving? Creating and publishing videos across TikTok, YouTube, Instagram and more is: Time-consuming (writing scripts, creating videos, uploading manually) Inconsistent (different platforms, formats, captions) Hard to scale without automation This workflow solves it by turning a voice note into a complete AI video — scripted, generated, and published automatically. ⚙️ What this workflow does Capture idea via Telegram voice note Transcribe audio to text using OpenAI Whisper Generate script, title, and caption with GPT-5 Create avatar video with HeyGen based on your script Save final video to Google Drive and log metadata in Google Sheets Upload video to Blotato Auto-publish to 9 platforms (TikTok, YouTube Shorts, Instagram, LinkedIn, Facebook, Twitter (X), Threads, Bluesky, Pinterest) Send Telegram notification once published 🧰 Setup Before you start, you’ll need: ✅ A Telegram Bot connected to n8n 🔑 An OpenAI API key (Whisper + GPT-5) 🎭 A HeyGen API key for avatar video generation 📂 Google Drive + Sheets integrations for storage & logs 🧩 A Blotato Pro account with API access enabled 📦 Verified Community Nodes enabled in n8n Admin Panel ⚙️ Blotato node installed + credentials configured 📄 A Google Sheet template to log titles, captions, and video links 🛠️ How to customize this workflow Change prompts** → Adjust GPT-5 prompts to fit your tone or brand Select avatars** → Configure HeyGen avatar and voice to match your style Choose platforms** → Activate only TikTok, YouTube, Instagram, etc. Add approvals** → Insert a Telegram or Slack approval step before publishing Extend reporting** → Push analytics or engagement data into Sheets or Notion This workflow transforms a simple voice message into a ready-to-publish viral video — fully automated, consistent, and scalable. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Social Media with HeyGen and GPT-5: Publish Videos to TikTok, YouTube & Instagram 👥 Who is this for? This workflow is designed for: Content creators who want to scale their short-form video production Marketing teams seeking consistent and automated publishing pipelines Agencies managing multiple social accounts for clients Entrepreneurs looking to save time by automating repetitive content tasks 💡 What problem is this workflow solving? Publishing on multiple platforms like YouTube Shorts, TikTok, and Instagram is often: Time-consuming (manual editing, caption writing, uploads) Inconsistent (different requirements for each platform) Prone to delays (switching between tools) This workflow solves these issues by creating a fully automated video pipeline powered by GPT-5, HeyGen, and Blotato. ⚙️ What this workflow does Capture voice idea via Telegram Transcribe voice to text using OpenAI Whisper Generate a catchy title and caption with GPT-5 Create an AI avatar video with HeyGen Save and organize assets in Google Drive and Google Sheets Upload final video to Blotato Auto-publish to: YouTube Shorts TikTok Instagram (Optional: Facebook, X/Twitter, LinkedIn, Pinterest, Threads, Bluesky) Update logs in Google Sheets Send a Telegram confirmation once published 🧰 Setup Before using this workflow, ensure you have: A Telegram Bot connected to n8n for voice input An OpenAI API key for transcription (Whisper) and GPT-5 processing A HeyGen account & API key for avatar video generation A Google Drive & Google Sheets integration for storing assets and logs A Blotato account (Pro plan) with API access enabled Verified Community Nodes enabled in n8n Admin Panel Blotato node installed and credentials configured 🛠️ How to customize this workflow Prompts** → Adjust GPT-5 prompts to match your brand voice and niche Avatars** → Use custom avatars or voices via HeyGen configuration Platforms** → Activate only the social nodes you need (e.g., focus on TikTok & YouTube Shorts) Approval steps** → Add Telegram or Slack confirmation before publishing Analytics** → Extend the workflow to track engagement data in Google Sheets, Airtable, or Notion This workflow turns a simple spoken idea into a viral-ready video — automatically generated, styled, and posted across your most important platforms. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Vasu Gupta
AI Video Generator & Social Media Publisher (Telegram Bot) Turn a single text message into a fully produced video and publish it across 5 social media platforms automatically. This workflow combines AI video generation (Fal.ai) with a "Human-in-the-loop" approval system via Telegram, ensuring you never post content without checking it first. Who is this for? Social Media Managers who want to automate Short/Reel creation. Content Creators looking to scale their output on Instagram, TikTok, and YouTube. Brands that need consistent, on-brand video content generated from simple text prompts. How it works Trigger: Send a text message to your Telegram bot (e.g., "A futuristic city with neon lights"). AI Drafting: An AI Agent (GPT-4o) converts your message into a detailed cinematic video prompt based on your brand guidelines. Generation: Fal.ai (Veo3 model) generates a high-quality 9:16 video. Approval Loop: The bot sends you the video preview. Reply "Great": The workflow proceeds to publishing. Reply with feedback (e.g., "Too dark, make it brighter"): The AI refines the prompt and regenerates the video. Publishing: Once approved, the workflow generates platform-specific captions and hashtags, logs the post to Google Sheets, and uploads the video to Blotato to publish on Instagram, Facebook, LinkedIn, TikTok, and YouTube Shorts. Requirements OpenAI API Key** Fal.ai API Key** (for video generation) Telegram Bot Token** Google Sheets** (for logging posts) Blotato Account** (for multi-channel publishing) How to set up Credentials: configure your credentials for OpenAI, Fal.ai, Telegram, Google Sheets, and Blotato. Fal.ai Setup: Create a Header Auth credential named Authorization with value Key <YOUR_FAL_KEY>. Google Sheets: In the "Log to Google Sheets" node, select the specific Sheet you want to use for tracking. Blotato: In the 5 publishing nodes (Instagram, Facebook, etc.), select your specific Account IDs from the dropdowns. Brand Customization: Open the "Draft Video Prompt" and "Generate Caption" nodes to edit the System Prompt with your specific brand voice and aesthetic guidelines.
by Dr. Firas
Convert Viral Videos into AI Avatar Swaps and Publish on TikTok with Blotato Who is this for? This workflow is designed for content creators, agencies, influencers, and automation builders who want to transform viral videos into personalized avatar-based edits — and automatically publish them on TikTok (and other platforms) without manual editing or video software. What problem is this workflow solving? Replacing a character in a video, transforming the voice, merging audio/video, and publishing to social networks typically requires multiple tools, editing skills, and a lot of time. This workflow automates the entire pipeline end-to-end: No manual video editing No audio processing No API debugging No uploading/publishing hassles It’s a full AI-powered transformation system that produces ready-to-post content in minutes. What this workflow does This workflow receives an avatar image + a viral video URL and automatically: Extracts the audio using Replicate Replaces the character in the video using FAL WAN Replace Transforms the voice using FAL Chatterbox Speech-to-Speech Merges the new video and audio using FAL FFmpeg Saves results to Google Sheets for tracking Publishes the final video to TikTok via Blotato (and optionally to Instagram, Facebook, LinkedIn, X, and YouTube) Sends a confirmation message when publishing is complete Everything runs automatically, in parallel, for maximum speed. Setup Telegram Bot Add your Telegram Bot credentials in the Telegram Trigger node. Send an avatar photo + video URL in one message (URL in the caption). Workflow Configuration Add your FAL API key Add your Replicate API key Add your targetVoiceAudioUrl (this is the voice the output will use) Google Sheets Connect your Google Sheets OAuth credentials Select your sheet and ensure columns exist (e.g. original URL, output URL) Blotato Publishing Install community node @blotato/n8n-nodes-blotato Connect your Blotato API credentials Select the TikTok account (and optional other accounts) Test the workflow Send a Telegram message with: A photo (avatar) Video URL in the caption The workflow will process everything automatically. How to customize this workflow to your needs Change platforms**: remove or add publishing outputs (TikTok, Instagram, LinkedIn, Facebook, YouTube, X). Change voice style**: update the targetVoiceAudioUrl in the Workflow Configuration node. Use your own avatar**: send any image in Telegram — the workflow automatically makes it public and ready for AI processing. Adjust video logic**: swap FAL models, update merg 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by WeblineIndia
IPA Size Tracker with Trend Alerts – Automated iOS Apps Size Monitoring This workflow runs on a daily schedule and monitors IPA file sizes from configured URLs. It stores historical size data in Google Sheets, compares current vs. previous builds and sends email alerts only when significant size changes occur (default: ±10%). A DRY_RUN toggle allows safe testing before real notifications go out. Who’s it for iOS developers tracking app binary size growth over time. DevOps teams monitoring build artifacts and deployment sizes. Product managers ensuring app size budgets remain acceptable. QA teams detecting unexpected size changes in release builds. Mobile app teams optimizing user experience by keeping apps lightweight. How it works Schedule Trigger (daily at 09:00 UTC) kicks off the workflow. Configuration: Define monitored apps with {name, version, build, ipa_url}. HTTP Request downloads the IPA file from its URL. Size Calculation: Compute file sizes in bytes, KB, MB and attach timestamp metadata. Google Sheets: Append size data to the IPA Size History sheet. Trend Analysis: Compare current vs. previous build sizes. Alert Logic: Evaluate thresholds (>10% increase or >10% decrease). Email Notification: Send formatted alerts with comparisons and trend indicators. Rate Limit: Space out notifications to avoid spamming recipients. How to set up 1. Spreadsheet Create a Google Sheet with a tab named IPA Size History containing: Date, Timestamp, App_Name, Version, Build_Number, Size_Bytes, Size_KB, Size_MB, IPA_URL 2. Credentials Google Sheets (OAuth)** → for reading/writing size history. Gmail** → for sending alert emails (use App Password if 2FA is enabled). 3. Open “Set: Configuration” node Define your workflow variables: APP_CONFIGS = array of monitored apps ({name, version, build, ipa_url}) SPREADSHEET_ID = Google Sheet ID SHEET_NAME = IPA Size History SMTP_FROM = sender email (e.g., devops@company.com) ALERT_RECIPIENTS = comma-separated emails SIZE_INCREASE_THRESHOLD = 0.10 (10%) SIZE_DECREASE_THRESHOLD = 0.10 (10%) LARGE_APP_WARNING = 300 (MB) SCHEDULE_TIME = 09:00 TIMEZONE = UTC DRY_RUN = false (set true to test without sending emails) 4. File Hosting Host IPA files on Google Drive, Dropbox or a web server. Ensure direct download URLs are used (not preview links). 5. Activate the workflow Once configured, it will run automatically at the scheduled time. Requirements Google Sheet with the IPA Size History tab. Accessible IPA file URLs. SMTP / gmail account (Gmail recommended). n8n (cloud or self-hosted) with Google Sheets + Email nodes. Sufficient local storage for IPA file downloads. How to customize the workflow Multiple apps**: Add more configs to APP_CONFIGS. Thresholds**: Adjust SIZE_INCREASE_THRESHOLD / SIZE_DECREASE_THRESHOLD. Notification templates**: Customize subject/body with variables: {{app_name}}, {{current_size}}, {{previous_size}}, {{change_percent}}, {{trend_status}}. Schedule**: Change Cron from daily to hourly, weekly, etc. Large app warnings**: Adjust LARGE_APP_WARNING. Trend analysis**: Extend beyond one build (7-day, 30-day averages). Storage backend**: Swap Google Sheets for CSV, DB or S3. Add-ons to level up Slack Notifications**: Add Slack webhook alerts with emojis & formatting. Size History Charts**: Generate trend graphs with Chart.js or Google Charts API. Environment separation**: Monitor dev/staging/prod builds separately. Regression detection**: Statistical anomaly checks. Build metadata**: Log bundle ID, SDK versions, architectures. Archive management**: Auto-clean old records to save space. Dashboards**: Connect to Grafana, DataDog or custom BI. CI/CD triggers**: Integrate with pipelines via webhook trigger. Common Troubleshooting No size data** → check URLs return binary IPA (not HTML error). Download failures** → confirm hosting permissions & direct links. Missing alerts** → ensure thresholds & prior history exist. Google Sheets errors** → check sheet/tab names & OAuth credentials. Email issues** → validate SMTP credentials, spam folder, sender reputation. Large file timeouts** → raise HTTP timeout for >100MB files. Trend errors** → make sure at least 2 builds exist. No runs** → confirm workflow is active and timezone is correct. Need Help? If you’d like this to customize this workflow to suit your app development process, then simply reach out to us here and we’ll help you customize the template to your exact use case.
by Surya Vardhan Yalavarthi
Submit a research topic through a form and receive a professionally styled executive report in your inbox — fully automated, with built-in scraping resilience. The workflow searches Google via SerpApi, scrapes each result with Jina.ai (free, no key needed), and uses Claude to extract key findings. If a page is blocked by a CAPTCHA or login wall, it automatically retries with Firecrawl. Blocked sources are gracefully skipped after two attempts. Once all sources are processed, Claude synthesises a structured executive report and delivers it as a styled HTML email via Gmail. How it works A web form collects the research topic, number of sources (5–7), and recipient email SerpApi searches Google and returns a buffer of results (2× requested + 3 to survive domain filtering) Junk domains are filtered out automatically (Reddit, YouTube, Twitter, PDFs, etc.) Each URL is processed one at a time in a serial loop: Round 1 — Jina.ai: free Markdown scraper, no API key required Claude checks the content — if it's a CAPTCHA or wall, it returns RETRY_NEEDED Round 2 — Firecrawl: paid fallback scraper retries the blocked URL If still blocked, the source is marked as unavailable and the loop continues All extracted findings are aggregated and Claude writes a structured executive report (Executive Summary, Key Findings, Detailed Analysis, Data & Evidence, Conclusions, Sources) The report is converted to styled HTML (with tables, headings, and lists) and emailed to the recipient Setup steps Required credentials | Service | Where to get it | Where to paste it | |---|---|---| | SerpApi | serpapi.com — free tier: 100 searches/month | SerpApi Search node → query param api_key | | Firecrawl | firecrawl.dev — free tier: 500 pages/month | Firecrawl (Fallback) node → Authorization header | | Anthropic | n8n credentials → Anthropic API | Connect to: Claude Extractor, Claude Re-Analyzer, Claude Synthesizer | | Gmail | n8n credentials → Gmail OAuth2 | Connect to: Send Gmail | Error handler (optional) The workflow includes a built-in error handler that captures the failed node name, error message, and execution URL. To activate it: Workflow Settings → Error Workflow → select this workflow. Add a Slack or Gmail node after Format Error to receive failure alerts. Nodes used n8n Form Trigger** — collects topic, source count, and recipient email HTTP Request** × 3 — SerpApi (Google Search), Jina.ai (primary scraper), Firecrawl (fallback scraper) Code** × 6 — URL filtering, response normalisation, prompt assembly, HTML rendering Split In Batches** — serial loop (one URL at a time, prevents rate limit collisions) IF** × 2 — CAPTCHA/block detection after each scrape attempt Wait** — 3-second pause before Firecrawl retry Basic LLM Chain** × 3 — page analysis (×2) and report synthesis (×1), all powered by Claude Aggregate** — collects all per-URL findings before synthesis Gmail** — sends the final HTML report Error Trigger + Set** — error handler sub-flow Notes Jina.ai is free and works without an API key for most public pages Firecrawl is only called when Jina is blocked — most runs won't consume Firecrawl credits SerpApi fetches numSources × 2 + 3 results to ensure enough survive domain filtering Claude model is set to claude-sonnet-4-5 — swap to any Anthropic model in the three Claude nodes The HTML email renders markdown tables, headings, lists, and bold correctly in Gmail
by Kevin Meneses
What this workflow does This workflow automatically audits web pages for SEO issues and generates an executive-friendly SEO report using AI. It is designed for marketers, founders, and SEO teams who want fast, actionable insights without manually reviewing HTML, meta tags, or SERP data. The workflow: Reads URLs from Google Sheets Scrapes page content using Decodo (reliable scraping, even on protected sites) Extracts key SEO elements (title, meta description, canonical, H1/H2, visible text) Uses an AI Agent to analyze the page and generate: Overall SEO status Top issues Quick wins Title & meta description recommendations Saves results to Google Sheets Sends a formatted HTML executive report by email (Gmail) Who this workflow is for This template is ideal for: SEO consultants and agencies SaaS marketing teams Founders monitoring their landing pages Content teams doing SEO quality control It focuses on on-page SEO fundamentals, not backlink analysis or technical crawling. Setup (step by step) 1. Google Sheets Create an input sheet with one URL per row Create an output sheet to store SEO results 2. Decodo Add your Decodo API credentials The URL is automatically taken from the input sheet 👉 Decodo – Web Scraper for n8n 3. AI Agent Connect your LLM credentials (OpenAI, Gemini, etc.) The prompt is already optimized for non-technical SEO summaries 4. Gmail Connect your Gmail account Set the recipient email address Emails are sent in clean HTML format Notes & disclaimer This is a community template Results depend on page accessibility and content structure It focuses on on-page SEO, not backlinks or rankings
by vinci-king-01
Property Listing Aggregator with Mailchimp and Notion ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow scrapes multiple commercial-real-estate websites, consolidates new property listings into Notion, and emails weekly availability updates or immediate space alerts to a Mailchimp audience. It automates the end-to-end process so business owners can stay on top of the latest spaces without manual searching. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed Active Notion workspace with permission to create/read databases Mailchimp account with at least one Audience list Basic understanding of JSON; ability to add API credentials in n8n Required Credentials ScrapeGraphAI API Key** – Enables web scraping functionality Notion OAuth2 / Integration Token** – Writes data into Notion database Mailchimp API Key** – Sends campaigns and individual emails (Optional) Proxy credentials – If target real-estate sites block your IP Specific Setup Requirements | Resource | Requirement | Example | |----------|-------------|---------| | Notion | Database with property fields (Address, Price, SqFt, URL, Availability) | Database ID: abcd1234efgh | | Mailchimp | Audience list where alerts are sent | Audience ID: f3a2b6c7d8 | | ScrapeGraphAI | YAML/JSON config per site | Stored inside the ScrapeGraphAI node | How it works This workflow scrapes multiple commercial-real-estate websites, consolidates new property listings into Notion, and emails weekly availability updates or immediate space alerts to a Mailchimp audience. It automates the end-to-end process so business owners can stay on top of the latest spaces without manual searching. Key Steps: Manual Trigger / CRON**: Starts the workflow weekly or on-demand. Code (Site List Builder)**: Generates an array of target URLs for ScrapeGraphAI. Split In Batches**: Processes URLs in manageable groups to avoid rate limits. ScrapeGraphAI**: Extracts property details from each site. IF (New vs Existing)**: Checks whether the listing already exists in Notion. Notion**: Inserts new listings or updates existing records. Set**: Formats email content (HTML & plaintext). Mailchimp**: Sends a campaign or automated alert to subscribers. Sticky Notes**: Provide documentation and future-enhancement pointers. Set up steps Setup Time: 15-25 minutes Install Community Node Navigate to Settings → Community Nodes and install “ScrapeGraphAI”. Create Notion Integration Go to Notion Settings → Integrations → Develop your own integration. Copy the integration token and share your target database with the integration. Add Mailchimp API Key In Mailchimp: Account → Extras → API keys. Copy an existing key or create a new one, then add it to n8n credentials. Build Scrape Config In the ScrapeGraphAI node, paste a YAML/JSON selector config for each website (address, price, sqft, url, availability). Configure the URL List Open the first Code node. Replace the placeholder array with your target listing URLs. Map Notion Fields Open the Notion node and map scraped fields to your database properties. Save. Design Email Template In the Set node, tweak the HTML and plaintext blocks to match your brand. Test the Workflow Trigger manually, check that Notion rows are created and Mailchimp sends the message. Schedule Add a CRON node (weekly) or leave the Manual Trigger for ad-hoc runs. Node Descriptions Core Workflow Nodes: Manual Trigger / CRON** – Kicks off the workflow either on demand or on a schedule. Code (Site List Builder)** – Holds an array of commercial real-estate URLs and outputs one item per URL. Split In Batches** – Prevents hitting anti-bot limits by processing URLs in groups (default: 5). ScrapeGraphAI** – Crawls each URL, parses DOM with CSS/XPath selectors, returns structured JSON. IF (New Listing?)** – Compares scraped listing IDs against existing Notion database rows. Notion** – Creates or updates pages representing property listings. Set (Email Composer)** – Builds dynamic email subject, body, and merge tags for Mailchimp. Mailchimp* – Uses the *Send Campaign endpoint to email your audience. Sticky Note** – Contains inline documentation and customization reminders. Data Flow: Manual Trigger/CRON → Code (URLs) → Split In Batches → ScrapeGraphAI → IF (New?) True path → Notion (Create) → Set (Email) → Mailchimp False path → (skip) Customization Examples Filter Listings by Maximum Budget // Inside the IF node (custom expression) {{$json["price"] <= 3500}} Change Email Frequency to Daily Digests { "nodes": [ { "name": "Daily CRON", "type": "n8n-nodes-base.cron", "parameters": { "triggerTimes": [ { "hour": 8, "minute": 0 } ] } } ] } Data Output Format The workflow outputs structured JSON data: { "address": "123 Market St, Suite 400", "price": 3200, "sqft": 950, "url": "https://examplebroker.com/listing/123", "availability": "Immediate", "new": true } Troubleshooting Common Issues Scraper returns empty objects – Verify selectors in ScrapeGraphAI config; inspect the site’s HTML for changes. Duplicate entries in Notion – Ensure the “IF” node checks a unique ID (e.g., listing URL) before creating a page. Performance Tips Reduce batch size or add delays in ScrapeGraphAI to avoid site blocking. Cache previously scraped URLs in an external file or database for faster runs. Pro Tips: Rotate proxies in ScrapeGraphAI for heavily protected sites. Use Notion rollups to calculate total available square footage automatically. Leverage Mailchimp merge tags (|FNAME|) in the Set node for personalized alerts.
by Shayan Ali Bakhsh
About this Workflow This workflow helps you repurpose your YouTube videos across multiple social media platforms with zero manual effort. It’s designed for creators, businesses, and marketers who want to maximize reach without spending hours re-uploading content everywhere. How It Works Trigger from YouTube The workflow checks your YouTube channel every 10 minutes via RSS feed. It compares the latest video ID with the last saved one to detect if a new video was uploaded. Tutorial: How to get YouTube Channel RSS Feed Generate Descriptions with AI Uses Gemini 2.5 Flash to automatically generate fresh, engaging descriptions for your posts. Create Images with ContentDrips ContentDrips offers multiple templates (carousel, single image, branding templates, etc.). The workflow generates a custom promotional image using your video description and thumbnail. Install node: npm install n8n-nodes-contentdrips Docs: ContentDrips Blog Tutorial Publish Across Social Platforms with SocialBu Instead of manually connecting each social media API, this workflow uses SocialBu. From a single connection, you can post to: Facebook, Instagram, TikTok, YouTube, Twitter (X), LinkedIn, Threads, Pinterest, and more. Website: SocialBu Get Real-Time Notifications via Discord After each run, the workflow sends updates to your Discord channel. You’ll know if the upload was successful, or if an error occurred (e.g., API limits). Setup guide: Discord OAuth Credentials Why Use This Workflow? Saves time by automating the entire repurposing process. Ensures consistent branding and visuals across platforms. Works around platform restrictions by leveraging SocialBu’s integrations. Keeps you updated with Discord notifications—no guessing if something failed. Requirements YouTube channel RSS feed link ContentDrips API key, template ID, and branding setup SocialBu account with connected social media platforms Discord credentials (for webhook updates) Need Help? Message me on LinkedIn: Shayan Ali Bakhsh Happy Automation 🚀
by PhilanthropEAK Automation
Who's it for Marketing teams, social media managers, content creators, and small businesses looking to maintain consistent social media presence across multiple platforms. Perfect for organizations that want to automate content creation while maintaining quality and brand consistency. How it works This workflow creates a complete social media automation system that generates platform-specific content using AI and schedules posts across Twitter, LinkedIn, and Instagram based on your content calendar in Google Sheets. The system runs daily at 9 AM, reading your content calendar to identify scheduled topics for the day. It uses OpenAI's GPT-4 to generate platform-optimized content that follows each platform's best practices - concise engaging posts for Twitter, professional thought leadership for LinkedIn, and visual storytelling for Instagram. DALL-E creates accompanying images that match your brand style and topic themes. Each piece of content is automatically formatted for optimal engagement, including appropriate hashtags, character limits, and platform-specific calls-to-action. The workflow then schedules posts through Buffer's API at optimal times and updates your spreadsheet with posting status, content previews, and generated image URLs for tracking and approval workflows. How to set up Prerequisites: Google account with Sheets access OpenAI API key with GPT-4 and DALL-E access Buffer account with connected social media profiles Slack workspace (optional for notifications) Setup steps: Create your content calendar: Copy the provided Google Sheets template Set up columns: Date, Topic, Platforms, Content Type, Keywords, Status, Generated Content, Image URL Fill in your content schedule with topics and target platforms Configure credentials in n8n: Add OpenAI API credential with your API key Set up Google Sheets OAuth2 for spreadsheet access Add Buffer API token from your Buffer dashboard Add Slack API credential for success notifications (optional) Update Configuration Variables: Set your Google Sheet ID from the spreadsheet URL Define your brand voice and company messaging Specify target audience for content personalization Set image style preferences for consistent visuals Configure Buffer integration: Connect your social media accounts to Buffer Get profile IDs for Twitter, LinkedIn, and Instagram Update the Schedule Post node with your specific profile IDs Set optimal posting times in Buffer settings Test the workflow: Add test content to tomorrow's date in your calendar Run the workflow manually to verify content generation Check that posts appear in Buffer's queue correctly Verify spreadsheet updates and Slack notifications work Requirements Google Sheets with template structure and editing permissions OpenAI API key with GPT-4 and DALL-E access (estimated cost: $0.10-0.30 per day for content generation) Buffer account (free plan supports up to 3 social accounts, paid plans for more) Social media accounts connected through Buffer (Twitter, LinkedIn, Instagram) n8n instance (cloud subscription or self-hosted) How to customize the workflow Adjust content generation: Modify AI prompts in the OpenAI node to match your industry tone and style Add custom content types (promotional, educational, behind-the-scenes, user-generated) Include seasonal or event-based content variations in your prompts Customize hashtag strategies per platform and content type Enhance scheduling logic: Add time zone considerations for global audiences Implement different posting schedules for weekdays vs weekends Create urgency-based posting for time-sensitive content Add approval workflows before scheduling sensitive content Expand platform support: Add Facebook, TikTok, or YouTube Shorts using their respective APIs Integrate with Hootsuite or Later as alternative scheduling platforms Include Pinterest for visual content with optimized descriptions Add LinkedIn Company Page posting alongside personal profiles Improve content intelligence: Integrate trending hashtag research using social media APIs Add competitor content analysis for inspiration and differentiation Include sentiment analysis to adjust tone based on current events Implement A/B testing for different content variations Advanced automation features: Add engagement monitoring and response workflows Create monthly performance reports sent via email Implement content recycling for evergreen topics Build user-generated content curation from brand mentions Add crisis communication protocols for sensitive topics Integration enhancements: Connect with your CRM to include customer success stories Link to email marketing for cross-channel content consistency Integrate with project management tools for campaign coordination Add analytics dashboards for content performance tracking
by Omer Fayyaz
Automatically discover and extract article URLs from any website using AI to identify valid content links while filtering out navigation, category pages, and irrelevant content—perfect for building content pipelines, news aggregators, and research databases. What Makes This Different: AI-Powered Intelligence** - Uses GPT-5-mini to understand webpage context and identify actual articles vs navigation pages, eliminating false positives Browser Spoofing** - Includes realistic User-Agent headers and request patterns to avoid bot detection on publisher sites Smart URL Normalization* - Automatically strips tracking parameters (utm_, fbclid, etc.), removes duplicates, and standardizes URLs Source Categorization** - AI assigns logical source names based on domain and content type for easy filtering Rate Limiting Built-In** - Configurable delays between requests prevent IP blocking and respect website resources Deduplication on Save** - Google Sheets append-or-update pattern ensures no duplicate URLs in your database Key Benefits of AI-Powered Content Discovery: Save 10+ Hours Weekly** - Automate manual link hunting across dozens of publisher sites Higher Quality Results** - AI filters out 95%+ of junk links (nav pages, categories, footers) that rule-based scrapers miss Scale Effortlessly** - Add new seed URLs to your sheet and the same workflow handles any website structure Industry Agnostic** - Works for news, blogs, research papers, product pages—any content type Always Up-to-Date** - Schedule daily runs to catch new content as it's published Full Audit Trail** - Track discovered URLs with timestamps and sources in Google Sheets Who's it for This template is designed for content marketers, SEO professionals, researchers, media monitors, and anyone who needs to aggregate content from multiple sources. It's perfect for organizations that need to track competitor blogs, curate industry news, build research databases, monitor brand mentions, or aggregate content for newsletters without manually checking dozens of websites daily or writing complex scraping rules for each source. How it works / What it does This workflow creates an intelligent content discovery pipeline that automatically finds and extracts article URLs from any webpage. The system: Reads Seed URLs - Pulls a list of webpages to crawl from your Google Sheets (blog indexes, news feeds, publication homepages) Fetches with Stealth - Downloads each webpage's HTML using browser-like headers to avoid bot detection Converts for AI - Transforms messy HTML into clean Markdown that the AI can easily process AI Extraction - GPT-5-mini analyzes the content and identifies valid article URLs while filtering out navigation, categories, and junk links Normalizes & Saves - Cleans URLs (removes tracking params), deduplicates, and saves to Google Sheets with source tracking Key Innovation: Context-Aware Link Filtering - Unlike traditional scrapers that rely on CSS selectors or URL patterns (which break when sites update), the AI understands the semantic difference between an article link and a navigation link. It reads the page like a human would, identifying content worth following regardless of the website's structure. How to set up 1. Create Your Google Sheets Database Create a new Google Spreadsheet with two sheets: "Seed URLs" - Add column URL with webpages to crawl (blog homepages, news feeds, etc.) "Discovered URLs" - Add columns: URL, Source, Status, Discovered At Add 3-5 seed URLs to start (e.g., https://abc.com/, https://news.xyz.com/) 2. Connect Your Credentials Google Sheets**: Click the "Read Seed URLs" and "Save Discovered URLs" nodes → Select your Google Sheets account OpenAI**: Click the "OpenAI GPT-5-mini" node → Add your OpenAI API key Select your spreadsheet and sheet names in both Google Sheets nodes 3. Customize the AI Prompt (Optional) Open the "AI URL Extractor" node Modify the system message to add industry-specific rules: // Example: Add to system message for tech blogs For tech sites, also extract: Tutorial and guide URLs Product announcement pages Changelog and release notes Adjust source naming conventions to match your taxonomy 4. Test Your Configuration Click "Test Workflow" or use the Manual Trigger Check the execution to verify: Seed URLs are being read correctly HTML is fetched successfully (check for 200 status) AI returns valid JSON array of URLs URLs are saved to your output sheet Review the "Discovered URLs" sheet for results 5. Schedule and Monitor Adjust the Schedule Trigger (default: daily at 6 AM) Enable the workflow to run automatically Monitor execution logs for errors: Rate limiting: Increase wait time if sites block you Empty results: Check if seed URLs have changed structure AI errors: Review AI output in execution data Set up error notifications via email or Slack (add nodes after Completion Summary) Requirements Google Sheets Account** - OAuth2 connection for reading seed URLs and saving results OpenAI API Key** - For GPT-5-mini (or swap for any LangChain-compatible LLM)