by David P
Curate & post AI news to X, Bluesky, Threads & more via GPT-5 mini & Cue This n8n template automatically curates AI news from RSS feeds and generates platform-tailored social media posts using GPT-5 mini. Posts are saved as drafts in Cue for review before publishing to X, Bluesky, Threads, Mastodon, and Facebook. Use cases include: Daily automated AI/tech news curation Multi-platform social media content creation Building thought leadership with consistent posting Staying on top of industry news without manual effort Who is this for? This workflow is ideal for: Tech content creators who want to share AI news across multiple platforms Social media managers handling multiple accounts Anyone building an audience around AI/tech topics Teams who want consistent daily content without manual curation What problem does this workflow solve? Manually curating news, writing platform-specific posts, and publishing across 5 different social networks is time-consuming. This workflow automates the entire process: Curation** - Pulls from 4 trusted AI/tech RSS feeds daily Deduplication** - Tracks posted articles in Google Sheets so you never share the same story twice Content creation** - GPT-5 mini writes posts tailored to each platform's style and character limits Review workflow** - Creates drafts in Cue so you can review before publishing How it works Schedule Trigger - Runs daily at 9am (configurable) RSS Feeds - Fetches articles from TechCrunch AI, Ars Technica AI, The Verge AI, and MIT Tech Review Filter & Merge - Combines all feeds and filters to articles from the last 7 days Deduplication - Compares against Google Sheets to find unposted articles Random Selection - Picks one random article from available stories AI Generation - GPT-5 mini generates 5 platform-specific posts with appropriate tone and length Save to Cue - Creates a draft post with all 5 platform variations Log to Sheet - Records the article URL to prevent future duplicates Setup Requirements Cue account with connected social accounts OpenAI API key Google account for Sheets Step 1: Install the Cue community node Go to Settings → Community Nodes Click Install Enter @cuehq/n8n-nodes-cue Step 2: Create tracking spreadsheet Create a new Google Sheet named "AI News Tracker" Add these column headers in row 1: article_url title source processed_at Step 3: Configure credentials Google Sheets - Add OAuth2 credentials and connect to the "Get Recent Posts" node OpenAI - Add your API key and connect to the "GPT-5 mini" node Cue - Add your API key from Cue Settings Step 4: Configure the Cue node Open the Create Draft in Cue node Select your Profile For each platform slot, select your social account: Slot 1 → X/Twitter Slot 2 → Bluesky Slot 3 → Threads Slot 4 → Mastodon Slot 5 → Facebook Don't have all 5 platforms? Simply delete the unused slots. Step 5: Publish Save and click Publish to activate the workflow. Customizing this workflow Change the schedule Edit the Daily 9am Trigger node to run at a different time or frequency. Use different RSS feeds Replace the feed URLs with sources relevant to your niche. The workflow handles any standard RSS feed. Keep 3-6 feeds for best results. Auto-publish instead of drafts To publish immediately instead of creating drafts, enable Publish Immediately in the Cue node settings. Adjust the AI tone Modify the system prompt in the Write Social Posts node to match your brand voice or adjust platform-specific guidelines. Good to know Cost** - Each run uses one OpenAI API call. With GPT-5 mini, this costs approximately $0.01-0.02 per execution. Draft review** - Posts are created as drafts in Cue, giving you a chance to review and edit before publishing. Deduplication** - The Google Sheet tracks all posted URLs, so the same article is never shared twice. About Cue Cue is a social media scheduling platform that lets you manage and publish content across X, Bluesky, Threads, Mastodon, Facebook, LinkedIn, TikTok, and Instagram from a single dashboard. Key features: Multi-platform publishing** - Schedule once, publish everywhere Platform-specific content** - Tailor each post for different audiences Draft workflow** - Review and edit before publishing API & integrations** - Connect with n8n, Zapier, Make, and custom apps Get started free · Documentation · n8n Community Node
by Jitesh Dugar
Schedule social media posts from local files using UploadToURL, OpenAI, and Buffer Marketing teams often have design files sitting locally — campaign images, product videos, event graphics — that need to be published on social media. The usual process means downloading files, switching apps, uploading to each platform separately, and writing captions by hand. This workflow removes those steps. Send a file link or binary upload to the webhook. UploadToURL hosts it instantly and returns a clean public URL. OpenAI GPT-4.1 mini reads the filename and context to generate a platform-specific caption, hashtags, alt text, and a scroll-stopping hook. A Switch node routes to the correct Buffer profile — Twitter/X, Instagram, or LinkedIn — and the post is scheduled at the AI-suggested best time. What this workflow does Receives a file URL or binary upload via webhook along with platform, tone, and brand preferences Validates the payload — checks the platform, detects content type from the file extension, cleans the filename into readable words for the AI prompt Uploads the file to UploadToURL and retrieves a permanent public link Sends the link and context to OpenAI, which returns a structured JSON caption with hashtags, alt text, a hook line, and a UTC posting time Routes to the correct Buffer profile based on the platform field Schedules the post and returns a confirmation with the schedule ID, caption, hashtags, and estimated engagement Who this is for Marketing agencies** managing multiple brand accounts who need to go from a finished design file to a scheduled post without switching tools Solo creators** who want to publish immediately after finishing a piece of content without manually uploading to each platform E-commerce teams** who want to trigger social posts whenever new product photos are ready Setup Install the UploadToURL community node: n8n-nodes-uploadtourl Add credentials for UploadToURL API, OpenAI API, and Buffer (as HTTP Header Auth with your Buffer access token) Set three workflow variables: BUFFER_PROFILE_TWITTER, BUFFER_PROFILE_INSTAGRAM, BUFFER_PROFILE_LINKEDIN — find these IDs in your Buffer account under each profile's settings Activate and copy the webhook URL Webhook payload { "fileUrl": "https://cdn.example.com/summer-campaign.jpg", "filename": "summer-campaign.jpg", "platform": "instagram", "tone": "casual", "brand": "Acme Studio", "hashtags": true } To upload a binary file instead, send as multipart/form-data with field name file and omit fileUrl. Pass scheduleTime as an ISO 8601 string to override the AI scheduling suggestion. Notes The OpenAI node uses gpt-4.1-mini with response_format: json_object to guarantee structured output — no post-processing of freetext required Caption length is validated against per-platform limits before scheduling (Twitter: 280, Instagram: 2200, LinkedIn: 3000) To add Facebook or TikTok, add a new output on the Switch node and duplicate one of the Buffer HTTP request nodes The error handler returns a structured JSON 400 response so calling apps receive actionable feedback without needing to check n8n logs
by Kyriakos Papadopoulos
Auto-Summarize Blog Posts to Social Media with Gemma and Postiz This workflow automates fetching the latest post from a Blogspot RSS feed, summarizes it with an LLM (e.g., Gemma via Ollama), extracts and uploads an image, generates three relevant hashtags, and posts to Facebook, LinkedIn, X (Twitter), and Instagram via the Postiz API. It ensures content fits platform limits (e.g., 280 characters for X) and prevents duplicates using hashing. Pros: Efficient for content creators Local LLM ensures privacy Customizable for any RSS/blog source Cons: Dependent on stable APIs (Postiz/social platforms) LLM outputs may vary in quality without human review Target Audience: Bloggers, content marketers, or social media managers looking to automate cross-platform posting from RSS sources, especially those focused on niches like health, tech, or personal development. Ideal for users with technical setup skills for self-hosting. Customization Options: Adapt prompts in "Generate Summary and Hashtags with LLM" for tone/style (e.g., professional vs. casual). Modify maxChars/hashtag reserve in "Calculate Summary Character Limit" for different platforms. Extend for multiple RSS feeds by adjusting "Calculate Summary Character Limit" array. Add error handling (e.g., IF node after "Create and Post Content via Postiz API") for API failures. Disclaimer: This template is designed for self-hosted n8n instances to leverage local Ollama for privacy. For cloud use, modify as follows: 1) Use an n8n cloud account, 2) Replace Ollama with a cloud API-based LLM like ChatGPT in the "Configure Local LLM Model (Ollama)" node, 3) Switch to cloud-hosted Postiz in the HTTP Request node. Template Image: How it works Set the RSS feed URL in "Set RSS Feed URLs". Fetch the latest post via RSS. Normalize fields and calculate the maximum summary length. Use the LLM to summarize the text, append hashtags, and include the link. Extract and process an image from the post HTML. Validate inputs and post to social platforms via the Postiz API. Setup Instructions Install n8n (self-hosted recommended for Ollama integration). Set up Ollama with the Gemma (or a similar) model using "Ollama Model" credentials. Add Postiz API credentials in the "Create and Post Content via Postiz API" node. Replace placeholders: RSS URL in "Set News RSS Feeds" Integration IDs in the Postiz HTTP body (Optional) Add error handling for API failures. Activate the workflow and test with a sample post. Uncertainties Changes in social media APIs may break posting functionality. LLM output consistency depends on model choice and prompt configuration. Required n8n Version Tested on n8n v1.107.3 (self-hosted). Works with the community node n8n-nodes-langchain. Resources n8n Docs: RSS Feed Read n8n Docs: HTTP Request Ollama Setup Postiz Documentation
by DataMinex
Transform property searches into personalized experiences! This powerful automation delivers dream home matches straight to clients' inboxes with professional CSV reports - all from a simple web form. 🚀 What this workflow does Create a complete real estate search experience that works 24/7: ✨ Smart Web Form - Beautiful property search form captures client preferences 🧠 Dynamic SQL Builder - Intelligently creates optimized queries from user input ⚡ Lightning Database Search - Scans 1000+ properties in milliseconds 📊 Professional CSV Export - Excel-ready reports with complete property details 📧 Automated Email Delivery - Personalized emails with property previews and attachments 🎯 Perfect for: Real Estate Agents** - Generate leads and impress clients with instant service Property Managers** - Automate tenant matching and recommendations Brokerages** - Provide 24/7 self-service property discovery Developers** - Showcase available properties with professional automation 💡 Why this workflow is a game-changer > "From property search to professional report delivery in under 30 seconds!" ⚡ Instant Results: Zero wait time for property matches 🎨 Professional Output: Beautiful emails that showcase your expertise 📱 Mobile Optimized: Works flawlessly on all devices 🧠 Smart Filtering: Only searches criteria clients actually specify 📈 Infinitely Scalable: Handles unlimited searches simultaneously 📊 Real Estate Data Source Built on authentic US market data from the Github: 🏘️ 1000+ Real Properties across all US states 💰 Actual Market Prices from legitimate listings 🏠 Complete Property Details (bedrooms, bathrooms, square footage, lot size) 📍 Verified Locations with accurate cities, states, and ZIP codes 🏢 Broker Information for authentic real estate context 🛠️ Quick Setup Guide Prerequisites Checklist ✅ [ ] SQL Server database (MySQL/PostgreSQL also supported) [ ] Gmail account for automated emails [ ] n8n instance (cloud or self-hosted) [ ] 20 minutes setup time Step 1: Import Real Estate Data 📥 🌟 Download the data 💾 Download CSV file (1000+ properties included) 🗄️ Create SQL Server table with this exact schema: CREATE TABLE [REALTOR].[dbo].[realtor_usa_price] ( brokered_by BIGINT, status NVARCHAR(50), price DECIMAL(12,2), bed INT, bath DECIMAL(3,1), acre_lot DECIMAL(10,8), street BIGINT, city NVARCHAR(100), state NVARCHAR(50), zip_code INT, house_size INT, prev_sold_date NVARCHAR(50) ); 📊 Import your CSV data into this table Step 2: Configure Database Connection 🔗 🔐 Set up Microsoft SQL Server credentials in n8n ✅ Test connection to ensure everything works 🎯 Workflow is pre-configured for the table structure above Step 3: Gmail Setup (The Magic Touch) 📧 🌐 Visit Google Cloud Console 🆕 Create new project (or use existing) 🔓 Enable Gmail API in API Library 🔑 Create OAuth2 credentials (Web Application) ⚙️ Add your n8n callback URL to authorized redirects 🔗 Configure Gmail OAuth2 credentials in n8n ✨ Authorize your Google account Step 4: Launch Your Property Search Portal 🚀 📋 Import this workflow template (form is pre-configured) 🌍 Copy your webhook URL from the Property Search Form node 🔍 Test with a sample property search 📨 Check email delivery with CSV attachment 🎉 Go live and start impressing clients! 🎨 Customization Playground 🏷️ Personalize Your Brand // Customize email subjects in the Gmail node "🏠 Exclusive Properties Curated Just for You - ${results.length} Perfect Matches!" "✨ Your Dream Home Portfolio - Handpicked by Our Experts" "🎯 Hot Market Alert - ${results.length} Premium Properties Inside!" 🔧 Advanced Enhancements 🎨 HTML Email Templates**: Create stunning visual emails with property images 📊 Analytics Dashboard**: Track popular searches and user engagement 🔔 Smart Alerts**: Set up automated price drop notifications 📱 Mobile Integration**: Connect to React Native or Flutter apps 🤖 AI Descriptions**: Add ChatGPT for compelling property descriptions 🌍 Multi-Database Flexibility // Easy database switching // MySQL: Replace Microsoft SQL node → MySQL node // PostgreSQL: Swap for PostgreSQL node // MongoDB: Use MongoDB node with JSON queries // Even CSV files: Use CSV reading nodes for smaller datasets 🚀 Advanced Features & Extensions 🔥 Pro Tips for Power Users 🔄 Bulk Processing**: Handle multiple searches simultaneously 💾 Smart Caching**: Store popular searches for lightning-fast results 📈 Lead Scoring**: Track which properties generate most interest 📅 Follow-up Automation**: Schedule nurturing email sequences 🎯 Integration Possibilities 🏢 CRM Connection**: Auto-add qualified leads to your CRM 📅 Calendar Integration**: Add property viewing scheduling 📊 Price Monitoring**: Track market trends and price changes 📱 Social Media**: Auto-share featured properties to social platforms 💬 Chat Integration**: Connect to WhatsApp or SMS for instant alerts 🔗 Expand Your Real Estate Automation 🌟 Related Workflow Ideas 🤖 AI Property Valuation - Add machine learning for price predictions 📊 Market Analysis Reports - Generate comprehensive market insights 📱 SMS Property Alerts - Instant text notifications for hot properties 🏢 Commercial Property Search - Adapt for office and retail spaces 💹 Investment ROI Calculator - Add financial analysis for investors 🏘️ Neighborhood Analytics - Include school ratings and demographics 🛠️ Technical Extensions 📷 Image Processing: Auto-resize and optimize property photos 🗺️ Map Integration: Add interactive property location maps 📱 Progressive Web App: Create mobile app experience 🔔 Push Notifications: Real-time alerts for saved searches 🚀 Get Started Now Import this workflow template Configure your database and Gmail Customize branding and messaging Launch your professional property search portal Watch client satisfaction soar!
by Surya Vardhan Yalavarthi
Submit a research topic through a form and receive a professionally styled executive report in your inbox — fully automated, with built-in scraping resilience. The workflow searches Google via SerpApi, scrapes each result with Jina.ai (free, no key needed), and uses Claude to extract key findings. If a page is blocked by a CAPTCHA or login wall, it automatically retries with Firecrawl. Blocked sources are gracefully skipped after two attempts. Once all sources are processed, Claude synthesises a structured executive report and delivers it as a styled HTML email via Gmail. How it works A web form collects the research topic, number of sources (5–7), and recipient email SerpApi searches Google and returns a buffer of results (2× requested + 3 to survive domain filtering) Junk domains are filtered out automatically (Reddit, YouTube, Twitter, PDFs, etc.) Each URL is processed one at a time in a serial loop: Round 1 — Jina.ai: free Markdown scraper, no API key required Claude checks the content — if it's a CAPTCHA or wall, it returns RETRY_NEEDED Round 2 — Firecrawl: paid fallback scraper retries the blocked URL If still blocked, the source is marked as unavailable and the loop continues All extracted findings are aggregated and Claude writes a structured executive report (Executive Summary, Key Findings, Detailed Analysis, Data & Evidence, Conclusions, Sources) The report is converted to styled HTML (with tables, headings, and lists) and emailed to the recipient Setup steps Required credentials | Service | Where to get it | Where to paste it | |---|---|---| | SerpApi | serpapi.com — free tier: 100 searches/month | SerpApi Search node → query param api_key | | Firecrawl | firecrawl.dev — free tier: 500 pages/month | Firecrawl (Fallback) node → Authorization header | | Anthropic | n8n credentials → Anthropic API | Connect to: Claude Extractor, Claude Re-Analyzer, Claude Synthesizer | | Gmail | n8n credentials → Gmail OAuth2 | Connect to: Send Gmail | Error handler (optional) The workflow includes a built-in error handler that captures the failed node name, error message, and execution URL. To activate it: Workflow Settings → Error Workflow → select this workflow. Add a Slack or Gmail node after Format Error to receive failure alerts. Nodes used n8n Form Trigger** — collects topic, source count, and recipient email HTTP Request** × 3 — SerpApi (Google Search), Jina.ai (primary scraper), Firecrawl (fallback scraper) Code** × 6 — URL filtering, response normalisation, prompt assembly, HTML rendering Split In Batches** — serial loop (one URL at a time, prevents rate limit collisions) IF** × 2 — CAPTCHA/block detection after each scrape attempt Wait** — 3-second pause before Firecrawl retry Basic LLM Chain** × 3 — page analysis (×2) and report synthesis (×1), all powered by Claude Aggregate** — collects all per-URL findings before synthesis Gmail** — sends the final HTML report Error Trigger + Set** — error handler sub-flow Notes Jina.ai is free and works without an API key for most public pages Firecrawl is only called when Jina is blocked — most runs won't consume Firecrawl credits SerpApi fetches numSources × 2 + 3 results to ensure enough survive domain filtering Claude model is set to claude-sonnet-4-5 — swap to any Anthropic model in the three Claude nodes The HTML email renders markdown tables, headings, lists, and bold correctly in Gmail
by Dr. Firas
💥 Automate AI Video Creation & Multi-Platform Publishing with Veo 3.1 & Blotato 🎯 Who is this for? This workflow is designed for content creators, marketers, and automation enthusiasts who want to produce professional AI-generated videos and publish them automatically on social media — without editing or manual uploads. Perfect for those using Veo 3.1, GPT-4, and Blotato to scale video creation. 💡 What problem is this workflow solving? Creating short-form content (TikTok, Instagram Reels, YouTube Shorts) is time-consuming — from writing scripts to video editing and posting. This workflow eliminates the manual steps by combining AI storytelling + video generation + automated publishing, letting you focus on creativity while your system handles production and distribution. ⚙️ What this workflow does Reads new ideas from Google Sheets Generates story scripts using GPT-4 Creates cinematic videos using Veo 3.1 (fal.ai/veo3.1/reference-to-video) with 3 input reference images Uploads the final video automatically to Google Drive Publishes the video across multiple platforms (TikTok, Instagram, Facebook, X, LinkedIn, YouTube) via Blotato Updates Google Sheets with video URL and status (Completed / Failed) 🧩 Setup Required accounts: OpenAI → GPT-4 API key fal.ai → Veo 3.1 API key Google Cloud Console → Sheets & Drive connection Blotato → API key for social media publishing Configuration steps: Copy the Google Sheets structure: A: id_video B: niche C: idea D: url_1 E: url_2 F: url_3 G: url_final H: status Add your API keys to the Workflow Configuration node. Insert three image URLs and a short idea into your sheet. Wait for the automation to process and generate your video. 🧠 How to customize this workflow Change duration or aspect ratio** → Edit the Veo 3.1 node JSON body (duration, aspect_ratio) Modify prompt style** → Adjust the “Optimize Prompt for Veo” node for your desired tone or cinematic look Add more platforms** → Extend Blotato integration to publish on Pinterest, Reddit, or Threads Enable Telegram Trigger** → Allow users to submit ideas and images directly via Telegram 🚀 Expected Outcome Within 2–3 minutes, your idea is transformed into a full cinematic AI video — complete with storytelling, visuals, and automatic posting to your social media channels. Save hours of editing and focus on strategy, creativity, and growth. 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by isaWOW
Description Automate Facebook post scheduling from a Google Sheets content calendar. Runs 4 times daily, reads approved posts scheduled for today, downloads images from Google Drive, schedules via Facebook Graph API, and updates tracking sheet with published URLs—perfect for social media managers and agencies. What this workflow does This workflow eliminates manual Facebook posting by automating the entire scheduling process from a centralized Google Sheets content calendar. It runs four times daily (9:35 AM, 10:35 AM, 11:35 AM, 12:35 PM) to catch posts scheduled at different times throughout the morning. The workflow reads your Google Sheet, filters posts marked with Approval Status = "Good" and Platform = "Facebook", then checks which posts are scheduled for today. For each approved post, it intelligently determines if it's a text-only post or a photo post—if there's a Media URL, it downloads the image from Google Drive; otherwise, it schedules just the text. Both types are scheduled via Facebook Graph API with future publishing times (not posted immediately), and once successfully scheduled, the workflow updates your Google Sheet with the published post URL and changes the Approval Status to "Published". This creates a complete audit trail of all scheduled content while supporting team collaboration through the approval workflow. Perfect for social media managers handling multiple Facebook pages, marketing agencies scheduling client content with approval checkpoints, content creators batch-planning posts in Google Sheets, and teams needing collaborative content calendars with centralized image management. Key features Google Sheets content calendar: Manage all Facebook posts in a familiar spreadsheet with columns for Scheduled On, Platform, Post Type, Caption, Media URL, and Approval Status—no complex social media management tools needed. Built-in approval workflow: Only posts marked "Good" in the Approval Status column are published. Team members can review, approve, or reject posts directly in Google Sheets before they go live. Dual post type support: Handles both text-only posts (scheduled via /feed endpoint) and photo posts (scheduled via /photos endpoint with binary image data)—automatically detects which type based on Media URL presence. Google Drive image integration: Stores all images in Google Drive (centralized, shared storage), then automatically downloads them when scheduling photo posts—no manual file management needed. Runs 4 times daily: Schedule trigger fires at 9:35 AM, 10:35 AM, 11:35 AM, and 12:35 PM to catch posts scheduled at different morning times—handles busy posting schedules without missing slots. Facebook Graph API scheduling: Uses official Facebook Graph API v24.0 with scheduled_publish_time parameter (published: false) to schedule posts for future times—not immediate posting, actual scheduling. Post URL tracking: After successfully scheduling, updates Google Sheet with the published Facebook post URL—creates complete audit trail and enables easy post performance tracking. Multi-platform ready: Uses "Platform" column to filter Facebook posts only—same Google Sheet can manage Instagram, LinkedIn, Twitter content by adding more platform-specific workflows. Story post filtering: Automatically skips posts where Post Type = "Story" (Facebook Stories scheduling not supported by this workflow)—only processes Feed and Photo posts. How it works 1. Scheduled trigger fires 4 times daily A cron trigger runs at 9:35 AM, 10:35 AM, 11:35 AM, and 12:35 PM every day. This catches posts scheduled at different times throughout the morning without needing to run the workflow every minute. 2. Load Facebook credentials The workflow reads a separate ".env" sheet in your Google Sheets document containing: Facebook Page ID:** Your Facebook Page's unique ID Facebook Page Access Token:** Long-lived access token with pages_manage_posts and pages_read_engagement permissions These credentials are used for all Facebook Graph API calls later in the workflow. 3. Read approved Facebook posts The workflow reads your main "Post URL" sheet and applies two filters: Approval Status = "Good":** Only processes approved posts Platform = "Facebook":** Filters out Instagram, LinkedIn, etc. This returns all approved Facebook posts regardless of scheduled date. 4. Filter posts scheduled for today A Code node compares the "Scheduled On" column value against today's date (ignores time, just checks the date part). Posts scheduled for today pass through; others are filtered out. Supported date formats: "2025-10-30 10:00" "2025-10-30 06-42" Any format with YYYY-MM-DD at the beginning 5. Loop through each post The Split in Batches node processes one post at a time, preventing API rate limits and ensuring each post is handled individually. If there are 5 approved posts for today, it loops 5 times. 6. Platform verification A Switch node double-checks that Platform = "Facebook" (redundant but ensures accuracy). This allows the same workflow structure to be copied for other platforms. 7. Story post filtering An If node checks if Post Type != "Story". Facebook Stories scheduling is not supported in this workflow, so Story posts are skipped and merged back into the loop to continue with the next post. 8. Determine post type (text-only vs. photo) An If node checks if the "Media URL" column is empty: Empty → Text-only post** (routes to Branch A) Has value → Photo post** (routes to Branch B) Branch A: Text-Only Post 9a. Schedule Facebook text post HTTP POST request to Facebook Graph API: https://graph.facebook.com/v24.0/{page-id}/feed Parameters: message: Caption text from Google Sheet access_token: From credentials sheet published: false (schedules instead of posting immediately) scheduled_publish_time: Unix timestamp converted from "Scheduled On" field Example: If Scheduled On = "2025-10-30 14:00", the workflow converts this to Unix timestamp (1730296800) and Facebook schedules the post for that exact time. 10a. Update sheet with text post URL After successful API response, the workflow constructs the Facebook post URL from the response ID: https://www.facebook.com/{page-id}/posts/{post-id} Then updates the Google Sheet row: Approval Status:** "Published" Post URL:** Constructed Facebook URL This marks the post as published and provides a clickable link to view it on Facebook. Branch B: Photo Post 9b. Download image from Google Drive Uses the Media URL (Google Drive sharing link) to download the image file. Supports: Direct Google Drive file URLs Shared Drive files Public or private files (as long as the OAuth account has access) The image is downloaded as binary data and passed to the next node. 10b. Schedule Facebook photo post HTTP POST request to Facebook Graph API: https://graph.facebook.com/v24.0/{page-id}/photos Content-Type: multipart/form-data Parameters: source: Binary image data (from Google Drive download) caption: Caption text from Google Sheet access_token: From credentials sheet published: false (schedules instead of posting immediately) scheduled_publish_time: Unix timestamp + 15 minute buffer Note: Photo posts get an extra 15-minute buffer in the scheduled time to account for image processing delays on Facebook's side. 11b. Update sheet with photo post URL After successful API response, constructs the Facebook photo URL: https://www.facebook.com/photo/?fbid={photo-id} Then updates the Google Sheet row: Approval Status:** "Published" Post URL:** Constructed Facebook photo URL 12. Merge and loop All three branches (text posts, photo posts, skipped stories) merge back together. The loop then proceeds to the next post until all approved posts for today are processed. Setup requirements Tools you'll need: Active n8n instance (self-hosted or n8n Cloud) Google Sheets with OAuth access Google Drive with OAuth access Facebook Page (not personal profile) Facebook Page Access Token with proper permissions Estimated setup time: 30–35 minutes Configuration steps 1. Create Facebook Page Access Token Go to Facebook Developer Console Create an app (or use existing) Add "Facebook Login" product Under Tools → Graph API Explorer: Select your Page Request permissions: pages_manage_posts, pages_read_engagement, publish_to_groups Generate long-lived access token (follow Facebook's token extension process) Save the Page ID and Access Token 2. Set up Google Sheets Create two sheets in one Google Sheets document: Sheet 1: ".env" (credentials) | Facebook Page ID | Facebook Page Access Token | |---|---| | 123456789 | EAAxxxxxxx... | Sheet 2: "Post URL" (content calendar) | Scheduled On | Platform | Post Type | Caption | Media URL | Approval Status | Post URL | row_number | |---|---|---|---|---|---|---|---| | 2025-10-30 10:00 | Facebook | Photo | Check out our new product! | https://drive.google.com/file/d/xxx | Good | | 1 | | 2025-10-30 14:00 | Facebook | Feed | Happy Monday everyone! | | Good | | 2 | Important column details: Scheduled On:** Format must be YYYY-MM-DD HH-MM (24-hour format) Platform:** Must be "Facebook" (case-sensitive) Post Type:** "Feed" (text-only), "Photo" (with image), or "Story" (skipped) Media URL:** Google Drive sharing link (leave empty for text-only posts) Approval Status:** "Good" (publish), "Pending" (hold), "Rejected" (skip) Post URL:** Leave empty (auto-filled after publishing) row_number:** Auto-generated by Google Sheets 3. Connect Google Sheets OAuth In n8n: Credentials → Add credential → Google Sheets OAuth2 API Complete OAuth authentication Open these nodes and select your credential: "Load Facebook Credentials from Sheet" "Read Approved Facebook Posts" "Update Sheet with Photo Post URL" "Update Sheet with Text Post URL" 4. Connect Google Drive OAuth In n8n: Credentials → Add credential → Google Drive OAuth2 API Complete OAuth authentication Open "Download Image from Google Drive" node Select your Google Drive credential 5. Update sheet URLs Open the following nodes and update the documentId value with your Google Sheets URL: "Load Facebook Credentials from Sheet"** → Point to your .env sheet "Read Approved Facebook Posts"** → Point to your Post URL sheet "Update Sheet with Photo Post URL"** → Point to your Post URL sheet "Update Sheet with Text Post URL"** → Point to your Post URL sheet 6. Test with sample posts Add 2 test rows in your Google Sheet: Row 1: Text-only post (no Media URL) scheduled for today Row 2: Photo post (with Google Drive URL) scheduled for today Set both Approval Status to "Good" Manually trigger the workflow (or wait for the next scheduled run) Verify: Posts appear in Facebook's Publishing Tools as scheduled Google Sheet updated with Post URLs Approval Status changed to "Published" 7. Activate the workflow Toggle the workflow to Active The workflow will now run automatically at 9:35 AM, 10:35 AM, 11:35 AM, and 12:35 PM daily Monitor the first few days to ensure posts are scheduling correctly Use cases Social media managers: Schedule 20-30 Facebook posts per week from a centralized Google Sheets calendar. Team members add content, you approve in the sheet, workflow handles publishing—no manual Facebook Business Suite logins. Marketing agencies: Manage 10+ client Facebook Pages from one Google Sheet. Each client gets their own rows, separate Facebook credentials loaded per page, automated scheduling with URL tracking for client reporting. Content creators: Batch-create a month of posts in one sitting (captions + images in Google Drive), mark them "Good" when ready, let the workflow publish them at scheduled times—focus on creation, not distribution. Small businesses: Schedule promotional posts, event announcements, and product launches without paying for Buffer, Hootsuite, or Later. Free automation with Google Sheets as the interface. E-commerce stores: Schedule new product announcements with product images from Google Drive. Workflow downloads images, posts to Facebook with captions, tracks URLs for performance analysis. Agencies with approval workflows: Content team creates posts, marks "Pending". Manager reviews, changes to "Good" or "Rejected". Only approved posts publish—built-in quality control without third-party tools. Resources n8n documentation Facebook Graph API Facebook Page Access Tokens Google Sheets API Google Drive API n8n Schedule Trigger n8n Google Sheets node Support Need help or custom development? 📧 Email: info@isawow.com 🌐 Website: https://isawow.com/
by vinci-king-01
Customer Support Analysis Dashboard with AI and Automated Insights 🎯 Target Audience Customer support managers and team leads Customer success teams monitoring satisfaction Product managers analyzing user feedback Business analysts measuring support metrics Operations managers optimizing support processes Quality assurance teams monitoring support quality Customer experience (CX) professionals 🚀 Problem Statement Manual analysis of customer support tickets and feedback is time-consuming and often misses critical patterns or emerging issues. This template solves the challenge of automatically collecting, analyzing, and visualizing customer support data to identify trends, improve response times, and enhance overall customer satisfaction. 🔧 How it Works This workflow automatically monitors customer support channels using AI-powered analysis, processes tickets and feedback, and provides actionable insights for improving customer support operations. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Ticket Analysis - Uses advanced NLP to categorize, prioritize, and analyze support tickets Multi-Channel Integration - Monitors email, chat, help desk systems, and social media Automated Insights - Generates reports on trends, response times, and satisfaction scores Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the ticket was processed | "2024-01-15T10:30:00Z" | | ticket_id | String | Unique ticket identifier | "SUP-2024-001234" | | customer_email | String | Customer contact information | "john@example.com" | | subject | String | Ticket subject line | "Login issues with new app" | | description | String | Full ticket description | "I can't log into the mobile app..." | | category | String | AI-categorized ticket type | "Technical Issue" | | priority | String | Calculated priority level | "High" | | sentiment_score | Number | Customer sentiment (-1 to 1) | -0.3 | | urgency_indicator | String | Urgency classification | "Immediate" | | response_time | Number | Time to first response (hours) | 2.5 | | resolution_time | Number | Time to resolution (hours) | 8.0 | | satisfaction_score | Number | Customer satisfaction rating | 4.2 | | agent_assigned | String | Support agent name | "Sarah Johnson" | | status | String | Current ticket status | "Resolved" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Help desk system API access (Zendesk, Freshdesk, etc.) Email service integration (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for customer support analysis Configure the sheet name (default: "Support Analysis") 4. Configure Support System Integration Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for your help desk system or support portal Customize the user prompt to extract specific ticket data Set up categories and priority thresholds 5. Set up Notification Channels Configure Slack webhook or API credentials for alerts Set up email service credentials for critical issues Define alert thresholds for different priority levels Test notification delivery 6. Configure Schedule Trigger Set analysis frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider support system rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test ticket analysis with sample data 🔄 Workflow Customization Options Modify Analysis Targets Add or remove support channels (email, chat, social media) Change ticket categories and priority criteria Adjust analysis frequency based on ticket volume Extend Analysis Capabilities Add more sophisticated sentiment analysis Implement customer churn prediction models Include agent performance analytics Add automated response suggestions Customize Alert System Set different thresholds for different ticket types Create tiered alert systems (info, warning, critical) Add SLA breach notifications Include trend analysis alerts Output Customization Add data visualization and reporting features Implement support trend charts and graphs Create executive dashboards with key metrics Add customer satisfaction trend analysis 📈 Use Cases Support Ticket Management**: Automatically categorize and prioritize tickets Response Time Optimization**: Identify bottlenecks in support processes Customer Satisfaction Monitoring**: Track and improve satisfaction scores Agent Performance Analysis**: Monitor and improve agent productivity Product Issue Detection**: Identify recurring problems and feature requests SLA Compliance**: Ensure support teams meet service level agreements 🚨 Important Notes Respect support system API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update your analysis parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and GDPR compliance for customer data 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Ticket parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust analysis frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Help desk system API documentation Customer support analytics best practices
by vinci-king-01
Competitor Price Monitoring Dashboard with AI and Real-time Alerts 🎯 Target Audience E-commerce managers and pricing analysts Retail business owners monitoring competitor pricing Marketing teams tracking market positioning Product managers analyzing competitive landscape Data analysts conducting pricing intelligence Business strategists making pricing decisions 🚀 Problem Statement Manual competitor price monitoring is inefficient and often leads to missed opportunities or delayed responses to market changes. This template solves the challenge of automatically tracking competitor prices, detecting significant changes, and providing actionable insights for strategic pricing decisions. 🔧 How it Works This workflow automatically monitors competitor product prices using AI-powered web scraping, analyzes price trends, and sends real-time alerts when significant changes are detected. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain up-to-date price data AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract pricing information from competitor websites Price Analysis Engine - Processes historical data to detect trends and anomalies Alert System - Sends notifications via Slack and email when price changes exceed thresholds Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the price was recorded | "2024-01-15T10:30:00Z" | | competitor_name | String | Name of the competitor | "Amazon" | | product_name | String | Product name and model | "iPhone 15 Pro 128GB" | | current_price | Number | Current price in USD | 999.00 | | previous_price | Number | Previous recorded price | 1099.00 | | price_change | Number | Absolute price difference | -100.00 | | price_change_percent | Number | Percentage change | -9.09 | | product_url | URL | Direct link to product page | "https://amazon.com/iphone15" | | alert_triggered | Boolean | Whether alert was sent | true | | trend_direction | String | Price trend analysis | "Decreasing" | 🛠️ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Email service for alerts (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for price monitoring data Configure the sheet name (default: "Price Monitoring") 4. Configure Competitor URLs Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for each competitor you want to monitor Customize the user prompt to extract specific pricing data Set appropriate price thresholds for alerts 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials (SendGrid, SMTP, etc.) Define alert thresholds and notification preferences Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider competitor website rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test alert notifications with sample data 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove competitor websites Change product categories or specific products Adjust monitoring frequency based on market volatility Extend Price Analysis Add more sophisticated trend analysis algorithms Implement price prediction models Include competitor inventory and availability tracking Customize Alert System Set different thresholds for different product categories Create tiered alert systems (info, warning, critical) Add SMS notifications for urgent price changes Output Customization Add data visualization and reporting features Implement price history charts and graphs Create executive dashboards with key metrics 📈 Use Cases Dynamic Pricing**: Adjust your prices based on competitor movements Market Intelligence**: Understand competitor pricing strategies Promotion Planning**: Time your promotions based on competitor actions Inventory Management**: Optimize stock levels based on market conditions Customer Communication**: Proactively inform customers about price changes 🚨 Important Notes Respect competitor websites' terms of service and robots.txt Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider legal implications of automated price monitoring 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Price parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Slack API documentation for notification setup
by Max aka Mosheh
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works • Publishes content to 9 social platforms (Instagram, YouTube, TikTok, Facebook, LinkedIn, Threads, Twitter/X, Bluesky, Pinterest) from a single Airtable base • Automatically uploads media to Blotato, handles platform-specific requirements (YouTube titles, Pinterest boards), and tracks success/failure for each post • Includes smart features like GPT-powered YouTube title optimization, Pinterest Board ID finder tool, and random delays to avoid rate limits Set up steps • Takes ~20–35 minutes to configure all 9 platforms (or less if you only need specific ones) • Requires Airtable personal access token, Blotato API key, and connecting your social accounts in Blotato dashboard • Workflow includes comprehensive sticky notes with step-by-step Airtable base setup, credential configuration, platform ID locations, and quick debugging links for each social network Pro tip: The workflow is modular - you can disable any platforms you don't use by deactivating their respective nodes, making it flexible for any social media strategy from single-platform to full omnichannel publishing.
by Yang
Who’s it for This template is perfect for content marketers, social media managers, and creators who want to repurpose YouTube videos into platform-specific posts without manual work. If you spend hours brainstorming captions, resizing content, or creating images for different platforms, this workflow automates the entire process from video selection to ready-to-publish posts. What it does The workflow takes a topic from a Google Sheet, finds the most relevant and recent YouTube video using Dumpling AI and GPT-4o, then automatically generates unique posts for Instagram, Facebook, and LinkedIn. Each post comes with a tailored AI-generated image, and all content is saved back into a Google Sheet for easy scheduling and review. Here’s what happens step by step: Picks an unsearched topic from Google Sheets Searches YouTube via Dumpling AI and sorts videos Uses GPT-4o to select the most relevant video Extracts the video transcript using Dumpling AI Generates three platform-specific posts using GPT-4o Creates matching images for each post using Dumpling AI image generation Saves the final Instagram, Facebook, and LinkedIn posts into a Google Sheet Marks the topic as processed so it won’t repeat How it works Scheduled Trigger: Starts the workflow automatically on a set schedule Google Sheets: Retrieves one unprocessed topic from the YouTube Topics sheet Dumpling AI: Finds and filters YouTube videos matching the topic GPT-4o: Chooses the best video and turns the transcript into three unique posts Dumpling AI (Image): Generates platform-specific visuals for each post Google Sheets: Saves all posts and images to the Social Media Post sheet for publishing Requirements ✅ Dumpling AI API key stored as credentials ✅ OpenAI GPT-4 credentials ✅ Google Sheets connection with the following sheets: YouTube Topics with columns Youtube Topics and Searched? Social Media Post with columns platform, Content, Image How to customize Adjust the GPT prompt to match your brand voice or content style Add or remove platforms depending on your posting strategy Change the schedule trigger frequency to fit your content calendar Integrate with scheduling tools like Buffer or Hootsuite for auto-publishing Add review or approval steps before posts are finalized > This workflow helps you transform a single YouTube video into three polished, platform-ready posts with matching visuals, in minutes—not hours.
by Yang
Who's it for This workflow is perfect for marketers, social media managers, recruiters, sales teams, and researchers who need to collect and organize public profile data from TikTok and LinkedIn. Whether you're building influencer databases, enriching CRM data, conducting competitor research, or gathering prospect information, this workflow automates the entire data extraction and storage process. What it does This AI-powered Telegram bot automatically scrapes public profile data from TikTok and LinkedIn, then saves it directly to Google Sheets. Simply send a TikTok username or LinkedIn profile URL via text or voice message, and the workflow handles everything: For TikTok profiles: Username and verification status Follower, following, and friend counts Total hearts (likes) and video count Bio link and secure user ID For LinkedIn profiles: Full name and profile picture Location and follower count Bio/about section Recent posts activity link All data is automatically organized into separate Google Sheets tabs for easy reference and analysis. You receive an email notification when extraction is complete. How it works The workflow uses an AI Agent as an intelligent router that determines which platform to scrape based on your input. Here's the flow: Input Processing: Send a message via Telegram (text or voice) Voice Transcription: If you send a voice note, OpenAI Whisper transcribes it to text AI Routing: The agent identifies whether it's TikTok or LinkedIn Profile Scraping: Calls Dumpling AI's specialized scraper for that platform Data Extraction: Parses the profile metrics and details Database Storage: Saves all data to the appropriate Google Sheets tab Confirmation: Sends an email notification when complete The AI agent ensures proper tool pairing - it always scrapes first, then saves, preventing partial data or errors. Setup Requirements Accounts & Credentials Needed: Telegram Bot Token (create via @BotFather) OpenAI API Key (for voice transcription and AI routing) Dumpling AI API Key (for profile scraping) Google Sheets OAuth2 credentials Gmail OAuth2 credentials (for notifications) Google Sheets Structure: Create a spreadsheet with two tabs: TikTok Tab - Columns: Username verified secUid bioLink followerCount followingCount heartCount videoCount friendCount LinkedIn Tab - Columns: name image location followers about recentPosts link How to set up Step 1: Create Telegram Bot Open Telegram and message @BotFather Use /newbot command and follow prompts Save your bot token for later Step 2: Configure Credentials Add Telegram bot token to "Receive Telegram Message" node Add OpenAI API key to "OpenAI Chat Model" and "Transcribe Audio" nodes Add Dumpling AI credentials as HTTP Header Auth Connect Google Sheets OAuth2 Connect Gmail OAuth2 Step 3: Set Up Google Sheets Create a new Google Spreadsheet Create two tabs: "TikTok" and "LinkedIn" Add column headers as specified above Copy the spreadsheet ID from the URL Step 4: Update Workflow Replace Google Sheets document ID in both database saver nodes Update email address in "Send Completion Email" node Remove personal credential names ("Nneka") Step 5: Test the Workflow Activate the workflow Message your bot with: "Scrape TikTok profile: @charlidamelio" Or try: "Extract this LinkedIn: https://www.linkedin.com/in/example" Check your Google Sheets for the data How to customize Add More Social Platforms: Create new scraper/saver tool pairs for Instagram, Twitter/X, or YouTube by: Adding new HTTP Request Tool nodes for scraping Adding corresponding Google Sheets Tool nodes Updating the AI Agent's system prompt with new protocols Enhance Voice Input: Add language detection for multilingual voice notes Implement speaker identification for team usage Add voice response capability Advanced Data Enrichment: Chain multiple profile lookups for followers Add sentiment analysis on bios and recent posts Implement automatic categorization/tagging Notification Improvements: Send results directly to Telegram instead of email Add Slack notifications for team collaboration Create detailed extraction reports with statistics Batch Processing: Modify to accept CSV files with multiple profiles Add rate limiting to avoid API throttling Implement queue system for large-scale scraping