by Automate With Marc
Viral Marketing Reel & Autopost with Sora2 + Blotato Create funny, ultra-realistic marketing reels on autopilot using n8n, Sora2, Blotato, and OpenAI. This beginner-friendly template generates a comedic video prompt, creates a 12-second Sora2 video, writes a caption, and auto-posts to Instagram/TikTok — all on a schedule. 🎥 Watch the full step-by-step tutorial: https://www.youtube.com/watch?v=lKZknEzhivo What this template does This workflow automates an entire short-form content production pipeline: Scheduled Trigger: Runs automatically at your chosen time (e.g., every evening at 7PM). AI “Video Prompt Agent”: Creates a cinematic, funny, 12-second Sora2 text-to-video prompt designed to promote a product (default: Sally’s Coffee). Insert Row (Data Table): Logs each generated video prompt for tracking, reuse, or inspiration. Sora2 (via Wavespeed): Sends POST request to generate a video. Waits 30 seconds. Polls the prediction endpoint until the video is completed. Blotato Integration: Uploads the finished video to your connected social account(s). Automatically publishes or schedules the post. Caption Generator: Uses an AI agent to create an Instagram/TikTok-ready caption with relevant hashtags. This turns n8n into a hands-free comedic marketing engine that writes, creates, and posts content for you. Why it’s useful Create daily or weekly marketing reels without filming, editing, or writing scripts. Experiment with new comedic formats, hooks, and product placements in seconds. Perfect for small businesses, agencies, creators, and social media managers. Demonstrates how to combine AI agents + Sora2 + polling + external posting services inside one workflow. Requirements Before running this template, configure: OpenAI API Key (for the prompt agent & caption model) Wavespeed / Sora2 API credentials Blotato account connected to Instagram/TikTok (for posting) n8n Data Table (optional, or replace with your own) ⚠️ All credentials must be added manually after import. No real credentials are included in the template. How it works Schedule Trigger Runs at a fixed time or interval. Video Prompt Agent (LangChain Agent) Generates a cinematic, realistic comedic video idea. Built with a detailed system prompt. Ensures brand integration (e.g., Sally’s Coffee) happens naturally. Insert Row (Data Table) Logs each generated prompt so future videos can be referenced or reused. Sora2 POST Request Sends the generated prompt to Sora2 via Wavespeed’s /text-to-video endpoint. Wait 30s + GET Sora2 Result Polls the result until data.status === "completed". Continues looping if still “processing”. Upload Media (Blotato) Uploads the finished video file. Caption Generator Creates a funny, platform-ready Instagram/TikTok caption with hashtags. Create Post (Blotato) Publishes (or schedules) the video + caption. Setup Instructions (Step-by-Step) Import template into n8n. Open Video Prompt Agent → review or customize the brand name, style, humor tone. Add your OpenAI API credentials: For prompt generation For caption generation Add your Wavespeed/Sora2 credentials to the POST and GET nodes. Connect your Blotato credential for uploading and posting. (Optional) Replace the Data Table ID with your own table. Adjust the Schedule Trigger time to your desired posting schedule. Run once manually to confirm: Prompt is generated Video is created Caption is written Video uploads successfully Enable workflow → your daily/weekly comedic autoposter is live. Customization Ideas Change the brand from Sally’s Coffee to any business, product, or influencer brand. Modify the prompt agent to enforce specific camera styles, settings, or comedic tones. Swap posting destinations: Blotato supports multiple networks—configure IG/TikTok/Facebook/YouTube Shorts. Add approval steps: Insert a Slack/Telegram “Approve before posting” step. Add analytics logging: Store video URLs, caption, and AI cost estimate. Troubleshooting Sora video stuck in processing: Increase the wait time or add another polling loop. Upload fails: Ensure media URL exists and Blotato account has posting permissions. Caption empty: Reconnect OpenAI credential or check model availability. Posting fails: Confirm your Blotato API key is valid and linked to a connected account. Category: Marketing, AI Video, Social Media Automation Difficulty: Beginner–Intermediate Core Nodes: LangChain Agent, HTTP Request, Wait, Data Table, Blotato, OpenAI Includes: System prompts, polling logic, caption generator, posting workflow
by Davide
This workflow automates the process of collecting, analyzing, and storing Facebook post comments with AI-powered sentiment analysis about YOUR Facebook Page. Typical Use Cases: Social media sentiment monitoring Brand reputation analysis Campaign performance evaluation Community management and moderation insights Reporting and analytics for marketing teams Key Advantages ✅ 1. Full Automation Eliminates manual work by automatically collecting and analyzing Facebook comments end-to-end. ✅ 2. AI-Powered Sentiment Analysis Uses Google Gemini to accurately classify user sentiment, enabling deeper insights into audience perception and engagement. ✅ 3. Structured Data Storage Saves results directly into Google Sheets, making the data easy to analyze, share, and visualize with dashboards or reports. ✅ 4. Duplicate-Safe Updates The “append or update” logic ensures comments are not duplicated and can be refreshed if sentiment analysis changes. ✅ 5. Scalable and Robust Pagination handling, batch processing, and wait nodes allow the workflow to scale to large volumes of comments without hitting API limits. ✅ 6. Modular Architecture The use of sub-workflows makes the solution reusable and easy to integrate into larger automation pipelines (e.g. monitoring multiple posts or pages). ✅ 7. Flexible Triggering Can be run manually for testing or automatically as part of a broader workflow ecosystem. How it works This workflow automates the process of fetching Facebook post comments, performing sentiment analysis on each comment, and storing the results in a Google Sheet. It operates in two modes: Manual execution mode: Starts with a Manual Trigger, where the user enters a Facebook Post ID. The workflow fetches the post details, then retrieves all comments (including pagination). It calls a separate "Facebook" workflow (via the Call 'Facebook' node) to process each comment batch through sentiment analysis and save results to Google Sheets. Triggered execution mode: Activated via the "When Executed by Another Workflow" trigger, receiving comment data directly. It splits and batches the incoming comments, processes each through the sentiment analysis model (Google Gemini), and appends/updates records in Google Sheets. Set up steps Configure Facebook Graph API credentials: Add your Facebook Graph API credentials to both "Get Fb Post" and "Get Fb comments" nodes. Set up Google Gemini API credentials: Configure the "Google Gemini Chat Model" node with valid Google PaLM/Gemini API credentials. Prepare Google Sheet: Ensure the Google Sheet exists and is accessible via the Google Sheets OAuth2 credentials. The sheet should have (or will automatically create) columns: POST ID, COMMENT ID, COMMENT, SENTIMENT. Configure the sub-workflow call: Ensure the "Call 'Facebook'" node points to a valid, existing workflow that can process comment data. Optional: Adjust batch and wait settings: Modify the "Loop Over Items" node if different batch sizes are needed. Adjust the "Wait" node delay if required to avoid rate limits. Activate the workflow: Toggle the workflow to active if scheduled or webhook execution is desired. Test using the Manual Trigger with a sample Facebook Post ID. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Rahul Joshi
📘 Description This workflow automates end-to-end pre-surgery checklist reminders and confirmation tracking for healthcare operations teams. It ensures patients receive timely preparation instructions, can confirm completion with a single click, and allows staff to monitor confirmation status in real time—without manual coordination. Every day at 9:00 AM, the workflow fetches surgery events from Google Calendar, extracts patient details from event descriptions, and generates a unique confirmation link for each patient. An AI assistant then creates a personalized, patient-friendly pre-op checklist email (subject + styled HTML body) that includes surgery details and a confirmation button. When a patient clicks the confirmation link, a webhook captures the confirmation, updates Google Sheets as the source of truth, and records the timestamp. A separate periodic scheduler scans the sheet for patients who have not confirmed within the expected window. If confirmation is missing, the assigned nurse or owner is alerted in Slack with full patient and surgery context for immediate follow-up. This creates a closed-loop system: reminder → confirmation capture → tracking → escalation. ⚙️ What This Workflow Does (Step-by-Step) ▶️ Daily 9:00 AM Trigger Starts the pre-op reminder cycle automatically each morning. 📅 Fetch Today’s Surgery Events (Google Calendar) Pulls calendar events and filters only surgery-related entries. 🧾 Extract Patient Details from Event Description Parses patient name, email, phone, ID, procedure, and surgery time. 🔗 Generate Unique Confirmation Link Creates a secure confirmUrl per patient with tracking parameters. ✉️ AI Email Generation (Subject + HTML Body) Uses Azure OpenAI (gpt-4o) to generate a calm, professional checklist email with: Surgery details Basic preparation checklist Green confirmation button linked to confirmUrl 📤 Send Pre-Op Reminder via Gmail Delivers the styled HTML email directly to the patient. ✅ Confirmation Webhook (GET /confirm) Captures patient clicks, parses query parameters, and marks the checklist as confirmed. 🧾 Upsert Confirmation Status (Google Sheets) Stores and updates patient confirmation records as the operational source of truth. ⏳ Periodic Confirmation Check Runs on a schedule to scan all patient rows for missing confirmations. 🚨 Slack Alert for Missing Confirmations Notifies the nurse or owner with full patient context when confirmation is not received in time. 🧩 Prerequisites • Google Calendar OAuth2 • Gmail OAuth2 • Azure OpenAI (gpt-4o) credentials • Google Sheets OAuth2 • Slack API credentials • Publicly accessible webhook URL for confirmation tracking 💡 Key Benefits ✔ Automated pre-op reminders at a fixed daily time ✔ Personalized AI-generated patient emails ✔ One-click confirmation tracking ✔ Google Sheets audit trail for operations ✔ Proactive Slack alerts for missing confirmations ✔ Reduced manual follow-ups and missed preparations 👥 Perfect For Hospitals and surgical centers Clinic operations teams Care coordinators Day-surgery facilities Healthcare admin automation systems
by Rahul Joshi
Description: Keep your API documentation accurate and reliable with this n8n automation template. The workflow automatically tests your FAQ content related to authentication and rate limits, evaluating each answer using Azure OpenAI GPT-4o-mini for completeness, edge-case coverage, and technical clarity. It logs all results to Google Sheets, scores FAQs from 0–10, and sends Slack alerts when low-quality answers are detected. Ideal for API teams, developer relations managers, and technical writers who want to maintain high-quality documentation with zero manual review effort. ✅ What This Template Does (Step-by-Step) ▶️ Manual Trigger or On-Demand Run Start the evaluation anytime you update your FAQs — perfect for regression testing before documentation releases. 📖 Fetch FAQ Q&A from Google Sheets Reads FAQ questions and answers from your designated test sheet (columns A:B). Each Q&A pair becomes a test case for AI evaluation. 🤖 AI Evaluation via GPT-4o-mini Uses Azure OpenAI GPT-4o-mini to evaluate how well each FAQ covers critical aspects of API authentication and rate limiting. The AI provides a numeric score (0–10) and a short explanation. 🔍 Parse & Format AI Results Extracts structured JSON data (Question, Score, Explanation, Timestamp) and prepares it for reporting and filtering. 💾 Save Evaluation to Google Sheets Appends all results to a Results Sheet (A:D) — creating a running history of FAQ quality audits. ⚠️ Filter for Low-Scoring FAQs Identifies any FAQ with a score below 7, flagging them as needing review or rewrite. 🔔 Send Slack Alerts for Weak Entries Posts an alert message in your chosen Slack channel, including: The question text Score received AI’s explanation Link to the full results sheet This ensures your documentation team can quickly address weak or incomplete FAQ answers. 🧠 Key Features 🤖 AI-powered FAQ quality scoring (0–10) 📊 Automated tracking of doc health over time 📥 Seamless Google Sheets integration for results storage ⚙️ Slack notifications for underperforming FAQs 🧩 Ideal for continuous documentation improvement 💼 Use Cases 📘 Validate FAQ accuracy before API documentation updates ⚡ Auto-test new FAQ sets during content refresh cycles 🧠 Ensure API rate limit and auth topics cover all edge cases 📢 Alert documentation owners about weak answers instantly 📦 Required Integrations Google Sheets API – for reading and storing FAQs and test results Azure OpenAI (GPT-4o-mini) – for evaluating FAQ coverage and clarity Slack API – for sending quality alerts and notifications 🎯 Why Use This Template? ✅ Ensures API FAQ accuracy and completeness automatically ✅ Replaces tedious manual content reviews with AI scoring ✅ Builds an ongoing record of documentation improvements ✅ Keeps technical FAQs consistent, relevant, and developer-friendly
by oka hironobu
Who is this for Gym-goers, runners, home workout enthusiasts, and personal trainers who want to track workouts without fiddling with complicated fitness apps. Just type what you did and let AI handle the rest. What this workflow does Fill in a simple web form with your workout in plain text — something like "Bench press 60kg x10 x3, 5km run 28min, squats bodyweight x20 x4." Google Gemini AI parses each exercise, identifies the muscle groups worked, estimates calories burned, and rates the workout intensity. A code node validates the AI output and structures it into a clean record. Everything gets saved to a Google Sheet as a running training log. You receive a Slack notification and an email summary with personalized tips and a suggestion for your next session. How to set up Create a Google Sheet with a "Workouts" tab and add column headers: Date, Workout, Duration, Feeling, Type, Exercises, Calories, Muscles, Intensity, Tips, Next Suggestion, Notes, Logged At Add your Google Sheets and Gmail credentials in n8n Get a free Gemini API key from Google AI Studio Connect Slack and set the channel ID in the Slack node Update the email address in the Gmail node Activate and bookmark the form URL Requirements Google Gemini API key (free tier available) Google Sheets and Gmail credentials Slack workspace with a connected app How to customize Edit the AI prompt to add sport-specific analysis (swimming, cycling, climbing) Remove Slack or Gmail if you only need one notification channel Add a weekly summary node that aggregates your training volume
by hayatofujita
Predict churn risk from customer data and send retention emails via OpenAI 👥 Who’s it for This workflow is designed for Customer Success Managers, Growth Teams, and SaaS Business Owners who want to proactively reduce churn using AI. It automates the analysis of customer health and the delivery of personalized retention offers without manual intervention. 🚀 What it does This template acts as an intelligent retention system that connects your data, AI, and communication channels. Aggregates Data: Pulls customer profiles from your CRM, support ticket history via API, and product usage logs from PostgreSQL. Predicts Risk: Uses OpenAI to analyze the combined data and calculate a "Churn Risk Score" for each customer. Automates Action: For customers identified as high-risk (score > 0.7): Generates a unique, dynamic discount coupon via Stripe. Drafts a highly personalized retention email using OpenAI. Sends the email via Gmail. Tracks Effectiveness: Logs all actions to Google Sheets. It also checks back (via SendGrid and CRM data) to track email opens and verify if the customer was retained after 30 days. ⚙️ How to set up Prepare Google Sheet: Create a sheet with columns for customer_id, risk_score, offer_type, email_status, and retention_result. Configure Credentials: Set up your credentials for OpenAI, Stripe, Gmail, Google Sheets, SendGrid, and PostgreSQL. API Endpoints: Update the HTTP Request nodes to point to your specific CRM and Support tool APIs (replace the placeholder URLs). Customize Logic: In the Postgres node, adjust the SQL query to match your product's event table. In the Code node (Offer Decision), define your rules for discounts (e.g., "Give 20% off if MRR > $10k"). Activate: The workflow is set to run daily at 3:00 AM. Toggle the Schedule Trigger to Active when ready. 📦 Requirements n8n** (v1.0 or later) OpenAI** API Key Google Workspace** (Gmail, Sheets) Stripe** Account (for coupon generation) SendGrid** Account (for email tracking) PostgreSQL** (or similar database) Access to CRM and Support Tool APIs 🎨 How to customize the workflow Adjust the AI Prompt:* Edit the *OpenAI node** system message to match your brand's tone of voice. Change the Threshold:* Modify the *If node** to target customers with a risk score higher or lower than 0.7. Internal Alerts:* Instead of emailing the customer directly, replace the *Gmail node* with a *Slack node** to notify your CSM team to reach out manually. Swap Database:** If you use MySQL or Snowflake, simply replace the PostgreSQL node with the corresponding n8n node.
by Kendra McClanahan
Champion Migration Tracker Automatically detect when your champion contacts change companies and respond with intelligent, personalized AI outreach before your competitors do. THE PROBLEM When champions move to new companies, sales teams lose track and miss high-value opportunities. Manual LinkedIn monitoring doesn't scale, and by the time you notice, the relationship has gone cold. THE SOLUTION This workflow automates champion migration tracking end-to-end, combining Explorium's data intelligence with Claude AI agents to maintain relationships and prioritize opportunities. HOW IT WORKS 1. Automated Job Change Detection Uses Explorium person enrichment to detect when champions move companies Eliminates manual LinkedIn monitoring Triggers immediately when employment changes 2. Intelligent Company Enrichment Enriches new companies with Explorium data: firmographics, funding, tech stack, hiring velocity Checks if company already exists in your CRM (Customer vs Prospect) Identifies open opportunities and account owners 3. Multi-Dimensional Opportunity Scoring (0-100) ICP Fit (40%)**: Company size, funding stage, revenue, tech stack alignment Relationship Strength (40%)**: Past deals influenced, relationship warmth, CRM status Timing (20%)**: Days at new company, recent funding/acquisition signals Results in Hot/Warm/Cold priority classification 4. Smart Routing by Context Customers**: Notify account manager with congratulations message Hot Prospects (75+ score)**: Draft detailed strategic outreach for rep review 5. AI-Powered Personalization Claude AI agents generate contextually relevant emails References past relationship, deals influenced, and company intelligence Adapts tone and content based on opportunity priority and CRM status DEMO SETUP (Google Sheets) This demo uses Google Sheets for simplicity. For production use, replace with your actual CRM: Salesforce HubSpot Pipedrive Any CRM with n8n integration Important Fields to Consider: Champions: champion_id, name, email, company, title, last_checked_date relationship_strength (Hot/Warm/Cold), last_contact_date, deals_influenced relationship_notes, isChampion (TRUE/FALSE), linkedin_url, explorium_prospect_id Companies: company_ID, companyName, domain, relationship_type (Customer/Prospect/None) open_opportunity (TRUE/FALSE), opportunity_stage, account_owner, account_owner_email contractValue, notes, ExploriumBusinessID REQUIRED CREDENTIALS Anthropic API Key - Powers Claude AI agents for email generation Explorium API Key - Provides person and company enrichment data Google Sheets or Your CRM (production) - Data source and logging SETUP INSTRUCTIONS Connect Credentials in n8n Settings → Credentials Update Data Sources: Replace Google Sheets nodes with your CRM nodes (or create demo sheets with structure above) Configure Scoring: Adjust ICP scoring criteria in "Score Company" node to match your ideal customer profile Test with Sample Data: Run with 2-3 test champions to verify routing and email generation Schedule Trigger: Set to run daily or weekly based on your needs CUSTOMIZATION TIPS Scoring Weights: Adjust the 40/40/20 weighting in the scoring node to prioritize what matters most to your business Tech Stack Matching: Update the relevantTech array with tools your champions likely use Email Tone: Modify Claude prompts to match your brand voice (formal, casual, technical, etc.) Routing Logic: Add additional branches for specific scenarios (e.g., churned customers, enterprise accounts) **Agentic Experience: Consider adding an agent that sends the email for Cold prospects automatically. Integrations: Add Slack notifications, CRM updates, or calendar booking links to the output BUSINESS VALUE Prevent Revenue Leakage**: Never lose track of champion relationships Prioritize Intelligently**: Focus on opportunities with highest potential Scale Relationship Building**: Automate what used to require manual effort Act Before Competitors**: Reach out while champions are still settling into new roles Data-Driven Decisions**: Quantifiable scores replace gut feelings USE CASES Sales Teams**: Re-engage champions at new prospect companies Customer Success**: Track champions who move to existing accounts Account-Based Marketing**: Identify high-fit accounts through champion networks Revenue Operations**: Automate champion tracking at scale NOTES Production Recommendation**: Replace Google Sheets with your production CRM for real-time data Privacy**: All API keys are credential-referenced (not hardcoded) for security Explorium Credits**: Person + company enrichment uses ~2-3 credits per champion
by WeblineIndia
AI Investment Idea Generator using n8n, Groq AI, Google Sheets & Stock API This workflow automatically generates 2–3 high-quality Indian stock investment ideas daily by combining trending stock data + latest market news, processing it with AI (Groq/OpenAI), avoiding duplicates using Google Sheets and storing results for tracking. Quick Implementation Steps Import the workflow into n8n Add credentials: Stock API (HTTP Headers Auth) Google Sheets OAuth2 Groq / OpenAI API Connect your Google Sheet (same structure as defined) Activate the workflow (runs daily at 9 AM) Check your Google Sheet for generated ideas What It Does This workflow acts as a mini AI-powered equity research assistant for the Indian stock market. It automatically gathers real-time market insights by pulling: Trending stocks (top gainers & losers) Latest market-related news (via Google News RSS) It then processes this data into a structured format and feeds it into an AI agent configured with strict rules to generate actionable, non-generic investment ideas. To maintain quality and avoid repetition, the workflow: Fetches historical ideas from Google Sheets Uses memory to avoid recent duplicates Ensures each new idea is unique, data-backed and sector-driven Finally, it stores structured results into Google Sheets for easy tracking and analysis. Who’s It For Retail investors looking for daily stock ideas Financial analysts and research teams Algo trading enthusiasts Content creators (finance blogs, newsletters) Automation enthusiasts using n8n Requirements To use this workflow, you need: n8n instance (self-hosted or cloud) Stock API access (via HTTP headers auth) Google Sheets account (read + append permissions) Groq or OpenAI API access A Google Sheet with columns: Date | Title | Stock | Reason | Horizon How It Works & Set Up Schedule Trigger (runs daily at 9 AM) Fetch trending stocks via API Fetch latest market news via RSS Merge and format data Generate investment ideas using AI Validate and parse output Store results in Google Sheets Handle errors and send alerts How To Customize Nodes Modify API endpoint in stock node Change RSS keywords for news Adjust AI prompt rules Change number of generated ideas Modify Google Sheets mapping Update schedule timing Add-ons (Enhancements) Slack/Telegram alerts Email digest Sentiment analysis Performance tracking Global market support Use Case Examples Daily investment newsletter Personal research assistant Trading strategy ideas Content creation Portfolio tracking Troubleshooting Guide Issue | Possible Cause | Solution | -------------- | -------- No stock data | API issue | Check API credentials AI empty output | Model issue | Retry / check logs JSON error | Format issue | Fix prompt output No sheet data | Mapping issue | Verify columns No alerts | Gmail not set | Reconnect Gmail Need Help? For setup, customization or advanced automation solutions, contact our n8n workflow developers at WeblineIndia.
by Rahul Joshi
📘 Description This workflow automates keyword-level SEO research and content opportunity discovery using live Google SERP data and AI-driven analysis. It takes a single keyword request, pulls real-time search results for the India market, converts raw SERP data into a structured SEO dataset, analyzes search intent and competition, identifies content gaps and high-impact opportunities, and delivers client-ready insights via email while logging results for tracking and audits. Instead of manual keyword research, competitive scanning, and reporting, the system derives actionable SEO strategy directly from live search behavior. Outputs are structured for UI consumption, professional email delivery, and historical storage in Google Sheets. Any workflow failure triggers an automated Slack alert with diagnostic details. This workflow replaces manual SEO research, gap analysis, reporting, and documentation with a repeatable, automated SEO intelligence pipeline. ⚙️ What This Workflow Does (Step-by-Step) 🟢 Receive SEO Keyword Analysis Request via Webhook Accepts a POST request containing the target keyword for SEO analysis. 🧹 Extract Keyword from Request Payload Cleans and isolates the keyword field for SERP processing. 🌐 Run Google SERP Search for Keyword (India) Executes a live Google search via SerpAPI with India as the target region. Fetches: • Organic search results • Related searches • Video SERP features • Result count metadata 🧩 Normalize SERP Results into SEO Dataset Transforms raw SERP output into a structured dataset containing: • Rankings, titles, snippets, and sources • Video presence and platforms • Related search intent signals 🧠 Analyze Keyword SEO Opportunities Using AI Uses GPT-4o to determine: • Search intent • Competition level • Content gaps • High-impact content opportunities • Recommended content formats Returns strictly structured JSON output. 🧪 Parse Market Analysis Output JSON Validates and enforces the predefined SEO analysis schema. 🔄 Flatten AI Output for Downstream Use Removes nested AI structures to simplify reporting and UI usage. 🧭 Map SEO Fields for UI & Reporting Aligns SEO insights to UI-friendly keys, including confidence scoring and reporting fields. 📧 Generate Client-Ready SEO Insights Email Using AI Converts SEO analysis into a professional HTML email containing: • Keyword & country context • Search intent explanation • Competition level • Content gaps • Top content opportunities with difficulty • Recommended formats • Overall confidence score 📤 Send SEO Opportunity Report via Email Delivers the formatted SEO opportunity report to the configured recipient via Gmail. 📊 Log SEO Analysis Result to Google Sheets Stores keyword, intent, competition, confidence, gaps, and opportunities for: • Tracking • Audits • Historical SEO analysis 🚨 Error Handler Trigger → Slack Alert Any workflow failure sends an automated Slack alert with node name, error message, and timestamp. 🧩 Prerequisites • SerpAPI account • OpenAI API key • Gmail OAuth credentials • Google Sheets OAuth access • Slack API credentials • Valid webhook endpoint for keyword submission 💡 Key Benefits ✔ Automates keyword-level SEO research using live SERP data ✔ Identifies content gaps and opportunities based on real search behavior ✔ Produces client-ready SEO insight reports automatically ✔ Eliminates manual competitor analysis and documentation ✔ Logs SEO intelligence for long-term tracking and audits ✔ Provides immediate error visibility through Slack alerts 👥 Perfect For SEO agencies delivering keyword opportunity reports Content teams planning data-driven content calendars Founders validating SEO demand before investing in content Marketing teams prioritizing high-impact keywords Operators needing repeatable SEO research workflows
by Samir Saci
Tags: Image Compression, Tinify API, TinyPNG, SEO Optimisation, E-commerce, Marketing Context Hi! I’m Samir Saci, Supply Chain Engineer, Data Scientist based in Paris, and founder of LogiGreen. I built this workflow for an agency specialising in e-commerce to automate the daily compression of their images stored in a Google Drive folder. This is particularly useful when managing large libraries of product photos, website assets or marketing visuals that need to stay lightweight for SEO, website performance or storage optimisation. > Test this workflow with the free tier of the API! 📬 For business inquiries, you can find me on LinkedIn Who is this template for? This template is designed for: E-commerce managers** who need to keep product images optimised Marketing teams** handling large volumes of visuals Website owners** wanting automatic image compression for SEO Anyone using Google Drive** to store images that gradually become too heavy What does this workflow do? This workflow acts as an automated image compressor and reporting system using Tinify, Google Drive, and Gmail. Runs every day at 08:00 using a Schedule Trigger Fetches all images from the Google Drive Input folder Downloads each file and sends it to the Tinify API for compression Downloads the optimised image and saves it to the Compressed folder Moves the original file to the Original Images archive Logs: fileName, originalSize, compressedSize, imageId, outputUrl and processingId into a Data Table After processing, it retrieves all logs for the current batch Generates a clean HTML report summarising the compression results Sends the report via Gmail, including total space saved Here is an example from my personal folder: Here is the report generated for these images: P.S.: You can customise the report to match your company branding or visual identity. 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: Next Steps Before running the workflow, follow the sticky notes and configure the following: Get your Tinify API key for the free tier here: Get your key Replace Google Drive folder IDs in: Input, Compressed, and Original Images Replace the Data Table reference with your own (fields required: fileName, originalSize, compressedSize, imageId, outputUrl, processingId) Add your Tinify API key in the HTTP Basic Auth credentials Set up your Gmail credentials and recipient email (Optional) Customise the HTML report in the Generate Report Code node (Optional) Adjust the daily schedule to your preferred time Submitted: 18 November 2025 Template designed with n8n version 1.116.2
by riandra
Description This n8n template automatically monitors news sources daily, analyzes article sentiment using AI, and delivers structured intelligence reports to your team — all without any manual reading. It uses MrScraper to discover and extract articles, GPT-4o-mini to score sentiment and flag urgent issues, and delivers results to both Notion (for archiving) and Slack (for real-time alerts and daily digests). Whether you're tracking brand reputation, monitoring a competitor, or staying on top of industry trends, this workflow turns the open web into a fully automated radar system that runs every morning before your team starts their day. How It Works Phase 1 – Trigger & Config:** A Schedule Trigger fires daily at 8AM. The workflow reads your list of target news source URLs from a Google Sheet, then loops through each source one by one. Phase 2 – URL Discovery:** For each news source, the Map Agent crawls the page and extracts individual article URLs. URLs are filtered using include patterns to keep only actual article links, deduplicated, and capped at your configured maxArticles limit. Phase 3 – Article Extraction:** Each article URL is processed by the General Agent, which extracts the full content including title, body text, author, and publication date. Articles with fewer than 50 words or that don't mention your defined brand/topic keywords are automatically filtered out and skipped. Phase 4 – AI Sentiment Analysis:** Each relevant article is sent to GPT-4o-mini with a structured prompt. The model returns a complete JSON analysis including sentiment label (positive/neutral/negative), a sentiment score from -1 to 1, a 2–3 sentence summary, key topics, tone classification, the most impactful quote, and whether the article requires an urgent response. Phase 5 – Storage & Reporting:** Every analyzed article is saved as a page in your Notion database with full metadata. If an article is flagged as action-required, an immediate Slack alert is fired. At the end of each run, a Daily Digest is compiled with sentiment breakdowns, average scores, flagged items, and top article summaries — then posted to your Slack channel. How to Set Up Create 2 scrapers in your MrScraper account: Map Agent Scraper (for discovering article URLs from news pages) General Agent Scraper (for extracting full article content: title, body, date, author) Copy the scraperId for each. Enable AI Scraper API access in your MrScraper account settings. Prepare your Google Sheet: Create a sheet with a Domain column listing the news source URLs you want to monitor Add as many sources as needed — each will be looped through on every run Add your credentials in n8n: MrScraper API token OpenAI API key Slack OAuth Notion OAuth Configure your Notion database with these properties: Title, Source URL, Sentiment (select), Sentiment Score (number), Tone (select), Key Topics (text), Summary (text), Action Required (checkbox), Action Reason (text), Published At (text), Scraped At (text) Update the workflow settings: Set your brandName and brandKeywords (comma-separated keywords to filter relevant articles) Set your mapScraperId and generalScraperId Set your slackChannel name (e.g. #brand-monitoring) Set your notionDatabaseId (from your Notion database URL) Set maxArticles to control how many articles are processed per source per run Adjust the Schedule Trigger to your preferred run time (default: every 24 hours) Customize the topic/brand keywords inside the Pick Best Content + Filter Irrelevant code node to match your monitoring targets Requirements MrScraper** account with API access enabled OpenAI** account (GPT-4o-mini used by default) Slack** workspace with OAuth connected Notion** workspace with a database set up and OAuth connected Google Sheets** (OAuth2 connected) for storing your list of news sources Good to Know GPT-4o-mini is used by default to keep costs low — processing costs approximately $0.0001 per article. Switch to GPT-4o only if you need higher analysis quality. The workflow has two distinct Slack outputs: an immediate urgent alert fires the moment a high-priority article is detected, while the Daily Digest summarizes the entire run at the end. Articles that are too short (under 50 words) or don't match your defined keywords are silently skipped — the workflow never breaks even if a source returns no usable content. Sentiment scores range from -1.0 (strongly negative) to 1.0 (strongly positive), making it easy to track trends over time in Notion or a connected dashboard. Customising This Workflow Track multiple brands or topics:** Duplicate the keyword filter and AI prompt steps to run parallel analysis pipelines for different subjects in a single workflow. Add email reporting:** Insert a Gmail node after the Slack Daily Digest step to also send the daily summary as a formatted email report. Connect to a dashboard:** Pipe the Notion database into a tool like Google Looker Studio or Retool to visualize sentiment trends over time with charts and filters. Adjust scrape frequency:** Change the Schedule Trigger from daily to hourly for breaking news monitoring, or weekly for slower trend tracking. Expand news sources:** Add Google News search URLs (e.g. https://news.google.com/search?q=YOUR_BRAND) to your input sheet for broader coverage beyond specific news sites.
by Abi Odedeyi
How It Works Trigger: Watches for new emails in Gmail with PDF/image attachments. OCR: Sends the attachment to OCR.space API (https://ocr.space/OCRAPI) to extract invoice text. Parsing: Extracts key fields: Vendor Invoice number Amount Currency Invoice date Due date Description Validation Logic: Checks if amount is valid Ensures vendor and invoice number are present Flags high-value invoices (e.g., over $10,000) Routing: If invalid: Sends a Slack message highlighting issues Labels email as Rejected If valid: Logs the invoice into Google Sheets Sends a Slack message to the finance team for approval After approval, creates a draft invoice in Xero Labels the email as Processed in Gmail Set up steps • Estimated setup time: 45-60 mins • You’ll need connected credentials for Gmail, Slack, Google Sheets, and Xero • Replace the default API key for OCR.space with your own (in the HTTP Request node) • Update Slack channel IDs and label IDs to match your workspace • Adjust invoice validation rules as needed (e.g. currency, red flag conditions) All detailed explanations and field mappings are provided in sticky notes within the workflow.