by Milo Bravo
AI YouTube Trend Intelligence Report: YouTube API + GPT-4o + PDF Dashboard Who is this for? AI creators, marketers, agencies, and researchers tracking YouTube trends who need weekly high-signal insights without 4+ hours manual research. **What problem is this workflow solving? Trend hunting is exhausting:** Scanning 500+ videos across keywords Manual engagement calculations No automated filtering or analysis Scattered spreadsheets vs polished reports This workflow auto-discovers top videos, ranks by engagement, and delivers branded PDF + Sheets dashboard. What this workflow does Trigger: Form input (keywords, days back) or weekly cron YouTube API: Searches 10 keywords → ~500 videos (past 7 days) Ranking: Views + engagement rates → top performers Google Sheets: Exports channels/videos/keywords/stats GPT-4o: Analyzes trends → content recommendations PDF.co: HTML charts → branded PDF report Gmail: Delivers to inbox Setup:(5 minutes) YouTube Data API v3 key (HTTP Query Auth) Google Sheets OAuth2 for exports OpenAI API (GPT-4o-mini) PDF.co for HTML-to-PDF Gmail OAuth2 + recipient email Fully configurable env vars—no hardcoded IDs. How to customize: Edit 10-term list for your niche Filters: Adjust min views (1k), engagement (2%) Schedule: Daily/weekly cron Output: Swap Gmail for Slack/Notion Scale: 1000s videos/month ROI: 4+ hours saved weekly 20% higher content performance Automated competitive intel Zero manual spreadsheet work Need help customizing? Contact me for consulting and support: LinkedIn / **[Message](https://tally.so/r/E Keywords: YouTube trend analysis, AI YouTube research, YouTube analytics automation, content trend tracker, video engagement ranking, YouTube API n8n, weekly YouTube report, YouTube keyword monitoring
by Oneclick AI Squad
This n8n workflow automates the transformation of raw text ideas into structured visual diagrams and content assets using NapkinAI. It connects Claude AI (Anthropic) with NapkinAI to take any rough concept or unstructured idea, intelligently enrich and structure it, generate polished visual diagrams, and deliver the results directly to your team via email, Slack, and a database log — all hands-free. 🔄 Workflow Stages 1. Input Collection Webhook trigger accepts raw idea text Manual trigger for testing Configurable idea metadata (title, type, audience) 2. AI Idea Enrichment (Claude) Expands raw idea into structured concept Extracts key themes and relationships Suggests optimal diagram types Generates multiple visual concepts 3. NapkinAI Processing Authenticate with NapkinAI API Submit enriched text for visual generation Poll for completion status Retrieve generated diagram assets 4. Asset Management Download generated visuals Store metadata in database Organize by project/category 5. Delivery Email with visual assets attached Slack notification with preview Save to Google Drive / Notion ⚙️ Setup Required NapkinAI API credentials Anthropic API key (Claude) SMTP for email delivery Slack webhook (optional) PostgreSQL for logging (optional) Why Does It Matter? Creating visual content is time-consuming. Turning an idea into a diagram, infographic, or visual asset typically requires: Writing a clear brief Choosing the right diagram format Using design tools manually Iterating on structure and layout This workflow compresses all of that into a single automated pipeline. Key benefits: Speed: Ideas become visuals in minutes, not hours Consistency: Every output follows a structured process Scale: Process multiple ideas simultaneously Repurposing: Each idea generates 3+ content format suggestions Team Visibility: Slack + email notifications keep everyone aligned Tracking: Database logging enables analytics over time
by Luis Acosta
📰 Reddit to Newsletter (Automated Curation with Open AI 4o Mini ) Turn the best posts from a subreddit into a ready-to-send HTML newsletter — no copy-pasting, no wasted time. This workflow fetches new posts, filters by topic of interest, analyzes comments, summarizes insights, and composes a clean HTML email delivered straight to your inbox with Gmail. 💡 What this workflow does ✅ Fetches posts from your chosen subreddit (default: r/microsaas, sorted by “new”) 🏆 Selects the Top 10 by upvotes, comments, and recency 🧭 Defines a topic of interest and runs a lightweight AI filter (true/false) without altering the original JSON 💬 Pulls and flattens comments into a clean, structured list 🧠 Summarizes each post + comments into main_post_summary, comment_insights, and key_learnings ✍️ Generates a newsletter in HTML (not Markdown) with headline, outline, sections per post, quotes, and “by the numbers” 📤 Sends the HTML email via Gmail with subject “Reddit Digest” (editable) 🛠 What you’ll need 🔑 Reddit OAuth2 connected in n8n 🔑 OpenAI API key (e.g., gpt-4o-mini) for filtering and summarization 🔑 Gmail OAuth2 to deliver the newsletter 🧵 A target subreddit and a clearly defined topic of interest 🧩 How it works (high-level) Manual Trigger → Get many posts (from subreddit) Select Top 10 (Code node, ranking by ups + comments + date) Set topic of interest → AI filter → String to JSON → If topic of interest Loop Over Items for each valid post Fetch post comments → Clean comments (Code) → Merge comments → Merge with post Summarize post + comments (AI) → Merge summaries → Create newsletter HTML Send Gmail message with the generated HTML ⚙️ Key fields to adjust Subreddit name* and “new” filter in *Get many posts Ranking logic* inside *Top 10 Code node Text inside Set topic of interest** Prompts* for *AI filter, Summarize, and Create newsletter (tone & structure) Recipient & subject line* in *Send Gmail message ✨ Use cases Weekly digest** of your niche community Podcast or newsletter prep** with community insights Monitoring specific themes** (e.g., “how to get first customers”) and delivering insights to a team or client 🧠 Tips & gotchas ⏱️ Reddit API limits: tune batch size and rate if the subreddit is very active 🧹 Robust JSON parsing: the String to JSON node handles clean, fenced, or escaped JSON; failures return error + raw for debugging 📨 Email client quirks: test long newsletters; some clients clip lengthy HTML 💸 AI cost: the two-step (summarization + HTML generation) improves quality but can be merged to reduce cost 🧭 Quick customization Change microsaas to your target subreddit Rewrite the topic of interest (e.g., “growth strategies”, “fundraising”, etc.) Adapt the newsletter outline prompt for a different tone/format Schedule with a Cron node for daily or weekly digests 📬 Contact & Feedback Need help tailoring this workflow to your stack? 📩 Luis.acosta@news2podcast.com 🐦 @guanchehacker If you’re building something more advanced with curation + AI (like turning the digest into a podcast or video), let’s connect — I may have the missing piece you need.
by WeblineIndia
Snyk Vulnerability Automation Workflow with Webhook, Jira, Slack & Airtable This workflow receives vulnerability data(e.g., Snyk, Dependabot or any security scanner) from Snyk through a webhook, standardizes and validates the payload, checks Jira for duplicates using a unique vulnerability key, and either updates an existing Jira issue or creates a new one. It also sends real-time alerts to Slack and stores every new vulnerability in Airtable for reporting and auditing. The workflow ensures fast triage, prevents duplicate Jira tickets and centralizes all data for easy tracking. Quick Start – Implementation Steps Add the n8n Webhook URL to Snyk. Configure Jira, Slack and Airtable credentials in n8n. Adjust severity rules or Jira fields if required. Activate the workflow — vulnerability triage becomes fully automated. What It Does This workflow automates how your team processes vulnerabilities reported by Snyk. When a new vulnerability arrives, the system first normalizes the payload into a clean, consistent format. It then validates required fields such as the vulnerability ID, CVSS score, title and URL. If anything is missing, the workflow instantly sends a Slack alert so the team can review. If the payload is valid, the workflow assigns a severity level and generates a unique “vulnerability key.” This key is used to search Jira for existing issues. If a match is found, the workflow updates the existing Jira ticket and notifies the team. If no match exists, the workflow creates a brand-new Jira issue, sends a Slack alert and also writes the data into Airtable for centralized tracking and analytics. This ensures accurate documentation, avoids duplicates and gives teams visibility through both Jira and Airtable. Who’s It For This workflow is ideal for: DevOps and platform engineering teams Security engineers QA and development teams Companies using Snyk for vulnerability scanning Teams needing automated Jira creation and Airtable reporting Requirements to Use This Workflow To fully use this workflow, you need: An n8n instance (cloud or self-hosted) A Snyk webhook configured to send vulnerability notifications A Jira Software Cloud account A Slack workspace with bot permissions An Airtable base and personal access token Basic understanding of JSON fields How It Works Receive Vulnerability – Snyk posts data to an n8n webhook. Normalize Payload – Converts inconsistent Snyk formats into a standard structure. Validate Required Fields – Missing fields trigger a Slack alert. Assign Severity – CVSS score is mapped to Low/Medium/High/Critical. Generate Vulnerability Key – Used for deduplication (e.g., vuln-SNYK-12345). Check Jira for Matches – Searches by label to detect duplicates. Duplicate Handling – Updates existing Jira issue and sends Slack notification. Create New Issue – If no duplicate exists, creates a new Jira ticket. Store in Airtable – Adds a new vulnerability row for reporting and history. Slack Alerts – Notifies the team of new or updated vulnerabilities. Setup Steps Import the workflow JSON file into n8n. Configure credentials: Jira Slack Airtable Add the generated webhook URL inside your Snyk project settings. Update Jira project ID, issue type, or description fields as needed. Map Airtable fields (Title, CVSS, Severity, URL, Key, etc.). Update Slack channel IDs. Activate the workflow. How To Customize Nodes Customize Severity Rules Modify the node that maps CVSS score ranges: Change thresholds Add custom severity levels Map severity to Jira priority Customize Jira Fields Inside the Create or Update Jira Issue nodes, you can modify: Project ID Issue type Labels Description template Assigned user Customize Slack Messages Adjust Slack text blocks to: Change formatting Add emojis or styling Mention specific users or teams Send different messages based on severity Customize Airtable Storage Update the Airtable node to: Add new columns Save timestamps Link vulnerabilities to other Airtable tables Store more metadata for reporting Add-Ons (Optional Enhancements) You can extend this workflow with: Auto-close Jira tickets when Snyk marks vulnerabilities as “fixed”. Severity-based Slack routing (e.g., Critical → #security-alerts). Email notifications for high-risk vulnerabilities. Google Sheets or Notion logging for long-term tracking. Weekly summary report generated using OpenAI. Mapping vulnerabilities to microservices or repositories. Automated dashboards using Airtable Interfaces. Use Case Examples Automatic Vulnerability Triage – Instantly logs new Snyk findings into Jira. Duplicate Prevention – Ensures every vulnerability is tracked only once. Slack Alerts – Real-time notifications for new or updated issues. Airtable Reporting – Creates a central, filterable database for analysis. Security Team Automation – Reduces manual reviews and saves time. Troubleshooting Guide | Issue | Possible Cause | Solution | |-----------------------------|--------------------------------------------------|--------------------------------------------------------| | Slack alert not sent | Wrong API credentials or channel ID | Re-check Slack configuration | | Jira issue not created | Incorrect project ID / issue type | Update Jira node details | | Duplicate detection not working | Vulnerability key or label mismatch | Confirm key generation and JQL settings | | Airtable row not added | Wrong base or field mapping | Reconfigure Airtable node | | Webhook not triggered | Snyk not pointing to correct URL | Re-add the n8n webhook in Snyk | | Severity not correct | CVSS parsing error | Check normalization and mapping node | Need Help? If you need help setting up this workflow, customizing the logic, integrating new nodes or adding advanced reporting, feel free to reach out to our n8n automation development team at WeblineIndia. We can help automate with advanced security processes, build dashboards, integrate additional tools or expand the workflow as per your business needs.
by Adnan Azhar
Template Overview This n8n workflow provides an intelligent, timezone-aware AI voice calling system for e-commerce businesses to automatically confirm customer orders via phone calls. The system uses VAPI (Voice AI Platform) to make natural, conversational calls while respecting customer time zones and business hours. 🎯 Use Case Perfect for e-commerce businesses that want to: Automatically confirm high-value or important orders via phone Reduce order cancellations and disputes Provide personalized customer service at scale Maintain human-like interactions while automating the process Respect customer time zones and calling hours ✨ Key Features Timezone Intelligence Automatically detects customer timezone from shipping address or phone number Only calls during appropriate business hours (10 AM - 3 PM local time, weekdays) Schedules calls for appropriate times when outside calling hours Uses timezone-aware greetings (Good morning/afternoon/evening) AI-Powered Conversations Natural, context-aware conversations using VAPI Personalized greetings with customer names and local time awareness Intelligent confirmation detection from call transcripts Handles customer concerns and change requests gracefully Smart Call Management Automatic retry logic with attempt tracking Call quality assessment and cost tracking Detailed transcript analysis and sentiment detection Follow-up alerts for calls requiring human intervention Comprehensive Tracking Complete call history and analytics in Airtable Real-time status updates throughout the process Detailed reporting on confirmation rates and call quality Cost tracking and ROI analysis 🏗️ Workflow Architecture Main Flow (Order Confirmation) Order Webhook - Receives order data from e-commerce platform Data Validation - Validates required fields (phone, status) Timezone Detection - Determines customer timezone and calling eligibility Call Routing - Either initiates immediate call or schedules for later VAPI Integration - Makes the actual AI voice call Status Tracking - Updates database with call results Scheduled Flow (Retry System) Runs every 15 minutes to check for scheduled calls Respects retry limits and calling hours Automatically processes queued confirmations Webhook Handler (Results Processing) Receives VAPI call completion webhooks Analyzes call transcripts for confirmation status Sends follow-up alerts or confirmation emails Updates final order status 🔧 Prerequisites & Setup Required Services VAPI Account - For AI voice calling functionality Airtable Base - For order tracking and analytics SMTP Server - For email notifications n8n Instance - Self-hosted or cloud
by Khairul Muhtadin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Who is this for? Automation enthusiasts, content creators, or social media managers who post article-based threads to Bluesky and want to automate the process end-to-end. What problem is this solving? Manual content repackaging and posting can be repetitive and time-consuming. This workflow automates the process from capturing article URLs (via Telegram or RSS) to scraping content, transforming it into a styled thread, and posting on Bluesky platform. What this workflow does Listens on Telegram or fetches from RSS feeds (AI Trends, Machine Learning Mastery, Technology Review). Extracts content from URLs using JinaAI. Converts the article into a neat, scroll-stopping thread via LangChain + Gemini / OpenAI ChatGPT. Splits the thread into multiple posts. The first post is published with “Create a Post”, while subsequent posts are replies. Adds short delays between posting to avoid rate limits. Setup Add credentials for Telegram Bot API, JinaAI, Google Gemini, and Bluesky App Password. Add or customize RSS feeds if needed Test with a sample URL to validate posting sequence. How to customize Swap out RSS feeds or trigger sources. Modify prompt templates or thread formatting rules in the LangChain/Gemini node. Adjust wait times or content parsing logic. Replace Bluesky with another posting target if desired. Made by: Khaisa Studio Need customs workflows? Contact Me!
by Abdullah Alshiekh
What Problem Does It Solve? SEO professionals and marketers spend hours manually searching keywords to analyze competitor content. Copying and pasting SERP results into spreadsheets is tedious and prone to formatting errors. Analyzing "why" a page ranks requires significant mental effort and time for every single keyword. This workflow solves these by: Automatically fetching live Google search results for a list of keywords. Using AI to instantly analyze the top ranking pages for Intent, Strengths, and Weaknesses. Delivering a consolidated, strategic SEO report directly to your email inbox. How to Configure It API Setup: Connect your Decodo credentials (for scraping Google results).- Connect your Google Gemini credentials (for the AI analysis). Connect your Gmail account (to send the final report). Keyword Input: Open the "Edit Fields" node and replace the placeholder items (keyword_1, etc.) with the actual search terms you want to track. Email Recipient: Update the "Send a message" node with your email address. How It Works The workflow triggers manually (or can be scheduled). It loops through your defined list of keywords one by one. Decodo performs a real-time Google search for each term and extracts organic results. A JavaScript node cleans the data, removing ads and irrelevant snippets. The AI Agent acts as an expert SEO analyst, processing the top results to generate a concise audit. Finally, the workflow compiles all insights into a single email report and sends it to you. Customization Ideas Change the output: Save the analysis to a Google Sheet or Notion database instead of Email. Adjust the AI Persona: Modify the system prompt to focus on specific metrics (e.g., content gaps or backlink opportunities). Automate the Input: Connect a Google Sheet to dynamically pull new keywords every week. Schedule It: Replace the Manual Trigger with a Cron node to run this report automatically every Monday morning. If you need any help Get in Touch
by oka hironobu
TimeRex AI-Powered Booking Automation Description (for n8n template submission) Transform your TimeRex booking management with AI-powered automation. This workflow automatically processes bookings, enriches data with AI insights, and keeps your team informed via Slack—all in real-time. What This Workflow Does 🤖 AI-Powered Intelligence Smart Company Detection**: Automatically identifies company names from guest email domains Booking Categorization**: Uses Google Gemini to classify bookings (Sales/Support/Interview/Partnership/Media) Meeting Brief Generation**: AI creates actionable preparation notes for hosts before each meeting ⚡ Automated Processing Receives webhooks from TimeRex for confirmed and cancelled bookings Validates requests with security token verification Logs enriched booking data to Google Sheets Sends detailed Slack notifications with AI-generated insights 🛡️ Security & Reliability Token-based webhook authentication Security alerts for unauthorized access attempts Automatic cancellation handling with data cleanup Use Cases Sales Teams**: Automatically categorize leads and prepare meeting briefs Recruitment**: Streamline interview scheduling with AI-powered candidate insights Customer Success**: Track support meetings and prepare context for calls Media Relations**: Manage press interviews with automated briefings How It Works TimeRex sends a webhook when a booking is confirmed or cancelled Security token is verified (failed attempts trigger Slack alerts) For confirmed bookings: Media source is detected from calendar name Company name is extracted from email domain AI categorizes the booking purpose AI generates a meeting preparation brief Enriched data is saved to Google Sheets Slack notification is sent with AI insights For cancellations: Booking is found by Event ID Row is deleted from Google Sheets Cancellation alert is sent to Slack Setup Instructions Webhook Configuration Copy the webhook URL from the "TimeRex Webhook" node Paste it in TimeRex Settings → Webhook Security Token Copy your TimeRex security token Update the Verify Security Token node with your token Google Sheets Create a spreadsheet with these columns: event_id, booking_date, guest_name, guest_email, calendar_name, meeting_url, host_name, media_source, company_name, booking_category, ai_meeting_brief, created_at Update all Google Sheets nodes with your Sheet ID AI Credentials Connect your Google Gemini API credentials to both AI model nodes Slack Connect your Slack account Select your notification channel in all Slack nodes Activate Turn on the workflow and start receiving AI-enhanced booking notifications! Requirements TimeRex account with webhook access Google Cloud account (for Sheets & Gemini API) Slack workspace n8n instance (self-hosted or cloud) Customization Tips Modify the Filter by Calendar Type node to match your calendar naming convention Adjust AI prompts in the LLM Chain nodes for different categorization or brief styles Add more media sources to the Media Master sheet for accurate source tracking Extend the workflow with email confirmations or calendar event creation Short Description (100 characters max) Automate TimeRex bookings with AI-powered categorization, meeting briefs, and Slack notifications. Categories Sales Productivity AI Scheduling Tags TimeRex, Booking, AI, Google Gemini, Slack, Google Sheets, Automation, Meeting Management, LLM, Scheduling
by Yaron Been
Track companies adopting tools that complement yours and send AI-drafted co-marketing outreach emails to new adopters. This workflow reads a list of complementary tools (with their PredictLeads technology IDs) from Google Sheets, discovers companies that recently adopted each tool via the PredictLeads Technology Detections API, compares against previously scanned domains to find new adopters, enriches each new company, and uses OpenAI to draft a personalized co-marketing partnership email. How it works: Schedule trigger runs the workflow daily at 8 AM. Reads complementary tool names and PredictLeads tech IDs from Google Sheets. Loops through each tool and discovers recent technology adopters via PredictLeads. Reads previously scanned domains from a separate Google Sheets tab. Compares current detections against previous scans to identify new adopters only. Limits processing to 2 new companies per tool per run (adjustable). Enriches each new adopter with PredictLeads company data. Builds a structured prompt and sends it to OpenAI to draft a co-marketing email. Sends the email via Gmail. Logs the domain, tool name, and timestamp to the Previous Scan sheet to prevent duplicates. Setup: Create a Google Sheet with two tabs: "Complementary Tools" with columns: tool_name, tech_id (PredictLeads technology ID). "Previous Scan" with columns: domain, tool_name, detected_at, email_sent. Connect your Gmail account (OAuth2) for sending outreach emails. Add your OpenAI API key in the Draft Co-Marketing Email HTTP Request node. Add your PredictLeads API credentials (X-Api-Key and X-Api-Token headers). Requirements: Google Sheets OAuth2 credentials. Gmail OAuth2 credentials. OpenAI API account (uses gpt-4o-mini, ~$0.003-0.008 per call). PredictLeads API account (https://docs.predictleads.com). Notes: The Limit node caps outreach at 2 companies per tool per run -- adjust as needed. Technology IDs for the complementary tools can be found via the PredictLeads API. The Previous Scan tab prevents the same company from being contacted twice. PredictLeads Technology Detections and Company API docs: https://docs.predictleads.com
by WeblineIndia
WooCommerce Weekly Sales KPI Reporting to Slack & Google Sheets This workflow automatically generates a weekly sales performance report from WooCommerce and shares it with your team. It runs on a weekly schedule, fetches last week’s orders and refunds, calculates key sales KPIs, stores the results in Google Sheets and sends a summarized report to a Slack channel. Quick Implementation Steps (Get Started Fast) Connect WooCommerce, Slack and Google Sheets credentials in n8n. Update the WooCommerce store domain in the Configure WooCommerce Store node. Review the Slack channel and Google Sheet settings. Activate the workflow. That’s it — your weekly sales KPIs will now be generated and shared automatically. What It Does This workflow helps you track and share weekly WooCommerce performance without manual effort. It automatically calculates key sales metrics such as total orders, total revenue, average order value, refunds and top-performing products based on the previous week’s data. The workflow begins on a weekly schedule and determines the exact date range for the last completed week. Using this date range, it pulls sales orders and refund data from WooCommerce through HTTP requests. Multiple calculations are then performed to generate meaningful KPIs that are useful for both operational and leadership-level reporting. Once the KPIs are calculated, the workflow consolidates them into a clean report format. The data is saved in Google Sheets for long-term tracking and a readable summary is sent to a Slack channel so stakeholders can quickly review weekly performance. Who’s It For E-commerce store owners using WooCommerce Operations and sales teams tracking weekly performance Business managers who want automated KPI reporting Teams using Slack and Google Sheets for collaboration and reporting Requirements to Use This Workflow An active WooCommerce store with REST API access WooCommerce Consumer Key and Secret (Basic Auth) An n8n instance with scheduled workflows enabled A Slack workspace with permission to post messages A Google Sheets account with access to the target spreadsheet How the Workflow Works Weekly Schedule Trigger The workflow runs once per week. The exact day and time are configurable. Calculate Last Week’s Date Range A Code node calculates the start and end dates of the previous week. Configure WooCommerce Store The WooCommerce store domain is defined once and reused across API requests. Fetch Weekly Data from WooCommerce Orders with completed and processing status Refund data for the same date range Calculate KPIs Separate Code nodes calculate: Total orders and total revenue Average order value Refund count and refund amount Top products based on revenue Merge KPI Results All calculated KPIs are combined into a single dataset. Prepare Final KPI Report Fields Only required, clean fields are retained for reporting. Store Data in Google Sheets Each workflow run appends one new row with weekly KPI data. Send Weekly Report to Slack A formatted summary is posted to the selected Slack channel. Setup Instructions Update the WooCommerce domain in the Configure WooCommerce Store node. Verify WooCommerce API credentials in all HTTP Request nodes. Select the desired Slack channel in the Slack node. Confirm the target Google Sheet and worksheet. Adjust the weekly schedule if needed. Activate the workflow. How To Customize Nodes Weekly Sales KPI Trigger** Change the day or time to run the workflow at any point during the week. Configure WooCommerce Store** Update the domain if you move to a different store or environment. HTTP Request Nodes** Modify order statuses or add filters as needed. KPI Calculation Code Nodes** Add new metrics or adjust existing calculations. Slack Node** Send reports to a different channel or workspace. Google Sheets Node** Store data in another sheet or spreadsheet. Add-ons (Additional Features) Monthly or daily KPI reporting Email-based KPI reports Separate reports for different WooCommerce stores Alerting when revenue drops below a threshold Dashboard integration using BI tools Use Case Examples Weekly sales performance review for management Tracking revenue and refunds trends over time Sharing automated reports with remote teams Maintaining a historical KPI log in Google Sheets Supporting business decisions with consistent weekly data There can be many more use cases depending on how this workflow is customized or extended. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|---------------|----------| | No data in Slack | Workflow not active | Activate the workflow | | Empty KPIs | No orders in the selected week | Verify WooCommerce data | | Incorrect dates | Schedule misconfiguration | Review trigger timing | | Google Sheets not updating | Permission issue | Reconnect Google Sheets credentials | | WooCommerce API error | Invalid credentials | Check Consumer Key and Secret | Need Help? If you need help setting up this workflow, customizing KPIs or building advanced reporting automation, our n8n workflow developers at WeblineIndia are here to help. Our team has strong expertise in n8n workflow automation, WooCommerce integrations and business intelligence reporting. Whether you want to extend this workflow or build a similar solution tailored to your business needs, feel free to reach out to WeblineIndia for expert support.
by Rajeet Nair
Overview This workflow automates bulk email campaigns with built-in validation, deliverability protection, and smart send-time optimization. It processes CSV leads, validates emails, enriches data, and schedules campaigns intelligently. Emails are sent using controlled inbox rotation, while engagement tracking and analytics continuously improve performance. How It Works Campaign Input Receives campaign data and CSV leads via webhook. Lead Processing Extracts CSV data, splits leads, and validates email format. Domain & Quality Checks Verifies domains using MX records and filters invalid leads. Lead Enrichment Adds timezone, domain type, and engagement score for better targeting. Lead Storage Stores valid leads and separates invalid ones for tracking. Campaign Execution Scheduler fetches active campaigns and selects top leads. Send Optimization Calculates best send time per lead based on timezone and historical performance, while selecting inboxes within sending limits. Email Delivery Waits until optimal time and sends emails using selected inbox. Tracking & Logging Logs sent emails and updates inbox usage statistics. Event Tracking Captures opens, clicks, replies, and bounces via webhook. Performance Analytics Updates campaign stats and analyzes engagement trends. Continuous Optimization Updates send-time rules to improve future campaign performance. Setup Instructions Connect webhook for campaign and CSV upload Configure send limits, delays, and MX API Set up Postgres with required tables Connect Gmail or SMTP for sending Configure event webhook for tracking Enable campaign and analytics schedulers Test with sample campaign before activating Use Cases Running cold email campaigns at scale Improving email deliverability and sender reputation Automating lead validation and enrichment Optimizing send times based on engagement data Managing multi-inbox outbound systems Requirements n8n instance with webhook support Postgres database Gmail or SMTP email account(s) MX record lookup API (e.g., Google DNS) Email tracking system or webhook integration Notes Inbox rotation and throttling help prevent spam and protect reputation. Engagement-based lead scoring improves campaign performance. Send-time optimization is continuously refined using real data. You can extend this workflow with personalization or AI-generated emails.
by Databox
Your paid ads and website analytics live in separate tools. This workflow bridges both via Databox MCP, runs three specialized AI agents in sequence, and emails a daily intelligence report with a correlation layer that surfaces insights neither dataset could show alone. Who's it for Performance marketers** who want to understand how ads influence website quality Growth teams** looking for daily cross-channel signals without building custom dashboards Marketing managers** who need one morning briefing covering paid spend and website behavior How it works Schedule Trigger fires every day at 8 AM Agent 1 fetches website performance from Databox: sessions, bounce rate, goal completions, conversion rate Agent 2 fetches paid channel data from Databox: spend, CPC, CTR, ROAS per platform Agent 3 synthesizes both outputs - ranks channel efficiency, estimates cost per quality visit, and writes 3 actionable recommendations A styled HTML email report is delivered to your inbox Requirements Databox account** with website analytics and at least one paid ads platform connected (free plan works) OpenAI API key (or Anthropic) Gmail account How to set up Click each Databox MCP Tool node - set Authentication to OAuth2 and authorize Add your OpenAI API key to each of the three Chat Model nodes Connect Gmail and set the recipient address in the Send Email node Activate - your first report arrives tomorrow at 8 AM