by Václav Čikl
Overview Transform your Gmail sent folder into a comprehensive, enriched contact database automatically. This workflow processes hundreds or thousands of sent emails, extracting and enriching contact information using AI and web search – saving days of manual work. What This Workflow Does Loads sent Gmail messages and extracts basic contact information Deduplicates contacts against your existing Google Sheets database Searches for email conversation history with each contact AI-powered extraction from email threads (phone, socials, websites) Fallback web search via Brave API when no email history exists Saves enriched data to Google Sheets with all discovered contact details Perfect For Musicians & bands** organizing booker/venue contacts Freelancers & agencies** building client databases Sales teams** enriching prospect lists from outbound campaigns Consultants** creating structured contact databases from years of emails Key Features Intelligent Two-Path Enrichment Path A (Email History)**: Analyzes existing email threads to extract contact details from signatures and message content Path B (Web Search)**: Falls back to Brave API search + HTML scraping when no email history exists AI-Powered Data Extraction Uses GPT-5 Nano to intelligently parse: Phone numbers Website URLs LinkedIn profiles Instagram, Twitter, Facebook, Youtube, TikTok, LinkTree, BandCamp... Alternative email addresses Built-in Deduplication Prevents duplicate entries by checking existing Google Sheets records before processing. Free-Tier Friendly Runs entirely on free tiers: Gmail API (free) OpenAI GPT-5 Nano (cost-effective) Brave Search API (2,000 free searches/month) Google Sheets (free) Setup Requirements Required Accounts & Credentials Gmail Account - OAuth2 credentials for Gmail API access OpenAI API Key - For GPT-5 Nano model Brave Search API Key - Free tier (2,000 searches/month) Google Sheets - OAuth2 credentials Google Sheets Structure Create a Google Sheet with these columns (see template link): Template Sheet: Make a copy here How to Use Clone this workflow to your n8n instance Configure credentials for Gmail, OpenAI, Brave Search, and Google Sheets Create/connect your Google Sheet using the template structure Run manually to process all sent emails and build your initial database Review results in Google Sheets - enriched with discovered contact info First Run Tips Start with a smaller Gmail query (e.g., last 6 months) to test Check Brave API quota before processing large volumes Manual trigger means you control when processing happens Processing time varies based on email volume (typically 2-5 seconds per contact) Customization Ideas Extend the Enrichment Include company information parsing Extract job titles from email signatures Automate Regular Updates Convert manual trigger to scheduled trigger Process only recent sent emails for incremental updates Add email notification when new contacts are added Integration Options Push enriched contacts to CRM (HubSpot, Salesforce) Send Slack notifications for high-value contacts Export to Airtable for relational database features Improve Accuracy Add human-in-the-loop review for uncertain extractions Implement confidence scoring for AI-extracted data Add validation checks for phone numbers and URLs Use Case Example Music Promoter Building Venue Database: Processed 1,835 sent emails to bookers and venues AI extracted contact details from 60% via email signatures Brave search found websites for remaining 40% Final database: 1,835 enriched contacts ready for outreach Time saved: ~40 hours of manual data entry Technical Notes Rate Limiting**: Brave API free tier = 2,000 searches/month Duplicates**: Handled at workflow start, not during processing Empty Results**: Stores email + name even when enrichment fails Model**: Uses GPT-5 Nano for cost-effective parsing Gmail Scope**: Reads sent emails only (not inbox) Cost Estimate For processing 1,000 contacts: Gmail API**: Free GPT-5 Nano**: ~$0.50-2 (depending on email length) Brave Search**: Free (within 2K/month limit) Google Sheets**: Free Total**: Under $2 for 1,000 enriched contacts Template Author: Questions or need help with setup? 📧 Email:xciklv@gmail.com 💼 LinkedIn:https://www.linkedin.com/in/vaclavcikl/
by Dr. Firas
Generate AI viral videos with NanoBanana & VEO3, shared on socials via Blotato Who is this for? This workflow is designed for content creators, marketers, and entrepreneurs who want to automate their video production and social media publishing process. If you regularly post promotional or viral-style content on platforms like TikTok, YouTube Shorts, Instagram Reels, LinkedIn, and more, this template will save you hours of manual work. What problem is this workflow solving? / Use case Creating viral short-form videos is often time-consuming: You need to generate visuals, write scripts, edit videos, and then manually upload them to multiple platforms. Staying consistent across TikTok, YouTube Shorts, Instagram Reels, LinkedIn, Twitter/X, and others requires constant effort. This workflow solves the problem by automating the entire pipeline from idea → video creation → multi-platform publishing. What this workflow does Collects an idea and image from Telegram. Enhances visuals with NanoBanana for user-generated content style. Generates a complete video script with AI (OpenAI + structured prompts). Creates the final video with VEO3 using your custom prompt and visuals. Rewrites captions with GPT to be short, catchy, and optimized for social platforms. Saves metadata in Google Sheets for tracking and management. Auto-uploads the video to all major platforms via Blotato (TikTok, YouTube, Instagram, LinkedIn, Threads, Pinterest, X/Twitter, Bluesky, Facebook). Notifies you on Telegram with a preview link once publishing is complete. Setup Connect your accounts: Google Sheets (for video tracking) Telegram (to receive and send media) Blotato (for multi-platform publishing) OpenAI API (for captions, prompts, and image analysis) VEO3 API (for video rendering) Fal.ai (for NanoBanana image editing) Google Drive (to store processed images) Set your credentials in the respective nodes. Adjust the Google Sheet IDs to match your own sheet structure. Insert your Telegram bot token in the Set: Bot Token (Placeholder) node. How to customize this workflow to your needs Platforms**: Disable or enable only the Blotato social accounts you want to post to. Video style**: Adjust the master prompt schema in the Set Master Prompt node to fine-tune tone, camera style, or video format. Captions**: Modify the GPT prompt in the Rewrite Caption with GPT-4o node to control length and tone. Notifications**: Customize the Telegram nodes to notify team members, not just yourself. Scheduling**: Add a Cron trigger if you want automatic posting at specific times. ✨ With this workflow, you go from idea → AI-enhanced video → instant multi-platform publishing in just minutes, with almost no manual work. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Țugui Dragoș
This workflow automates the post-publish process for YouTube videos, combining advanced SEO optimization, cross-platform promotion, and analytics reporting. It is designed for creators, marketers, and agencies who want to maximize the reach and performance of their YouTube content with minimal manual effort. Features SEO Automation** Fetches video metadata and analyzes competitor and trending data. Uses AI to generate SEO-optimized titles, descriptions, and tags. Calculates an SEO score and applies A/B testing logic to select the best title. Updates the video metadata on YouTube automatically. Cross-Platform Promotion** Generates platform-specific promotional content (LinkedIn, X/Twitter, Instagram, Facebook, etc.) using AI. Publishes posts to each connected social channel. Extracts video clips and analyzes thumbnails for enhanced promotion. Engagement Monitoring & Analytics** Monitors YouTube comments, detects negative sentiment, and drafts AI-powered replies. Logs all key data (videos, comments, analytics) to Google Sheets for tracking and reporting. Runs a weekly analytics job to aggregate performance, calculate engagement/viral indicators, and email a detailed report. Notifications & Alerts** Sends Slack alerts when a new video is published or when viral potential/negative comments are detected. How It Works Trigger The workflow starts automatically when a new YouTube video is published (via webhook) or on a weekly schedule for analytics. Video Intake & SEO Fetches video details (title, description, tags, stats). Gathers competitor and trending topic data. Uses AI to generate improved SEO assets and calculates an SEO score. Selects the best title (A/B test) and updates the video metadata. Clip & Thumbnail Processing If the video is long enough, runs thumbnail analysis and extracts short clips for social media. Cross-Platform Promotion Generates and formats promotional posts for each social platform. Publishes automatically to enabled channels. Engagement & Comment Monitoring Fetches comments, detects negative sentiment, and drafts AI-powered replies. Logs comments and responses to Google Sheets. Analytics & Reporting Aggregates weekly analytics, calculates engagement and viral indicators. Logs insights and sends a weekly report via email. Notifications Sends Slack alerts for new video publications and viral/negative comment detection. Setup Instructions Connect YouTube Set up YouTube API credentials and required IDs in the Workflow Configuration node. Connect OpenAI Add your OpenAI credentials for AI-powered content generation. Connect Slack Configure Slack credentials and specify alert channels. Connect Google Sheets Set up service account credentials for logging video, comment, and analytics data. Configure Social Platforms Add credentials for LinkedIn, Twitter (X), Instagram, and Facebook as needed. Test the Workflow Publish a test video and verify that metadata updates, social posts, logging, and weekly reports are working as expected. Use Cases YouTube Creators:** Automate SEO, promotion, and analytics to grow your channel faster. Marketing Teams:** Streamline multi-channel video campaigns and reporting. Agencies:** Deliver consistent, data-driven YouTube growth for multiple clients. Requirements YouTube API credentials OpenAI API key Slack API token Google Sheets service account (Optional) LinkedIn, Twitter, Instagram, Facebook API credentials Limitations Requires valid API credentials for all connected services. AI-powered features depend on OpenAI API access. Social posting is limited to platforms with available n8n nodes and valid credentials. Tip: You can easily customize prompts, scoring logic, and enabled platforms to fit your channel’s unique needs.
by Dr. Firas
💥 Automate YouTube thumbnail creation from video links (with templated.io) Who is this for? This workflow is designed for content creators, YouTubers, and automation enthusiasts who want to automatically generate stunning YouTube thumbnails and streamline their publishing workflow — all within n8n. If you regularly post videos and spend hours designing thumbnails manually, this automation is built for you. What problem is this workflow solving? Creating thumbnails is time-consuming — yet crucial for video performance. This workflow completely automates that process: No more manual design. No more downloading screenshots. No more repetitive uploads. In less than 2 minutes, you can refresh your entire YouTube thumbnail library and make your channel look brand new. What this workflow does Once activated, this workflow can: ✅ Receive YouTube video links via Telegram ✅ Extract metadata (title, description, channel info) via YouTube API ✅ Generate a custom thumbnail automatically using Templated.io ✅ Upload the new thumbnail to Google Drive ✅ Log data in Google Sheets ✅ Send email and Telegram notifications when ready ✅ Create and publish AI-generated social posts on LinkedIn, Facebook, and Twitter via Blotato Bonus: You can re-create dozens of YouTube covers in minutes — saving up to 5 hours per week and around $500/month in manual design effort. Setup 1️⃣ Get a YouTube Data API v3 key from Google Cloud Console 2️⃣ Create a Templated.io account and get your API key + template ID 3️⃣ Set up a Telegram bot using @BotFather 4️⃣ Create a Google Drive folder and copy the folder ID 5️⃣ Create a Google Sheet with columns: Date, Video ID, Video URL, Title, Thumbnail Link, Status 6️⃣ Get your Blotato API key from the dashboard 7️⃣ Connect your social media accounts to Blotato 8️⃣ Fill all credentials in the Workflow Configuration node 9️⃣ Test by sending a YouTube URL to your Telegram bot How to customize this workflow Replace the Templated.io template ID with your own custom thumbnail layout Modify the OpenAI node prompts to change text tone or style Add or remove social platforms in the Blotato section Adjust the wait time (default: 5 minutes) based on template complexity Localize or translate the generated captions as needed Expected Outcome With one Telegram message, you’ll receive: A professional custom thumbnail An instant email + Telegram notification A Google Drive link with your ready-to-use design And your social networks will be automatically updated — no manual uploads. Credits Thumbnail generation powered by Templated.io Social publishing powered by Blotato Automation orchestrated via n8n 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 🎥 Watch This Tutorial 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Ehsan
Who is this for? This workflow is for Product Managers, Indie Hackers, and Customer Success teams who collect feature requests but struggle to notify specific users when those features actually ship. It helps you turn old feedback into customer loyalty and potential upsells. What it does This workflow creates a "Semantic Memory" of user requests. Instead of relying on exact keyword tags, it uses Vector Embeddings to understand the meaning of a request. For example, if a user asks for "Night theme," and months later you release "Dark Mode," this workflow understands they are the same thing, finds that user, and drafts a personal email to them. How it works Listen: Receives new requests via Tally Forms, vectorizes the text using Nomic Embed Text (via Ollama or OpenAI), and stores them in Supabase. Watch: Monitors your Changelog (RSS) or waits for a manual trigger when you ship a new feature. Match: Performs a Vector Similarity Search in Supabase to find users who requested semantically similar features in the past. Notify: An AI Agent drafts a hyper-personalized email connecting the user's specific past request to the new feature, saving it as a Gmail Draft (for safety). Requirements Supabase Project:** You need a project with the vector extension enabled. AI Model:* This template is pre-configured for *Ollama (Local)** to keep it free, but works perfectly with OpenAI. Tally Forms & Gmail:** For input and output. Setup steps Database Setup (Crucial): Copy the SQL script provided in the workflow's Red Sticky Note and run it in your Supabase SQL Editor. This creates the necessary tables and the vector search function. Credentials: Add your credentials for Tally, Supabase, and Gmail. URL Config: Update the HTTP Request node with your specific Supabase Project URL. SQL Script Open your Supabase SQL Editor and paste this script to set up the tables and search function: -- 1. Enable Vector Extension create extension if not exists vector; -- 2. Create Request Table (Smart Columns) create table feature_requests ( id bigint generated by default as identity primary key, content text, metadata jsonb, embedding vector(768), -- 768 for Nomic, 1536 for OpenAI created_at timestamp with time zone default timezone('utc'::text, now()), user_email text generated always as (metadata->>'user_email') stored, user_name text generated always as (metadata->>'user_name') stored ); -- 3. Create Search Function create or replace function match_feature_requests ( query_embedding vector(768), match_threshold float, match_count int ) returns table ( id bigint, user_email text, user_name text, content text, similarity float ) language plpgsql as $$ begin return query select feature_requests.id, feature_requests.user_email, feature_requests.user_name, feature_requests.content, 1 - (feature_requests.embedding <=> query_embedding) as similarity from feature_requests where 1 - (feature_requests.embedding <=> query_embedding) > match_threshold order by feature_requests.embedding <=> query_embedding limit match_count; end; $$; ⚠️ Dimension Warning: This SQL is set up for 768 dimensions (compatible with the local nomic-embed-text model included in the template). If you decide to switch the Embeddings node to use OpenAI's text-embedding-3-small, you must change all instances of 768 to 1536 in the SQL script above before running it. How to customize Change Input:** Swap the Tally node for Typeform, Intercom, or Google Sheets. Change AI:** The template includes notes on how to swap the local Ollama nodes for OpenAI nodes if you prefer cloud hosting. Change Output:** Swap Gmail for Slack, SendGrid, or HubSpot to notify your sales team instead of the user directly.
by TOMOMITSU ASANO
{ "name": "IoT Sensor Data Aggregation with AI-Powered Anomaly Detection", "nodes": [ { "parameters": { "content": "## How it works\nThis workflow monitors IoT sensors in real-time. It ingests data via MQTT or a schedule, normalizes the format, and removes duplicates using data fingerprinting. An AI Agent then analyzes readings against defined thresholds to detect anomalies. Finally, it routes alerts to Slack or Email based on severity and logs everything to Google Sheets.\n\n## Setup steps\n1. Configure the MQTT Trigger with your broker details.\n2. Set your specific limits in the Define Sensor Thresholds node.\n3. Connect your OpenAI credential to the Chat Model node.\n4. Authenticate the Gmail, Slack, and Google Sheets nodes.\n5. Create a Google Sheet with headers: timestamp, sensorId, location, readings, analysis.", "height": 484, "width": 360 }, "id": "298da7ff-0e47-4b6c-85f5-2ce77275cdf3", "name": "Main Overview", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -2352, -480 ] }, { "parameters": { "content": "## 1. Data Ingestion\nCaptures sensor data via MQTT for real-time streams or runs on a schedule for batch processing. Both streams are merged for unified handling.", "height": 488, "width": 412, "color": 7 }, "id": "4794b396-cd71-429c-bcef-61780a55d707", "name": "Section: Ingestion", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -1822, -48 ] }, { "parameters": { "content": "## 2. Normalization & Deduplication\nSets monitoring thresholds, standardizes the JSON structure, creates a content hash, and filters out duplicate readings to prevent redundant API calls.", "height": 316, "width": 884, "color": 7 }, "id": "339e7cb7-491e-44c9-b561-983e147237d8", "name": "Section: Processing", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -1376, 32 ] }, { "parameters": { "content": "## 3. AI Anomaly Detection\nAn AI Agent evaluates sensor data against thresholds to identify anomalies, assigning severity levels and providing actionable recommendations.", "height": 528, "width": 460, "color": 7 }, "id": "ebcb7ca3-f70c-4a90-8a2a-f489e7be4c73", "name": "Section: AI Analysis", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -422, 24 ] }, { "parameters": { "content": "## 4. Routing & Archiving\nRoutes alerts based on severity (Critical = Email+Slack, Warning = Slack) and archives all data points to Google Sheets for historical analysis.", "height": 756, "width": 900, "color": 7 }, "id": "7f2b32a5-d3b2-4fea-844f-4b39b8e8a239", "name": "Section: Alerting", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 94, -196 ] }, { "parameters": { "topics": "sensors/+/data", "options": {} }, "id": "bc86720b-9de9-4693-b090-343d3ebad3a3", "name": "MQTT Sensor Trigger", "type": "n8n-nodes-base.mqttTrigger", "typeVersion": 1, "position": [ -1760, 88 ] }, { "parameters": { "rule": { "interval": [ { "field": "minutes", "minutesInterval": 15 } ] } }, "id": "1c38f2d0-aa00-447e-bdae-bffd08c38461", "name": "Batch Process Schedule", "type": "n8n-nodes-base.scheduleTrigger", "typeVersion": 1.2, "position": [ -1760, 280 ] }, { "parameters": { "mode": "chooseBranch" }, "id": "f9b41822-ee61-448b-b324-38483036e0e1", "name": "Merge Triggers", "type": "n8n-nodes-base.merge", "typeVersion": 3, "position": [ -1536, 184 ] }, { "parameters": { "mode": "raw", "jsonOutput": "{\n \"thresholds\": {\n \"temperature\": {\"min\": -10, \"max\": 50, \"unit\": \"C\"},\n \"humidity\": {\"min\": 20, \"max\": 90, \"unit\": \"%\"},\n \"pressure\": {\"min\": 950, \"max\": 1050, \"unit\": \"hPa\"},\n \"co2\": {\"min\": 400, \"max\": 2000, \"unit\": \"ppm\"}\n },\n \"alertConfig\": {\n \"criticalChannel\": \"#iot-critical\",\n \"warningChannel\": \"#iot-alerts\",\n \"emailRecipients\": \"ops@example.com\"\n }\n}", "options": {} }, "id": "308705a8-edc7-4435-9250-487aa528e033", "name": "Define Sensor Thresholds", "type": "n8n-nodes-base.set", "typeVersion": 3.4, "position": [ -1312, 184 ] }, { "parameters": { "jsCode": "const items = $input.all();\nconst thresholds = $('Define Sensor Thresholds').first().json.thresholds;\nconst results = [];\n\nfor (const item of items) {\n let sensorData;\n try {\n sensorData = typeof item.json.message === 'string' \n ? JSON.parse(item.json.message) \n : item.json;\n } catch (e) {\n sensorData = item.json;\n }\n \n const now = new Date();\n const reading = {\n sensorId: sensorData.sensorId || sensorData.topic?.split('/')[1] || 'unknown',\n location: sensorData.location || 'Main Facility',\n timestamp: now.toISOString(),\n readings: {\n temperature: sensorData.temperature ?? null,\n humidity: sensorData.humidity ?? null,\n pressure: sensorData.pressure ?? null,\n co2: sensorData.co2 ?? null\n },\n metadata: {\n receivedAt: now.toISOString(),\n source: item.json.topic || 'batch',\n thresholds: thresholds\n }\n };\n \n results.push({ json: reading });\n}\n\nreturn results;" }, "id": "a2008189-5ace-418b-b0db-d51d63dcf2d8", "name": "Parse Sensor Payload", "type": "n8n-nodes-base.code", "typeVersion": 2, "position": [ -1088, 184 ] }, { "parameters": { "type": "SHA256", "value": "={{ $json.sensorId + '-' + $json.timestamp + '-' + JSON.stringify($json.readings) }}", "dataPropertyName": "dataHash" }, "id": "bf8db555-a10e-4468-a44a-cdc4c97e5b80", "name": "Generate Data Fingerprint", "type": "n8n-nodes-base.crypto", "typeVersion": 1, "position": [ -864, 184 ] }, { "parameters": { "compare": "selectedFields", "fieldsToCompare": "dataHash", "options": {} }, "id": "a45405e2-d211-449d-84d7-4538eaf56fcd", "name": "Remove Duplicate Readings", "type": "n8n-nodes-base.removeDuplicates", "typeVersion": 1, "position": [ -640, 184 ] }, { "parameters": { "text": "=Analyze this IoT sensor reading and determine if there are any anomalies:\n\nSensor ID: {{ $json.sensorId }}\nLocation: {{ $json.location }}\nTimestamp: {{ $json.timestamp }}\n\nReadings:\n- Temperature: {{ $json.readings.temperature }}°C (Normal: {{ $json.metadata.thresholds.temperature.min }} to {{ $json.metadata.thresholds.temperature.max }})\n- Humidity: {{ $json.readings.humidity }}% (Normal: {{ $json.metadata.thresholds.humidity.min }} to {{ $json.metadata.thresholds.humidity.max }})\n- CO2: {{ $json.readings.co2 }} ppm (Normal: {{ $json.metadata.thresholds.co2.min }} to {{ $json.metadata.thresholds.co2.max }})\n\nProvide your analysis in this exact JSON format:\n{\n \"hasAnomaly\": true/false,\n \"severity\": \"critical\"/\"warning\"/\"normal\",\n \"anomalies\": [\"list of detected issues\"],\n \"reasoning\": \"explanation of your analysis\",\n \"recommendation\": \"suggested action\"\n}", "options": { "systemMessage": "You are an IoT monitoring expert. Analyze sensor data and detect anomalies based on the provided thresholds. Be precise and provide actionable recommendations. Always respond in valid JSON format." } }, "id": "b60194ba-7b99-44e0-b0d7-9f1632dce4d4", "name": "AI Anomaly Detector", "type": "@n8n/n8n-nodes-langchain.agent", "typeVersion": 1.7, "position": [ -416, 184 ] }, { "parameters": { "jsCode": "const item = $input.first();\nconst originalData = $('Remove Duplicate Readings').first().json;\n\nlet aiAnalysis;\ntry {\n const responseText = item.json.output || item.json.text || '';\n const jsonMatch = responseText.match(/\\{[\\s\\S]*\\}/);\n aiAnalysis = jsonMatch ? JSON.parse(jsonMatch[0]) : {\n hasAnomaly: false,\n severity: 'normal',\n anomalies: [],\n reasoning: 'Unable to parse AI response',\n recommendation: 'Manual review required'\n };\n} catch (e) {\n aiAnalysis = {\n hasAnomaly: false,\n severity: 'normal',\n anomalies: [],\n reasoning: 'Parse error: ' + e.message,\n recommendation: 'Manual review required'\n };\n}\n\nreturn [{\n json: {\n ...originalData,\n analysis: aiAnalysis,\n alertLevel: aiAnalysis.severity,\n requiresAlert: aiAnalysis.hasAnomaly && aiAnalysis.severity !== 'normal'\n }\n}];" }, "id": "a145a8c7-538c-411a-95c6-9485acdcb969", "name": "Parse AI Analysis", "type": "n8n-nodes-base.code", "typeVersion": 2, "position": [ -64, 184 ] }, { "parameters": { "rules": { "values": [ { "conditions": { "options": { "caseSensitive": true, "typeValidation": "strict" }, "combinator": "and", "conditions": [ { "id": "critical", "operator": { "type": "string", "operation": "equals" }, "leftValue": "={{ $json.alertLevel }}", "rightValue": "critical" } ] }, "renameOutput": true, "outputKey": "Critical" }, { "conditions": { "options": { "caseSensitive": true, "typeValidation": "strict" }, "combinator": "and", "conditions": [ { "id": "warning", "operator": { "type": "string", "operation": "equals" }, "leftValue": "={{ $json.alertLevel }}", "rightValue": "warning" } ] }, "renameOutput": true, "outputKey": "Warning" } ] }, "options": { "fallbackOutput": "extra" } }, "id": "1ab9785d-9f7f-4840-b1e9-0afc62b00b12", "name": "Route by Severity", "type": "n8n-nodes-base.switch", "typeVersion": 3.2, "position": [ 160, 168 ] }, { "parameters": { "sendTo": "={{ $('Define Sensor Thresholds').first().json.alertConfig.emailRecipients }}", "subject": "=CRITICAL IoT Alert: {{ $json.sensorId }} - {{ $json.analysis.anomalies[0] || 'Anomaly Detected' }}", "message": "=CRITICAL IoT SENSOR ALERT\n\nSensor: {{ $json.sensorId }}\nLocation: {{ $json.location }}\nTime: {{ $json.timestamp }}\n\nReadings:\n- Temperature: {{ $json.readings.temperature }}°C\n- Humidity: {{ $json.readings.humidity }}%\n- CO2: {{ $json.readings.co2 }} ppm\n\nAI Analysis:\n{{ $json.analysis.reasoning }}\n\nDetected Issues:\n{{ $json.analysis.anomalies.join('\\n- ') }}\n\nRecommendation:\n{{ $json.analysis.recommendation }}", "options": {} }, "id": "28201a6c-10b5-4387-be89-10a57c634622", "name": "Send Critical Email", "type": "n8n-nodes-base.gmail", "typeVersion": 2.1, "position": [ 384, -80 ], "webhookId": "35b9f8fa-4a50-456e-b552-9fd20a25ccc5" }, { "parameters": { "select": "channel", "channelId": { "__rl": true, "mode": "name", "value": "#iot-critical" }, "text": "=🚨 CRITICAL IoT ALERT\n\nSensor: {{ $json.sensorId }}\nLocation: {{ $json.location }}\n\nReadings:\n• Temperature: {{ $json.readings.temperature }}°C\n• Humidity: {{ $json.readings.humidity }}%\n• CO2: {{ $json.readings.co2 }} ppm\n\nAI Analysis: {{ $json.analysis.reasoning }}\nRecommendation: {{ $json.analysis.recommendation }}", "otherOptions": {} }, "id": "c5a297be-ccef-40ba-9178-65805262efba", "name": "Slack Critical Alert", "type": "n8n-nodes-base.slack", "typeVersion": 2.2, "position": [ 384, 112 ], "webhookId": "19113595-0208-4b37-b68c-c9788c19f618" }, { "parameters": { "select": "channel", "channelId": { "__rl": true, "mode": "name", "value": "#iot-alerts" }, "text": "=⚠️ IoT Warning\n\nSensor: {{ $json.sensorId }} | Location: {{ $json.location }}\nIssue: {{ $json.analysis.anomalies[0] || 'Threshold approaching' }}\nRecommendation: {{ $json.analysis.recommendation }}", "otherOptions": {} }, "id": "5c3d7acf-0211-44dd-9f4b-a43d3796abb1", "name": "Slack Warning Alert", "type": "n8n-nodes-base.slack", "typeVersion": 2.2, "position": [ 384, 400 ], "webhookId": "37abfb19-f82f-4449-bd69-a65635b99606" }, { "parameters": {}, "id": "6bcbb42f-ec14-4f00-a091-babcc2d2d5c4", "name": "Merge Alert Outputs", "type": "n8n-nodes-base.merge", "typeVersion": 3, "position": [ 608, 184 ] }, { "parameters": { "operation": "append", "documentId": { "__rl": true, "mode": "list", "value": "" }, "sheetName": { "__rl": true, "mode": "list", "value": "" } }, "id": "6243aa23-408d-4928-a512-811eeb3b5f9e", "name": "Archive to Google Sheets", "type": "n8n-nodes-base.googleSheets", "typeVersion": 4.5, "position": [ 832, 184 ] }, { "parameters": { "model": "gpt-4o-mini", "options": { "temperature": 0.3 } }, "id": "61081e8a-ebc9-465f-8beb-88af225e59f2", "name": "OpenAI Chat Model", "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi", "typeVersion": 1.2, "position": [ -344, 408 ] } ], "pinData": {}, "connections": { "MQTT Sensor Trigger": { "main": [ [ { "node": "Merge Triggers", "type": "main", "index": 0 } ] ] }, "Batch Process Schedule": { "main": [ [ { "node": "Merge Triggers", "type": "main", "index": 1 } ] ] }, "Merge Triggers": { "main": [ [ { "node": "Define Sensor Thresholds", "type": "main", "index": 0 } ] ] }, "Define Sensor Thresholds": { "main": [ [ { "node": "Parse Sensor Payload", "type": "main", "index": 0 } ] ] }, "Parse Sensor Payload": { "main": [ [ { "node": "Generate Data Fingerprint", "type": "main", "index": 0 } ] ] }, "Generate Data Fingerprint": { "main": [ [ { "node": "Remove Duplicate Readings", "type": "main", "index": 0 } ] ] }, "Remove Duplicate Readings": { "main": [ [ { "node": "AI Anomaly Detector", "type": "main", "index": 0 } ] ] }, "AI Anomaly Detector": { "main": [ [ { "node": "Parse AI Analysis", "type": "main", "index": 0 } ] ] }, "Parse AI Analysis": { "main": [ [ { "node": "Route by Severity", "type": "main", "index": 0 } ] ] }, "Route by Severity": { "main": [ [ { "node": "Send Critical Email", "type": "main", "index": 0 }, { "node": "Slack Critical Alert", "type": "main", "index": 0 } ], [ { "node": "Slack Warning Alert", "type": "main", "index": 0 } ], [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Send Critical Email": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Slack Critical Alert": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Slack Warning Alert": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Merge Alert Outputs": { "main": [ [ { "node": "Archive to Google Sheets", "type": "main", "index": 0 } ] ] }, "OpenAI Chat Model": { "ai_languageModel": [ [ { "node": "AI Anomaly Detector", "type": "ai_languageModel", "index": 0 } ] ] } }, "active": false, "settings": { "executionOrder": "v1" }, "versionId": "", "meta": { "instanceId": "15d6057a37b8367f33882dd60593ee5f6cc0c59310ff1dc66b626d726083b48d" }, "tags": [] }
by Simeon Penev
Who’s it for Marketing, growth, and analytics teams who want a decision-ready GA4 summary—automatically calculated, clearly color-coded, and emailed as a polished HTML report. How it works / What it does Get Client (Form Trigger)* collects *GA4 Property ID (“Account ID”), **Key Event, date ranges (current & previous), Client Name, and recipient email. Overall Metrics This Period / Previous Period (GA4 Data API)** pull sessions, users, engagement, bounce rate, and more for each range. Form Submits This Period / Previous Period (GA4 Data API)** fetch key-event counts for conversion comparisons. Code** normalizes form dates for API requests. AI Agent* builds a *valid HTML email**: Calculates % deltas, applies green for positive (#10B981) and red for negative (#EF4444) changes. Writes summary and recommendations. Produces the final HTML only. Send a message (Gmail)** sends the formatted HTML report to the specified email address with a contextual subject. How to set up 1) Add credentials: Google Analytics OAuth2, OpenAI (Chat), Gmail OAuth2. 2) Ensure the form fields match your GA4 property and event names; “Account ID” = GA4 Property ID. Property ID - https://take.ms/vO2MG Key event - https://take.ms/hxwQi 3) Publish the form URL and run a test submission. Requirements GA4 property access (Viewer/Analyst) • OpenAI API key • Gmail account with send permission. Resources Google OAuth2 (GA4) – https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/ OpenAI credentials – https://docs.n8n.io/integrations/builtin/credentials/openai/ Gmail OAuth2 – https://docs.n8n.io/integrations/builtin/credentials/google/ GA4 Data API overview – https://developers.google.com/analytics/devguides/reporting/data/v1
by Charles
🚀 Daily IndieHackers Reddit Trend Analysis to Slack > Transform Reddit chaos into actionable startup intelligence > Get AI-powered insights from r/indiehackers delivered to your Slack every morning 🎯 Who's It For This template is designed for startup founders, growth teams, and product managers who need to: Stay ahead of indie hacker trends without manual Reddit browsing Understand what's working in the entrepreneurial community Get actionable insights for product and marketing decisions Keep their team informed about emerging opportunities Perfect for teams building products for entrepreneurs or anyone wanting to leverage community intelligence for competitive advantage. ✨ What It Does Transform your morning routine with automated intelligence gathering that delivers structured, AI-powered summaries of the hottest r/indiehackers discussions directly to your Slack channel. 🧠 Smart Analysis Features | Feature | Description | |---------|-------------| | 🔥 Hotness Scoring | Calculates engagement scores using time-decay algorithms | | 📊 Topic Extraction | Identifies key themes and trending subjects | | 💰 Traction Signals | Spots revenue, metrics, and growth indicators | | 🎯 Theme Clustering | Groups posts into actionable categories | | ⚡ Action Items | Generates specific recommendations for your team | 📱 Slack Integration Receive beautifully formatted messages with: Executive summaries and key takeaways Top 3 hottest posts with engagement metrics Interactive buttons for deeper exploration Team discussion prompts ⚙️ How It Works graph LR A[🕐 Daily 8AM Trigger] --> B[📱 Fetch Reddit Posts] B --> C[🔄 Process Data] C --> D[🤖 Gemini AI Analysis] D --> E[✨ Groq Slack Formatting] E --> F[💬 Deliver to Slack] 🔄 The Complete Process Step 1: Automated Trigger Every morning at 8 AM, the workflow springs into action Step 2: Reddit Data Collection Fetches the latest 5 posts from r/indiehackers with full metadata Step 3: Data Processing Structures raw Reddit data for optimal AI analysis Step 4: AI-Powered Analysis Gemini AI performs deep analysis calculating hotness scores, extracting topics, and identifying patterns Step 5: Slack Formatting Groq AI Agent transforms insights into beautiful Slack Block Kit messages Step 6: Team Delivery Your designated Slack channel receives the formatted analysis 🛠️ Requirements You'll need API access for: Reddit (OAuth2), Google Gemini, Groq, and Slack (OAuth2). All have free tiers available. 🚀 Setup Guide 1️⃣ Configure Your Credentials Add these credentials in n8n: Reddit OAuth2, Google Gemini, Groq, and Slack OAuth2. The workflow will guide you through each setup. 2️⃣ Customize the Schedule Default: Daily at 8:00 AM To modify: Edit the "Daily Schedule" cron trigger node // Example: Run at 9:30 AM { "triggerTimes": { "item": [{ "hour": 9, "minute": 30 }] } } 3️⃣ Set Your Slack Destination Open the "Send to Slack" node Select your target channel Configure notification preferences 4️⃣ Adjust Analysis Parameters Post Limit: Change from default 5 posts // In "Get many posts" Reddit node "limit": 10 // Recommended: 3-10 posts Context Customization: { "channel_type": "team", "audience": "Growth, Product, and Founders", "cta_link": "https://your-dashboard.com", "timeframe_label": "This Week" } 🎨 Customization Options 🔍 Analysis Focus Areas Transform the workflow for different insights: SaaS-Focused Analysis Add to Gemini prompt: "Focus on SaaS and B2B insights, prioritizing recurring revenue and product-market fit signals" Geographic Targeting Add: "Prioritize posts relevant to [your region/market]" Stage-Specific Insights Add: "Focus on [early-stage/growth-stage] startup challenges" 📈 Hotness Algorithm Tweaking Default Formula: (ups + 2*num_comments) * freshness_decay Emphasize Comments: (ups + 3*num_comments) * freshness_decay Include Upvote Ratio: (ups * upvote_ratio + 2*num_comments) * freshness_decay 🌐 Multi-Subreddit Analysis Expand beyond r/indiehackers: Additional Communities: r/startups r/entrepreneur r/SideProject r/buildinpublic r/nocode 💾 Data Storage Extensions Enhance with historical tracking: | Node Type | Purpose | Benefit | |-----------|---------|---------| | Google Sheets | Trend storage | Historical analysis | | Airtable | Advanced data management | Rich analytics | | Webhook | External analytics | Custom dashboards | 📊 Expected Output 📱 Daily Slack Message Structure 🚀 IndieHackers Trends — This Week 📋 TL;DR: [One-sentence key insight] 🔥 Hot Posts (Top 3) [Post Title] (Hotness: 8.7) Topics: SaaS launch, pricing strategy 💬 23 comments | 👍 156 ups | 📅 Posted 4 hours ago [Open Reddit Button] 🧭 Themes Summary Go-to-market tactics — 3 posts, hotness: 24.1 Product launches — 2 posts, hotness: 18.3 ✅ What to Do Now Test pricing page variations based on community feedback Consider cold email strategies mentioned in hot posts Validate product ideas using discussed frameworks [Open Dashboard Button] 💡 Pro Tips for Success 🎯 Optimization Strategies Week 1-2: Baseline Monitor output quality and team engagement Note which insights generate the most discussion Week 3-4: Refinement Adjust AI prompts based on feedback Fine-tune hotness scoring for your needs Month 2+: Advanced Usage Add historical trend analysis Create custom dashboards with stored data Build feedback loops for continuous improvement 🚨 Common Pitfalls to Avoid | Issue | Solution | |-------|---------| | API Rate Limits | Reduce post count or increase time intervals | | Poor Insight Quality | Refine prompts with specific examples | | Team Engagement Drop | Rotate focus areas and encourage thread discussions | | Information Overload | Limit to top 3 posts and key themes only | 🔧 Troubleshooting ❌ Common Issues & Solutions "Model not found" Error Cause: Gemini regional availability Fix: Check supported regions or switch to alternative AI model Slack Formatting Broken Cause: Invalid Block Kit JSON Fix: Validate JSON structure in AI Agent output Missing Reddit Data Cause: API credentials or rate limits Fix: Verify OAuth2 setup and check usage quotas AI Timeouts Cause: Too much data or complex prompts Fix: Reduce post count or simplify analysis requests ⚡ Performance Optimization Keep analysis under 10 posts for optimal speed Monitor execution times in n8n logs Add error handling nodes for production reliability Use webhook timeouts for external API calls 🌟 Advanced Use Cases 📈 Competitive Intelligence Modify prompts to track specific competitors or market segments mentioned in discussions 🎯 Product Validation Focus analysis on posts related to your product category for market research 📝 Content Strategy Use trending topics to inform your content calendar and thought leadership 🤝 Community Engagement Identify opportunities to participate in discussions and build relationships Ready to transform your startup intelligence gathering? 🚀 Deploy this workflow and start receiving actionable insights tomorrow morning!
by Evervise
🤖 AI Business Automation Opportunity Finder Turn automation audits into high-ticket sales with this ROI-focused n8n workflow powered by 4 specialized AI agents that identify and quantify automation opportunities in any business. What It Does This workflow analyzes any business and delivers a comprehensive automation blueprint with concrete ROI calculations in under 60 seconds. Perfect for agencies, consultants, and automation experts looking to generate qualified leads and close high-value deals. Unlike generic automation advice, this delivers personalized, quantified opportunities ranked by return on investment - making it incredibly easy for prospects to say yes. 🤖 Four Specialized AI Agents Business Analyst - Deep analysis of business model, workflows, pain points, tech stack, and scalability challenges Process Mapper - Maps all repetitive processes, calculates time waste, identifies bottlenecks across the entire operation Automation Architect - Designs 15+ specific automation solutions with tools, complexity ratings, and implementation steps ROI Calculator - Calculates detailed ROI for each automation, ranks top 10, creates 90-day implementation roadmap ✨ Key Features Concrete Dollar Savings**: Every automation shows exact time saved, labor cost saved, and payback period Top 10 Ranked by ROI**: Opportunities prioritized by impact vs. effort with detailed financial analysis 90-Day Implementation Roadmap**: Month-by-month plan showing progressive savings milestones Comprehensive Process Mapping**: Identifies inefficiencies they didn't even mention Tool-Specific Recommendations**: Exact tools and platforms needed (n8n, Zapier, Make, etc.) Beautiful HTML Reports**: Professional, conversion-focused email with 3-tier pricing built in Multiple CTAs**: Strategically placed conversion points throughout the report 📊 What Gets Analyzed Business Analysis Business model and revenue streams Operational workflows and processes Current tech stack assessment Team capacity and resource allocation Growth stage and scalability blockers Industry-specific automation patterns Process Mapping Comprehensive workflow documentation Time waste analysis (hours per month) Bottleneck identification Process dependencies and integration opportunities Quick win vs. strategic project categorization Automation Architecture For each of 15+ automation opportunities: Clear description of what it automates Specific tools required Step-by-step implementation flow Complexity rating (Easy/Medium/Hard) Prerequisites and requirements Additional benefits beyond time savings Real-world use case examples ROI Calculations For each automation: Time saved per week/month/year Labor cost savings (calculated from team size/industry) One-time implementation cost Ongoing monthly costs Payback period in months 12-month net savings ROI percentage Priority score (0-10) 💼 Perfect For Automation Agencies**: High-value lead magnet that pre-sells your services Business Consultants**: Demonstrate ROI before engagement No-Code Developers**: Show concrete value of your expertise Digital Transformation Consultants**: Quantify the opportunity SaaS Companies**: Lead gen for automation/workflow tools Freelancers**: Land bigger clients with data-driven proposals 🚀 Why This Converts Better Than Other Lead Magnets Traditional Lead Magnets: Generic advice ("You should automate") Subjective benefits ("Save time") No clear next steps Conversion rate: 5-10% This Workflow: Specific to their business** (personalized analysis) Quantified in dollars** ($50K+ annual savings) Prioritized action plan** (top 10 ranked by ROI) Clear implementation path** (90-day roadmap) Conversion rate: 20-30%** to strategy call 40-50% of calls close** to paid engagement 📈 Expected Business Results Per 100 Form Submissions: 25-30 strategy calls booked** (25-30% conversion) 10-15 deals closed** (40-50% call-to-close rate) $12K-18K in initial revenue** (mix of Tier 1 & 2) 2-4 retainer clients** ($30K-60K annual value) Total potential: $42K-78K** from 100 leads Why It Works: Self-qualifying**: Detailed form filters serious prospects Pre-sold**: They see the value before the call ROI-focused**: Speaks CFO language (dollars, not features) Urgency**: Shows money being wasted daily Social proof**: Built-in testimonials and case studies 📋 What You Need Required n8n instance (self-hosted or cloud) Anthropic API key (Claude Sonnet 4.5) Gmail account or SMTP provider Optional Enhancements CRM integration (HubSpot, Salesforce, Pipedrive) Slack notifications for high-value leads Calendly for automatic call booking Zapier/Make for additional workflows Analytics tracking (Mixpanel, Segment) ⚙️ Technical Details AI Model**: Claude Sonnet 4.5 (4 sequential agents) Average Runtime**: 50-70 seconds Cost Per Analysis**: ~$0.20-0.30 Form Fields**: 9 (business description, industry, team size, tasks, tools, bottleneck, revenue, email, name) Output**: Comprehensive HTML email with all analyses, pricing, and CTAs 🎨 Customization Options The workflow is fully customizable and includes detailed documentation: Adjust ROI calculation parameters (labor rates by industry) Modify agent prompts for specific niches Customize pricing tiers and packages Add/remove form fields White-label the entire report Integrate with your CRM/marketing stack Segment responses by company size or revenue Add video walkthroughs or personalized messages Create industry-specific versions 📊 Form Fields Explained The 9-field form is strategically designed to gather intelligence: Business Description (textarea): Core operations and offerings Industry/Niche (text): Context for automation patterns Team Size (dropdown): Affects ROI calculations and tool recommendations Repetitive Tasks (textarea): Gold mine for automation opportunities Current Tools (textarea): Integration points and tech stack assessment Biggest Bottleneck (textarea): Primary pain point for targeting Monthly Revenue (optional dropdown): For accurate ROI estimates and lead scoring Email (required): For report delivery Name (required): For personalization 🔧 Setup Difficulty Basic - Requires basic n8n knowledge and API configuration Setup Steps Import workflow JSON to n8n Add Anthropic API credentials Configure Gmail/SMTP credentials Customize branding and pricing in email template Test with sample business scenarios Deploy form on your website Set up follow-up sequences (recommended) 📚 Included Documentation Comprehensive sticky notes** for every component Setup instructions** with prerequisites Customization guide** for different industries Pricing strategy** breakdown and alternatives Conversion optimization** tips Follow-up sequence** recommendations Sales script** suggestions for strategy calls Marketing promotion** ideas 🌟 Advanced Use Cases 1. Lead Magnet Embed on website to capture qualified automation leads continuously 2. Discovery Tool Use during sales calls to demonstrate immediate value and build credibility 3. Content Marketing Offer in LinkedIn posts, email campaigns, YouTube videos for viral growth 4. Partner Program White-label for partners/affiliates to generate leads in their networks 5. Upsell Sequence For existing clients, identify additional automation opportunities 6. Industry Templates Create versions for specific industries (real estate, e-commerce, agencies) 7. Competitive Intelligence Analyze competitor operations and position your services ⚡ Why This Workflow Stands Out Compared to Generic Automation Audits: ✅ Quantified in dollars vs. vague "save time" claims ✅ Personalized to their business vs. generic templates ✅ Prioritized by ROI vs. random feature lists ✅ Actionable roadmap vs. overwhelming possibilities ✅ Tool-specific vs. theoretical concepts Compared to Manual Analysis: ✅ 60 seconds vs. 2-3 hours of consultant time ✅ $0.25 cost vs. $300-500 in labor ✅ Consistent quality vs. variable analyst experience ✅ Scalable vs. bottlenecked by human capacity ✅ 24/7 available vs. business hours only 🤝 Support & Community 📖 Website: https://evervise.ai/ ✨ Support: mark.marin@evervise.com N8N Link 🎁 Bonus Resources Included Follow-up email sequence** (3 emails over 10 days) Sales call script** for strategy calls Objection handling** guide Pricing calculator** spreadsheet Marketing assets** (social media templates) Case study template** for testimonials Tags automation lead-generation roi-calculator business-analysis process-mapping ai-agents anthropic claude workflow-automation business-consulting no-code n8n-workflows high-ticket-sales conversion-optimization saas-tools Ready to turn automation audits into recurring revenue? Import this workflow and start attracting qualified leads who can see the exact dollar value you provide before they even talk to you. Average user results: $42K-78K revenue from first 100 form submissions.
by Julian Kaiser
What problem does this solve? Earlier this year, as I got more involved with n8n, I committed to helping users on our community forums and the n8n subreddit. The volume of questions was growing, and I found it was a real challenge to keep up and make sure no one was left without an answer. I needed a way to quickly see what people were struggling with, without spending hours just searching for new posts. So, I built this workflow. It acts as my personal AI research assistant. Twice a day, it automatically scans Reddit and the n8n forums for me. It finds relevant questions, summarizes the key points using AI, and sends me a digest with direct links to each post. This allows me to jump straight into the conversations that matter and provide support effectively. While I built this for n8n support, you can adapt it to monitor any community, track product feedback, or stay on top of any topic you care about. It transforms noisy forums into an actionable intelligence report delivered right to your inbox. How it works Here’s the technical breakdown of my two-part system: AI Reddit Digest (Daily at 9AM / 5 PM): Fetches the latest 50 posts from a specified subreddit. Uses an AI Text Classifier to categorize each post (e.g., QUESTION, JOB_POST). Isolates the posts classified as questions and uses an AI model to generate a concise summary for each. Formats the original post link and its new summary into an email-friendly format and sends the digest. AI n8n Forum Digest (Daily at 9AM / 5 PM): Scrapes the n8n community forum to get a list of the latest post links. Processes each link individually, fetching the full post content. Filters these posts to keep only those containing a specific keyword (e.g., "2025"). Summarizes the filtered posts using an AI model. Combines the original post link with its AI summary and sends it in a separate email report. Set up steps This workflow is quite powerful and requires a few configurations. Setup should take about 15 minutes. Add Credentials: First, add your credentials for your AI provider (like OpenRouter) and your email service (like Gmail or SMTP) in the Credentials section of your n8n instance. Configure Reddit Digest: In the Get latest 50 reddit posts node, enter the name of the Subreddit you want to follow. Fine-tune the AI's behavior by editing the prompt in the Summarize Reddit Questions node. (Optional) Add more examples to the Text Classifier node to improve its accuracy. Configure n8n Forum Digest: In the Filter 2025 posts node, change the keyword to track topics you're interested in. Edit the prompt in the Summarize n8n Forum Posts node to guide the AI's summary style. Activate Workflow: Once configured, just set the workflow to Active. It will run automatically on schedule. You can also trigger it manually with the When clicking 'Test workflow' node.
by Incrementors
Overview: This n8n workflow automates the complete blog publishing process from topic research to WordPress publication. It researches topics, writes SEO-optimized content, generates images, publishes posts, and notifies teams—all automatically from Google Sheets input. How It Works: Step 1: Client Management & Scheduling Client Data Retrieval:** Scans master Google Sheet for clients with "Active" project status and "Automation" blog publishing setting Publishing Schedule Validation:** Checks if current day matches client's weekly frequency (Mon, Tue, Wed, Thu, Fri, Sat, Sun) or if set to "Daily" Content Source Access:** Connects to client-specific Google Sheet using stored document ID and sheet name Step 2: Content Planning & Selection Topic Filtering:** Retrieves rows where "Status for Approval" = "Approved" and "Live Link" = "Pending" Content Validation:** Ensures Focus Keyword field is populated before proceeding Single Topic Processing:** Selects first available topic to maintain quality and prevent API rate limits Step 3: AI-Powered Research & Writing Comprehensive Research:** Google Gemini analyzes search intent, competitor content, audience needs, trending subtopics, and LSI keywords Content Generation:** Creates 800-1000 word articles with natural keyword integration, internal linking, and conversational tone optimized for Indian investors Quality Assessment:** Evaluates content for human-like writing, conversational tone, readability, and engagement factors Content Optimization:** Automatically fixes grammar, punctuation, sentence flow, and readability issues while maintaining HTML structure Step 4: Visual Content Creation Image Prompt Generation:** OpenAI creates detailed prompts based on blog title and content for professional visuals Image Generation:** Ideogram AI produces 1248x832 resolution images with realistic styling and professional appearance Binary Processing:** Downloads and converts generated images to binary format for WordPress upload Step 5: WordPress Publication Media Upload:** Uploads generated image to WordPress media library with proper filename and headers Content Publishing:** Creates new WordPress post with title, optimized content, and embedded image Featured Image Assignment:** Sets uploaded image as post's featured thumbnail for proper display Category Assignment:** Automatically assigns posts to predefined category Step 6: Tracking & Communication Status Updates:** Updates Google Sheet with live blog URL in "Live Link" column using S.No. as identifier Team Notification:** Sends Discord message to designated channel with published blog link and review request Process Completion:** Triggers next iteration or workflow conclusion based on remaining topics Setup Steps: Estimated Setup Time: 45-60 minutes Required API Credentials: 1. Google Sheets API Service account with sheets access OAuth2 credentials for client-specific sheets Proper sharing permissions for all target sheets 2. Google Gemini API Active API key with sufficient quota Access to Gemini Pro model for content generation Rate limiting considerations for bulk processing 3. OpenAI API GPT-4 access for creative prompt generation Sufficient token allocation for daily operations Fallback handling for API unavailability 4. Ideogram AI API Premium account for quality image generation API key with generation permissions Understanding of rate limits and pricing 5. WordPress REST API Application passwords for each client site Basic authentication setup with proper encoding REST API enabled in WordPress settings User permissions for post creation and media upload 6. Discord Bot API Bot token with message sending permissions Channel ID for notifications Guild access and proper bot roles Master Sheet Configuration: Document Structure: Create primary tracking sheet with columns Client Name:** Business identifier Project Status:** Active/Inactive/Paused Blog Publishing:** Automation/Manual/Disabled Website URL:** Full WordPress site URL with trailing slash Blog Posting Auth Code:** Base64 encoded username: password On Page Sheet:** Google Sheets document ID for content planning WeeklyFrequency:** Daily/Mon/Tue/Wed/Thu/Fri/Sat/Sun Discord Channel:** Channel ID for notifications Content Planning Sheet Structure: Required Columns (exact naming required): S.No.:** Unique identifier for tracking Focus Keyword:** Primary SEO keyword Content Topic** Article title/subject Target Page:** Internal linking target Words:** Target word count Brief URL:** Content brief reference Content URL:** Draft content location Status for Approval:** Pending/Approved/Rejected Live Link:** Published URL (auto-populated) WordPress Configuration: REST API Activation:** Ensure wp-json endpoint accessibility User Permissions:** Create dedicated user with Editor or Administrator role Application Passwords:** Generate secure passwords for API authentication Category Setup:** Create or identify category ID for automated posts Media Settings:** Configure upload permissions and file size limits Security:** Whitelist IP addresses if using security plugins Discord Integration Setup: Bot Creation:** Create application and bot in Discord Developer Portal Permissions:** Grant Send Messages, Embed Links, and Read Message History Channel Configuration:** Set up dedicated channel for blog notifications User Mentions:** Configure user ID for targeted notifications Message Templates:** Customize notification format and content Workflow Features & Capabilities: Content Quality Standards: SEO Optimization:** Natural keyword integration with LSI keywords and related terms Readability:** Conversational tone with short sentences and clear explanations Structure:** Proper HTML formatting with headings, lists, and internal links Length:** Consistent 800-1000 word count for optimal engagement Audience Targeting:** Content tailored for Indian investor audience with relevant examples Image Generation Specifications: Resolution:** 1248x832 pixels optimized for blog headers Style:** Realistic professional imagery with human subjects Design:** Clean layout with heading text placement (bottom or left side) Quality:** High-resolution output suitable for web publishing Branding:** Light beige to gradient backgrounds with golden overlay effects Error Handling & Reliability: Graceful Failures:** Workflow continues even if individual steps encounter errors API Rate Limits:** Built-in delays and retry mechanisms for external services Data Validation:** Checks for required fields before processing Backup Processes:** Alternative paths for critical failure points Logging:** Comprehensive tracking of successes and failures Security & Access Control: Credential Encryption:** All API keys stored securely in n8n vault Limited Permissions:** Service accounts with minimum required access Authentication:** Basic auth for WordPress with encoded credentials Data Privacy:** No sensitive information exposed in logs or outputs Access Logging:** Track all sheet modifications and blog publications Troubleshooting: Common Issues: API Rate Limits:** Check your API quotas and usage limits WordPress Authentication:** Verify your basic auth credentials are correct Sheet Access:** Ensure Google Sheets API has proper permissions Image Generation Fails:** Check Ideogram API key and quotas Need Help?: For technical support or questions: Email: info@incrementors.com Contact Form: https://www.incrementors.com/contact-us/
by Dean Pike
CV → Match → Screen → Decide, all automated This workflow automatically processes candidate CVs from email, intelligently matches them to job descriptions, performs AI-powered screening analysis, and sends actionable summaries to your team in Slack. Good to know Handles both PDF and Word document CVs automatically Two-stage JD matching: prioritizes role mentioned in candidate's email, falls back to CV analysis if needed Uses Google Gemini API for AI screening (generous free tier and rate limits, typically enough to avoid paying for API requests, but check latest pricing at Google AI Pricing) All CVs stored in Google Drive with standardized naming (candidate name + date/time) Complete audit trail logged in Google Sheets Who's it for Hiring teams and recruiters who want to automate first-round CV screening while maintaining quality. Perfect for companies receiving high volumes of applications across multiple roles, especially in tech, sales, or automation-focused positions. How it works Gmail monitors inbox for CVs with specific label and downloads attachments Detects file type (PDF or Word) and converts/standardizes format for text extraction AI agent matches candidate to best-fit job description by analyzing email context first (if candidate mentioned a role), or CV content as fallback (selects up to 3 potential JD matches) If multiple JDs matched, second AI agent selects the single best fit AI recruiter agent analyzes CV against selected JD and generates structured screening report (strengths, weaknesses, risk/reward factors, overall fit score 0-10 with justification) Extracts candidate details (name, email) from CV text Logs complete analysis to Google Sheets tracker Sends formatted summary to Slack with Proceed/Reject action buttons for instant team decisions Requirements Gmail account with API access Google Drive account (OAuth2) Google Sheets account (OAuth2) Slack workspace with bot permissions Google Gemini API key (Get free key here) Google Drive folders: one for CVs, one for Job Descriptions (as PDFs or Google Docs) How to set up Add credentials: Gmail OAuth2, Google Drive OAuth2, Google Sheets OAuth2, Slack OAuth2, Google Gemini API Create Gmail label (e.g., "CV-Screening") for incoming candidate emails In "Receive CV via Email" node: select your Gmail label for filtering Create two Google Drive folders: "Candidate CVs" and "Job Descriptions" In "Upload CV - PDF" and "Stream Doc/Docx File" nodes: update folder ID to your "Candidate CVs" folder In "Access JD Files" node: update folder ID to your "Job Descriptions" folder Create Google Sheet named "AI Candidate Screening" with columns matching the sample AI Candidate Screening sheet In "Append row in sheet" node: select your Google Sheet In "Send Candidate Screening Confirmation" node: select your Slack channel Activate workflow Customizing this workflow Change JD matching logic: Edit "JD Matching Agent" node prompt to adjust how CVs are matched to roles (e.g., weight technical skills vs. experience). Change "Company Description" in AI prompts: Insert your "Company Description" in System Message sections in "JD Matching Agent" and "Detailed JD Matching Agent" nodes Modify screening criteria: Edit "Recruiter Scoring Agent" node system message to focus on specific qualities (culture fit, leadership, technical depth, etc.) Add more storage locations: Add nodes to save CVs to other systems (Notion, Airtable, ATS platforms) Customize Slack message: Edit "Send Candidate Screening Confirmation" node to change formatting, add more context, or include additional candidate data Auto-proceed logic: Add IF node after screening to auto-proceed candidates with fit score above threshold (e.g., 8+/10) Add email responses: Connect nodes to automatically email candidates (confirmation, rejection, interview invite) Add human-in-the-loop: Sub-workflow triggered by Slack response or email response, to update Sheet with approve/reject status Add candidate email responses + interview scheduling**: For approved candidates, trigger email to candidate with Cal.com or Calendly link so they can book their interview Quick Troubleshooting No CVs being processed: Check Gmail label is correctly set in "Receive CV via Email" node and emails are being labeled Word documents failing: Verify "Stream Doc/Docx File" node has correct parent folder ID and Google Drive credentials authorized JD matching returns no results: Check "Access JD Files" node folder ID points to your Job Descriptions folder, and JD files are named clearly (e.g., "Marketing Director JD.pdf") JD matching is not relevant for my company: Update the "Company Description" in the System Messages in the "JD Matching Agent" and "Detailed JD Matching Agent" nodes "Can't find matching JD": Ensure candidate's email mentions role name OR their CV clearly indicates relevant experience for available JDs Google Sheets errors: Verify sheet name is "AI Candidate Screening" and column headers exactly match workflow expectations (Submission ID, Date, CV, First Name, etc.) Slack message not appearing: Re-authorize Slack credentials and confirm channel ID in "Send Candidate Screening Confirmation" node Missing candidate name/email: CV text must be readable - check PDF extraction quality or try converting complex CVs to simpler format 401/403 API errors: Re-authorize all OAuth2 credentials (Gmail, Google Drive, Google Sheets, Slack) AI analysis quality issues: Edit system prompts in "JD Matching Agent" and "Recruiter Scoring Agent" nodes to refine screening criteria Sample Outputs Google Sheets - AI Candidate Screening - sample Slack confirmation message Acknowledgments This workflow was inspired by Nate Herk's YouTube demonstration on building a resume analysis system. This implementation builds upon that foundation by adding dynamic job description matching (initial + detailed JD matching agents), Slack Block Kit integration with interactive buttons, updated Google Drive API methods for document handling, and enhanced candidate data capture in Google Sheets.