by DevCode Journey
Who is this for? This workflow is designed for content creators, marketers, and entrepreneurs who want to automate their video production and social media publishing process. If you regularly post promotional or viral-style content on platforms like TikTok, YouTube Shorts, Instagram Reels, LinkedIn, and more, this template will save you hours of manual work. What problem is this workflow solving? / Use case Creating viral short-form videos is often time-consuming: You need to generate visuals, write scripts, edit videos, and then manually upload them to multiple platforms. Staying consistent across TikTok, YouTube Shorts, Instagram Reels, LinkedIn, Twitter/X, and others requires constant effort. This workflow solves the problem by automating the entire pipeline from idea → video creation → multi-platform publishing. What this workflow does Collects an idea and image from Telegram Enhances visuals with NanoBanana for user-generated content style Generates a complete video script with AI (OpenAI + structured prompts) Creates the final video with VEO3 using your custom prompt and visuals Rewrites captions with GPT to be short, catchy, and optimized for social platforms Saves metadata in Google Sheets for tracking and management Auto-uploads the video to all major platforms via Blotato TikTok YouTube Instagram LinkedIn Threads Pinterest X/Twitter Bluesky Facebook Notifies you on Telegram with a preview link once publishing is complete Setup Connect your accounts: Google Sheets** (for video tracking) Telegram** (to receive and send media) Blotato** (for multi-platform publishing) OpenAI API** (for captions, prompts, and image analysis) VEO3 API** (for video rendering) Fal.ai** (for NanoBanana image editing) Google Drive** (to store processed images) Set your credentials in the respective nodes. Adjust the Google Sheet IDs to match your own sheet structure. Insert your Telegram bot token in the Set: Bot Token (Placeholder) node. 🙋 For Help & Community 🌐 Website: devcodejourney.com 🔗 LinkedIn: Connect with Shakil 📱 WhatsApp Channel: Join Now 💬 Direct Chat: Message Now
by Pikor
Different Articles Summarizer & Social Media Auto-Poster This n8n template demonstrates how to extract full-text articles from different news websites, summarize them with AI, and automatically generate content for social networks (Twitter, Instagram, Threads, LinkedIn, YouTube). You can use it for any news topic. Example: posting summaries of breaking news articles. Possible use cases : Automate press article summarization with GPT. Create social media posts optimized for young audiences. Publish content simultaneously across multiple platforms with Late API. How it works The workflow starts manually or with a trigger. URLs of news articles are defined in the Edit Fields node. Each URL is processed separately via Split Out. HTTP Request fetches the article HTML. Custom Code node extracts clean text (title, content, main image). OpenAI summarizes each article factually. Aggregate combines results. Another OpenAI node (Message a model) creates structured JSON summaries for young readers. A final OpenAI node (Message a model1) generates short social media posts (hook, summary, CTA, hashtags). Images are extracted via HTML1 and uploaded to Google Drive. Posts (text + image) are sent to Late API for multi-platform scheduling (Twitter, Instagram, Threads, LinkedIn, YouTube). Requirements OpenAI API key connected to n8n. Google Drive account (for storing article images). Late API credentials with platform account IDs. Valid list of article URLs.
by Václav Čikl
Overview Transform your Gmail sent folder into a comprehensive, enriched contact database automatically. This workflow processes hundreds or thousands of sent emails, extracting and enriching contact information using AI and web search – saving days of manual work. What This Workflow Does Loads sent Gmail messages and extracts basic contact information Deduplicates contacts against your existing Google Sheets database Searches for email conversation history with each contact AI-powered extraction from email threads (phone, socials, websites) Fallback web search via Brave API when no email history exists Saves enriched data to Google Sheets with all discovered contact details Perfect For Musicians & bands** organizing booker/venue contacts Freelancers & agencies** building client databases Sales teams** enriching prospect lists from outbound campaigns Consultants** creating structured contact databases from years of emails Key Features Intelligent Two-Path Enrichment Path A (Email History)**: Analyzes existing email threads to extract contact details from signatures and message content Path B (Web Search)**: Falls back to Brave API search + HTML scraping when no email history exists AI-Powered Data Extraction Uses GPT-5 Nano to intelligently parse: Phone numbers Website URLs LinkedIn profiles Instagram, Twitter, Facebook, Youtube, TikTok, LinkTree, BandCamp... Alternative email addresses Built-in Deduplication Prevents duplicate entries by checking existing Google Sheets records before processing. Free-Tier Friendly Runs entirely on free tiers: Gmail API (free) OpenAI GPT-5 Nano (cost-effective) Brave Search API (2,000 free searches/month) Google Sheets (free) Setup Requirements Required Accounts & Credentials Gmail Account - OAuth2 credentials for Gmail API access OpenAI API Key - For GPT-5 Nano model Brave Search API Key - Free tier (2,000 searches/month) Google Sheets - OAuth2 credentials Google Sheets Structure Create a Google Sheet with these columns (see template link): Template Sheet: Make a copy here How to Use Clone this workflow to your n8n instance Configure credentials for Gmail, OpenAI, Brave Search, and Google Sheets Create/connect your Google Sheet using the template structure Run manually to process all sent emails and build your initial database Review results in Google Sheets - enriched with discovered contact info First Run Tips Start with a smaller Gmail query (e.g., last 6 months) to test Check Brave API quota before processing large volumes Manual trigger means you control when processing happens Processing time varies based on email volume (typically 2-5 seconds per contact) Customization Ideas Extend the Enrichment Include company information parsing Extract job titles from email signatures Automate Regular Updates Convert manual trigger to scheduled trigger Process only recent sent emails for incremental updates Add email notification when new contacts are added Integration Options Push enriched contacts to CRM (HubSpot, Salesforce) Send Slack notifications for high-value contacts Export to Airtable for relational database features Improve Accuracy Add human-in-the-loop review for uncertain extractions Implement confidence scoring for AI-extracted data Add validation checks for phone numbers and URLs Use Case Example Music Promoter Building Venue Database: Processed 1,835 sent emails to bookers and venues AI extracted contact details from 60% via email signatures Brave search found websites for remaining 40% Final database: 1,835 enriched contacts ready for outreach Time saved: ~40 hours of manual data entry Technical Notes Rate Limiting**: Brave API free tier = 2,000 searches/month Duplicates**: Handled at workflow start, not during processing Empty Results**: Stores email + name even when enrichment fails Model**: Uses GPT-5 Nano for cost-effective parsing Gmail Scope**: Reads sent emails only (not inbox) Cost Estimate For processing 1,000 contacts: Gmail API**: Free GPT-5 Nano**: ~$0.50-2 (depending on email length) Brave Search**: Free (within 2K/month limit) Google Sheets**: Free Total**: Under $2 for 1,000 enriched contacts Template Author: Questions or need help with setup? 📧 Email:xciklv@gmail.com 💼 LinkedIn:https://www.linkedin.com/in/vaclavcikl/
by Dr. Firas
Generate AI viral videos with NanoBanana & VEO3, shared on socials via Blotato Who is this for? This workflow is designed for content creators, marketers, and entrepreneurs who want to automate their video production and social media publishing process. If you regularly post promotional or viral-style content on platforms like TikTok, YouTube Shorts, Instagram Reels, LinkedIn, and more, this template will save you hours of manual work. What problem is this workflow solving? / Use case Creating viral short-form videos is often time-consuming: You need to generate visuals, write scripts, edit videos, and then manually upload them to multiple platforms. Staying consistent across TikTok, YouTube Shorts, Instagram Reels, LinkedIn, Twitter/X, and others requires constant effort. This workflow solves the problem by automating the entire pipeline from idea → video creation → multi-platform publishing. What this workflow does Collects an idea and image from Telegram. Enhances visuals with NanoBanana for user-generated content style. Generates a complete video script with AI (OpenAI + structured prompts). Creates the final video with VEO3 using your custom prompt and visuals. Rewrites captions with GPT to be short, catchy, and optimized for social platforms. Saves metadata in Google Sheets for tracking and management. Auto-uploads the video to all major platforms via Blotato (TikTok, YouTube, Instagram, LinkedIn, Threads, Pinterest, X/Twitter, Bluesky, Facebook). Notifies you on Telegram with a preview link once publishing is complete. Setup Connect your accounts: Google Sheets (for video tracking) Telegram (to receive and send media) Blotato (for multi-platform publishing) OpenAI API (for captions, prompts, and image analysis) VEO3 API (for video rendering) Fal.ai (for NanoBanana image editing) Google Drive (to store processed images) Set your credentials in the respective nodes. Adjust the Google Sheet IDs to match your own sheet structure. Insert your Telegram bot token in the Set: Bot Token (Placeholder) node. How to customize this workflow to your needs Platforms**: Disable or enable only the Blotato social accounts you want to post to. Video style**: Adjust the master prompt schema in the Set Master Prompt node to fine-tune tone, camera style, or video format. Captions**: Modify the GPT prompt in the Rewrite Caption with GPT-4o node to control length and tone. Notifications**: Customize the Telegram nodes to notify team members, not just yourself. Scheduling**: Add a Cron trigger if you want automatic posting at specific times. ✨ With this workflow, you go from idea → AI-enhanced video → instant multi-platform publishing in just minutes, with almost no manual work. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Dr. Firas
💥 Automate YouTube thumbnail creation from video links (with templated.io) Who is this for? This workflow is designed for content creators, YouTubers, and automation enthusiasts who want to automatically generate stunning YouTube thumbnails and streamline their publishing workflow — all within n8n. If you regularly post videos and spend hours designing thumbnails manually, this automation is built for you. What problem is this workflow solving? Creating thumbnails is time-consuming — yet crucial for video performance. This workflow completely automates that process: No more manual design. No more downloading screenshots. No more repetitive uploads. In less than 2 minutes, you can refresh your entire YouTube thumbnail library and make your channel look brand new. What this workflow does Once activated, this workflow can: ✅ Receive YouTube video links via Telegram ✅ Extract metadata (title, description, channel info) via YouTube API ✅ Generate a custom thumbnail automatically using Templated.io ✅ Upload the new thumbnail to Google Drive ✅ Log data in Google Sheets ✅ Send email and Telegram notifications when ready ✅ Create and publish AI-generated social posts on LinkedIn, Facebook, and Twitter via Blotato Bonus: You can re-create dozens of YouTube covers in minutes — saving up to 5 hours per week and around $500/month in manual design effort. Setup 1️⃣ Get a YouTube Data API v3 key from Google Cloud Console 2️⃣ Create a Templated.io account and get your API key + template ID 3️⃣ Set up a Telegram bot using @BotFather 4️⃣ Create a Google Drive folder and copy the folder ID 5️⃣ Create a Google Sheet with columns: Date, Video ID, Video URL, Title, Thumbnail Link, Status 6️⃣ Get your Blotato API key from the dashboard 7️⃣ Connect your social media accounts to Blotato 8️⃣ Fill all credentials in the Workflow Configuration node 9️⃣ Test by sending a YouTube URL to your Telegram bot How to customize this workflow Replace the Templated.io template ID with your own custom thumbnail layout Modify the OpenAI node prompts to change text tone or style Add or remove social platforms in the Blotato section Adjust the wait time (default: 5 minutes) based on template complexity Localize or translate the generated captions as needed Expected Outcome With one Telegram message, you’ll receive: A professional custom thumbnail An instant email + Telegram notification A Google Drive link with your ready-to-use design And your social networks will be automatically updated — no manual uploads. Credits Thumbnail generation powered by Templated.io Social publishing powered by Blotato Automation orchestrated via n8n 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 🎥 Watch This Tutorial 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Adem Tasin
Overview This workflow turns a single YouTube video into multiple content formats and publishes them across different platforms with an optional human approval step. It helps content creators, marketers, and agencies repurpose long-form video content into blog posts and social media updates automatically. How it works Fetches the transcript from a YouTube video Uses AI to generate blog and social media content Optionally waits for manual approval Publishes content to selected platforms Logs the result for tracking Setup steps Add your API credentials (AI, transcript, and social platforms) Paste a YouTube video URL Choose auto-publish or approval mode Run the workflow Use cases Content repurposing for YouTube creators Automated blog and social media publishing Marketing automation for agencies AI-assisted content production pipelines 🧑💻 Creator Information Developed by: Adem Tasin 🌐 Website: ademtasin.com 💼 LinkedIn: Adem Tasin
by WeblineIndia
Community Contest Tracker (FB Comments) → Sentiment Analysis → Telegram Winner Alerts + Airtable Proof This workflow automatically monitors a Facebook post, extracts comments, enforces a "past winner" blocklist, analyzes sentiment using AI to find positive entries, randomly selects a winner, stores them in Airtable and announces the result via Telegram. This workflow runs every night to manage your daily community giveaways. It fetches fresh comments from a specific Facebook post and cross-references users against a list of previous winners stored in Airtable to ensure fairness. It uses OpenAI to filter for genuinely positive sentiment (removing spam), selects a random winner, saves the record and sends a celebratory announcement to your Telegram channel. You receive: Daily automated comment collection** Fairness enforcement (Blocklist for past winners)** AI-powered sentiment filtering (Positive vibes only)** Automated winner selection & notification** Ideal for community managers and brand owners who want to run fair, high-engagement contests without manually reading hundreds of comments or tracking past winners in spreadsheets. Quick Start – Implementation Steps Add your Facebook Graph API Credentials in the HTTP Request node. Connect and configure your Airtable base (Winners Table). Add your OpenAI API Key for sentiment analysis. Connect your Telegram Bot credentials and set the Chat ID. Update the Post ID in the "Get FB Comments" node. Activate the workflow — daily contest automation begins instantly. What It Does This workflow automates the entire lifecycle of a social media contest: Daily Trigger: Runs automatically at 9:00 PM every day. Data Ingestion: Fetches the latest comments from Facebook and the full list of past winners from Airtable simultaneously. Pre-Processing: Creates a blocklist of users who won in the last 30 days. Filters out spam, short comments (e.g., single emojis) and blocklisted users. AI Analysis: Uses GPT-4o-mini to analyze the text of eligible comments. Filters specifically for "Positive" sentiment. Selection: Randomly picks one winner from the pool of positive comments. Storage: Saves the winner's Name, Facebook ID and Comment to Airtable. Notification: Sends a "Winner Announcement" to your public Telegram channel. If any errors occur (e.g., DB save fail), logs them to Supabase and alerts the Admin. This ensures your contests are fair, spam-free and consistently managed with zero manual effort. Who’s It For This workflow is ideal for: Social Media Managers Community Moderators Digital Marketing Agencies Brand Owners running daily giveaways Influencers managing high-volume comment sections Customer Experience teams rewarding positive feedback To run this workflow, you will need n8n instance** (cloud or self-hosted) Facebook Developer App** (Graph API Access Token) Airtable Base** + Personal Access Token OpenAI API Key** (or compatible LLM) Telegram Bot Token** Supabase Project** (Optional, for error logging) How It Works Daily Trigger – The schedule node initiates the process. Fetch Data – Comments are pulled from FB; Winners pulled from Airtable. Code Filter – JavaScript node removes past winners and low-quality spam. Sentiment Analysis – AI determines if the comment is Positive, Neutral or Negative. Pick Winner – A randomized logic block selects one "Positive" user. Record Keeping – The winner is officially logged in your database. Broadcast – The winner is announced to the community via Telegram. Setup Steps Import the provided n8n JSON file. Open Get FB Comments node → Add credentials and paste your specific Post ID. Open Get Past Winners node → Link to your Airtable "Winners" table. Open OpenAI Chat Model node → Add your API Key. Open Create a record (Airtable) → Map the fields: Name Facebook ID Date Open Send a text message (Telegram) → Add your Chat ID (e.g., @mychannel). Activate the workflow — done! How To Customize Nodes Customize Filtering Logic Modify the Pre-Filter (Blocklist) Code node: Change the minimum character length (default is 2). Adjust the "Blocklist" duration (e.g., allow users to win again after 7 days instead of 30). Customize AI Criteria Modify the Sentiment Analysis or OpenAI prompt: Look for "Creative" or "Humorous" comments instead of just "Positive". Filter for specific keywords related to your brand. Customize Notifications Replace Telegram with: Slack** (for internal team updates). Discord** (for gaming communities). Email** (SMTP/Gmail) to notify the marketing team. Customize Storage Replace Airtable with: Google Sheets Notion PostgreSQL / MySQL Add-Ons (Optional Enhancements) You can extend this workflow to: Auto-Reply:** Use the Facebook API to reply to the winner's comment automatically ("Congrats! DM us to claim."). Image Generation:** Use OpenAI DALL-E or Bannerbear to generate a "Winner Certificate" image. Cross-Posting:** Automatically post the winner's name to Twitter/X or LinkedIn. Sentiment Report:** Create a weekly summary of overall community sentiment (Positive vs Negative ratio). Prize Tiering:** Assign different prizes based on the quality of the comment. Use Case Examples 1. Daily Product Giveaways Reward one user every day who comments why they love your product. 2. Feedback Drives Encourage users to leave constructive feedback and reward the most helpful positive comment. 3. Community Engagement Keep your group active by automating "Best Comment of the Day" rewards. 4. Brand Loyalty Programs Track "Super Fans" by counting how many times they participate (even if they don't win). Troubleshooting Guide | Issue | Possible Cause | Solution | | :--- | :--- | :--- | | No comments found | Invalid Post ID or Token | Check Facebook Graph API token and Post ID. | | No winner selected | No positive comments | AI found no "Positive" sentiment. Check prompt or input data. | | Airtable Error | Field Mismatch | Ensure column names in Airtable match exactly (Name, Facebook ID). | | Telegram Error | Bot Permissions | Ensure the Bot is an Admin in the channel. | | Workflow Stuck | API Rate Limit | Check OpenAI or Facebook API usage limits. | Need Help? If you need help customizing or extending this workflow, adding multi-platform support (Instagram/YouTube), integrating complex prize logic or setting up advanced dashboards, feel free to hire n8n automation developers at WeblineIndia. We are happy to assist you with advanced automation solutions.
by Țugui Dragoș
This workflow automates the post-publish process for YouTube videos, combining advanced SEO optimization, cross-platform promotion, and analytics reporting. It is designed for creators, marketers, and agencies who want to maximize the reach and performance of their YouTube content with minimal manual effort. Features SEO Automation** Fetches video metadata and analyzes competitor and trending data. Uses AI to generate SEO-optimized titles, descriptions, and tags. Calculates an SEO score and applies A/B testing logic to select the best title. Updates the video metadata on YouTube automatically. Cross-Platform Promotion** Generates platform-specific promotional content (LinkedIn, X/Twitter, Instagram, Facebook, etc.) using AI. Publishes posts to each connected social channel. Extracts video clips and analyzes thumbnails for enhanced promotion. Engagement Monitoring & Analytics** Monitors YouTube comments, detects negative sentiment, and drafts AI-powered replies. Logs all key data (videos, comments, analytics) to Google Sheets for tracking and reporting. Runs a weekly analytics job to aggregate performance, calculate engagement/viral indicators, and email a detailed report. Notifications & Alerts** Sends Slack alerts when a new video is published or when viral potential/negative comments are detected. How It Works Trigger The workflow starts automatically when a new YouTube video is published (via webhook) or on a weekly schedule for analytics. Video Intake & SEO Fetches video details (title, description, tags, stats). Gathers competitor and trending topic data. Uses AI to generate improved SEO assets and calculates an SEO score. Selects the best title (A/B test) and updates the video metadata. Clip & Thumbnail Processing If the video is long enough, runs thumbnail analysis and extracts short clips for social media. Cross-Platform Promotion Generates and formats promotional posts for each social platform. Publishes automatically to enabled channels. Engagement & Comment Monitoring Fetches comments, detects negative sentiment, and drafts AI-powered replies. Logs comments and responses to Google Sheets. Analytics & Reporting Aggregates weekly analytics, calculates engagement and viral indicators. Logs insights and sends a weekly report via email. Notifications Sends Slack alerts for new video publications and viral/negative comment detection. Setup Instructions Connect YouTube Set up YouTube API credentials and required IDs in the Workflow Configuration node. Connect OpenAI Add your OpenAI credentials for AI-powered content generation. Connect Slack Configure Slack credentials and specify alert channels. Connect Google Sheets Set up service account credentials for logging video, comment, and analytics data. Configure Social Platforms Add credentials for LinkedIn, Twitter (X), Instagram, and Facebook as needed. Test the Workflow Publish a test video and verify that metadata updates, social posts, logging, and weekly reports are working as expected. Use Cases YouTube Creators:** Automate SEO, promotion, and analytics to grow your channel faster. Marketing Teams:** Streamline multi-channel video campaigns and reporting. Agencies:** Deliver consistent, data-driven YouTube growth for multiple clients. Requirements YouTube API credentials OpenAI API key Slack API token Google Sheets service account (Optional) LinkedIn, Twitter, Instagram, Facebook API credentials Limitations Requires valid API credentials for all connected services. AI-powered features depend on OpenAI API access. Social posting is limited to platforms with available n8n nodes and valid credentials. Tip: You can easily customize prompts, scoring logic, and enabled platforms to fit your channel’s unique needs.
by Ehsan
Who is this for? This workflow is for Product Managers, Indie Hackers, and Customer Success teams who collect feature requests but struggle to notify specific users when those features actually ship. It helps you turn old feedback into customer loyalty and potential upsells. What it does This workflow creates a "Semantic Memory" of user requests. Instead of relying on exact keyword tags, it uses Vector Embeddings to understand the meaning of a request. For example, if a user asks for "Night theme," and months later you release "Dark Mode," this workflow understands they are the same thing, finds that user, and drafts a personal email to them. How it works Listen: Receives new requests via Tally Forms, vectorizes the text using Nomic Embed Text (via Ollama or OpenAI), and stores them in Supabase. Watch: Monitors your Changelog (RSS) or waits for a manual trigger when you ship a new feature. Match: Performs a Vector Similarity Search in Supabase to find users who requested semantically similar features in the past. Notify: An AI Agent drafts a hyper-personalized email connecting the user's specific past request to the new feature, saving it as a Gmail Draft (for safety). Requirements Supabase Project:** You need a project with the vector extension enabled. AI Model:* This template is pre-configured for *Ollama (Local)** to keep it free, but works perfectly with OpenAI. Tally Forms & Gmail:** For input and output. Setup steps Database Setup (Crucial): Copy the SQL script provided in the workflow's Red Sticky Note and run it in your Supabase SQL Editor. This creates the necessary tables and the vector search function. Credentials: Add your credentials for Tally, Supabase, and Gmail. URL Config: Update the HTTP Request node with your specific Supabase Project URL. SQL Script Open your Supabase SQL Editor and paste this script to set up the tables and search function: -- 1. Enable Vector Extension create extension if not exists vector; -- 2. Create Request Table (Smart Columns) create table feature_requests ( id bigint generated by default as identity primary key, content text, metadata jsonb, embedding vector(768), -- 768 for Nomic, 1536 for OpenAI created_at timestamp with time zone default timezone('utc'::text, now()), user_email text generated always as (metadata->>'user_email') stored, user_name text generated always as (metadata->>'user_name') stored ); -- 3. Create Search Function create or replace function match_feature_requests ( query_embedding vector(768), match_threshold float, match_count int ) returns table ( id bigint, user_email text, user_name text, content text, similarity float ) language plpgsql as $$ begin return query select feature_requests.id, feature_requests.user_email, feature_requests.user_name, feature_requests.content, 1 - (feature_requests.embedding <=> query_embedding) as similarity from feature_requests where 1 - (feature_requests.embedding <=> query_embedding) > match_threshold order by feature_requests.embedding <=> query_embedding limit match_count; end; $$; ⚠️ Dimension Warning: This SQL is set up for 768 dimensions (compatible with the local nomic-embed-text model included in the template). If you decide to switch the Embeddings node to use OpenAI's text-embedding-3-small, you must change all instances of 768 to 1536 in the SQL script above before running it. How to customize Change Input:** Swap the Tally node for Typeform, Intercom, or Google Sheets. Change AI:** The template includes notes on how to swap the local Ollama nodes for OpenAI nodes if you prefer cloud hosting. Change Output:** Swap Gmail for Slack, SendGrid, or HubSpot to notify your sales team instead of the user directly.
by Gtaras
Who’s it for This workflow is for hotel managers, travel agencies, and hospitality teams who receive booking requests via email. It eliminates the need for manual data entry by automatically parsing emails and attachments, assigning booking cases to the right teams, and tracking performance metrics. What it does This workflow goes beyond simple automation by including enterprise-grade logic and security: 🛡️ Gatekeeper:** Watches your Gmail and filters irrelevant emails before spending money on AI tokens. 🧠 AI Brain:** Uses OpenAI (GPT-5-mini) to extract structured data from unstructured email bodies and PDF attachments. ⚖️ Business Logic:** Automatically routes tasks to different teams based on urgency, room count, and VIP status. 🔒 Security:** Catches PII (like credit card numbers) and scrubs them before they hit your database. 🚨 Safety Net:** If anything breaks, a dedicated error handling path logs the issue immediately so no booking is lost. 📈 ROI Tracking:** Calculates the time saved per booking to prove the value of automation. How to set up Create your Google Sheet: Create a new sheet and rename the tabs to: Cases, Team Assignments, Error Logs, Success Metrics. Add Credentials: Go to n8n Settings → Credentials and add your Gmail (OAuth2), Google Sheets, and OpenAI API keys. Configure User Settings: Open the "Configuration: User Settings" node at the start of the workflow. Paste your specific Google Sheet ID and Admin Email there. Adjust Business Rules: Open the "Apply Business Rules" node (Code node) to adjust the logic for team assignment (e.g., defining what counts as a "VIP" booking). Customize Templates: Modify the email templates in the Gmail nodes to match your hotel's branding. Test: Send a sample booking email to yourself to verify the filters and data extraction. Setup requirements Gmail account (OAuth2 connected) Google Sheets (with the 4 tabs listed below) OpenAI API key (GPT-5-mini recommended) n8n Cloud or self-hosted instance How to customize Filter Booking Emails:** Update the trigger node keywords to match your specific email subjects (e.g., "Reservation", "Booking Request"). Apply Business Rules:** Edit the Javascript in the Code node to fit your company’s internal logic (e.g., changing priority thresholds). New Metrics:** Add new columns in the Google Sheet (e.g., “Revenue Metrics”) and map them in the "Update Sheet" node. AI Model:** Switch to GPT-5 if you need higher reasoning capabilities for complex PDF layouts. Google Sheets Structure Description This workflow uses a Google Sheets document with four main tabs to track and manage hotel booking requests. 1. Cases This is the main data log for all incoming booking requests. case_id:** Unique identifier generated by the workflow. processed_date:** Timestamp when the workflow processed the booking. travel_agency / contact_details:** Extracted from the email. number_of_rooms / check_in_date:** Booking details parsed by the AI. special_requests:** Optional notes (e.g., airport transfer). assigned_team / priority:** Automatically set based on business rules. days_until_checkin:** Dynamic field showing urgency. 2. Team Assignments Stores internal routing and assignment details. timestamp:** When the case was routed. case_id:** Link to the corresponding record in the Cases tab. assigned_team / team_email:** Which department handles this request. priority:** Auto-set based on room count or urgency. 3. Error Log A critical audit trail that captures details about any failed processing steps. error_type:** Categorization of the failure (e.g., MISSING_REQUIRED_FIELDS). error_message:** Detailed technical explanation for debugging. original_sender / snippet:** Context to help you manually process the request if needed. 4. Success Metrics Tracks the results of your automation to prove its value. processing_time_seconds:** The time savings achieved by the automation (run time vs. human time). record_updated:** Confirmation that the database was updated. 🙋 Support If you encounter any issues during setup or have questions about customization, please reach out to our dedicated support email: foivosautomationhelp@gmail.com
by TOMOMITSU ASANO
{ "name": "IoT Sensor Data Aggregation with AI-Powered Anomaly Detection", "nodes": [ { "parameters": { "content": "## How it works\nThis workflow monitors IoT sensors in real-time. It ingests data via MQTT or a schedule, normalizes the format, and removes duplicates using data fingerprinting. An AI Agent then analyzes readings against defined thresholds to detect anomalies. Finally, it routes alerts to Slack or Email based on severity and logs everything to Google Sheets.\n\n## Setup steps\n1. Configure the MQTT Trigger with your broker details.\n2. Set your specific limits in the Define Sensor Thresholds node.\n3. Connect your OpenAI credential to the Chat Model node.\n4. Authenticate the Gmail, Slack, and Google Sheets nodes.\n5. Create a Google Sheet with headers: timestamp, sensorId, location, readings, analysis.", "height": 484, "width": 360 }, "id": "298da7ff-0e47-4b6c-85f5-2ce77275cdf3", "name": "Main Overview", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -2352, -480 ] }, { "parameters": { "content": "## 1. Data Ingestion\nCaptures sensor data via MQTT for real-time streams or runs on a schedule for batch processing. Both streams are merged for unified handling.", "height": 488, "width": 412, "color": 7 }, "id": "4794b396-cd71-429c-bcef-61780a55d707", "name": "Section: Ingestion", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -1822, -48 ] }, { "parameters": { "content": "## 2. Normalization & Deduplication\nSets monitoring thresholds, standardizes the JSON structure, creates a content hash, and filters out duplicate readings to prevent redundant API calls.", "height": 316, "width": 884, "color": 7 }, "id": "339e7cb7-491e-44c9-b561-983e147237d8", "name": "Section: Processing", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -1376, 32 ] }, { "parameters": { "content": "## 3. AI Anomaly Detection\nAn AI Agent evaluates sensor data against thresholds to identify anomalies, assigning severity levels and providing actionable recommendations.", "height": 528, "width": 460, "color": 7 }, "id": "ebcb7ca3-f70c-4a90-8a2a-f489e7be4c73", "name": "Section: AI Analysis", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ -422, 24 ] }, { "parameters": { "content": "## 4. Routing & Archiving\nRoutes alerts based on severity (Critical = Email+Slack, Warning = Slack) and archives all data points to Google Sheets for historical analysis.", "height": 756, "width": 900, "color": 7 }, "id": "7f2b32a5-d3b2-4fea-844f-4b39b8e8a239", "name": "Section: Alerting", "type": "n8n-nodes-base.stickyNote", "typeVersion": 1, "position": [ 94, -196 ] }, { "parameters": { "topics": "sensors/+/data", "options": {} }, "id": "bc86720b-9de9-4693-b090-343d3ebad3a3", "name": "MQTT Sensor Trigger", "type": "n8n-nodes-base.mqttTrigger", "typeVersion": 1, "position": [ -1760, 88 ] }, { "parameters": { "rule": { "interval": [ { "field": "minutes", "minutesInterval": 15 } ] } }, "id": "1c38f2d0-aa00-447e-bdae-bffd08c38461", "name": "Batch Process Schedule", "type": "n8n-nodes-base.scheduleTrigger", "typeVersion": 1.2, "position": [ -1760, 280 ] }, { "parameters": { "mode": "chooseBranch" }, "id": "f9b41822-ee61-448b-b324-38483036e0e1", "name": "Merge Triggers", "type": "n8n-nodes-base.merge", "typeVersion": 3, "position": [ -1536, 184 ] }, { "parameters": { "mode": "raw", "jsonOutput": "{\n \"thresholds\": {\n \"temperature\": {\"min\": -10, \"max\": 50, \"unit\": \"C\"},\n \"humidity\": {\"min\": 20, \"max\": 90, \"unit\": \"%\"},\n \"pressure\": {\"min\": 950, \"max\": 1050, \"unit\": \"hPa\"},\n \"co2\": {\"min\": 400, \"max\": 2000, \"unit\": \"ppm\"}\n },\n \"alertConfig\": {\n \"criticalChannel\": \"#iot-critical\",\n \"warningChannel\": \"#iot-alerts\",\n \"emailRecipients\": \"ops@example.com\"\n }\n}", "options": {} }, "id": "308705a8-edc7-4435-9250-487aa528e033", "name": "Define Sensor Thresholds", "type": "n8n-nodes-base.set", "typeVersion": 3.4, "position": [ -1312, 184 ] }, { "parameters": { "jsCode": "const items = $input.all();\nconst thresholds = $('Define Sensor Thresholds').first().json.thresholds;\nconst results = [];\n\nfor (const item of items) {\n let sensorData;\n try {\n sensorData = typeof item.json.message === 'string' \n ? JSON.parse(item.json.message) \n : item.json;\n } catch (e) {\n sensorData = item.json;\n }\n \n const now = new Date();\n const reading = {\n sensorId: sensorData.sensorId || sensorData.topic?.split('/')[1] || 'unknown',\n location: sensorData.location || 'Main Facility',\n timestamp: now.toISOString(),\n readings: {\n temperature: sensorData.temperature ?? null,\n humidity: sensorData.humidity ?? null,\n pressure: sensorData.pressure ?? null,\n co2: sensorData.co2 ?? null\n },\n metadata: {\n receivedAt: now.toISOString(),\n source: item.json.topic || 'batch',\n thresholds: thresholds\n }\n };\n \n results.push({ json: reading });\n}\n\nreturn results;" }, "id": "a2008189-5ace-418b-b0db-d51d63dcf2d8", "name": "Parse Sensor Payload", "type": "n8n-nodes-base.code", "typeVersion": 2, "position": [ -1088, 184 ] }, { "parameters": { "type": "SHA256", "value": "={{ $json.sensorId + '-' + $json.timestamp + '-' + JSON.stringify($json.readings) }}", "dataPropertyName": "dataHash" }, "id": "bf8db555-a10e-4468-a44a-cdc4c97e5b80", "name": "Generate Data Fingerprint", "type": "n8n-nodes-base.crypto", "typeVersion": 1, "position": [ -864, 184 ] }, { "parameters": { "compare": "selectedFields", "fieldsToCompare": "dataHash", "options": {} }, "id": "a45405e2-d211-449d-84d7-4538eaf56fcd", "name": "Remove Duplicate Readings", "type": "n8n-nodes-base.removeDuplicates", "typeVersion": 1, "position": [ -640, 184 ] }, { "parameters": { "text": "=Analyze this IoT sensor reading and determine if there are any anomalies:\n\nSensor ID: {{ $json.sensorId }}\nLocation: {{ $json.location }}\nTimestamp: {{ $json.timestamp }}\n\nReadings:\n- Temperature: {{ $json.readings.temperature }}°C (Normal: {{ $json.metadata.thresholds.temperature.min }} to {{ $json.metadata.thresholds.temperature.max }})\n- Humidity: {{ $json.readings.humidity }}% (Normal: {{ $json.metadata.thresholds.humidity.min }} to {{ $json.metadata.thresholds.humidity.max }})\n- CO2: {{ $json.readings.co2 }} ppm (Normal: {{ $json.metadata.thresholds.co2.min }} to {{ $json.metadata.thresholds.co2.max }})\n\nProvide your analysis in this exact JSON format:\n{\n \"hasAnomaly\": true/false,\n \"severity\": \"critical\"/\"warning\"/\"normal\",\n \"anomalies\": [\"list of detected issues\"],\n \"reasoning\": \"explanation of your analysis\",\n \"recommendation\": \"suggested action\"\n}", "options": { "systemMessage": "You are an IoT monitoring expert. Analyze sensor data and detect anomalies based on the provided thresholds. Be precise and provide actionable recommendations. Always respond in valid JSON format." } }, "id": "b60194ba-7b99-44e0-b0d7-9f1632dce4d4", "name": "AI Anomaly Detector", "type": "@n8n/n8n-nodes-langchain.agent", "typeVersion": 1.7, "position": [ -416, 184 ] }, { "parameters": { "jsCode": "const item = $input.first();\nconst originalData = $('Remove Duplicate Readings').first().json;\n\nlet aiAnalysis;\ntry {\n const responseText = item.json.output || item.json.text || '';\n const jsonMatch = responseText.match(/\\{[\\s\\S]*\\}/);\n aiAnalysis = jsonMatch ? JSON.parse(jsonMatch[0]) : {\n hasAnomaly: false,\n severity: 'normal',\n anomalies: [],\n reasoning: 'Unable to parse AI response',\n recommendation: 'Manual review required'\n };\n} catch (e) {\n aiAnalysis = {\n hasAnomaly: false,\n severity: 'normal',\n anomalies: [],\n reasoning: 'Parse error: ' + e.message,\n recommendation: 'Manual review required'\n };\n}\n\nreturn [{\n json: {\n ...originalData,\n analysis: aiAnalysis,\n alertLevel: aiAnalysis.severity,\n requiresAlert: aiAnalysis.hasAnomaly && aiAnalysis.severity !== 'normal'\n }\n}];" }, "id": "a145a8c7-538c-411a-95c6-9485acdcb969", "name": "Parse AI Analysis", "type": "n8n-nodes-base.code", "typeVersion": 2, "position": [ -64, 184 ] }, { "parameters": { "rules": { "values": [ { "conditions": { "options": { "caseSensitive": true, "typeValidation": "strict" }, "combinator": "and", "conditions": [ { "id": "critical", "operator": { "type": "string", "operation": "equals" }, "leftValue": "={{ $json.alertLevel }}", "rightValue": "critical" } ] }, "renameOutput": true, "outputKey": "Critical" }, { "conditions": { "options": { "caseSensitive": true, "typeValidation": "strict" }, "combinator": "and", "conditions": [ { "id": "warning", "operator": { "type": "string", "operation": "equals" }, "leftValue": "={{ $json.alertLevel }}", "rightValue": "warning" } ] }, "renameOutput": true, "outputKey": "Warning" } ] }, "options": { "fallbackOutput": "extra" } }, "id": "1ab9785d-9f7f-4840-b1e9-0afc62b00b12", "name": "Route by Severity", "type": "n8n-nodes-base.switch", "typeVersion": 3.2, "position": [ 160, 168 ] }, { "parameters": { "sendTo": "={{ $('Define Sensor Thresholds').first().json.alertConfig.emailRecipients }}", "subject": "=CRITICAL IoT Alert: {{ $json.sensorId }} - {{ $json.analysis.anomalies[0] || 'Anomaly Detected' }}", "message": "=CRITICAL IoT SENSOR ALERT\n\nSensor: {{ $json.sensorId }}\nLocation: {{ $json.location }}\nTime: {{ $json.timestamp }}\n\nReadings:\n- Temperature: {{ $json.readings.temperature }}°C\n- Humidity: {{ $json.readings.humidity }}%\n- CO2: {{ $json.readings.co2 }} ppm\n\nAI Analysis:\n{{ $json.analysis.reasoning }}\n\nDetected Issues:\n{{ $json.analysis.anomalies.join('\\n- ') }}\n\nRecommendation:\n{{ $json.analysis.recommendation }}", "options": {} }, "id": "28201a6c-10b5-4387-be89-10a57c634622", "name": "Send Critical Email", "type": "n8n-nodes-base.gmail", "typeVersion": 2.1, "position": [ 384, -80 ], "webhookId": "35b9f8fa-4a50-456e-b552-9fd20a25ccc5" }, { "parameters": { "select": "channel", "channelId": { "__rl": true, "mode": "name", "value": "#iot-critical" }, "text": "=🚨 CRITICAL IoT ALERT\n\nSensor: {{ $json.sensorId }}\nLocation: {{ $json.location }}\n\nReadings:\n• Temperature: {{ $json.readings.temperature }}°C\n• Humidity: {{ $json.readings.humidity }}%\n• CO2: {{ $json.readings.co2 }} ppm\n\nAI Analysis: {{ $json.analysis.reasoning }}\nRecommendation: {{ $json.analysis.recommendation }}", "otherOptions": {} }, "id": "c5a297be-ccef-40ba-9178-65805262efba", "name": "Slack Critical Alert", "type": "n8n-nodes-base.slack", "typeVersion": 2.2, "position": [ 384, 112 ], "webhookId": "19113595-0208-4b37-b68c-c9788c19f618" }, { "parameters": { "select": "channel", "channelId": { "__rl": true, "mode": "name", "value": "#iot-alerts" }, "text": "=⚠️ IoT Warning\n\nSensor: {{ $json.sensorId }} | Location: {{ $json.location }}\nIssue: {{ $json.analysis.anomalies[0] || 'Threshold approaching' }}\nRecommendation: {{ $json.analysis.recommendation }}", "otherOptions": {} }, "id": "5c3d7acf-0211-44dd-9f4b-a43d3796abb1", "name": "Slack Warning Alert", "type": "n8n-nodes-base.slack", "typeVersion": 2.2, "position": [ 384, 400 ], "webhookId": "37abfb19-f82f-4449-bd69-a65635b99606" }, { "parameters": {}, "id": "6bcbb42f-ec14-4f00-a091-babcc2d2d5c4", "name": "Merge Alert Outputs", "type": "n8n-nodes-base.merge", "typeVersion": 3, "position": [ 608, 184 ] }, { "parameters": { "operation": "append", "documentId": { "__rl": true, "mode": "list", "value": "" }, "sheetName": { "__rl": true, "mode": "list", "value": "" } }, "id": "6243aa23-408d-4928-a512-811eeb3b5f9e", "name": "Archive to Google Sheets", "type": "n8n-nodes-base.googleSheets", "typeVersion": 4.5, "position": [ 832, 184 ] }, { "parameters": { "model": "gpt-4o-mini", "options": { "temperature": 0.3 } }, "id": "61081e8a-ebc9-465f-8beb-88af225e59f2", "name": "OpenAI Chat Model", "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi", "typeVersion": 1.2, "position": [ -344, 408 ] } ], "pinData": {}, "connections": { "MQTT Sensor Trigger": { "main": [ [ { "node": "Merge Triggers", "type": "main", "index": 0 } ] ] }, "Batch Process Schedule": { "main": [ [ { "node": "Merge Triggers", "type": "main", "index": 1 } ] ] }, "Merge Triggers": { "main": [ [ { "node": "Define Sensor Thresholds", "type": "main", "index": 0 } ] ] }, "Define Sensor Thresholds": { "main": [ [ { "node": "Parse Sensor Payload", "type": "main", "index": 0 } ] ] }, "Parse Sensor Payload": { "main": [ [ { "node": "Generate Data Fingerprint", "type": "main", "index": 0 } ] ] }, "Generate Data Fingerprint": { "main": [ [ { "node": "Remove Duplicate Readings", "type": "main", "index": 0 } ] ] }, "Remove Duplicate Readings": { "main": [ [ { "node": "AI Anomaly Detector", "type": "main", "index": 0 } ] ] }, "AI Anomaly Detector": { "main": [ [ { "node": "Parse AI Analysis", "type": "main", "index": 0 } ] ] }, "Parse AI Analysis": { "main": [ [ { "node": "Route by Severity", "type": "main", "index": 0 } ] ] }, "Route by Severity": { "main": [ [ { "node": "Send Critical Email", "type": "main", "index": 0 }, { "node": "Slack Critical Alert", "type": "main", "index": 0 } ], [ { "node": "Slack Warning Alert", "type": "main", "index": 0 } ], [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Send Critical Email": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Slack Critical Alert": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Slack Warning Alert": { "main": [ [ { "node": "Merge Alert Outputs", "type": "main", "index": 0 } ] ] }, "Merge Alert Outputs": { "main": [ [ { "node": "Archive to Google Sheets", "type": "main", "index": 0 } ] ] }, "OpenAI Chat Model": { "ai_languageModel": [ [ { "node": "AI Anomaly Detector", "type": "ai_languageModel", "index": 0 } ] ] } }, "active": false, "settings": { "executionOrder": "v1" }, "versionId": "", "meta": { "instanceId": "15d6057a37b8367f33882dd60593ee5f6cc0c59310ff1dc66b626d726083b48d" }, "tags": [] }
by Simeon Penev
Who’s it for Marketing, growth, and analytics teams who want a decision-ready GA4 summary—automatically calculated, clearly color-coded, and emailed as a polished HTML report. How it works / What it does Get Client (Form Trigger)* collects *GA4 Property ID (“Account ID”), **Key Event, date ranges (current & previous), Client Name, and recipient email. Overall Metrics This Period / Previous Period (GA4 Data API)** pull sessions, users, engagement, bounce rate, and more for each range. Form Submits This Period / Previous Period (GA4 Data API)** fetch key-event counts for conversion comparisons. Code** normalizes form dates for API requests. AI Agent* builds a *valid HTML email**: Calculates % deltas, applies green for positive (#10B981) and red for negative (#EF4444) changes. Writes summary and recommendations. Produces the final HTML only. Send a message (Gmail)** sends the formatted HTML report to the specified email address with a contextual subject. How to set up 1) Add credentials: Google Analytics OAuth2, OpenAI (Chat), Gmail OAuth2. 2) Ensure the form fields match your GA4 property and event names; “Account ID” = GA4 Property ID. Property ID - https://take.ms/vO2MG Key event - https://take.ms/hxwQi 3) Publish the form URL and run a test submission. Requirements GA4 property access (Viewer/Analyst) • OpenAI API key • Gmail account with send permission. Resources Google OAuth2 (GA4) – https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/ OpenAI credentials – https://docs.n8n.io/integrations/builtin/credentials/openai/ Gmail OAuth2 – https://docs.n8n.io/integrations/builtin/credentials/google/ GA4 Data API overview – https://developers.google.com/analytics/devguides/reporting/data/v1