by Rully Saputra
Automated SEO Watchlist: Continuous Audits Powered by Decodo, Gemini and Google Sheets Automate continuous SEO audits with Decodo and Gemini AI — live data, smart insights, and Google Sheets tracking with team alerts. Who’s it for This workflow is designed for SEO specialists, marketing teams, agencies, and website owners who want an effortless, automated way to monitor SEO health. It’s perfect for ongoing audits, content monitoring, and proactive SEO management — without the manual workload. How it works / What it does Every five days, the workflow: Reads a list of URLs from Google Sheets. Uses Decodo to fetch live on-page data — titles, meta descriptions, headings, schema, links, and Core Web Vitals. Passes that data to Gemini AI for an advanced SEO analysis and scoring based on key factors (content, metadata, links, speed, and structure). Parses results via a Structured Output Parser for clean JSON output. Stores findings in Google Sheets and sends a Telegram alert when the audit completes. Why Decodo matters Decodo is the backbone of this workflow. It powers the real-time page inspection, ensuring Gemini AI has complete, accurate data to analyze. Decodo transforms static audits into live, intelligent monitoring — making your SEO insights far more actionable and reliable. How to set up Connect your Decodo API credentials. Add your Google Sheets URL list. Configure your Telegram bot credentials. Enable the workflow — it runs automatically every 5 days. Requirements Decodo API credentials Google Sheets OAuth connection Telegram Bot token n8n instance (Cloud or Self-hosted) How to customize the workflow Change the trigger interval in the Schedule Trigger node. Modify the SEO Analyzer (LLM Chain) weights for different scoring. Extend the Store Result node to integrate with dashboards or databases. Adjust the AI prompt for additional SEO checks (e.g., backlinks, readability, image optimization). ✅ Highlights Automated SEO auditing Real-time data from Decodo Smart analysis powered by Gemini AI Structured reporting in Google Sheets Team notifications via Telegram
by vinci-king-01
Social Media Sentiment Analysis Dashboard with AI and Real-time Monitoring 🎯 Target Audience Social media managers and community managers Marketing teams monitoring brand reputation PR professionals tracking public sentiment Customer service teams identifying trending issues Business analysts measuring social media ROI Brand managers protecting brand reputation Product managers gathering user feedback 🚀 Problem Statement Manual social media monitoring is overwhelming and often misses critical sentiment shifts or trending topics. This template solves the challenge of automatically collecting, analyzing, and visualizing social media sentiment data across multiple platforms to provide actionable insights for brand management and customer engagement. 🔧 How it Works This workflow automatically monitors social media platforms using AI-powered sentiment analysis, processes mentions and conversations, and provides real-time insights through a comprehensive dashboard. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Sentiment Analysis - Uses advanced NLP to analyze sentiment, emotions, and topics Multi-Platform Integration - Monitors Twitter, Reddit, and other social platforms Real-time Alerting - Sends notifications for critical sentiment changes or viral content Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the mention was recorded | "2024-01-15T10:30:00Z" | | platform | String | Social media platform | "Twitter" | | username | String | User who posted the content | "@john_doe" | | content | String | Full text of the post/comment | "Love the new product features!" | | sentiment_score | Number | Sentiment score (-1 to 1) | 0.85 | | sentiment_label | String | Sentiment classification | "Positive" | | emotion | String | Primary emotion detected | "Joy" | | topics | Array | Key topics identified | ["product", "features"] | | engagement | Number | Likes, shares, comments | 1250 | | reach_estimate | Number | Estimated reach | 50000 | | influence_score | Number | User influence metric | 0.75 | | alert_priority | String | Alert priority level | "High" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Social media API access (Twitter, Reddit, etc.) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sentiment analysis data Configure the sheet name (default: "Sentiment Analysis") 4. Configure Social Media Monitoring Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for social media platforms you want to monitor Customize the user prompt to extract specific sentiment data Set up keywords, hashtags, and brand mentions to track 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for alerts Define sentiment thresholds for different alert levels Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (every 15 minutes, hourly, etc.) Choose appropriate time zones for your business hours Consider social media platform rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test sentiment analysis with sample content 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove social media platforms Change keywords, hashtags, or brand mentions Adjust monitoring frequency based on platform activity Extend Sentiment Analysis Add more sophisticated emotion detection Implement topic clustering and trend analysis Include influencer identification and scoring Customize Alert System Set different thresholds for different sentiment levels Create tiered alert systems (info, warning, critical) Add sentiment trend analysis and predictions Output Customization Add data visualization and reporting features Implement sentiment trend charts and graphs Create executive dashboards with key metrics Add competitor sentiment comparison 📈 Use Cases Brand Reputation Management**: Monitor and respond to brand mentions Crisis Management**: Detect and respond to negative sentiment quickly Customer Feedback Analysis**: Understand customer satisfaction and pain points Product Launch Monitoring**: Track sentiment around new product releases Competitor Analysis**: Monitor competitor sentiment and engagement Influencer Identification**: Find and engage with influential users 🚨 Important Notes Respect social media platforms' terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring keywords and parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider privacy implications and data protection regulations 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Sentiment analysis errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Social media platform API documentation Sentiment analysis best practices and guidelines
by Avkash Kakdiya
How it works This workflow runs on a daily schedule to analyze all active HubSpot deals and their latest engagement activity. It applies AI-driven behavioral scoring to predict conversion probability and deal health. High-risk or stalled deals automatically trigger Slack alerts. All insights are logged in Google Sheets for forecasting and performance tracking. Step-by-step Step 1 – Trigger and collect active deals** Schedule Trigger – Runs the workflow automatically at a fixed time each day. Get Active Deals from HubSpot – Retrieves all non-closed deals with key properties like value, stage, and activity dates. Formatting Data – Cleans and normalizes deal data while calculating metrics such as deal age and inactivity duration. Step 2 – Enrich deals with engagement data** If – Filters only active deals to ensure closed deals are excluded. Loop Over Items – Processes each deal individually to handle enrichment safely. HTTP Request – Fetches engagement associations linked to each deal. Get an engagement – Retrieves detailed engagement records from HubSpot. Extracts Data – Structures engagement content, timestamps, and internal notes for AI analysis. Step 3 – Analyze risk and notify the team** AI Agent – Analyzes behavioral signals and predicts conversion probability, risk level, and next actions. Format Data – Parses the AI output into structured fields and risk indicators. Filter Alerts Needed – Identifies deals that require immediate attention. Send Slack Alert – Sends a detailed alert with risks, signals, and recommended actions. Append or update row in sheet – Stores analysis results in Google Sheets for tracking and forecasting. Why use this? Detect deal risk early using consistent, AI-based analysis Reduce manual pipeline reviews for sales managers Provide clear, actionable next steps to sales reps Keep a historical log of deal health and forecasts Improve close rates through timely, data-driven intervention
by Rahul Joshi
📊 Description Monitor daily brand visibility and reputation with an automated AI-powered mention tracker. 🔍🤖 This workflow checks Hacker News every morning for new stories matching your brand keyword, classifies each mention’s sentiment and urgency using GPT-4o-mini, and delivers a clean daily summary to Slack. If no mentions are found, the workflow sends a simple “no mentions today” update instead—ensuring your team is always informed without manual monitoring. Perfect for reputation tracking, competitive intelligence, and early warning alerts. 📈💬 🔁 What This Template Does 1️⃣ Triggers every morning at 09:00 to begin the analysis. ⏰ 2️⃣ Loads brand name + keyword filters from configuration. 🏷️ 3️⃣ Fetches relevant mentions from Hacker News using the Algolia API. 🌐 4️⃣ Normalizes raw API data into clean fields (title, URL, snippet, author, points). 📄 5️⃣ Classifies each mention’s sentiment, stance, topic, and urgency using GPT-4o-mini. 🤖 6️⃣ Builds a ranked daily summary including top 10 mentions and sentiment totals. 📊 7️⃣ Sends the report to Slack, formatted cleanly and ready for team consumption. 💬 8️⃣ If no mentions exist, sends a simple “no new mentions today” message. 🚫 9️⃣ Includes an error handler that notifies Slack of any workflow failures. ⚠️ ⭐ Key Benefits ✅ Automatically tracks brand presence without manual searching ✅ AI-powered sentiment & urgency analysis for deeper insights ✅ Clean Slack summaries keep teams aligned and aware ✅ Early detection of negative or high-urgency mentions ✅ Zero manual monitoring — runs fully on schedule ✅ Suitable for brand monitoring, PR, marketing, and leadership teams 🧩 Features Daily schedule trigger Hacker News API (Algolia) integration Structured data normalization GPT-4o-mini classification (sentiment, stance, topic, urgency) Slack notifications (detailed report or no-mention message) Error-handling pipeline with Slack alerts Fully configurable brand keywords 🔐 Requirements Slack API credentials OpenAI API key (GPT-4o-mini) No authentication required for Hacker News API n8n with LangChain nodes enabled 🎯 Target Audience Brand monitoring & PR teams AI companies tracking public sentiment Founders monitoring mentions of their product Marketing teams watching trends & community feedback
by Rahul Joshi
📊 Description Simplify your social media publishing process by automating post scheduling from Google Sheets directly to Meta (Facebook Pages). 📅💬 This workflow detects pending posts, uploads images with captions to your Facebook Page, updates the sheet status, and sends real-time notifications via Slack and email — keeping your marketing team always in sync. 🚀 What This Template Does 1️⃣ Trigger – Monitors a Google Sheet for new or pending posts every minute. ⏰ 2️⃣ Filter – Identifies the latest “pending” entry for publishing. 🔍 3️⃣ Extract – Captures post details like caption, image URL, and ID. 🧾 4️⃣ Publish – Uploads the post to your Meta (Facebook) Page using the Graph API. 📤 5️⃣ Validate – Confirms success or failure of the post operation. ✅ 6️⃣ Notify – Sends instant Slack and email updates on publishing status. 💌 7️⃣ Update – Marks the published post as “Completed” in Google Sheets. 📊 Key Benefits ✅ Hands-free publishing from Google Sheets to Meta ✅ Instant Slack and email alerts for post outcomes ✅ Prevents duplicate or failed post uploads ✅ Centralized content tracking and status updates ✅ Improves consistency and speed in social media operations Features Google Sheets trigger for post scheduling Facebook Graph API integration for auto-posting Slack and Outlook notifications for success/error alerts Automatic sheet updates post-publication Error handling and reporting for failed posts Requirements Google Sheets OAuth2 credentials Facebook Page Access Token via Graph API Slack Bot token for notifications Outlook or SMTP credentials for email updates Target Audience Marketing teams managing Facebook content calendars 📆 Social media managers seeking automated posting 📣 Agencies coordinating client content delivery 📋 Teams tracking campaign publishing performance 📊
by Fahmi Fahreza
Create Airtable records from new ClickUp Doc pages This workflow automates the process of turning content from ClickUp Docs into structured data in Airtable. When a new task is created in ClickUp with a link to a ClickUp Doc in its name, this workflow triggers, fetches the entire content of that Doc, parses it into individual records, and then creates a new record for each item in a specified Airtable base and table. Who's it for This template is perfect for content creators, project managers, and operations teams who use ClickUp Docs for drafting or knowledge management and Airtable for tracking and organizing data. It helps bridge the gap between unstructured text and a structured database. How it works Trigger: The workflow starts when a new task is created in a specific ClickUp Team. Fetch & Parse URL: It gets the new task's details and extracts the ClickUp Doc URL from the task name. Get Doc Content: It uses the URL to fetch the main Doc and all its sub-pages from the ClickUp API. Process Content: A Code node parses the text from each page. It's designed to split content by * * * and separate notes by looking for the "notes:" keyword. Find Airtable Destination: The workflow finds the correct Airtable Base and Table IDs by matching the names you provide. Create Records: It loops through each parsed content piece and creates a new record in your specified Airtable table. How to set up Configure the Set Node: Open the "Configure Variables" node and set the following values: clickupTeamId: Your ClickUp Team ID. Find it in your ClickUp URL (e.g., app.clickup.com/9014329600/...). airtableBaseName: The exact name of your target Airtable Base. airtableTableName: The exact name of your target Airtable Table. airtableVerticalsTableName: The name of the table in your base that holds "Vertical" records, which are linked in the main table. Set Up Credentials: Add your ClickUp (OAuth2) and Airtable (Personal Access Token) credentials to the respective nodes. Airtable Fields: Ensure your Airtable table has fields corresponding to the ones in the Create New Record in Airtable node (e.g., Text, Status, Vertical, Notes). You can customize the mapping in this node. Activate Workflow: Save and activate the workflow. Test: Create a new task in your designated ClickUp team. In the task name, include the full URL of the ClickUp Doc you want to process. How to customize the workflow Parsing Logic:* You can change how the content is parsed by modifying the JavaScript in the Parse Content from Doc Pages Code node. For example, you could change the delimiter from * * to something else. Field Mapping:** Adjust the Create New Record in Airtable node to map data to different fields or add more fields from the source data. Trigger Events:** Modify the Trigger on New ClickUp Task node to respond to different events, such as taskUpdated or taskCommentPosted.
by Oneclick AI Squad
This automated n8n workflow monitors real-time cryptocurrency prices using CoinGecko API and sends smart alerts when price conditions are met. It supports multi-coin tracking, dynamic conditions, and instant notifications via Email, Telegram, and Discord. Good to Know Reads crypto watchlist data from Google Sheets. Monitors prices at defined intervals (24/7 monitoring). Handles upper and lower price limits with direction-based alerts (above, below, both). Implements cooldown logic to avoid duplicate alerts. Updates last alert price and timestamp in Google Sheets. Supports multiple alert channels: Email, Telegram, Discord. Uses CoinGecko API for price data (Free tier supported). How It Works 24/7 Crypto Trigger – Runs every minute (or custom interval) to check latest prices. Read Crypto Watchlist – Fetches symbols and conditions from Google Sheets. Parse Crypto Data – Converts Google Sheet data into structured JSON. Fetch Live Crypto Price – Uses CoinGecko API to get latest market price for each coin. Smart Crypto Alert Logic – Compares live price with upper/lower limits and evaluates conditions: Above – Trigger alert if price > upper\_limit. Below – Trigger alert if price < lower\_limit. Both – Trigger alert if either condition is met. Implements cooldown\_minutes to prevent repeated alerts. Check Crypto Alert Conditions – Validates alerts before sending notifications. Send Crypto Email Alert – Sends email alert if condition is true. Send Telegram Crypto Alert – Sends Telegram alert. Send Discord Crypto Alert – Sends Discord alert. Update Crypto Alert History – Updates last_alert_price and last_alert_time in Google Sheet. Crypto Alert Status Check – Ensures alert process completed successfully. Success Notification – Sends confirmation message on success. Error Notification – Sends an error alert if something fails. Google Sheet Columns (A-G) | Column | Description | | ------ | ---------------------------------- | | A | symbol (BTC, ETH, SOL, etc.) | | B | upper_limit (e.g., 45000) | | C | lower_limit (e.g., 40000) | | D | direction (both / above / below) | | E | cooldown_minutes (e.g., 10) | | F | last_alert_price (auto-updated) | | G | last_alert_time (auto-updated) | How to Use Import the workflow into n8n. Configure Google Sheets credentials and link your watchlist sheet. Add your CoinGecko API endpoint in the Fetch Price node (Free tier). Set up Email, Telegram, and Discord credentials for notifications. Test with sample data: Example: BTC, upper\_limit=45000, lower\_limit=40000, direction=both. Execute the workflow and monitor alerts. Requirements n8n environment with execution permissions. Google Sheets integration (with API credentials). CoinGecko API (Free tier supported). Notification channels: Email (SMTP settings in n8n). Telegram Bot Token. Discord Webhook URL. Customizing This Workflow Add more coins in Google Sheet. Modify alert conditions (e.g., percentage change, moving averages). Add SMS or WhatsApp notifications. Integrate with Slack or Microsoft Teams. Use AI-based price predictions for smarter alerts.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🎯 Smart Job Hunter Pro - AI-Powered Multi-Platform Job Automation Transform your job search with this comprehensive n8n workflow that automatically searches, analyzes, and applies to relevant positions across multiple job platforms. Perfect for developers, engineers, and tech professionals looking to streamline their job hunting process. ✨ Key Features 🔄 Multi-Platform Job Search**: Simultaneously searches Jooble, JobStreet, Indeed, and WhatJobs APIs 🤖 AI-Powered Job Analysis**: Uses Google Gemini AI to analyze job compatibility and generate tailored cover letters 📊 Smart Scoring System**: Automatically scores job matches based on your skills and requirements 📝 Auto-Apply Threshold**: Only applies to jobs above your specified compatibility score 📋 Notion Integration**: Automatically tracks applications in organized Notion database 💬 Telegram Notifications**: Real-time alerts for high-match job opportunities ☁️ Google Drive Storage**: Saves personalized cover letters for each application ⚠️ Error Handling**: Comprehensive error tracking with Telegram notifications ⏰ Automated Scheduling**: Runs every 8 hours to find fresh opportunities 🛠 What This Workflow Does Scheduled Search: Automatically searches multiple job platforms every 8 hours Data Normalization: Standardizes job data from different API sources AI Analysis: Gemini AI evaluates each job posting against your skills profile Smart Filtering: Only processes jobs above your compatibility threshold (default: 75%) Application Tracking: Creates detailed records in Notion with match scores and status Instant Alerts: Sends Telegram notifications for promising opportunities Cover Letter Generation: AI creates personalized cover letters for each position Document Management: Automatically saves all cover letters to Google Drive 🔧 Required Integrations Job APIs**: Jooble API, WhatJobs API (JobStreet & Indeed use web scraping) AI Service**: Google Gemini API for job analysis Productivity**: Notion database for application tracking Communication**: Telegram bot for notifications Storage**: Google Drive for cover letter management 💡 Perfect For Software Developers** seeking JavaScript, React, Node.js positions Full-Stack Engineers** wanting automated job discovery Tech Professionals** needing organized application tracking Remote Workers** searching across multiple platforms Career Changers** looking for systematic job hunting 🎛 Customizable Variables Job Keywords**: Define your target roles and skills Location & Radius**: Set geographic search parameters Auto-Apply Threshold**: Control compatibility score requirements Results Limit**: Adjust number of jobs per platform Schedule Frequency**: Modify search intervals 📈 Benefits Save 10+ hours weekly** on manual job searching Never miss opportunities** with automated monitoring Professional application tracking** with detailed analytics Personalized cover letters** for every application Instant notifications** for high-match positions Complete audit trail** of all job search activities 🚀 Getting Started Import the workflow to your n8n instance Configure API credentials for all job platforms Set up Notion database with provided template structure Create Telegram bot and Google Drive folder Customize job search parameters for your profile Activate workflow and start receiving opportunities! 📝 Additional Notes Uses placeholder credentials for security ({{PLACEHOLDER_API_KEY}}) Comprehensive error handling prevents workflow failures Includes detailed setup instructions via sticky notes Optimized for Indonesian job market (JobStreet.co.id) Easily adaptable for other regions and job types Perfect for developers, engineers, and automation enthusiasts who want to leverage AI and n8n's power to dominate their job search process! 🚀
by Rahul Joshi
Description: Automate your developer onboarding quality checks with this n8n workflow template. Whenever a new onboarding task is created in ClickUp, the workflow logs it to Google Sheets, evaluates its completeness using Azure OpenAI GPT-4o-mini, and alerts your team in Slack if critical details are missing. Perfect for engineering managers, DevOps leads, and HR tech teams who want to maintain consistent onboarding quality and ensure every developer gets the tools, credentials, and environment setup they need — without manual review. ✅ What This Template Does (Step-by-Step) ⚡ Step 1: Auto-Trigger on ClickUp Task Creation Listens for new task creation events (taskCreated) in your ClickUp workspace to initiate the audit automatically. 📊 Step 2: Log Task Details to Google Sheets Records essential task data — task name, assignee, and description — creating a central audit trail for all onboarding activities. 🧠 Step 3: AI Completeness Analysis (GPT-4o-mini) Uses Azure OpenAI GPT-4o-mini to evaluate each onboarding task for completeness across key areas: Tooling requirements Credential setup Environment configuration Instruction clarity Outputs: ✅ Score (0–100) ⚠️ List of Missing Items 💡 Suggestions for Improvement 🚦 Step 4: Apply Quality Gate Checks whether the AI-generated completeness score is below 80. Incomplete tasks automatically move to the alert stage for review. 📢 Step 5: Alert Team via Slack Sends a structured Slack message summarizing the issue, including: Task name & assignee Completeness score Missing checklist items Recommended next actions This ensures your team fixes incomplete onboarding items before they impact new hires. 🧠 Key Features 🤖 AI-driven task completeness scoring 📊 Automatic task logging for audit visibility ⚙️ Smart quality gate (score threshold < 80) 📢 Instant Slack alerts for incomplete tasks 🔄 End-to-end automation from ClickUp to Slack 💼 Use Cases 🎓 Audit onboarding checklists for new developers 🧩 Standardize environment setup and credential handover 🚨 Identify missing steps before onboarding deadlines 📈 Maintain onboarding consistency across teams 📦 Required Integrations ClickUp API – to detect new onboarding tasks Google Sheets API – to store audit logs and history Azure OpenAI (GPT-4o-mini) – to evaluate completeness Slack API – to alert the team on incomplete entries 🎯 Why Use This Template? ✅ Ensures every new developer receives a full, ready-to-start setup ✅ Eliminates manual checklist verification ✅ Improves onboarding quality and compliance tracking ✅ Creates a transparent audit trail for continuous improvement
by Rahul Joshi
📘 Description This workflow automates end-to-end AI-driven inventory intelligence, transforming Airtable stock data into optimized reorder recommendations, daily operational summaries, and instant Slack alerts. It fetches all inventory rows, validates structure, computes reorder and safety-stock metrics using strict formulas, merges multi-batch AI output into a unified dataset, and distributes actionable insights across Email and Slack. Invalid or corrupted Airtable rows are logged to Google Sheets for audit and cleanup. The workflow ensures deterministic inventory math (zero improvisation), strict JSON compliance, and reliable multi-channel reporting for operations teams. ⚙️ What This Workflow Does (Step-by-Step) ▶️ Manual Trigger – Start Inventory Optimization Runs the full optimization and reporting pipeline on demand. 📦 Fetch Inventory Records from Airtable Retrieves all SKU records (ID, ItemName, SKU, quantities, reorder levels) from the Airtable Inventory table. 🔍 Validate Inventory Record Structure (IF) Ensures each record contains a valid id. Valid → routed to AI optimization Invalid → saved to Google Sheets. 📄 Log Invalid Inventory Rows to Google Sheet Captures malformed or incomplete Airtable items for audit checks and data hygiene. 🧠 Configure GPT-4o — Inventory Optimization Model Defines the AI model for stock-level calculations using strict formulas: SuggestedReorderPoint = ReorderLevel × 1.2 SuggestedSafetyStock = ReorderLevel × 0.5 StockStatus logic: Critical if QuantityInStock ≤ SuggestedSafetyStock Needs Reorder if QuantityInStock ≤ SuggestedReorderPoint OK otherwise 🤖 Generate Inventory Optimization Output (AI) The AI engine analyzes each SKU and returns: Suggested reorder point Suggested safety stock Updated stock status Clean structured JSON for each item All without markdown, hallucination, or additional logic. 🧩 Merge AI Optimization Results (Code) Consolidates all partial AI responses into one complete JSON dataset containing all SKUs. 🧠 Configure GPT-4o – Email Summary Model Prepares the AI model used for generating a professional operations-team email. 📧 Generate Inventory Email Summary (AI) Creates a manager-ready email including: High-level inventory health Detailed SKU summaries Alerts for low, reorder-level, or critical stock Recommended actions for today’s operations 📨 Email Inventory Summary to Manager (Gmail) Sends the completed inventory summary to the operations manager. 🧠 Configure GPT-4o – Slack Summary Model Sets up GPT-4o to produce a compact, emoji-supported Slack summary. 💬 Generate Inventory Slack Summary (AI) Builds a Slack-optimized message containing: One-line inventory health Bullet list of SKUs with stock status Clear alerts for reorder-level or critical items One recommended action line 📡 Notify Operations Team on Slack Delivers the optimized Slack summary to the operations Slack user/channel for real-time visibility. 🧩 Prerequisites Airtable access token Azure OpenAI GPT-4o credentials Google Sheets OAuth Slack API credentials Gmail OAuth 💡 Key Benefits ✔ AI-powered stock calculations with strict formulas ✔ Reliable reorder and safety-stock predictions ✔ Instant multi-channel reporting (Email + Slack) ✔ Full audit logging for invalid data ✔ Zero hallucinations—pure structured JSON ✔ Faster decision-making for operations teams 👥 Perfect For Operations & supply-chain teams Inventory managers Retail & e-commerce units Businesses using Airtable for stock tracking
by WeblineIndia
Smart Contract Event Monitor (Web3) This workflow automatically monitors the Ethereum blockchain, extracts USDT transfer events, filters large-value transactions, stores them in Airtable and sends a clean daily summary alert to Slack. This workflow checks the latest Ethereum block every day and identifies high-value USDT transfers. It fetches on-chain logs using Alchemy, extracts sender/receiver/value details, filters transactions above a threshold, stores them in Airtable and finally sends a single clear summary alert to Slack. You receive: Daily blockchain check (automated) Airtable tracking of all high-value USDT transfers A Slack alert summarizing the count + the largest transfer Ideal for teams wanting simple, automated visibility of suspicious or large crypto movements without manually scanning the blockchain. Quick Start – Implementation Steps Add your Alchemy Ethereum Mainnet API URL in both HTTP nodes. Connect and configure your Airtable base & table. Connect your Slack credentials and set the channel for alerts. Adjust the value threshold in the IF node (default: 1,000,000,000). Activate the workflow — daily monitoring begins instantly. What It Does This workflow automates detection of high-value USDT transfers on Ethereum: Fetches the latest block number using Alchemy. Retrieves all USDT Transfer logs from that block. Extracts structured data: Sender Receiver Amount Contract Block number Transaction hash Filters only transactions above a configurable threshold. Saves each high-value transaction into Airtable for record-keeping. Generates a summary including: Total number of high-value transfers The single largest transfer Sends one clean alert message to Slack. This ensures visibility of suspicious or large fund movements with no repeated alerts. Who’s It For This workflow is ideal for: Crypto analytics teams Blockchain monitoring platforms Compliance teams tracking high-value activity Web3 product teams Developers needing automated USDT transfer tracking Anyone monitoring whale movements / suspicious transactions Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Alchemy API URL** (Ethereum Mainnet) Airtable base** + Personal Access Token Slack workspace** with API permissions Basic understanding of Ethereum logs, hex values & JSON data How It Works Daily Check – Workflow runs automatically at your set time. Get Latest Block Number – Fetches newest Ethereum block from Alchemy. Fetch USDT Logs – Queries all Transfer events (ERC-20 standard). Extract Transaction Details – Converts hex → readable data. Filter High-Value Transactions – Keeps only large value transfers. Save to Airtable – Adds each transfer record to your database. Generate Summary – Finds the largest transfer & total count. Send Slack Alert – Notifies your team with one clean summary. Setup Steps Import the provided n8n JSON file. Open the Get Latest Block and Fetch Logs HTTP nodes → add your Alchemy API URL. Ensure USDT contract address (already included):0xdAC17F958D2ee523a2206206994597C13D831ec7 Connect your Airtable account and map: Contract From Address To Address Value Block Number txHash Connect Slack API credentials and choose your channel. Change the threshold limit in the IF node if needed (default: 1B). Activate the workflow — done! How To Customize Nodes Customize Value Threshold Modify the IF node: Increase or decrease the minimum transfer value Change logic to smaller or larger whale-tracking Customize Airtable Storage You can add fields like: Timestamp Token symbol USD price (using price API) Transaction status Risk classification Customize Slack Alerts You may add: Emojis Mentions (@channel, @team) Links to Etherscan Highlighted blocks for critical transfers Customize Web3 Provider Replace Alchemy with: Infura QuickNode Public RPC (not recommended for reliability) Add-Ons (Optional Enhancements) You can extend this workflow to: Track multiple ERC-20 tokens Process several blocks instead of just the latest Add price conversion (USDT → USD value) Detect transfers to suspicious wallets Generate daily or weekly summary reports in Slack Create a dashboard using Airtable Interfaces Add OpenAI-based insights (large spike, suspicious pattern, etc.) Use Case Examples 1\. Whale Tracking Detect large USDT movements (>1M or >5M). 2\. Compliance Monitoring Log high-value transfers in Airtable for audits. 3\. Real-Time Alerts Slack alerts tell your team instantly about big movements. 4\. On-Chain Analytics Automate structured extraction of Ethereum logs. 5\. Exchange Monitoring Detect large inflows/outflows to known addresses. Troubleshooting Guide | Issue | Possible Cause | Solution | |------------------------|-----------------------------------|---------------------------------------------------------| | No data in Airtable | Logs returned empty | Ensure USDT transfer events exist in that block | | Values are “zero” | Hex parsing failed | Check extract-code logic | | Slack alert not sent | Invalid credentials | Update Slack API key | | Airtable error | Wrong field names | Match Airtable column names exactly | | HTTP request fails | Wrong RPC URL | Re-check Alchemy API key | | Workflow not running | Schedule disabled | Enable "Daily Check" node | Need Help? If you need help customizing or extending this workflow — adding multi-token monitoring, setting up dashboards, improving alerts or scaling this for production — our n8n workflow developers at WeblineIndia can assist you with advanced automation.
by Rahi
n8n Workflow: AI-Personalized Email Outreach (Smartlead) 🔄 Purpose This workflow automates cold email campaigns by: Fetching leads Generating hyper-personalized email content using AI Sending emails via Smartlead API Logging campaign activity into Google Sheets 🧩 Workflow Structure Schedule Trigger Starts the workflow automatically at scheduled intervals. Ensures continuous campaign execution. Get Leads Fetches lead data (name, email, company, role, industry). Serves as the input for personalization. Loop Over Leads Processes each lead one by one. Maintains individualized email generation. Aggregate Lead Data Collects and formats lead attributes. Prepares structured input for the AI model. Basic LLM Chain #1 Generates personalized snippets/openers using AI. Tailored based on company, role, and industry. Update Row (Google Sheets) Saves AI outputs (snippets) for tracking and QA. Basic LLM Chain #2 Expands snippet into a full personalized email draft. Includes subject line + email body. Information Extractor Extracts structured fields from AI output: Subject Greeting Call-to-Action (CTA) Closing Update Row (Google Sheets) Stores finalized draft in Google Sheets. Provides visibility and audit trail. Code Formats email into Smartlead-compatible payload. Maps fields like subject, body, and recipient details. Smartlead API Request Sends the personalized email through Smartlead. Returns message ID and delivery status. Basic LLM Chain #3 (Optional) Generates follow-up versions for multi-step campaigns. Ensures varied engagement over time. Information Extractor (Follow-ups) Structures follow-up emails into ready-to-send format. Update Row (Google Sheets) Updates campaign logs with: Smartlead send status Message IDs AI personalization notes ⚙️ Data Flow Summary Trigger** → Runs workflow Get Leads** → Fetch lead records LLM Personalization** → Create openers + full emails Google Sheets** → Save drafts & logs Smartlead API** → Send personalized email Follow-ups** → Generate and log structured follow-up messages 📊 Use Case Automates hyper-personalized cold email outreach at scale. Uses AI to improve response rates with contextual personalization. Provides full visibility by saving drafts and send logs in Google Sheets. Integrates seamlessly with Smartlead for sending and tracking.