by John Pranay Kumar Reddy
🧾 Summary This workflow monitors Kubernetes pod CPU usage using Prometheus, and sends real-time Slack alerts when CPU consumption crosses a threshold (e.g., 0.8 cores). It groups pods by application name to reduce noise and improve clarity, making it ideal for observability across multi-pod deployments like Argo CD, Loki, Promtail, applications etc. 👥 Who’s it for Designed for DevOps and SRE teams and platform teams, this workflow is 100% no-code, plug-and-play, and can be easily extended to support memory, disk, or network spikes. It eliminates the need for Alertmanager by routing critical alerts directly into Slack using native n8n nodes. ⚙️ What it does This n8n workflow polls Prometheus every 5 minutes ⏱️, checks if any pod's CPU usage crosses a defined threshold (e.g., 0.8 cores) 🚨, groups them by app 🧩, and sends structured alerts to a Slack channel 💬. 🛠️ How to set up 🔗 Set your Prometheus URL with required metrics (container_cpu_usage_seconds_total, kube_pod_container_resource_limits) 🔐 Add your Slack bot token with chat:write scope 🧩 Import the workflow, customize: Threshold (e.g., 0.8 cores) Slack channel Cron schedule 📋 Requirements A working Prometheus stack with kube-state-metrics Slack bot credentials n8n instance (self-hosted or cloud) 🧑💻 How to customize 🧠 Adjust threshold values or query interval 📈 Add memory/disk/network usage metrics 💡 This is a plug-and-play Kubernetes alerting template for real-time observability. 🏷️ Tags: Prometheus, Slack, Kubernetes, Alert, n8n, DevOps, Observability, CPU Spike, Monitoring Prometheus Spike Alert to Slack
by Agent Circle
This N8N template makes it easy to extract key YouTube video data - including title, view count, like count, comment count, and many more - and save it directly into a connected Google Sheet. Use cases are many: Whether you're YouTubers, content strategists, growth marketers, and automation engineers, this tool gives you fast, structured access to video-level insights in seconds. How It Works The workflow begins when you click Execute Workflow or Test Workflow manually in N8N. It reads the list of video URLs in the connected Google Sheet. Only the URLs marked with the Ready status will be processed. The tool loops through each video and prepares the necessary data for the YouTube API call later. For each available URL, the tool extracts the video ID and sends a request to the YouTube API to fetch key metrics. The response is checked: If successful: the video’s statistics are written back to the corresponding row in the Google Sheet and the row's status is marked as Finished. If unsuccessful: the row's status is updated to Error for later review. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the YouTube - Get Video Statistics Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet template: Node Google Sheets - Get Video URLs → connected to Tab Video URLs; Node Google Sheets - Update Data → connected to Tab Video URLs; Node Google Sheets - Update Data - Error → connected to Tab Video URLs. For YouTube access, set up a GET method to connect to YouTube API in the following node: Node HTTP - Find Video Data. In your connected Google Sheet, enter the video URLs that you want to crawl and set the rows' status to Ready. Run the workflow by clicking Execute Workflow or Test Workflow in N8N. View the results in your connected Google Sheet: Successful fetches will update the rows' status in Column A in the Video URLs tab to Finished and the video metrics will populate. If the call fails, the rows' status in Column A in the tab will be marked as Error. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries automatically. If you want to fetch additional video fields or analytics (like tags, category ID, etc.), you can expand the HTTP - Find Video Data node to include those. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Rosh Ragel
Automatically Send Square Summary Report for Yesterday's Sales via Microsoft Outlook What It Does This workflow automatically connects to the Square API and generates a daily sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run daily and pull the previous day's sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Microsoft Outlook credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every day at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the specified report_date Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent to the chosen third party Example Use Cases Automatically send Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per day Instead of a daily summary, you can modify this workflow to produce a weekly summary once a week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Eumentis
What It Does This workflow automatically runs when a new email is received in the user's Gmail account. It sends the email content to OpenAI (GPT-4.1-mini), which intelligently determines whether the message requires action. If the email is identified as actionable, the workflow sends a structured alert message to the user in Microsoft Teams. This keeps the user informed of high-priority emails in real time without the need to manually check every message. The workflow does not log any execution data, ensuring that email content remains secure and unreadable by others. How It Works Trigger on New Email**: The workflow is triggered automatically when a new email is received in the user's Gmail account. Email Evaluation with OpenAI**: The email content is sent to GPT-4.1-MINI, which evaluates whether the message requires user action. Filter Actionable Emails**: Only emails identified as actionable by the AI are allowed to proceed through the rest of the workflow. Send Notification to Teams**: For actionable emails, the workflow sends a structured alert message to the user in Microsoft Teams chat via a Power Automate webhook. Prerequisites Gmail IMAP Credentials OpenAI API Key Microsoft Teams Webhook URL Power Automate Flow to send message to Teams chat How to Set It Up 1. Set Up Power Automate Workflow 1.1 Open Workflow Power Automate in Microsoft Teams Open the Workflow app from Microsoft Teams. If it's not already added, go to Apps → search "Workflow" → click Add → open it. 1.2 Create a New Flow Click New Flow → select Create from blank. 1.3 Add a Trigger: When a Teams webhook request is received In the trigger setup, set Who can trigger the flow? to Anyone. After saving the flow, a webhook URL will be generated — this URL will be used in n8n workflow. 1.4 Add Action: Parse JSON Set Content to: Body Use the following schema: { "type": "object", "properties": { "from": { "type": "string" }, "receivedAt": { "type": "string" }, "subject": { "type": "string" }, "message": { "type": "string" } } } 1.5 Add Action: Get an @mention token for a user Set the User field to the Microsoft Teams email address of the person to notify (e.g. yourname@domain.com). 1.6 Add Action: Post message in a chat or channel In this action, configure the following: Post as: Flow bot Post in: Chat with Flow bot Recipient: Your Microsoft Teams email address (e.g., yourname@domain.com) Paste the following code into the Message (in code view): Hello @{outputs('Get_an_@mention_token_for_a_user')?['body/atMention']}, You have received a new email at your email address @{body('Parse_JSON')?['recipientEmail']} that requires your attention: From: @{body('Parse_JSON')?['sender']} Received On: @{body('Parse_JSON')?['date']} Subject: @{body('Parse_JSON')?['subject']} Please review the message at your earliest convenience. Click here to search this mail in your mailbox 1.7 Save and Enable the Flow Click Save. Turn the flow On. The webhook URL is now active and available in the first trigger step, copy it to use in n8n. Need help with the setup? Feel free to contact us 2. Configure IMAP Email Trigger First, enable 2‑Step Verification in your Google Account and generate an App Password for n8n. Then, in the IMAP node → Create Credential to connect using the following details: • User: your Gmail address • Password: the App Password • Host: imap.gmail.com • Port: 993 • SSL/TLS: Enabled Follow the n8n documentation to complete the setup. 3. Configure OpenAI Integration Add your OpenAI API key as a credential in n8n. Follow the n8n documentation to complete the setup. 4. Set Up HTTP Request to Trigger Power Automate Workflow Paste generated Webhook URL from the Power Automate workflow into the URL field of the HTTP Request node. 5. Disable Execution Logging for Privacy To ensure that email content is not stored in logs and remains fully secure, you can disable execution logging in n8n: In the n8n Workflow Editor, click on the three dots (•••) in the top right corner and select Settings. In the settings panel: Set Save manual executions to: Do not save Set Save successful production executions to: Do not save Set Save failed production executions to: Do not save if you also want to avoid logging errors Save the changes. Refer to the official n8n documentation for more details: 6. Activate the Workflow Set the workflow status to Active in n8n so it runs automatically when a new mail is received in Gmail. Need Help? Contact us for support and custom workflow development.
by Garri
Description This workflow is an n8n-based automation that allows users to download TikTok/Reels videos without watermarks simply by sending the video link through a Telegram Bot. It uses a Telegram Trigger to receive the link from the user, then makes an HTTP request to a third-party API (tiktokio.com) to process and retrieve the download link. The workflow filters the results to find the Download without watermark link, downloads the video in MP4 format, and sends it back to the user directly in their Telegram chat. Key features: Supports the best available video quality (bestvideo+bestaudio). Automatically removes watermarks. Instant response directly in Telegram chat. Fully automated — no manual downloads required. How It Works Telegram Trigger The user sends a TikTok or Reels link to the Telegram bot. The workflow captures and stores the link for processing. HTTP Request – MediaDL API The link is sent via POST method to https://mediadl.app/api/download. The API processes the link and returns video file data. Wait Delay The workflow waits a few seconds to ensure the API response is fully ready. Edit Fields Extracts the video file URL from the API response. Additional Wait Delay Adds a short pause to avoid connection errors during the download process. HTTP Request – Proxy Download Downloads the MP4 video file directly from the filtered URL. Send Video via Telegram The downloaded video is sent back to the user in their Telegram chat. How to Set Up Create & Configure a Telegram Bot Open Telegram and search for BotFather. Send /newbot → choose a name & username for your bot. Copy the Bot Token provided — you’ll need it in n8n. Prepare Your n8n Environment Log in to your n8n instance (self-hosted or n8n Cloud). Go to Credentials → create new Telegram API credentials using your Bot Token. Import the Workflow In n8n, click Import and select the PROJECT_DOWNLOAD_TIKTOK_REELS.json file. Configure the Telegram Nodes In the Telegram Trigger and Send Video nodes, connect your Telegram API credentials. Configure the HTTP Request Nodes Ensure the Download2 and HTTP Request nodes have the correct URL and headers (pre-configured for mediadl.app). Make sure the responseFormat is set to file in the final download node. Activate the Workflow Toggle Activate in the top right corner of n8n. Test by sending a TikTok or Reels link to your bot — you should receive the no-watermark video in return.
by Puspak
🚀 Remote Job Automation Workflow Automatically fetch, clean, and broadcast the latest remote job listings — powered by RemoteOK, Airtable, and Telegram. 🔧 Key Features Seamless Data Fetching: Pulls the latest job listings from the RemoteOK API using an HTTP Request node. Smart Data Processing (via Code Node): Filters out irrelevant metadata Cleans and sanitizes job descriptions (e.g., HTML tags, special characters) Handles malformed or encoded text gracefully Extracts and formats salary ranges for clarity Airtable Integration (Upsert): Stores each job as a unique record using job ID Avoids duplication through conditional upserts using Airtable's Personal Access Token Telegram Bot Broadcasting: Automatically formats and sends job posts to a Telegram group or channel Keeps your community or team updated in real-time 📦 Tech Stack RemoteOK API – source of curated remote job listings Airtable – lightweight, accessible job database Telegram Bot API – for real-time job notifications n8n – workflow automation engine to tie everything together 🔁 Workflow Overview Fetch Jobs from RemoteOK API Clean & Normalize job descriptions and metadata Extract Salary ranges and standardize them Upsert to Airtable (avoiding duplicates) Format Post for visual clarity Send to Telegram via bot integration 🧠 Perfect For Remote job boards or aggregators Recruitment agencies/startups Developers building personal job feeds Communities or channels sharing curated remote opportunities Automating newsletters or job digests ✅ Benefits Near real-time updates Minimal maintenance Full control and extensibility with n8n
by Yusei Miyakoshi
Who’s it for Teams that start their day in Slack and want a concise, automated summary of yesterday’s emails—ops leads, PMs, founders, and anyone handling busy inboxes without writing code. What it does / How it works Runs every morning at 08:00 (cron 0 0 8 * * ), fetches all emails received *yesterday, and routes the result: if none were found, it posts a polite “no emails” notice; if emails exist, it aggregates them and asks an AI agent to produce a structured digest, then formats and posts to your chosen Slack channel. The flow uses **Gmail → If → Aggregate (Item Lists) → AI Agent (OpenRouter model with structured output) → Code (Slack formatter) → Slack. A set of sticky notes on the canvas explains each step and required inputs. How to set up Connect Gmail (OAuth2) and keep the default date window (yesterday → today at 00:00). Connect Slack (OAuth2) and select your target channel. Add OpenRouter credentials and pick a compact model (e.g., gpt-4o-mini). Keep the provided structured-output schema and formatter code. Adjust the schedule/timezone if needed (the fallback message includes an Asia/Tokyo example). Paste this description into the yellow sticky note at the top of the canvas. Requirements Gmail & Slack accounts with appropriate scopes OpenRouter API key stored in Credentials (no hard-coded keys) n8n Cloud or self-host with LangChain agent nodes enabled How to customize the workflow Narrow Gmail results with label/search filters (e.g., from:, subject:). Change the digest sections or tone in the AI Agent system prompt. Swap the model for cost/quality needs and tweak temperature/max tokens. Localize dates/timezones in the formatter code and Slack messages. Branch the output to email, Google Docs, or Sheets for archival. Security & publishing tips Rename all nodes clearly, do not hardcode API keys, remove real channel IDs/emails before sharing, and group end-user variables in a Set (Fields) node. Keep the sticky notes—they’re mandatory for reviewers.
by Arunava
This workflow finds fresh Reddit posts that match your keywords, decides if they’re actually relevant to your brand, writes a short human-style reply using AI, posts it, and logs everything to Baserow. 💡Perfect for Lead gen without spam: drop helpful replies where your audience hangs out. Get discovered by AI surfaces (AI Overviews / SGE, AISEO/GSEO) via high-quality brand mentions. Customer support in the wild: answer troubleshooting threads fast. Community presence: steady, non-salesy contributions in niche subreddits. 🧠 What it does Searches Reddit for your keyword query on a schedule (e.g., every 30 min) Checks Baserow first so you don’t reply twice to the same post Uses an AI prompt tuned for short, no-fluff, subreddit-friendly comments Softly mentions your brand only when it’s clearly relevant Posts the comment via Reddit’s API Saves post_id, comment_id, reply, permalink, status to Baserow Processes posts one-by-one with an optional short wait to be nice to Reddit ⚡ Requirements Reddit developer API Baserow account, table, and API token AI provider API (OpenAI / Anthropic / Gemini) ⚙️ Setup Instructions Create Baserow table Fields (user-field names exactly): post_id (unique), permalink, subreddit, title, created_utc, reply (long text), replied (boolean), created_on (datetime). Add credentials in n8n Reddit OAuth2* (scopes: read, submit, identity) and set a proper *User-Agent** string (Reddit requires it). LLM**: Google Gemini and/or Anthropic (both can be added; one can be fallback in the AI Agent). Baserow**: API token. Set the Schedule Trigger (Cron) Start hourly (or every 2–3h). Pacing is mainly enforced by the Wait nodes. Update “Check duplicate row” (HTTP Request) URL**: https://api.baserow.io/api/database/rows/table/{TABLE_ID}/?user_field_names=true&filter__post_id__equal={{$json.post_id}} Header**: Authorization: Token YOUR_BASEROW_TOKEN (Use your own Baserow domain if self-hosted.) Configure “Filter Replied Posts” Ensure it skips items where your Baserow record shows replied === true (so you don’t comment twice). Configure “Fetch Posts from Reddit” Set your keyword/search query (and time/sort). Keep User-Agent header present. Configure “Write Reddit Comment (AI)” Update your brand name** (and optional link). Edit the prompt/tone** to your voice; ensure it outputs a short reply field (≤80 words, helpful, non-salesy). Configure “Post Reddit Comment” (HTTP Request) Endpoint: POST https://oauth.reddit.com/api/comment Body: thing_id: "t3_{{$json.post_id}}", text: "{{$json.reply}}" Uses your Reddit OAuth credential and User-Agent header. Update user_agent value in header by your username n8n:reddit-autoreply:1.0 (by /u/{reddit-username}) Store Comment Data on Baserow (HTTP Request) POST https://api.baserow.io/api/database/rows/table/{TABLE_ID}/?user_field_names=true Header: Authorization: Token YOUR_BASEROW_TOKEN Map: post_id, permalink, subreddit, title, created_utc, reply, replied, created_on={{$now}}. Keep default pacing Leave Wait 5m (cool-off) and Wait 6h (global pace) → \~4 comments/day. Reduce waits gradually as account health allows. Test & enable Run once manually, verify a Baserow row and one test comment, then enable the schedule. 🤝 Need a hand? I’m happy to help you get this running smoothly—or tailor it to your brand. Reach out to me via email: imarunavadas@gmail.com
by Luka Zivkovic
Description Who's it for This workflow is designed for developers, entrepreneurs, and startup enthusiasts who want personalized, AI-driven startup idea generation and analysis. Perfect for solo developers seeking side project inspiration, startup accelerators evaluating concepts, or anyone looking to validate business ideas with professional-grade analysis. How it works The workflow uses a three-stage Claude AI agent pipeline to create comprehensive startup analyses. The first agent generates innovative startup ideas based on your technical skills and preferences. The second agent acts as a venture capitalist, critically analyzing market viability, competition, and execution challenges. The third agent performs sentiment analysis and synthesizes a final recommendation with actionable next steps. How to set up Configure Anthropic API credentials for all three Claude AI model nodes Set up Gmail OAuth2 for email delivery Fill out the "My Information" node with your developer profile Update the recipient email address in the Gmail node Test with the manual trigger before enabling daily automation Requirements n8n account Anthropic API account for Claude AI access Gmail account with OAuth2 configured Basic understanding of developer skills and market preferences How to customize the workflow Modify the AI agent prompts to focus on specific industries or business models. Adjust temperature settings for different creativity levels. Add database storage to track idea history. Configure the form trigger for team-wide idea generation or integrate with Slack for automated sharing. Got a good idea? Visit my site https://techpoweredgrowth.com to get help getting to the next level Or reach out to luka.zivkovic@techpoweredgrowth.com
by DevCode Journey
Who is this for? This workflow is designed for business founders, CMOs, marketing teams, and landing page designers who want to automatically analyze their landing pages and get personalized, unconventional, high-impact conversion rate optimization (CRO) recommendations. It works by scraping the landing page content, then leveraging multiple AI models to roast the page and generate creative CRO ideas tailored specifically for that page. What this Workflow Does / Key Features Captures a landing page URL through a user-friendly form trigger. Scrapes the landing page HTML content using an HTTP request node. Sends the scraped content to a LangChain AI Agent, which orchestrates various AI models (OpenAI, Google Gemini, Mistral, etc.) for deep analysis. The AI Agent produces a friendly, fun, and unconventional “roast” of the landing page, explaining what’s wrong in human tone. Generates 10 detailed, personalized, easy-to-implement, and 2024-relevant CRO recommendations with a “wow” factor. Delivers the analysis and recommendations via Telegram message, Gmail email, and WhatsApp (via Rapiwa). Utilizes multiple AI tools and search APIs to enhance the quality and creativity of the output. Requirements OpenAI API credentials configured in n8n. Google Gemini (PaLM) API credentials for LangChain integration. Mistral Cloud API credentials for text extraction. Telegram bot credentials for sending messages. Gmail OAuth2 credentials for email delivery. Rapiwa API credentials for WhatsApp notifications. Running n8n instance with nodes: Form Trigger, HTTP Request, LangChain AI Agent, Telegram, Gmail, and custom Rapiwa node. How to Use — step-by-step Setup 1) Credentials Add your OpenAI API key under n8n credentials (OpenAi account 2). Add Google Gemini API key (Google Gemini (PaLM) Api account). Add Mistral Cloud API key (Mistral Cloud account). Set up Telegram Bot credentials (Telegram account). Set up Gmail OAuth2 credentials (Gmail account). Add Rapiwa API key for WhatsApp messages (Rapiwa). 2) Configure the Form Trigger Customize the form title, description, and landing page URL input placeholder if desired. 3) Customize Delivery Nodes Modify the Telegram, Gmail, and Rapiwa nodes with your desired recipient info and messaging preferences. 4) Run the Workflow Open the form URL webhook and submit the landing page URL to get a detailed AI-powered CRO roast and recommendations sent directly to your communication channels. Important Notes The AI Agent prompt is designed to create a fun and unconventional roast to engage users emotionally. Avoid generic advice. All CRO recommendations are personalized and contextual based on the scraped content of the provided landing page. Ensure all API credentials are kept secure and not hard-coded. Use n8n credentials management. Adjust the delivery nodes to match your preferred communication channels and recipients. The workflow supports expansion with additional AI models or messaging platforms as needed. 🙋 For Help & Community 👾 Discord: n8n channel 🌐 Website: devcodejourney.com 🔗 LinkedIn: Connect with Shakil 📱 WhatsApp Channel: Join Now 💬 Direct Chat: Message Now
by Oneclick AI Squad
Monitor Indian (NSE/BSE) and US stock markets with intelligent price alerts, cooldown periods, and multi-channel notifications (Email + Telegram). Automatically tracks price movements and sends alerts when stocks cross predefined upper/lower limits. Perfect for day traders, investors, and portfolio managers who need instant notifications for price breakouts and breakdowns. How It Works Market Hours Trigger - Runs every 2 minutes during market hours Read Stock Watchlist - Fetches your stock list from Google Sheets Parse Watchlist Data - Processes stock symbols and alert parameters Fetch Live Stock Price - Gets real-time prices from Twelve Data API Smart Alert Logic - Intelligent price checking with cooldown periods Check Alert Conditions - Validates if alerts should be triggered Send Email Alert - Sends detailed email notifications Send Telegram Alert - Instant mobile notifications Update Alert History - Records alert timestamps in Google Sheets Alert Status Check - Monitors workflow success/failure Success/Error Notifications - Admin notifications for monitoring Key Features: Smart Cooldown**: Prevents alert spam Multi-Market**: Supports Indian & US stocks Dual Alerts**: Email + Telegram notifications Auto-Update**: Tracks last alert times Error Handling**: Built-in failure notifications Setup Requirements: 1. Google Sheets Setup: Create a Google Sheet with these columns (in exact order): A**: symbol (e.g., TCS, AAPL, RELIANCE.BSE) B**: upper_limit (e.g., 4000) C**: lower_limit (e.g., 3600) D**: direction (both/above/below) E**: cooldown_minutes (e.g., 15) F**: last_alert_price (auto-updated) G**: last_alert_time (auto-updated) 2. API Keys & IDs to Replace: YOUR_GOOGLE_SHEET_ID_HERE - Replace with your Google Sheet ID YOUR_TWELVE_DATA_API_KEY - Get free API key from twelvedata.com YOUR_TELEGRAM_CHAT_ID - Your Telegram chat ID (optional) your-email@gmail.com - Your sender email alert-recipient@gmail.com - Alert recipient email 3. Stock Symbol Format: US Stocks**: Use simple symbols like AAPL, TSLA, MSFT Indian Stocks**: Use .BSE or .NSE suffix like TCS.NSE, RELIANCE.BSE 4. Credentials Setup in n8n: Google Sheets**: Service Account credentials Email**: SMTP credentials Telegram**: Bot token (optional) Example Google Sheet Data: symbol upper_limit lower_limit direction cooldown_minutes TCS.NSE 4000 3600 both 15 AAPL 180 160 both 10 RELIANCE.BSE 2800 2600 above 20 Output Example: Alert: TCS crossed the upper limit. Current Price: ₹4100, Upper Limit: ₹4000.
by Jitesh Dugar
Automatically qualify inbound demo requests, scrape prospect websites, and send AI-personalized outreach emails—all on autopilot. What This Workflow Does This end-to-end lead automation workflow helps SaaS companies qualify and nurture inbound leads with zero manual work until human approval. Key Features ✅ Smart Email Filtering - Automatically flags personal emails (Gmail, Yahoo, etc.) and routes them to a polite regret message ✅ Website Intelligence - Scrapes prospect websites and extracts business context ✅ AI Analysis - Uses OpenAI to score ICP fit, identify pain points, and find personalization opportunities ✅ Personalized Outreach - AI drafts custom emails referencing specific details from their website ✅ Human-in-the-Loop - Approval gate before sending to ensure quality control ✅ Professional Branding - Even rejected leads get a thoughtful response Perfect For B2B SaaS companies with inbound lead forms Sales teams drowning in demo requests Businesses wanting to personalize at scale Anyone needing intelligent lead qualification What You'll Need Jotform account (or any form tool with webhooks) Create your form for free on Jotform using this link OpenAI API key Gmail account (or any email service) n8n instance (cloud or self-hosted) Workflow Sections 📧 Lead Intake & Qualification - Capture form submissions and filter personal emails 🕷️ Website Scraping - Extract company information from their domain ❌ Regret Flow - Send polite rejection to unqualified leads 🤖 AI Analysis - Analyze prospects and draft personalized emails 📨 Approved Outreach - Human review + send welcome email Customization Tips: Update the AI prompt with your company's ICP and value proposition Modify the personal email provider list based on your market Adjust the regret email template to match your brand voice Add Slack notifications for high-value leads Connect your CRM to log all activities Time Saved: ~15-20 minutes per lead Lead Response: Under 5 minutes (vs hours/days manually)