by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by Vigh Sandor
Workflow Overview This n8n workflow provides automated monitoring of YouTube channels and sends real-time notifications to RocketChat when new videos are published. It supports all YouTube URL formats, uses dual-source video fetching for reliability, and intelligently filters videos to prevent duplicate notifications. Key Features Multi-Format URL Support**: Handles @handle, /user/, and /channel/ URL formats Dual Fetching Strategy**: Uses both RSS feeds and HTML scraping for maximum reliability Smart Filtering**: Only notifies about videos published in the last hour Shorts Exclusion**: Automatically excludes YouTube Shorts from notifications Rate Limiting**: 30-second delay between notifications to prevent spam Batch Processing**: Processes multiple channels sequentially Error Handling**: Continues execution even if one channel fails Customizable Schedule**: Default hourly checks, adjustable as needed Use Cases Monitor competitor channels, track favorite creators, aggregate content from multiple channels, build content curation workflows, stay updated on educational channels, monitor brand mentions, track news channels for breaking updates. Setup Instructions Prerequisites n8n instance (self-hosted or cloud) version 1.0+ RocketChat server with admin or bot access RocketChat API credentials Internet connectivity for YouTube access Step 1: Obtain RocketChat Credentials Create Bot User: Log in to RocketChat as administrator Navigate to Administration → Users → New Fill in details: Name (YouTube Monitor Bot), Username (youtube-bot), Email, Password, Roles (bot) Click Save Get API Credentials: Log in as bot user Navigate to My Account → Personal Access Tokens Click Generate New Token Enter token name: n8n YouTube Monitor Copy generated token immediately Note User ID from account settings Step 2: Configure RocketChat in n8n Open n8n web interface Navigate to Credentials section Click Add Credential → RocketChat API Fill in: Domain: Your RocketChat URL (e.g., https://rocket.yourdomain.com) User: Bot username (e.g., youtube-bot) Password: Bot password or personal access token Click Save and test connection Step 3: Prepare RocketChat Channel Create new channel in RocketChat: youtube-notifications Add bot user to channel: Click channel menu → Members → Add Users Search for bot username Click Add Step 4: Collect YouTube Channel URLs Handle Format: https://www.youtube.com/@ChannelHandle User Format: https://www.youtube.com/user/Username Channel ID Format: https://www.youtube.com/channel/UCxxxxxxxxxx All formats supported. Find channel ID in page source or use browser extension. Step 5: Import Workflow Copy workflow JSON In n8n: Workflows → Import from File/URL Paste JSON or upload file Click Import Step 6: Configure Channel List Locate Channel List node Enter YouTube URLs in channel_urls field, one per line: https://www.youtube.com/@NoCopyrightSounds/videos https://www.youtube.com/@chillnation/videos Include /videos suffix or workflow adds it automatically Step 7: Configure RocketChat Notification Locate RocketChat Notification node Replace YOUR-CHANNEL-NAME with your channel name Select RocketChat credential Customize message template if needed Step 8: Configure Schedule (Optional) Default: Every 1 hour To change: Open Hourly Check node Modify interval (Minutes, Hours, Days) Recommended Intervals: Every hour (default): Good balance Every 30 minutes: More frequent Every 2 hours: Less frequent Avoid intervals less than 15 minutes Important: YouTube RSS updates every 15 minutes. Hourly checks match 1-hour filter window. Step 9: Test the Workflow Click Execute Workflow button Monitor execution (green = success, red = errors) Check node outputs: Channel List: Shows URLs Filter New Videos: Shows found videos (may be empty) RocketChat Notification: Shows sent messages Verify notifications in RocketChat No notifications is normal if no videos posted in last hour. Step 10: Activate Workflow Toggle Active switch in top-right Workflow runs on schedule automatically Monitor RocketChat channel for notifications How to Use Understanding Workflow Execution Default Schedule: Hourly Executes every hour Checks all channels Processes videos from last 60 minutes Prevents duplicate notifications Execution Duration: 1-5 minutes for 10 channels. Rate limiting adds 30 seconds per video. Adding New Channels Open Channel List node Add new URL on new line Save (Ctrl+S) Change takes effect on next run Removing Channels Open Channel List node Delete line or comment out with # at start Save changes Changing Check Frequency Open Hourly Check node Modify interval If changing from hourly, update Filter New Videos node: Find: cutoffDate.setHours(cutoffDate.getHours() - 1); Change -1 to match interval (-2 for 2 hours, -6 for 6 hours) Important: Time window should match or exceed check interval. Understanding Video Sources RSS Feed (Primary): Official YouTube RSS Fast and reliable 5-15 minute delay for new videos Structured data HTML Scraping (Fallback): Immediate results Works when RSS unavailable More fragile Benefits of dual approach: Reliability: If one fails, other works Speed: Scraping catches videos immediately Completeness: RSS ensures nothing missed Videos are deduplicated automatically Excluding YouTube Shorts Shorts are filtered by checking URL for /shorts/ path. To include Shorts: Open Filter New Videos node Find: if (videoUrl && !videoUrl.includes('/shorts/')) Remove the !videoUrl.includes('/shorts/') check Rate Limiting 30-second wait between notifications: Prevents flooding RocketChat Allows users to read each notification Avoids rate limits Impact: 5 videos = 2.5 minutes, 10 videos = 5 minutes To adjust: Open Wait 30 sec node, change amount field (15-60 seconds recommended) Handling Multiple Channels Channels processed sequentially: Prevents overwhelming workflow Ensures reliable execution One failed channel doesn't stop others Recommend 20-50 channels per workflow FAQ Q: How many channels can I monitor? A: Recommend 20-50 per workflow. Split into multiple workflows for more. Q: Why use both RSS and scraping? A: RSS is reliable but delayed. Scraping is immediate but fragile. Both ensures no videos missed. Q: Can I exclude specific video types? A: Yes, add filtering logic in Filter New Videos node. Already excludes Shorts. Q: Will this get my IP blocked? A: Unlikely with hourly checks. Don't check more than every 15 minutes. Q: How do I prevent duplicate notifications? A: Ensure time window matches schedule interval. Already implemented. Q: What if channel changes handle? A: Update URL in Channel List node. YouTube maintains redirects. Q: Can I monitor playlists? A: Not directly. Would need modifications for playlist RSS feeds. Technical Reference YouTube URL Formats Handle: https://www.youtube.com/@handlename User: https://www.youtube.com/user/username Channel ID: https://www.youtube.com/channel/UCxxxxxx RSS Feed Format https://www.youtube.com/feeds/videos.xml?channel_id=UCxxxxxx Contains up to 15 recent videos with title, link, publish date, thumbnail. APIs Used: YouTube RSS (public), RocketChat API (requires auth) License: Open for modification and commercial use
by Milan Vasarhelyi - SmoothWork
Overview This workflow automates the complete purchase order cycle for Airtable-based inventory management using n8n. It continuously monitors your stock levels, automatically creates purchase orders when inventory falls below reorder thresholds, and sends formatted order emails to suppliers—eliminating manual tracking and reducing stockout risk. The system handles three critical automation processes: ensuring each supplier always has a draft purchase order ready, intelligently adding products to those orders based on forecasted stock levels versus threshold quantities, and automatically emailing suppliers when orders are approved for sending. Key Features Automated stock monitoring:** Hourly checks identify products that need reordering based on customizable threshold levels Smart purchase order management:** Automatically creates and maintains draft purchase orders for each supplier Supplier email automation:** Sends formatted order details directly to suppliers when purchase orders are ready Movement-based ledger:** Tracks every stock-in and stock-out transaction for complete audit trails Test data generator:** Includes a manual-trigger workflow to simulate random sales orders for testing Setup Requirements Required first step: You must copy the Airtable inventory management base into your own Airtable account from: https://airtable.com/appN9ivOwGQt1FwT5/shr1ApcBSi4SOVoPh After copying the base, you'll need to configure: Airtable credentials:** Personal Access Token with read/write permissions to your copied base Gmail credentials:** OAuth2 connection for sending purchase order emails to suppliers Base connections:** Update all Airtable nodes to point to your copied base URL and table IDs Configuration The workflow runs on an hourly schedule by default to check for products needing reorder. You can adjust this frequency in the "Check Products Hourly" Schedule Trigger node based on your business needs. All supplier-specific settings including email addresses, reorder thresholds, and refill quantities are managed directly in the Airtable base—not in the workflow itself. This allows non-technical team members to adjust inventory parameters without touching the automation. The included test workflow ("Generate random SO") is manual-trigger only and simulates daily sales by randomly reducing product quantities, making it easy to test the reorder automation without waiting for real sales data.
by Ladies Build With AI
Who is it for This workflow is designed for anyone who wants to simplify email automation without leaving Google Sheets. You can also send out emails automatically, without even visiting Google Sheets. It’s especially useful for: Marketers sending bulk or personalized campaigns Recruiters managing outreach from candidate lists Small business owners who want automated follow-ups Anyone who wants to trigger emails directly from sheet updates, e.g. event updates. How it works The workflow connects Google Sheets with Gmail to let you send emails in either of two ways: Bulk emails (mail merge): Use data from your sheet to send an email to multiple email addresses, one by one. Triggered emails: Automatically send an email whenever specific values or conditions in your sheet are met. No need to manually copy, paste, or switch to Gmail, because the process is fully automated. How to set it up Copy this template into your personal n8n workspace: https://docs.google.com/spreadsheets/d/1fWg_GOU0m_2cQpah7foDiz1WqTRKjCbJJCLBGCvJlXc/edit?usp=sharing Connect your Google Sheets and Gmail accounts using this workflow in n8n. Select the spreadsheet and sheet you want to use. Customize the email nodes with your subject line, body text, and variables (e.g., names or links from your sheet). Test the workflow, then activate it to start sending emails automatically. For a step-by-step walkthrough, check out this video guide on YouTube: https://www.youtube.com/watch?v=XJQ0W3yWR-0 Requirements A Google Sheets account with your data organized in rows and columns A Gmail account for sending emails An active n8n account to run the workflow
by Xavier Tai
🔄 Daily Follow-Up System with Multi-Stage Sequences What It Does Automatically sends timed follow-up emails to leads based on a 4-stage sequence (Day 1, 3, 7, 14), updates tracking automatically, and calculates next follow-up dates. Set it once, add leads, and never manually track follow-ups again. Converts cold leads into warm opportunities through consistent, professional touchpoints. How It Works Schedule Trigger → Runs daily at 9 AM Read Tracker Sheet → Gets all leads from Follow-Up Tracker Filter Today's Follow-Ups → Only processes leads where "Next Follow-Up Date" = Today Process Individually → Handles each lead one at a time (prevents rate limits) Route by Stage → Sends appropriate email based on Day 1/3/7/14 stage Send Stage Email → 4 different templates for each follow-up milestone Update Last Sent → Records when email was sent Calculate Next Date → Automatically schedules next follow-up (or marks complete) 🚀 SETUP INSTRUCTIONS Step 1: Create Follow-Up Tracker Sheet Create Google Sheet with tab "Follow-Up Tracker" Add columns: Name | Email | Project/Interest | Timeline | Next Step | Stage | Next Follow-Up Date | Last Sent Date | Status Populate with leads: Set Stage = "Day 1", Status = "Active", Next Follow-Up Date = desired start date Update YOUR_GOOGLE_SHEET_ID in nodes 2, 7, and 8 Step 2: Configure Email Templates Edit nodes 6-9 with your email templates Replace YOUR_CALENDAR_LINK with your actual booking link (Calendly, etc.) Replace YOUR_RESOURCE_LINK in Day 3 email with relevant content Customize sender name/signature in all templates Step 3: Setup Gmail Connection Add Gmail OAuth2 credentials to all email nodes Test workflow with one test lead first Monitor Gmail sending limits (500/day for free accounts) Step 4: Test the Sequence Add one test lead with Next Follow-Up Date = today Manually execute workflow to verify email sends Check that Google Sheet updates correctly Verify next stage is calculated properly
by Chandan Singh
++Who’s it for++ This template is ideal for anyone who needs reliable, real-time visibility into failed executions in n8n. Whether you’re a developer, operator, founder, or part of a small team, this workflow helps you detect issues quickly without digging through execution logs. It’s especially useful for users who want the flexibility to enable AI-powered diagnostics when needed. ++What it does++ The workflow sends an automated email alert whenever any workflow in your n8n instance encounters an error. It captures key details such as workflow name, timestamp, node name, and error message. If you enable AI analysis, the alert also includes a Severity Level and a Quick Resolution—giving you an instant, actionable understanding of the problem. If AI is disabled, you receive a clean, minimal error notification. ++How it works++ 1. Error Trigger activates when any workflow fails. 2. Config — Set Fields stores your SMTP settings and the AnalyzeErrorWithAI toggle. 3. Use AI Analysis? decides whether to run the AI node. 4. If enabled, Analyze Error with AI generates structured recommendations. 5. Format Email Body builds the message based on the selected mode. 6. Send Email delivers the notification. ++Requirements++ 1. SMTP credentials 2. A valid sender & recipient email 3. Optional: OpenAI credentials if using AI analysis ++How to set up++ 1. Open the Config node and fill in email settings and the AI toggle. 2. Add your SMTP and (optional) OpenAI credentials. 3. Save, activate, and test the workflow.
by Razvan Bara
How it works This workflow automates the process of fetching weather forecasts for your home location, including severe weather alerts, and sends timely notifications. It uses the Visual Crossing API for detailed weather data and integrates with Telegram (or other messaging services) for messaging and alerts. Step-by-step In summary, the workflow runs every hour, grabs the current day's weather conditions for [your city/location of interest], and returns only those items that truly contain one or more weather alerts. 📅 Step 1: Hourly Trigger The workflow begins with the Hourly Trigger node, which is a scheduleTrigger. This node acts as the clock that initiates the entire process at regular hourly intervals. 🌤️ Step 2: Fetch Weather Data Immediately after the trigger, the workflow moves to the Meteo node, an httpRequest. This node makes an external API call to fetch weather data for your specified location. API Used: Visual Crossing Web Services Authentication: Uses your API key (key=[API KEY]) Response format: JSON 🌪🌀 Step 3: Check for Severe Weather The JSON weather data output is analyzed, and if severe weather conditions or alerts are detected, the workflow sends the alert via your preferred communication channel(s). Optional You can replace the Telegram node with email, WhatsApp, SMS notifications, or add multiple notification nodes to receive severe weather alerts across all desired channels.
by Rodrigo
How it works This workflow automatically responds to incoming emails identified as potential leads using AI-generated text. It connects to your email inbox via IMAP, classifies incoming messages with an AI model, filters out non-leads, and sends a personalized reply to relevant messages. Steps Email Trigger (IMAP): Watches your inbox for new emails in real time. Is Lead? (Message Model): Uses AI to determine whether the sender is a lead. Filter: Passes only lead emails to the next step. Write Customized Reply (Message Model): Generates a personalized response using AI. Get Message: Retrieves original email details to ensure correct threading. Reply to Message: Sends the AI-generated reply to the sender. Setup Instructions Connect your IMAP Email credentials to the first node and set the folder to watch (e.g., INBOX). In the "Filter leads" node, adjust the AI prompt to match your lead qualification criteria. In the "Reply with customized message" node, edit the AI prompt to reflect your product, service, or business tone. Connect your Gmail (or other email provider) credentials in the Get Message and Reply to Message nodes. Test with a few sample emails before activating. Requirements IMAP-enabled email account (for receiving messages) Gmail API access (or modify to your email provider) OpenAI or other AI model credentials for message analysis and reply generation This template is ready to use, with all steps documented inside sticky notes for easy customization.
by Yang
📄 What this workflow does This workflow automatically scrapes product information from any website URL entered into a Google Sheet and stores the extracted product details into another sheet. It uses Dumpling AI to extract product data such as name, price, description, and reviews. 👤 Who is this for This is ideal for: Lead generation specialists capturing product info from prospect websites eCommerce researchers collecting data on competitor product listings Sales teams building enriched product databases from lead URLs Anyone who needs to automate product scraping from multiple websites ✅ Requirements A Google Sheet with a column labeled Website where URLs will be added A second sheet (e.g., product details) where extracted data will be saved Dumpling AI** API access to perform the extraction Connected Google Sheets credentials in n8n ⚙️ How to set up Replace the Google Sheet and tab IDs in the workflow with your own. Make sure your source sheet includes a Website column. Connect your Dumpling AI and Google Sheets credentials. Make sure the output sheet has the following headers: productName price productDescription (The workflow supports review, but it’s optional.) Activate the workflow to start processing new rows. 🔁 How it works (Workflow Steps) Watch New Website URL in Google Sheets: Triggers when a new row is added with a website URL. Extract Product Info with Dumpling AI: Sends the URL to Dumpling AI’s extract endpoint using a defined schema for product details. Split Extracted Products: Separates multiple products into individual items if the page contains more than one. Append Product Info to Google Sheets: Adds the structured results to the specified product details sheet. 🛠️ Customization Ideas Add a column to store the original source URL alongside each product Use OpenAI to generate short SEO summaries for each product Add filters to ignore pages without valid product details Send Slack or email notifications when new products are added to the sheet
by Ziad Adel
What this workflow does This workflow sends a daily Slack report with the current number of subscribers in your Mailchimp list. It’s a simple way to keep your marketing or growth team informed without logging into Mailchimp. How it works Cron Trigger starts the workflow once per day (default: 09:00). Mailchimp node retrieves the total number of subscribers for a specific list. Slack node posts a formatted message with the subscriber count into your chosen Slack channel. Pre-conditions / Requirements A Mailchimp account with API access enabled. At least one Mailchimp audience list created (you’ll need the List ID). A Slack workspace with permission to post to your chosen channel. n8n connected to both Mailchimp and Slack via credentials. Setup Cron Trigger Default is set to 09:00 AM daily. Adjust the time or frequency as needed. Mailchimp: Get Subscribers Connect your Mailchimp account in n8n credentials. Replace {{MAILCHIMP_LIST_ID}} with the List ID of the audience you want to monitor. To find the List ID: Log into Mailchimp → Audience → All contacts → Settings → Audience name and defaults. Slack: Send Summary Connect your Slack account in n8n credentials. Replace {{SLACK_CHANNEL}} with the name of the channel where the summary should appear (e.g., #marketing). The message template can be customized, e.g., include emojis, or additional Mailchimp stats. Customization Options Multiple lists:** Duplicate the Mailchimp node for different audience lists and send combined stats. Formatting:** Add more details like new subscribers in the last 24h by comparing with previous runs (using Google Sheets or a database). Notifications:** Instead of Slack, send the update to email or Microsoft Teams by swapping the output node. Benefits Automation:** Removes the need for manual Mailchimp checks. Visibility:** Keeps the whole team updated on subscriber growth in real time. Motivation:** Celebrate growth milestones directly in team channels. Use Cases Daily subscriber growth tracking for newsletters. Sharing metrics with leadership without giving Mailchimp access. Monitoring the effectiveness of campaigns in near real time.
by Shahrear
Process Physician Orders into Google Sheets with VLM Run AI Extraction What this workflow does Monitors Google Drive for new physician order files in a target folder Downloads the file automatically inside n8n for processing Sends the file to VLM Run for AI transcription and structured data extraction Parses healthcare-specific details from the healthcare.physician-order domain as JSON Appends normalized attributes to a Google Sheet as a new row Setup Prerequisites: Google account, VLM Run API credentials, Google Sheets access, n8n. Install the verified VLM Run node from the n8n node list, then click Install. Once installed, you can integrate it directly in your workflow. Quick Setup: Create the Drive folder you want to watch and copy its Folder ID Create a Google Sheet with headers such as: timestamp, file_name, file_id, mime_type, size_bytes, uploader_email, patient_name, patient_dob, patient_gender, patient_address, patient_phone_no, physician_name, physician_phone_no, physician_email, referring_clinic, diagnosis, exam_date, form_signed_in, …other physician order fields as needed Configure Google Drive OAuth2 for the trigger and download nodes Add your VLM Run API credentials from VLM Run Dashboard to the VLM Run node Configure Google Sheets OAuth2 and set Spreadsheet ID + target tab Upload a sample physician order file to the Drive folder to test, then activate Perfect for Converting physician order documents into structured, machine-readable text Automating extraction of patient, physician, and clinical details with VLM Run Creating a centralized archive of orders for compliance, auditing, or reporting Reducing manual data entry and ensuring consistent formatting Key Benefits End-to-end automation** from Drive upload to structured Google Sheets row AI-powered accuracy** using VLM Run’s healthcare-specific extraction models Standardized attribute mapping** for patient and physician records Instantly searchable archive** directly in Google Sheets Hands-free processing** once the workflow is activated How to customize Extend by adding: Attribute-specific parsing (e.g., ICD/CPT diagnosis codes, insurance details) Automatic classification of orders by specialty, urgency, or exam type Slack, Teams, or email notifications when new physician orders are logged Keyword tagging for fast filtering and downstream workflows Error-handling rules that move failed conversions into a review folder or error sheet