by vinci-king-01
Product Price Monitor with Slack and Jira ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Slack workspace & channel for notifications Jira Software project (cloud or server) Basic JavaScript knowledge for optional custom code edits Required Credentials ScrapeGraphAI API Key** – Enables web scraping Slack OAuth Access Token** – Required by the Slack node Jira Credentials** – Email & API token (cloud) or username & password (server) (Optional) Proxy credentials – If target websites block direct scraping Specific Setup Requirements | Resource | Purpose | Example | |----------|---------|---------| | Product URL list | Seed URLs to monitor | https://example.com/products-winter-sale | | Slack Channel | Receives trend alerts | #pricing-alerts | | Jira Project Key | Tickets are created here | ECOM | How it works This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week. Key Steps: Webhook Trigger**: Kicks off the workflow via a weekly schedule or manual call. Set Product URLs**: Prepares the list of product pages to analyse. SplitInBatches**: Processes URLs in manageable batches to avoid rate limits. ScrapeGraphAI**: Extracts current prices, stock, and seasonality hints from each URL. Code (Trend Logic)**: Compares scraped prices against historical averages. If (Threshold Check)**: Determines if price deviations exceed ±10%. Slack Node**: Sends a formatted message to the pricing channel for each deviation. Jira Node**: Creates/updates a ticket linking to the product for further action. Merge**: Collects all batch results for summary reporting. Set up steps Setup Time: 15-20 minutes Install Community Nodes: In n8n, go to Settings → Community Nodes, search for “ScrapeGraphAI”, and install. Add Credentials: a. Slack → Credentials → New, paste your Bot/User OAuth token. b. Jira → Credentials → New, enter your domain, email/username, API token/password. c. ScrapeGraphAI → Credentials → New, paste your API key. Import Workflow: Upload or paste the JSON template into n8n. Edit the “Set Product URLs” Node: Replace placeholder URLs with your real product pages. Configure Schedule: Replace the Webhook Trigger with a Cron node (e.g., every Monday 09:00) or keep as webhook for manual runs. Map Jira Fields: In the Jira node, ensure Project Key, Issue Type (e.g., Task), and Summary fields match your instance. Test Run: Execute the workflow. Confirm Slack message appears and a Jira issue is created. Activate: Toggle the workflow to Active so it runs automatically. Node Descriptions Core Workflow Nodes: Webhook** – Default trigger, can be swapped with Cron for weekly automation. Set (Product URLs)** – Stores an array of product links for scraping. SplitInBatches** – Limits each ScrapeGraphAI call to five URLs to reduce load. ScrapeGraphAI** – Crawls and parses HTML, returning JSON with title, price, availability. Code (Trend Logic)** – Calculates percentage change vs. historical data (stored externally or hard-coded for demo). If (Threshold Check)** – Routes items above/below the set variance. Slack** – Posts a rich-format message containing product title, old vs. new price, and link. Jira* – Creates or updates a ticket with priority set to *Medium and assigns to the Pricing team lead. Merge** – Recombines batch streams for optional reporting or storage. Data Flow: Webhook → Set (Product URLs) → SplitInBatches → ScrapeGraphAI → Code (Trend Logic) → If → Slack / Jira → Merge Customization Examples Change Price Deviation Threshold // Code (Trend Logic) node const threshold = 0.05; // 5% instead of default 10% Alter Slack Message Template { "text": ${item.name} price changed from $${item.old} to $${item.new} (${item.diff}%)., "attachments": [ { "title": "Product Link", "title_link": item.url, "color": "#4E79A7" } ] } Data Output Format The workflow outputs structured JSON data: { "product": "Winter Jacket", "url": "https://example.com/winter-jacket", "oldPrice": 129.99, "newPrice": 99.99, "change": -23.06, "scrapedAt": "2023-11-04T09:00:00Z", "status": "Below Threshold", "slackMsgId": "A1B2C3", "jiraIssueKey": "ECOM-101" } Troubleshooting Common Issues ScrapeGraphAI returns empty data – Verify selectors; many sites use dynamic rendering, require a headless browser flag. Slack message not delivered – Check that the OAuth token scopes include chat:write; also confirm channel ID. Jira ticket creation fails – Field mapping mismatch; ensure Issue Type is valid and required custom fields are supplied. Performance Tips Batch fewer URLs (e.g., 3 instead of 5) to reduce timeout risk. Cache historical prices in an external DB (Postgres, Airtable) instead of reading large CSVs in the Code node. Pro Tips: Rotate proxies/IPs within ScrapeGraphAI to bypass aggressive e-commerce anti-bot measures. Add a Notion or Sheets node after Merge for historical logging. Use the Error Trigger workflow in n8n to alert when ScrapeGraphAI fails more than X times per run.
by Kevin Meneses
What this workflow does This workflow automatically audits web pages for SEO issues and generates an executive-friendly SEO report using AI. It is designed for marketers, founders, and SEO teams who want fast, actionable insights without manually reviewing HTML, meta tags, or SERP data. The workflow: Reads URLs from Google Sheets Scrapes page content using Decodo (reliable scraping, even on protected sites) Extracts key SEO elements (title, meta description, canonical, H1/H2, visible text) Uses an AI Agent to analyze the page and generate: Overall SEO status Top issues Quick wins Title & meta description recommendations Saves results to Google Sheets Sends a formatted HTML executive report by email (Gmail) Who this workflow is for This template is ideal for: SEO consultants and agencies SaaS marketing teams Founders monitoring their landing pages Content teams doing SEO quality control It focuses on on-page SEO fundamentals, not backlink analysis or technical crawling. Setup (step by step) 1. Google Sheets Create an input sheet with one URL per row Create an output sheet to store SEO results 2. Decodo Add your Decodo API credentials The URL is automatically taken from the input sheet 👉 Decodo – Web Scraper for n8n 3. AI Agent Connect your LLM credentials (OpenAI, Gemini, etc.) The prompt is already optimized for non-technical SEO summaries 4. Gmail Connect your Gmail account Set the recipient email address Emails are sent in clean HTML format Notes & disclaimer This is a community template Results depend on page accessibility and content structure It focuses on on-page SEO, not backlinks or rankings
by DataMinex
Transform property searches into personalized experiences! This powerful automation delivers dream home matches straight to clients' inboxes with professional CSV reports - all from a simple web form. 🚀 What this workflow does Create a complete real estate search experience that works 24/7: ✨ Smart Web Form - Beautiful property search form captures client preferences 🧠 Dynamic SQL Builder - Intelligently creates optimized queries from user input ⚡ Lightning Database Search - Scans 1000+ properties in milliseconds 📊 Professional CSV Export - Excel-ready reports with complete property details 📧 Automated Email Delivery - Personalized emails with property previews and attachments 🎯 Perfect for: Real Estate Agents** - Generate leads and impress clients with instant service Property Managers** - Automate tenant matching and recommendations Brokerages** - Provide 24/7 self-service property discovery Developers** - Showcase available properties with professional automation 💡 Why this workflow is a game-changer > "From property search to professional report delivery in under 30 seconds!" ⚡ Instant Results: Zero wait time for property matches 🎨 Professional Output: Beautiful emails that showcase your expertise 📱 Mobile Optimized: Works flawlessly on all devices 🧠 Smart Filtering: Only searches criteria clients actually specify 📈 Infinitely Scalable: Handles unlimited searches simultaneously 📊 Real Estate Data Source Built on authentic US market data from the Github: 🏘️ 1000+ Real Properties across all US states 💰 Actual Market Prices from legitimate listings 🏠 Complete Property Details (bedrooms, bathrooms, square footage, lot size) 📍 Verified Locations with accurate cities, states, and ZIP codes 🏢 Broker Information for authentic real estate context 🛠️ Quick Setup Guide Prerequisites Checklist ✅ [ ] SQL Server database (MySQL/PostgreSQL also supported) [ ] Gmail account for automated emails [ ] n8n instance (cloud or self-hosted) [ ] 20 minutes setup time Step 1: Import Real Estate Data 📥 🌟 Download the data 💾 Download CSV file (1000+ properties included) 🗄️ Create SQL Server table with this exact schema: CREATE TABLE [REALTOR].[dbo].[realtor_usa_price] ( brokered_by BIGINT, status NVARCHAR(50), price DECIMAL(12,2), bed INT, bath DECIMAL(3,1), acre_lot DECIMAL(10,8), street BIGINT, city NVARCHAR(100), state NVARCHAR(50), zip_code INT, house_size INT, prev_sold_date NVARCHAR(50) ); 📊 Import your CSV data into this table Step 2: Configure Database Connection 🔗 🔐 Set up Microsoft SQL Server credentials in n8n ✅ Test connection to ensure everything works 🎯 Workflow is pre-configured for the table structure above Step 3: Gmail Setup (The Magic Touch) 📧 🌐 Visit Google Cloud Console 🆕 Create new project (or use existing) 🔓 Enable Gmail API in API Library 🔑 Create OAuth2 credentials (Web Application) ⚙️ Add your n8n callback URL to authorized redirects 🔗 Configure Gmail OAuth2 credentials in n8n ✨ Authorize your Google account Step 4: Launch Your Property Search Portal 🚀 📋 Import this workflow template (form is pre-configured) 🌍 Copy your webhook URL from the Property Search Form node 🔍 Test with a sample property search 📨 Check email delivery with CSV attachment 🎉 Go live and start impressing clients! 🎨 Customization Playground 🏷️ Personalize Your Brand // Customize email subjects in the Gmail node "🏠 Exclusive Properties Curated Just for You - ${results.length} Perfect Matches!" "✨ Your Dream Home Portfolio - Handpicked by Our Experts" "🎯 Hot Market Alert - ${results.length} Premium Properties Inside!" 🔧 Advanced Enhancements 🎨 HTML Email Templates**: Create stunning visual emails with property images 📊 Analytics Dashboard**: Track popular searches and user engagement 🔔 Smart Alerts**: Set up automated price drop notifications 📱 Mobile Integration**: Connect to React Native or Flutter apps 🤖 AI Descriptions**: Add ChatGPT for compelling property descriptions 🌍 Multi-Database Flexibility // Easy database switching // MySQL: Replace Microsoft SQL node → MySQL node // PostgreSQL: Swap for PostgreSQL node // MongoDB: Use MongoDB node with JSON queries // Even CSV files: Use CSV reading nodes for smaller datasets 🚀 Advanced Features & Extensions 🔥 Pro Tips for Power Users 🔄 Bulk Processing**: Handle multiple searches simultaneously 💾 Smart Caching**: Store popular searches for lightning-fast results 📈 Lead Scoring**: Track which properties generate most interest 📅 Follow-up Automation**: Schedule nurturing email sequences 🎯 Integration Possibilities 🏢 CRM Connection**: Auto-add qualified leads to your CRM 📅 Calendar Integration**: Add property viewing scheduling 📊 Price Monitoring**: Track market trends and price changes 📱 Social Media**: Auto-share featured properties to social platforms 💬 Chat Integration**: Connect to WhatsApp or SMS for instant alerts 🔗 Expand Your Real Estate Automation 🌟 Related Workflow Ideas 🤖 AI Property Valuation - Add machine learning for price predictions 📊 Market Analysis Reports - Generate comprehensive market insights 📱 SMS Property Alerts - Instant text notifications for hot properties 🏢 Commercial Property Search - Adapt for office and retail spaces 💹 Investment ROI Calculator - Add financial analysis for investors 🏘️ Neighborhood Analytics - Include school ratings and demographics 🛠️ Technical Extensions 📷 Image Processing: Auto-resize and optimize property photos 🗺️ Map Integration: Add interactive property location maps 📱 Progressive Web App: Create mobile app experience 🔔 Push Notifications: Real-time alerts for saved searches 🚀 Get Started Now Import this workflow template Configure your database and Gmail Customize branding and messaging Launch your professional property search portal Watch client satisfaction soar!
by Omer Fayyaz
Automatically discover and extract article URLs from any website using AI to identify valid content links while filtering out navigation, category pages, and irrelevant content—perfect for building content pipelines, news aggregators, and research databases. What Makes This Different: AI-Powered Intelligence** - Uses GPT-5-mini to understand webpage context and identify actual articles vs navigation pages, eliminating false positives Browser Spoofing** - Includes realistic User-Agent headers and request patterns to avoid bot detection on publisher sites Smart URL Normalization* - Automatically strips tracking parameters (utm_, fbclid, etc.), removes duplicates, and standardizes URLs Source Categorization** - AI assigns logical source names based on domain and content type for easy filtering Rate Limiting Built-In** - Configurable delays between requests prevent IP blocking and respect website resources Deduplication on Save** - Google Sheets append-or-update pattern ensures no duplicate URLs in your database Key Benefits of AI-Powered Content Discovery: Save 10+ Hours Weekly** - Automate manual link hunting across dozens of publisher sites Higher Quality Results** - AI filters out 95%+ of junk links (nav pages, categories, footers) that rule-based scrapers miss Scale Effortlessly** - Add new seed URLs to your sheet and the same workflow handles any website structure Industry Agnostic** - Works for news, blogs, research papers, product pages—any content type Always Up-to-Date** - Schedule daily runs to catch new content as it's published Full Audit Trail** - Track discovered URLs with timestamps and sources in Google Sheets Who's it for This template is designed for content marketers, SEO professionals, researchers, media monitors, and anyone who needs to aggregate content from multiple sources. It's perfect for organizations that need to track competitor blogs, curate industry news, build research databases, monitor brand mentions, or aggregate content for newsletters without manually checking dozens of websites daily or writing complex scraping rules for each source. How it works / What it does This workflow creates an intelligent content discovery pipeline that automatically finds and extracts article URLs from any webpage. The system: Reads Seed URLs - Pulls a list of webpages to crawl from your Google Sheets (blog indexes, news feeds, publication homepages) Fetches with Stealth - Downloads each webpage's HTML using browser-like headers to avoid bot detection Converts for AI - Transforms messy HTML into clean Markdown that the AI can easily process AI Extraction - GPT-5-mini analyzes the content and identifies valid article URLs while filtering out navigation, categories, and junk links Normalizes & Saves - Cleans URLs (removes tracking params), deduplicates, and saves to Google Sheets with source tracking Key Innovation: Context-Aware Link Filtering - Unlike traditional scrapers that rely on CSS selectors or URL patterns (which break when sites update), the AI understands the semantic difference between an article link and a navigation link. It reads the page like a human would, identifying content worth following regardless of the website's structure. How to set up 1. Create Your Google Sheets Database Create a new Google Spreadsheet with two sheets: "Seed URLs" - Add column URL with webpages to crawl (blog homepages, news feeds, etc.) "Discovered URLs" - Add columns: URL, Source, Status, Discovered At Add 3-5 seed URLs to start (e.g., https://abc.com/, https://news.xyz.com/) 2. Connect Your Credentials Google Sheets**: Click the "Read Seed URLs" and "Save Discovered URLs" nodes → Select your Google Sheets account OpenAI**: Click the "OpenAI GPT-5-mini" node → Add your OpenAI API key Select your spreadsheet and sheet names in both Google Sheets nodes 3. Customize the AI Prompt (Optional) Open the "AI URL Extractor" node Modify the system message to add industry-specific rules: // Example: Add to system message for tech blogs For tech sites, also extract: Tutorial and guide URLs Product announcement pages Changelog and release notes Adjust source naming conventions to match your taxonomy 4. Test Your Configuration Click "Test Workflow" or use the Manual Trigger Check the execution to verify: Seed URLs are being read correctly HTML is fetched successfully (check for 200 status) AI returns valid JSON array of URLs URLs are saved to your output sheet Review the "Discovered URLs" sheet for results 5. Schedule and Monitor Adjust the Schedule Trigger (default: daily at 6 AM) Enable the workflow to run automatically Monitor execution logs for errors: Rate limiting: Increase wait time if sites block you Empty results: Check if seed URLs have changed structure AI errors: Review AI output in execution data Set up error notifications via email or Slack (add nodes after Completion Summary) Requirements Google Sheets Account** - OAuth2 connection for reading seed URLs and saving results OpenAI API Key** - For GPT-5-mini (or swap for any LangChain-compatible LLM)
by WeblineIndia
Automated Failed Login Detection with Jira Security Tasks, Slack Notifications Webhook: Failed Login Attempts → Jira Security Case → Slack Warnings This n8n workflow monitors failed login attempts from any application, normalizes incoming data, detects repeated attempts within a configurable time window and automatically: Sends detailed alerts to Slack, Creates Jira security tasks (single or grouped based on repetition), Logs all failed login attempts into a Notion database. It ensures fast, structured and automated responses to potential account compromise or brute-force attempts while maintaining persistent records. Quick Implementation Steps Import this JSON workflow into n8n. Connect your application to the failed-login webhook endpoint. Add Jira Cloud API credentials. Add Slack API credentials. Add Notion API credentials and configure the database for storing login attempts. Enable the workflow — done! What It Does Receives Failed Login Data Accepts POST requests containing failed login information. Normalizes the data, ensuring consistent fields: username, ip, timestamp and error. Validates Input Checks for missing username or IP. Sends a Slack alert if any required field is missing. Detects Multiple Attempts Uses a sliding time window (default: 5 minutes) to detect multiple failed login attempts from the same username + IP. Single attempts → standard Jira task + Slack notification. Multiple attempts → grouped Jira task + detailed Slack notification. Logs Attempts in Notion Records all failed login events into a Notion database with fields: Username, IP, Total Attempts, Attempt List, Attempt Type. Formats Slack Alerts Single attempt → lightweight notification. Multiple attempts → summary including timestamps, errors, total attempts, and Jira ticket link. Who’s It For This workflow is ideal for: Security teams monitoring authentication logs. DevOps/SRE teams maintaining infrastructure access logs. SaaS platform teams with high login traffic. Organizations aiming to automate breach detection. Teams using Jira + Slack + Notion + n8n for incident workflows. Requirements n8n (Self-Hosted or Cloud). Your application must POST failed login data to the webhook. Jira Software Cloud credentials (Email, API Token, Domain). Slack Bot Token with message-posting permissions. Notion API credentials with access to a database. Basic understanding of your login event sources. How It Works Webhook Trigger: Workflow starts when a failed-login event is sent to the failed-login webhook. Normalization: Converts single objects or arrays into a uniform format. Ensures username, IP, timestamp and error are present. Prepares a logMessage for Slack and Jira nodes. Validation: IF node checks whether username and IP exist. If missing → Slack alert for missing information. Multiple Attempt Detection: Function node detects repeated login attempts within a 5-minute sliding window. Flags attempts as multiple: true or false. Branching: Multiple attempts → build summary, create Jira ticket, format Slack message, store in Notion. Single attempts → create Jira ticket, format Slack message, store in Notion. Slack Alerts: Single attempt → concise message Multiple attempts → detailed summary with timestamps and Jira ticket link Notion Logging: Stores username, IP, total attempts, attempt list, attempt type in a dedicated database for recordkeeping. How To Set Up Import Workflow → Workflows → Import from File in n8n. Webhook Setup → copy the URL from Faield Login Trigger node and integrate with your application. Jira Credentials → connect your Jira account to both Jira nodes and configure project/issue type. Slack Credentials → connect your Slack Bot and select the alert channel. Notion Credentials → connect your Notion account and select the database for storing login attempts. Test the Workflow → send sample events: missing fields, single attempts, multiple attempts. Enable Workflow → turn on workflow once testing passes. Logic Overview | Step Node | Description | |---------------------------------|-----------------------------------------------| | Normalize input | Normalize Login Event — Ensures each event has required fields and prepares a logMessage. | | Validate fields | Check Username & IP present — IF node → alerts Slack if data is incomplete. | | Detect repeats | Detect Multiple Attempts — Finds multiple attempts within a 5-minute window; sets multiple flag. | | Multiple attempts | IF - Multiple Attempts + Build Multi-Attempt Summary — Prepares grouped summary for Slack & Jira. | | Single attempt | Create Ticket - Single Attempt — Creates Jira task & Slack alert for one-off events. | | Multiple attempt ticket | Create Ticket - Multiple Attempts — Creates detailed Jira task. | | Slack alert formatting | Format Fields For Single/Multiple Attempt — Prepares structured message for Slack. | | Slack alert delivery | Slack Alert - Single/Multiple Attempts — Posts alert in selected Slack channel. | | Notion logging | Login Attempts Data Store in DB — Stores structured attempt data in Notion database. | Customization Options Webhook Node** → adjust endpoint path for your application. Normalization Function** → add fields such as device, OS, location or user-agent. Multiple Attempt Logic** → change the sliding window duration or repetition threshold. Jira Nodes** → modify issue type, labels or project. Slack Nodes** → adjust markdown formatting, channel routing or severity-based channels. Notion Node** → add or modify database fields to store additional context. Optional Enhancements: Geo-IP lookup for country/city info. Automatic IP blocking via firewall or WAF. User notification for suspicious login attempts. Database logging in MySQL/Postgres/MongoDB. Threat intelligence enrichment (e.g., AbuseIPDB). Use Case Examples Detect brute-force attacks targeting user accounts. Identify credential stuffing across multiple users. Monitor admin portal access failures with Jira task creation. Alert security teams instantly when login attempts originate from unusual locations. Centralize failed login monitoring across multiple applications with Notion logging. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------------------------------|---------------------------------------------------|-------------------------------------------------------------| | Workflow not receiving data | Webhook misconfigured | Verify webhook URL & POST payload format | | Jira ticket creation fails | Invalid credentials or insufficient permissions | Update Jira API token and project access | | Slack alert not sent | Incorrect channel ID or missing bot scopes | Fix Slack credentials and permissions | | Multiple attempts not detected| Sliding window logic misaligned | Adjust Detect Multiple Attempts node code | | Notion logging fails | Incorrect database ID or missing credentials | Update Notion node credentials and database configuration | | Errors in normalization | Payload format mismatch | Update Normalize Login Event function code | Need Help? If you need help setting up, customizing or extending this workflow, WeblineIndia can assist with full n8n development, workflow automation, security event processing and custom integrations.
by vinci-king-01
Property Listing Aggregator with Mailchimp and Notion ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow scrapes multiple commercial-real-estate websites, consolidates new property listings into Notion, and emails weekly availability updates or immediate space alerts to a Mailchimp audience. It automates the end-to-end process so business owners can stay on top of the latest spaces without manual searching. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed Active Notion workspace with permission to create/read databases Mailchimp account with at least one Audience list Basic understanding of JSON; ability to add API credentials in n8n Required Credentials ScrapeGraphAI API Key** – Enables web scraping functionality Notion OAuth2 / Integration Token** – Writes data into Notion database Mailchimp API Key** – Sends campaigns and individual emails (Optional) Proxy credentials – If target real-estate sites block your IP Specific Setup Requirements | Resource | Requirement | Example | |----------|-------------|---------| | Notion | Database with property fields (Address, Price, SqFt, URL, Availability) | Database ID: abcd1234efgh | | Mailchimp | Audience list where alerts are sent | Audience ID: f3a2b6c7d8 | | ScrapeGraphAI | YAML/JSON config per site | Stored inside the ScrapeGraphAI node | How it works This workflow scrapes multiple commercial-real-estate websites, consolidates new property listings into Notion, and emails weekly availability updates or immediate space alerts to a Mailchimp audience. It automates the end-to-end process so business owners can stay on top of the latest spaces without manual searching. Key Steps: Manual Trigger / CRON**: Starts the workflow weekly or on-demand. Code (Site List Builder)**: Generates an array of target URLs for ScrapeGraphAI. Split In Batches**: Processes URLs in manageable groups to avoid rate limits. ScrapeGraphAI**: Extracts property details from each site. IF (New vs Existing)**: Checks whether the listing already exists in Notion. Notion**: Inserts new listings or updates existing records. Set**: Formats email content (HTML & plaintext). Mailchimp**: Sends a campaign or automated alert to subscribers. Sticky Notes**: Provide documentation and future-enhancement pointers. Set up steps Setup Time: 15-25 minutes Install Community Node Navigate to Settings → Community Nodes and install “ScrapeGraphAI”. Create Notion Integration Go to Notion Settings → Integrations → Develop your own integration. Copy the integration token and share your target database with the integration. Add Mailchimp API Key In Mailchimp: Account → Extras → API keys. Copy an existing key or create a new one, then add it to n8n credentials. Build Scrape Config In the ScrapeGraphAI node, paste a YAML/JSON selector config for each website (address, price, sqft, url, availability). Configure the URL List Open the first Code node. Replace the placeholder array with your target listing URLs. Map Notion Fields Open the Notion node and map scraped fields to your database properties. Save. Design Email Template In the Set node, tweak the HTML and plaintext blocks to match your brand. Test the Workflow Trigger manually, check that Notion rows are created and Mailchimp sends the message. Schedule Add a CRON node (weekly) or leave the Manual Trigger for ad-hoc runs. Node Descriptions Core Workflow Nodes: Manual Trigger / CRON** – Kicks off the workflow either on demand or on a schedule. Code (Site List Builder)** – Holds an array of commercial real-estate URLs and outputs one item per URL. Split In Batches** – Prevents hitting anti-bot limits by processing URLs in groups (default: 5). ScrapeGraphAI** – Crawls each URL, parses DOM with CSS/XPath selectors, returns structured JSON. IF (New Listing?)** – Compares scraped listing IDs against existing Notion database rows. Notion** – Creates or updates pages representing property listings. Set (Email Composer)** – Builds dynamic email subject, body, and merge tags for Mailchimp. Mailchimp* – Uses the *Send Campaign endpoint to email your audience. Sticky Note** – Contains inline documentation and customization reminders. Data Flow: Manual Trigger/CRON → Code (URLs) → Split In Batches → ScrapeGraphAI → IF (New?) True path → Notion (Create) → Set (Email) → Mailchimp False path → (skip) Customization Examples Filter Listings by Maximum Budget // Inside the IF node (custom expression) {{$json["price"] <= 3500}} Change Email Frequency to Daily Digests { "nodes": [ { "name": "Daily CRON", "type": "n8n-nodes-base.cron", "parameters": { "triggerTimes": [ { "hour": 8, "minute": 0 } ] } } ] } Data Output Format The workflow outputs structured JSON data: { "address": "123 Market St, Suite 400", "price": 3200, "sqft": 950, "url": "https://examplebroker.com/listing/123", "availability": "Immediate", "new": true } Troubleshooting Common Issues Scraper returns empty objects – Verify selectors in ScrapeGraphAI config; inspect the site’s HTML for changes. Duplicate entries in Notion – Ensure the “IF” node checks a unique ID (e.g., listing URL) before creating a page. Performance Tips Reduce batch size or add delays in ScrapeGraphAI to avoid site blocking. Cache previously scraped URLs in an external file or database for faster runs. Pro Tips: Rotate proxies in ScrapeGraphAI for heavily protected sites. Use Notion rollups to calculate total available square footage automatically. Leverage Mailchimp merge tags (|FNAME|) in the Set node for personalized alerts.
by Aslamul Fikri Alfirdausi
This n8n template demonstrates how to build O'Carla, an advanced all-in-one Discord AI assistant. It intelligently handles natural conversations, professional image generation, and visual file analysis within a single server integration. Use cases are many: Deploy a smart community manager that remembers past interactions, an on-demand artistic tool for your members, or an AI that can "read" and explain uploaded documents and images! Good to know API Costs:** Each interaction costs vary depending on the model used (Gemini vs. OpenRouter). Check your provider's dashboard for updated pricing. Infrastructure:* This workflow requires a separate Discord bot script (e.g., Node.js) to forward events to the n8n Webhook. It is recommended to host the bot using *PM2** for 24/7 uptime. How it works Webhook Trigger: Receives incoming data (text and attachments) from your Discord bot. Intent Routing: The workflow uses conditional logic to detect if the user wants an image (via keyword gambar:), a vision analysis (via attachments), or a standard chat. Multi-Model Intelligence: Gemini 2.5: Powers rapid and high-quality general chat reasoning. Llama 3.2 Vision (via OpenRouter): Specifically used to describe and analyze images or text-based files. Flux (via Pollinations): Uses a specialized AI Agent to refine prompts and generate professional-grade images. Contextual Memory: A 50-message buffer window ensures O'Carla maintains the context of your conversation based on your Discord User ID. Clean UI Output: Generated image links are automatically shortened via TinyURL to keep the Discord chat interface tidy. How to use Connect your Google Gemini and OpenRouter API keys in the respective nodes. Replace the Webhook URL in your bot script with this workflow's Production Webhook URL. Type gambar: [your prompt] in Discord to generate images. Upload an image or file to Discord to trigger the AI Vision analysis. Requirements n8n instance (Self-hosted or Cloud). Google Gemini API Key. OpenRouter API Key. Discord Bot Token and hosting environment. Customising this workflow O'Carla is highly flexible. You can change her personality by modifying the System Message in the Agent nodes, adjust the memory window length, or swap the LLM models to specialized ones like Claude 3.5 or GPT-4o.
by Growth AI
AI-powered alt text generation from Google Sheets to WordPress media Who's it for WordPress site owners, content managers, and accessibility advocates who need to efficiently add alt text descriptions to multiple images for better SEO and web accessibility compliance. What it does This workflow automates the process of generating and updating alt text for WordPress media files using AI analysis. It reads image URLs from a Google Sheet, analyzes each image with Claude AI to generate accessibility-compliant descriptions, updates the sheet with the generated alt text, and automatically applies the descriptions to the corresponding WordPress media files. The workflow includes error handling to skip unsupported media formats and continue processing. How it works Input: Provide a Google Sheets URL containing image URLs and WordPress media IDs Authentication: Retrieves WordPress credentials from a separate sheet and generates Base64 authentication Processing: Loops through each image URL in the sheet AI Analysis: Claude AI analyzes each image and generates concise, accessible alt text (max 125 characters) Error Handling: Automatically skips unsupported media formats and continues with the next item Update Sheet: Writes the generated alt text back to the Google Sheet WordPress Update: Updates the WordPress media library with the new alt text via REST API Requirements Google Sheets with image URLs and WordPress media IDs WordPress site with Application Passwords enabled Claude AI (Anthropic) API credentials WordPress admin credentials stored in Google Sheets Export Media URLs WordPress plugin for generating the media list How to set up Step 1: Export your WordPress media URLs Install the "Export Media URLs" plugin on your WordPress site Go to the plugin settings and check both ID and URL columns for export (these are mandatory for the workflow) Export your media list to get the required data Step 2: Configure WordPress Application Passwords Go to WordPress Admin → Users → Your Profile Scroll down to "Application Passwords" section Enter application name (e.g., "n8n API") Click "Add New Application Password" Copy the generated password immediately (it won't be shown again) Step 3: Set up Google Sheets Duplicate this Google Sheets template to get the correct structure. The template includes two sheets: Sheet 1: "Export media" - Paste your exported media data with columns: ID (WordPress media ID) URL (image URL) Alt text (will be populated by the workflow) Sheet 2: "Infos client" - Add your WordPress credentials: Admin Name: Your WordPress username KEY: The application password you generated Domaine: Your site URL without https:// (format: "example.com") Step 4: Configure API credentials Add your Anthropic API credentials to the Claude node Connect your Google Sheets account to the Google Sheets nodes How to customize Language: The Claude prompt is in French - modify it in the "Analyze image" node for other languages Alt text length: Adjust the 125-character limit in the Claude prompt Batch processing: Change the batch size in the Split in Batches node Error handling: The workflow automatically handles unsupported formats, but you can modify the error handling logic Authentication: Customize for different WordPress authentication methods This workflow is perfect for managing accessibility compliance across large WordPress media libraries while maintaining consistent, AI-generated descriptions. It's built to be resilient and will continue processing even when encountering unsupported media formats.
by Yusuke
🧠 Overview Generate empathetic, professional reply drafts for customer or user messages. The workflow detects sentiment, tone, and risk level, drafts a concise response, sanitizes PII/links/emojis, and auto-escalates risky or low-confidence cases to human review. ⚙️ How It Works Input — Manual Test or Webhook Trigger AI Agent (Empathy) — returns { sentiment, tone, reply, confidence, needs_handover } Post-Process & Sanitize — removes URLs/hashtags, masks PII, caps length Risk & Handover Rules — checks confidence threshold, risk words, and negativity Routing — auto-send safe replies or flag to Needs Review 🧩 Setup Instructions (3–5 min) Open Set Config1 and adjust: MAX_LEN (default 600) ADD_FOLLOWUP_QUESTION (true/false) FORMALITY (auto | casual | polite) EMOJI_ALLOWED (true/false), BLOCK_LINKS (true/false) RISK_WORDS (e.g., refund, lawsuit, self-harm) Connect Anthropic credential to Anthropic Chat Model (Optional) Replace Manual Trigger with Webhook Trigger for real-time use > Tip: If you need to show literal angle brackets in messages, use backticks like `<example>` (no HTML entities needed). 📚 Use Cases 1) SaaS Billing Complaints Input:** “I was billed after canceling. This is unacceptable.” Output:** Calm, apologetic reply with refund steps; escalates if refund is in RISK_WORDS or confidence < 0.45. 2) Product Bug Reports Input:** “Upload fails on large files since yesterday.” Output:** Acknowledges impact, requests logs, offers workaround; routes to auto-send if low risk and high confidence. 3) Delivery/Logistics Delays Input:** “My order is late again. Should I file a complaint?” Output:** Empathetic apology, ETA guidance, partial credit policy note; escalates if language indicates legal action. 4) Community Moderation / Abuse Input:** “Support is useless—you’re all scammers.” Output:** De-escalating, policy-aligned response; auto-flags due to negative sentiment + risk keyword match. 5) Safety / Self-harm Mentions Input:** “I feel like hurting myself if this isn’t fixed.” Output:* *Immediate escalation**, inserts approved resources; never auto-sends. 🚨 Auto-Escalation Rules (defaults) Negative** sentiment Message matches any RISK_WORDS confidence < 0.45 Mentions of legal, harassment, or self-harm context 🧪 Notes & Best Practices 🔐 No hardcoded API keys — use n8n Credentials 🧭 Tune thresholds and RISK_WORDS to your org policy 🧩 Works on self-hosted or cloud n8n ✅ Treat outputs as drafts; ship after human/policy review 🔗 Resources GitHub (template JSON):** https://github.com/yskmtb0714/n8n-workflows/blob/main/empathy-reply-assistant.json
by Meelioo
How it works This beginner-friendly workflow demonstrates the core building blocks of n8n. It guides you through: Triggers – Start workflows manually, on a schedule, via webhooks, or through chat. Data processing** – Use Set and Code nodes to create, transform, and enrich data. Logic and branching – Apply conditions with IF nodes and merge different branches back together. API integrations** – Fetch external data (e.g., users from an API), split arrays into individual items, and extract useful fields. AI-powered steps** – Connect to OpenAI for generating fun facts or build interactive assistants with chat triggers, memory, and tools. Responses** – Return structured results via webhooks or summary nodes. By the end, it demonstrates a full flow: creating data → transforming it → making decisions → calling APIs → using AI → responding with outputs. Set up steps Time required: 5–10 minutes. What you need: An n8n instance (cloud or self-hosted). Optional: API credentials (e.g., OpenAI) if you want to test AI features. Setup flow: Import this workflow. Add your API keys where needed (OpenAI, etc.). Trigger the workflow manually or test with webhooks. >👉 Detailed node explanations and examples are already included as sticky notes inside the workflow itself, so you can learn step by step as you explore.
by Ruthwik
📧 AI-Powered Email Categorization & Labeling in Zoho Mail This n8n template demonstrates how to use AI text classification to automatically categorize incoming emails in Zoho Mail and apply the correct label (e.g., Support, Billing, HR). It saves time by keeping your inbox structured and ensures emails are routed to the right category. Use cases include: Routing customer support requests to the correct team. Organizing billing and finance communications separately. Streamlining HR and recruitment email handling. Reducing inbox clutter and ensuring no important message is missed. ℹ️ Good to know You’ll need to configure Zoho OAuth credentials — see Self Client Overview, Authorization Code Flow, and Zoho Mail OAuth Guide. The labels must already exist in Zoho Mail (e.g., Support, Billing, HR). The workflow fetches these labels and applies them automatically. The Zoho Mail API domain changes depending on your account region: .com → Global accounts (https://mail.zoho.com/api/...) .eu → EU accounts (https://mail.zoho.eu/api/...) .in → India accounts (https://mail.zoho.in/api/...) Example: For an EU account, the endpoint would be: https://mail.zoho.eu/api/accounts/<accountID>/updatemessage The AI model used for text classification may incur costs depending on your provider (e.g., OpenRouter). Start by testing with a small set of emails before enabling for your full inbox. 🔄 How it works A new email in Zoho Mail triggers the workflow. OAuth authentication retrieves access to Zoho Mail’s API. All available labels are fetched, and a label map (display name → ID) is created. The AI model analyzes the subject and body to predict the correct category. The workflow routes the email to the right category branch. The matching Zoho Mail label is applied (final node is deactivated by default). 🛠️ How to use Create the required labels (e.g., Support, Billing, HR, etc.) in your Zoho Mail account before running the workflow. Replace the Zoho Mail Account ID in the Set Account ID node. Configure your Zoho OAuth credentials in the Get Access Token node. Update the API base URL to match your Zoho account’s region (.com, .eu, .in, etc.). Activate the Apply Label to Email node once ready for production. Optionally, adjust categories in the AI classifier prompt to fit your organization’s needs. 📋 Requirements Zoho Mail account with API access enabled. Labels created in Zoho Mail for each category you want to classify. OAuth credentials set up in n8n. Correct Zoho Mail API domain (.com, .eu, .in) based on your account region. An AI model (via OpenRouter or other provider) for text classification. 🎨 Customising this workflow This workflow can be adapted to many inbox management scenarios. Examples include: Auto-routing customer inquiries to specific departments. Prioritizing VIP client emails with special labels. Filtering job applications directly into an HR-managed folder.
by John Alejandro SIlva
🤖💬 Smart Telegram AI Assistant with Memory Summarization & Dynamic Model Selection > Optimize your AI workflows, cut costs, and get faster, more accurate answers. 📋 Description Tired of expensive AI calls, slow responses, or bots that forget your context? This Telegram AI Assistant template is designed to optimize cost, speed, and precision in your AI-powered conversations. By combining PostgreSQL chat memory, AI summarization, and dynamic model selection, this workflow ensures you only pay for what you really need. Simple queries get routed to lightweight models, while complex requests automatically trigger more advanced ones. The result? Smarter context, lower costs, and better answers. This template is perfect for anyone who wants to: ⚡ Save money by using cheaper models for easy tasks. 🧠 Keep context relevant with AI-powered summarization. ⏱️ Respond faster thanks to optimized chat memory storage. 💬 Deliver better answers directly inside Telegram. ✨ Key Benefits 💸 Cost Optimization: Automatically routes simple requests to Gemini Flash Lite and reserves Gemini Pro only for complex reasoning. 🧠 Smarter Context: Summarization ensures only the most relevant chat history is used. ⏱️ Faster Workflows: Storing user + agent messages in a single row reduces DB queries by half and saves ~0.3s per response. 🎤 Voice Message Support: Convert Telegram voice notes to text and reply intelligently. 🛡️ Error-Proof Formatting: Safe MarkdownV2 ensures Telegram-ready answers. 💼 Use Case This template is for anyone who needs an AI chatbot on Telegram that balances cost, performance, and intelligence. Customer support teams can reduce expenses by using lightweight models for FAQs. Freelancers and consultants can offer faster AI-powered chats without losing context. Power users can handle voice + text seamlessly while keeping conversations memory-aware. Whether you’re scaling a business or just want a smarter assistant, this workflow adapts to your needs and budget. 💬 Example Interactions Quick Q&A** → Routed to Gemini Flash Lite for fast, low-cost answers. Complex problem-solving** → Sent to Gemini Pro for in-depth reasoning. Voice messages** → Automatically transcribed, summarized, and answered. Long conversations** → Context is summarized, ensuring precise and efficient replies. 🔑 Required Credentials Telegram Bot API** (Bot Token) PostgreSQL** (Database connection) Google Gemini API** (Flash Lite, Flash, Pro) ⚙️ Setup Instructions 🗄️ Create the PostgreSQL table (chat_memory) from the Gray section SQL. 🔌 Configure the Telegram Trigger with your bot token. 🤖 Connect your Gemini API credentials. 🗂️ Set up PostgreSQL nodes with your DB details. ▶️ Activate the workflow and start chatting with your AI-powered Telegram bot. 🏷 Tags telegram ai-assistant chatbot postgresql summarization memory gemini dynamic-routing workflow-optimization cost-saving voice-to-text 🙏 Acknowledgement A special thank you to Davide for the inspiration behind this template. His work on the AI Orchestrator that dynamically selects models based on input type served as a foundational guide for this architecture. 💡 Need Assistance? Want to customize this workflow for your business or project? Let’s connect: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez