by vinci-king-01
Product Price Monitor with Slack and Jira ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Slack workspace & channel for notifications Jira Software project (cloud or server) Basic JavaScript knowledge for optional custom code edits Required Credentials ScrapeGraphAI API Key** – Enables web scraping Slack OAuth Access Token** – Required by the Slack node Jira Credentials** – Email & API token (cloud) or username & password (server) (Optional) Proxy credentials – If target websites block direct scraping Specific Setup Requirements | Resource | Purpose | Example | |----------|---------|---------| | Product URL list | Seed URLs to monitor | https://example.com/products-winter-sale | | Slack Channel | Receives trend alerts | #pricing-alerts | | Jira Project Key | Tickets are created here | ECOM | How it works This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week. Key Steps: Webhook Trigger**: Kicks off the workflow via a weekly schedule or manual call. Set Product URLs**: Prepares the list of product pages to analyse. SplitInBatches**: Processes URLs in manageable batches to avoid rate limits. ScrapeGraphAI**: Extracts current prices, stock, and seasonality hints from each URL. Code (Trend Logic)**: Compares scraped prices against historical averages. If (Threshold Check)**: Determines if price deviations exceed ±10%. Slack Node**: Sends a formatted message to the pricing channel for each deviation. Jira Node**: Creates/updates a ticket linking to the product for further action. Merge**: Collects all batch results for summary reporting. Set up steps Setup Time: 15-20 minutes Install Community Nodes: In n8n, go to Settings → Community Nodes, search for “ScrapeGraphAI”, and install. Add Credentials: a. Slack → Credentials → New, paste your Bot/User OAuth token. b. Jira → Credentials → New, enter your domain, email/username, API token/password. c. ScrapeGraphAI → Credentials → New, paste your API key. Import Workflow: Upload or paste the JSON template into n8n. Edit the “Set Product URLs” Node: Replace placeholder URLs with your real product pages. Configure Schedule: Replace the Webhook Trigger with a Cron node (e.g., every Monday 09:00) or keep as webhook for manual runs. Map Jira Fields: In the Jira node, ensure Project Key, Issue Type (e.g., Task), and Summary fields match your instance. Test Run: Execute the workflow. Confirm Slack message appears and a Jira issue is created. Activate: Toggle the workflow to Active so it runs automatically. Node Descriptions Core Workflow Nodes: Webhook** – Default trigger, can be swapped with Cron for weekly automation. Set (Product URLs)** – Stores an array of product links for scraping. SplitInBatches** – Limits each ScrapeGraphAI call to five URLs to reduce load. ScrapeGraphAI** – Crawls and parses HTML, returning JSON with title, price, availability. Code (Trend Logic)** – Calculates percentage change vs. historical data (stored externally or hard-coded for demo). If (Threshold Check)** – Routes items above/below the set variance. Slack** – Posts a rich-format message containing product title, old vs. new price, and link. Jira* – Creates or updates a ticket with priority set to *Medium and assigns to the Pricing team lead. Merge** – Recombines batch streams for optional reporting or storage. Data Flow: Webhook → Set (Product URLs) → SplitInBatches → ScrapeGraphAI → Code (Trend Logic) → If → Slack / Jira → Merge Customization Examples Change Price Deviation Threshold // Code (Trend Logic) node const threshold = 0.05; // 5% instead of default 10% Alter Slack Message Template { "text": ${item.name} price changed from $${item.old} to $${item.new} (${item.diff}%)., "attachments": [ { "title": "Product Link", "title_link": item.url, "color": "#4E79A7" } ] } Data Output Format The workflow outputs structured JSON data: { "product": "Winter Jacket", "url": "https://example.com/winter-jacket", "oldPrice": 129.99, "newPrice": 99.99, "change": -23.06, "scrapedAt": "2023-11-04T09:00:00Z", "status": "Below Threshold", "slackMsgId": "A1B2C3", "jiraIssueKey": "ECOM-101" } Troubleshooting Common Issues ScrapeGraphAI returns empty data – Verify selectors; many sites use dynamic rendering, require a headless browser flag. Slack message not delivered – Check that the OAuth token scopes include chat:write; also confirm channel ID. Jira ticket creation fails – Field mapping mismatch; ensure Issue Type is valid and required custom fields are supplied. Performance Tips Batch fewer URLs (e.g., 3 instead of 5) to reduce timeout risk. Cache historical prices in an external DB (Postgres, Airtable) instead of reading large CSVs in the Code node. Pro Tips: Rotate proxies/IPs within ScrapeGraphAI to bypass aggressive e-commerce anti-bot measures. Add a Notion or Sheets node after Merge for historical logging. Use the Error Trigger workflow in n8n to alert when ScrapeGraphAI fails more than X times per run.
by Jitesh Dugar
Automated Customer Statement Generator with Risk Analysis & Credit Monitoring Transform account statement management from hours to minutes - automatically compile transaction histories, calculate aging analysis, monitor credit limits, assess payment risk, and deliver professional PDF statements while syncing with accounting systems and alerting your team about high-risk accounts. What This Workflow Does Revolutionizes customer account management with intelligent statement generation, credit monitoring, and risk assessment: Webhook-Triggered Generation** - Automatically creates statements from accounting systems, CRM updates, or scheduled monthly triggers Smart Data Validation** - Verifies transaction data, validates account information, and ensures statement accuracy before generation Running Balance Calculation** - Automatically computes running balances through all transactions with opening and closing balance tracking Comprehensive Aging Analysis** - Calculates outstanding balances by age buckets (Current, 31-60 days, 61-90 days, 90+ days) Overdue Detection & Highlighting** - Automatically identifies overdue amounts with visual color-coded alerts on statements Professional HTML Design** - Creates beautifully branded statements with modern layouts, aging breakdowns, and payment information PDF Conversion** - Transforms HTML into print-ready, professional-quality PDF statements with preserved formatting Automated Email Delivery** - Sends branded emails to customers with PDF attachments and account summary details Google Drive Archival** - Automatically saves statements to organized folders with searchable filenames by account Credit Limit Monitoring** - Tracks credit utilization, detects over-limit accounts, and generates alerts at 75%, 90%, and 100%+ thresholds Risk Scoring System** - Calculates 0-100 risk scores based on payment behavior, aging, credit utilization, and overdue patterns Payment Behavior Analysis** - Tracks days since last payment, average payment time, and payment reliability trends Automated Recommendations** - Generates prioritized action items like "escalate to collections" or "suspend new credit" Accounting System Integration** - Syncs statement delivery, balance updates, and risk assessments to QuickBooks, Xero, or FreshBooks Conditional Team Notifications** - Different Slack alerts for overdue accounts (urgent) vs current accounts (standard) with risk metrics Transaction History Table** - Detailed itemization of all charges, payments, and running balances throughout statement period Multiple Payment Options** - Includes bank details, online payment links, and account manager contact information Key Features Automatic Statement Numbering**: Generates unique sequential statement numbers with format STMT-YYYYMM-AccountNumber for easy tracking and reference Aging Bucket Analysis**: Breaks down outstanding balances into current (0-30 days), 31-60 days, 61-90 days, and 90+ days overdue categories Credit Health Dashboard**: Visual indicators show credit utilization percentage, available credit, and over-limit warnings in statement Risk Assessment Engine**: Analyzes multiple factors including overdue amounts, credit utilization, payment frequency to calculate comprehensive risk score Payment Behavior Tracking**: Monitors days since last payment, identifies patterns like "Excellent - Pays on Time" or "Poor - Chronic Late Payment" Intelligent Recommendations**: Automatically generates prioritized action items based on account status, risk level, and payment history Transaction Running Balance**: Shows balance after each transaction so customers can verify accuracy and reconcile their records Over-Limit Detection**: Immediate alerts when accounts exceed credit limits with escalation recommendations to suspend new charges Good Standing Indicators**: Visual green checkmarks and positive messaging for accounts with no overdue balances Account Manager Details**: Includes dedicated contact person for questions, disputes, and payment arrangements Dispute Process Documentation**: Clear instructions on how customers can dispute transactions within required timeframe Multi-Currency Support**: Handles USD, EUR, GBP, INR with proper currency symbols and formatting throughout statement Accounting System Sync**: Logs statement delivery, balance updates, and risk assessments in QuickBooks, Xero, FreshBooks, or Wave Conditional Workflow Routing**: Different automation paths for high-risk overdue accounts vs healthy current accounts Activity Notes Generation**: Creates detailed CRM notes with account summary, recommendations, and delivery confirmation Print-Optimized PDFs**: A4 format with proper margins and color preservation for professional printing and digital distribution Perfect For B2B Companies with Trade Credit** - Manufacturing, wholesale, distribution businesses offering net-30 or net-60 payment terms Professional Services Firms** - Consulting, legal, accounting firms with monthly retainer clients and time-based billing Subscription Services (B2B)** - SaaS platforms, software companies, membership organizations with recurring monthly charges Equipment Rental Companies** - Construction equipment, party rentals, medical equipment with ongoing rental agreements Import/Export Businesses** - International traders managing accounts receivable across multiple customers and currencies Healthcare Billing Departments** - Medical practices, clinics, hospitals tracking patient account balances and payment plans Educational Institutions** - Private schools, universities, training centers with tuition payment plans and installments Telecommunications Providers** - Phone, internet, cable companies sending monthly account statements to business customers Utilities & Energy Companies** - Electric, gas, water utilities managing commercial account statements and collections Property Management Companies** - Real estate firms tracking tenant charges, rent payments, and maintenance fees Credit Card Companies & Lenders** - Financial institutions providing detailed account activity and payment due notifications Wholesale Suppliers** - Distributors supplying restaurants, retailers, contractors on credit terms with monthly settlements Commercial Insurance Agencies** - Agencies tracking premium payments, policy charges, and outstanding balances Construction Contractors** - General contractors billing for progress payments, change orders, and retention releases What You Will Need Required Integrations HTML to PDF API - PDF conversion service (API key required) - supports HTML/CSS to PDF API, PDFShift, or similar providers (approximately 1-5 cents per statement) Gmail or SMTP - Email delivery service for sending statements to customers (OAuth2 or SMTP credentials) Google Drive - Cloud storage for statement archival and compliance record-keeping (OAuth2 credentials required) Optional Integrations Slack Webhook** - Team notifications for overdue and high-risk accounts (free incoming webhook) Accounting Software Integration** - QuickBooks, Xero, FreshBooks, Zoho Books API for automatic statement logging and balance sync CRM Integration** - HubSpot, Salesforce, Pipedrive for customer activity tracking and collections workflow triggers Payment Gateway** - Stripe, PayPal, Square payment links for one-click online payment from statements Collections Software** - Integrate with collections management platforms for automatic escalation of high-risk accounts SMS Notifications** - Twilio integration for payment due reminders and overdue alerts via text message Quick Start Import Template - Copy JSON workflow and import into your n8n instance Configure PDF Service - Add HTML to PDF API credentials in the "HTML to PDF" node Setup Gmail - Connect Gmail OAuth2 credentials in "Send Email to Customer" node and update sender email Connect Google Drive - Add Google Drive OAuth2 credentials and set folder ID for statement archival Customize Company Info - Edit "Enrich with Company Data" node to add company name, address, contact details, bank information Configure Credit Limits - Set default credit limits and payment terms for your customer base Adjust Risk Thresholds - Modify risk scoring logic in "Credit Limit & Risk Analysis" node based on your policies Update Email Template - Customize email message in Gmail node with your branding and messaging Configure Slack - Add Slack webhook URLs in both notification nodes (overdue and current accounts) Connect Accounting System - Replace code in "Update Accounting System" node with actual API call to QuickBooks/Xero/FreshBooks Test Workflow - Submit sample transaction data via webhook to verify PDF generation, email delivery, and notifications Schedule Monthly Run - Set up scheduled trigger for automatic end-of-month statement generation for all customers Customization Options Custom Aging Buckets** - Modify aging periods to match your business (e.g., 0-15, 16-30, 31-45, 46-60, 60+ days) Industry-Specific Templates** - Create different statement designs for different customer segments or business units Multi-Language Support** - Translate statement templates for international customers (Spanish, French, German, Mandarin) Dynamic Credit Terms** - Configure different payment terms by customer type (VIP net-45, standard net-30, new customers due on receipt) Late Fee Calculation** - Add automatic late fee calculation and inclusion for overdue balances Payment Plan Tracking** - Track installment payment plans with remaining balance and next payment due Interest Charges** - Calculate and add interest charges on overdue balances based on configurable rates Partial Payment Allocation** - Show how partial payments were applied across multiple invoices Customer Portal Integration** - Generate secure links for customers to view statements and make payments online Batch Processing** - Process statements for hundreds of customers simultaneously with bulk email delivery White-Label Branding** - Create different branded templates for multiple companies or subsidiaries Custom Risk Models** - Adjust risk scoring weights based on your industry and historical payment patterns Collections Workflow Integration** - Automatically create tasks in collections software for high-risk accounts Early Payment Incentives** - Highlight early payment discounts or prompt payment benefits on statements Dispute Management** - Track disputed transactions and adjust balances accordingly with audit trail Expected Results 90% time savings** - Reduce statement creation from 2-3 hours to 5 minutes per customer 100% accuracy** - Eliminate calculation errors and missing transactions through automated processing 50% faster payment collection** - Professional statements with clear aging drive faster customer payments Zero filing time** - Automatic Google Drive organization with searchable filenames by account 30% reduction in overdue accounts** - Proactive credit monitoring and risk alerts prevent bad debt Real-time risk visibility** - Instant identification of high-risk accounts before they become uncollectible Automated compliance** - Complete audit trail with timestamped statement delivery and accounting sync Better customer communication** - Professional statements improve customer satisfaction and reduce disputes Reduced bad debt write-offs** - Early warning system catches payment issues before they escalate Improved cash flow** - Faster statement delivery and payment reminders accelerate cash collection Pro Tips Schedule Monthly Batch Generation** - Run workflow automatically on last day of month to generate statements for all customers simultaneously Customize Aging Thresholds** - Adjust credit alert levels (75%, 90%, 100%) based on your risk tolerance and industry norms Segment Customer Communications** - Use different email templates for VIP customers vs standard customers vs delinquent accounts Track Payment Patterns** - Monitor days-to-pay metrics by customer to identify chronic late payers proactively Integrate with Collections** - Connect workflow to collections software to automatically escalate 90+ day accounts Include Payment Portal Links** - Add unique payment links to each statement for one-click online payment Automate Follow-Up Reminders** - Build workflow extension to send payment reminders 7 days before due date Create Executive Dashboards** - Export risk scores and aging data to business intelligence tools for trend analysis Document Dispute Resolutions** - Log all disputed transactions in accounting system with resolution notes Test with Sample Data First** - Validate aging calculations with known test data before processing real customer accounts Archive Statements for Compliance** - Maintain 7-year archive in Google Drive organized by year and customer Monitor Credit Utilization Trends** - Track credit utilization changes month-over-month to predict cash flow needs Benchmark Against Industry** - Compare your DSO and bad debt ratios to industry averages to identify improvement areas Personalize Account Manager Info** - Assign dedicated contacts to customers and include their direct phone and email Use Descriptive Transaction Details** - Ensure transaction descriptions clearly explain charges to reduce disputes Business Impact Metrics Track these key metrics to measure workflow success: Statement Generation Time** - Measure average minutes from trigger to delivered statement (target: under 5 minutes) Statement Volume Capacity** - Count monthly statements generated through automation (expect 10-20x increase in capacity) Aging Calculation Accuracy** - Track statements with aging errors (target: 0% error rate) Days Sales Outstanding (DSO)** - Monitor average days to collect payment (expect 15-30% reduction) Bad Debt Write-Offs** - Track uncollectible accounts as percentage of revenue (expect 30-50% reduction) Collection Rate** - Monitor percentage of invoices collected within terms (expect 10-20% improvement) Customer Disputes** - Count statement disputes and billing inquiries (expect 50-70% reduction) Over-Limit Accounts** - Track number of accounts exceeding credit limits (early detection prevents losses) High-Risk Account Identification** - Measure days between risk detection and collection action (target: within 48 hours) Cash Flow Improvement** - Calculate working capital improvement from faster collections (typical: 20-35% improvement) Template Compatibility Compatible with n8n version 1.0 and above Works with n8n Cloud and Self-Hosted instances Requires HTML to PDF API service subscription (1-5 cents per statement) No coding required for basic setup Fully customizable for industry-specific requirements Integrates with major accounting platforms via API Multi-currency and multi-language ready Supports batch processing for large customer bases Compliant with financial record-keeping regulations Ready to transform your account receivables management? Import this template and start generating professional statements with credit monitoring, risk assessment, and automated collections alerts - improving your cash flow, reducing bad debt, and freeing your accounting team to focus on strategic financial management!
by WeblineIndia
Automated Failed Login Detection with Jira Security Tasks, Slack Notifications Webhook: Failed Login Attempts → Jira Security Case → Slack Warnings This n8n workflow monitors failed login attempts from any application, normalizes incoming data, detects repeated attempts within a configurable time window and automatically: Sends detailed alerts to Slack, Creates Jira security tasks (single or grouped based on repetition), Logs all failed login attempts into a Notion database. It ensures fast, structured and automated responses to potential account compromise or brute-force attempts while maintaining persistent records. Quick Implementation Steps Import this JSON workflow into n8n. Connect your application to the failed-login webhook endpoint. Add Jira Cloud API credentials. Add Slack API credentials. Add Notion API credentials and configure the database for storing login attempts. Enable the workflow — done! What It Does Receives Failed Login Data Accepts POST requests containing failed login information. Normalizes the data, ensuring consistent fields: username, ip, timestamp and error. Validates Input Checks for missing username or IP. Sends a Slack alert if any required field is missing. Detects Multiple Attempts Uses a sliding time window (default: 5 minutes) to detect multiple failed login attempts from the same username + IP. Single attempts → standard Jira task + Slack notification. Multiple attempts → grouped Jira task + detailed Slack notification. Logs Attempts in Notion Records all failed login events into a Notion database with fields: Username, IP, Total Attempts, Attempt List, Attempt Type. Formats Slack Alerts Single attempt → lightweight notification. Multiple attempts → summary including timestamps, errors, total attempts, and Jira ticket link. Who’s It For This workflow is ideal for: Security teams monitoring authentication logs. DevOps/SRE teams maintaining infrastructure access logs. SaaS platform teams with high login traffic. Organizations aiming to automate breach detection. Teams using Jira + Slack + Notion + n8n for incident workflows. Requirements n8n (Self-Hosted or Cloud). Your application must POST failed login data to the webhook. Jira Software Cloud credentials (Email, API Token, Domain). Slack Bot Token with message-posting permissions. Notion API credentials with access to a database. Basic understanding of your login event sources. How It Works Webhook Trigger: Workflow starts when a failed-login event is sent to the failed-login webhook. Normalization: Converts single objects or arrays into a uniform format. Ensures username, IP, timestamp and error are present. Prepares a logMessage for Slack and Jira nodes. Validation: IF node checks whether username and IP exist. If missing → Slack alert for missing information. Multiple Attempt Detection: Function node detects repeated login attempts within a 5-minute sliding window. Flags attempts as multiple: true or false. Branching: Multiple attempts → build summary, create Jira ticket, format Slack message, store in Notion. Single attempts → create Jira ticket, format Slack message, store in Notion. Slack Alerts: Single attempt → concise message Multiple attempts → detailed summary with timestamps and Jira ticket link Notion Logging: Stores username, IP, total attempts, attempt list, attempt type in a dedicated database for recordkeeping. How To Set Up Import Workflow → Workflows → Import from File in n8n. Webhook Setup → copy the URL from Faield Login Trigger node and integrate with your application. Jira Credentials → connect your Jira account to both Jira nodes and configure project/issue type. Slack Credentials → connect your Slack Bot and select the alert channel. Notion Credentials → connect your Notion account and select the database for storing login attempts. Test the Workflow → send sample events: missing fields, single attempts, multiple attempts. Enable Workflow → turn on workflow once testing passes. Logic Overview | Step Node | Description | |---------------------------------|-----------------------------------------------| | Normalize input | Normalize Login Event — Ensures each event has required fields and prepares a logMessage. | | Validate fields | Check Username & IP present — IF node → alerts Slack if data is incomplete. | | Detect repeats | Detect Multiple Attempts — Finds multiple attempts within a 5-minute window; sets multiple flag. | | Multiple attempts | IF - Multiple Attempts + Build Multi-Attempt Summary — Prepares grouped summary for Slack & Jira. | | Single attempt | Create Ticket - Single Attempt — Creates Jira task & Slack alert for one-off events. | | Multiple attempt ticket | Create Ticket - Multiple Attempts — Creates detailed Jira task. | | Slack alert formatting | Format Fields For Single/Multiple Attempt — Prepares structured message for Slack. | | Slack alert delivery | Slack Alert - Single/Multiple Attempts — Posts alert in selected Slack channel. | | Notion logging | Login Attempts Data Store in DB — Stores structured attempt data in Notion database. | Customization Options Webhook Node** → adjust endpoint path for your application. Normalization Function** → add fields such as device, OS, location or user-agent. Multiple Attempt Logic** → change the sliding window duration or repetition threshold. Jira Nodes** → modify issue type, labels or project. Slack Nodes** → adjust markdown formatting, channel routing or severity-based channels. Notion Node** → add or modify database fields to store additional context. Optional Enhancements: Geo-IP lookup for country/city info. Automatic IP blocking via firewall or WAF. User notification for suspicious login attempts. Database logging in MySQL/Postgres/MongoDB. Threat intelligence enrichment (e.g., AbuseIPDB). Use Case Examples Detect brute-force attacks targeting user accounts. Identify credential stuffing across multiple users. Monitor admin portal access failures with Jira task creation. Alert security teams instantly when login attempts originate from unusual locations. Centralize failed login monitoring across multiple applications with Notion logging. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------------------------------|---------------------------------------------------|-------------------------------------------------------------| | Workflow not receiving data | Webhook misconfigured | Verify webhook URL & POST payload format | | Jira ticket creation fails | Invalid credentials or insufficient permissions | Update Jira API token and project access | | Slack alert not sent | Incorrect channel ID or missing bot scopes | Fix Slack credentials and permissions | | Multiple attempts not detected| Sliding window logic misaligned | Adjust Detect Multiple Attempts node code | | Notion logging fails | Incorrect database ID or missing credentials | Update Notion node credentials and database configuration | | Errors in normalization | Payload format mismatch | Update Normalize Login Event function code | Need Help? If you need help setting up, customizing or extending this workflow, WeblineIndia can assist with full n8n development, workflow automation, security event processing and custom integrations.
by Shayan Ali Bakhsh
About this Workflow This workflow helps you repurpose your YouTube videos across multiple social media platforms with zero manual effort. It’s designed for creators, businesses, and marketers who want to maximize reach without spending hours re-uploading content everywhere. How It Works Trigger from YouTube The workflow checks your YouTube channel every 10 minutes via RSS feed. It compares the latest video ID with the last saved one to detect if a new video was uploaded. Tutorial: How to get YouTube Channel RSS Feed Generate Descriptions with AI Uses Gemini 2.5 Flash to automatically generate fresh, engaging descriptions for your posts. Create Images with ContentDrips ContentDrips offers multiple templates (carousel, single image, branding templates, etc.). The workflow generates a custom promotional image using your video description and thumbnail. Install node: npm install n8n-nodes-contentdrips Docs: ContentDrips Blog Tutorial Publish Across Social Platforms with SocialBu Instead of manually connecting each social media API, this workflow uses SocialBu. From a single connection, you can post to: Facebook, Instagram, TikTok, YouTube, Twitter (X), LinkedIn, Threads, Pinterest, and more. Website: SocialBu Get Real-Time Notifications via Discord After each run, the workflow sends updates to your Discord channel. You’ll know if the upload was successful, or if an error occurred (e.g., API limits). Setup guide: Discord OAuth Credentials Why Use This Workflow? Saves time by automating the entire repurposing process. Ensures consistent branding and visuals across platforms. Works around platform restrictions by leveraging SocialBu’s integrations. Keeps you updated with Discord notifications—no guessing if something failed. Requirements YouTube channel RSS feed link ContentDrips API key, template ID, and branding setup SocialBu account with connected social media platforms Discord credentials (for webhook updates) Need Help? Message me on LinkedIn: Shayan Ali Bakhsh Happy Automation 🚀
by WeblineIndia
IPA Size Tracker with Trend Alerts – Automated iOS Apps Size Monitoring This workflow runs on a daily schedule and monitors IPA file sizes from configured URLs. It stores historical size data in Google Sheets, compares current vs. previous builds and sends email alerts only when significant size changes occur (default: ±10%). A DRY_RUN toggle allows safe testing before real notifications go out. Who’s it for iOS developers tracking app binary size growth over time. DevOps teams monitoring build artifacts and deployment sizes. Product managers ensuring app size budgets remain acceptable. QA teams detecting unexpected size changes in release builds. Mobile app teams optimizing user experience by keeping apps lightweight. How it works Schedule Trigger (daily at 09:00 UTC) kicks off the workflow. Configuration: Define monitored apps with {name, version, build, ipa_url}. HTTP Request downloads the IPA file from its URL. Size Calculation: Compute file sizes in bytes, KB, MB and attach timestamp metadata. Google Sheets: Append size data to the IPA Size History sheet. Trend Analysis: Compare current vs. previous build sizes. Alert Logic: Evaluate thresholds (>10% increase or >10% decrease). Email Notification: Send formatted alerts with comparisons and trend indicators. Rate Limit: Space out notifications to avoid spamming recipients. How to set up 1. Spreadsheet Create a Google Sheet with a tab named IPA Size History containing: Date, Timestamp, App_Name, Version, Build_Number, Size_Bytes, Size_KB, Size_MB, IPA_URL 2. Credentials Google Sheets (OAuth)** → for reading/writing size history. Gmail** → for sending alert emails (use App Password if 2FA is enabled). 3. Open “Set: Configuration” node Define your workflow variables: APP_CONFIGS = array of monitored apps ({name, version, build, ipa_url}) SPREADSHEET_ID = Google Sheet ID SHEET_NAME = IPA Size History SMTP_FROM = sender email (e.g., devops@company.com) ALERT_RECIPIENTS = comma-separated emails SIZE_INCREASE_THRESHOLD = 0.10 (10%) SIZE_DECREASE_THRESHOLD = 0.10 (10%) LARGE_APP_WARNING = 300 (MB) SCHEDULE_TIME = 09:00 TIMEZONE = UTC DRY_RUN = false (set true to test without sending emails) 4. File Hosting Host IPA files on Google Drive, Dropbox or a web server. Ensure direct download URLs are used (not preview links). 5. Activate the workflow Once configured, it will run automatically at the scheduled time. Requirements Google Sheet with the IPA Size History tab. Accessible IPA file URLs. SMTP / gmail account (Gmail recommended). n8n (cloud or self-hosted) with Google Sheets + Email nodes. Sufficient local storage for IPA file downloads. How to customize the workflow Multiple apps**: Add more configs to APP_CONFIGS. Thresholds**: Adjust SIZE_INCREASE_THRESHOLD / SIZE_DECREASE_THRESHOLD. Notification templates**: Customize subject/body with variables: {{app_name}}, {{current_size}}, {{previous_size}}, {{change_percent}}, {{trend_status}}. Schedule**: Change Cron from daily to hourly, weekly, etc. Large app warnings**: Adjust LARGE_APP_WARNING. Trend analysis**: Extend beyond one build (7-day, 30-day averages). Storage backend**: Swap Google Sheets for CSV, DB or S3. Add-ons to level up Slack Notifications**: Add Slack webhook alerts with emojis & formatting. Size History Charts**: Generate trend graphs with Chart.js or Google Charts API. Environment separation**: Monitor dev/staging/prod builds separately. Regression detection**: Statistical anomaly checks. Build metadata**: Log bundle ID, SDK versions, architectures. Archive management**: Auto-clean old records to save space. Dashboards**: Connect to Grafana, DataDog or custom BI. CI/CD triggers**: Integrate with pipelines via webhook trigger. Common Troubleshooting No size data** → check URLs return binary IPA (not HTML error). Download failures** → confirm hosting permissions & direct links. Missing alerts** → ensure thresholds & prior history exist. Google Sheets errors** → check sheet/tab names & OAuth credentials. Email issues** → validate SMTP credentials, spam folder, sender reputation. Large file timeouts** → raise HTTP timeout for >100MB files. Trend errors** → make sure at least 2 builds exist. No runs** → confirm workflow is active and timezone is correct. Need Help? If you’d like this to customize this workflow to suit your app development process, then simply reach out to us here and we’ll help you customize the template to your exact use case.
by vinci-king-01
Customer Support Analysis Dashboard with AI and Automated Insights 🎯 Target Audience Customer support managers and team leads Customer success teams monitoring satisfaction Product managers analyzing user feedback Business analysts measuring support metrics Operations managers optimizing support processes Quality assurance teams monitoring support quality Customer experience (CX) professionals 🚀 Problem Statement Manual analysis of customer support tickets and feedback is time-consuming and often misses critical patterns or emerging issues. This template solves the challenge of automatically collecting, analyzing, and visualizing customer support data to identify trends, improve response times, and enhance overall customer satisfaction. 🔧 How it Works This workflow automatically monitors customer support channels using AI-powered analysis, processes tickets and feedback, and provides actionable insights for improving customer support operations. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Ticket Analysis - Uses advanced NLP to categorize, prioritize, and analyze support tickets Multi-Channel Integration - Monitors email, chat, help desk systems, and social media Automated Insights - Generates reports on trends, response times, and satisfaction scores Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the ticket was processed | "2024-01-15T10:30:00Z" | | ticket_id | String | Unique ticket identifier | "SUP-2024-001234" | | customer_email | String | Customer contact information | "john@example.com" | | subject | String | Ticket subject line | "Login issues with new app" | | description | String | Full ticket description | "I can't log into the mobile app..." | | category | String | AI-categorized ticket type | "Technical Issue" | | priority | String | Calculated priority level | "High" | | sentiment_score | Number | Customer sentiment (-1 to 1) | -0.3 | | urgency_indicator | String | Urgency classification | "Immediate" | | response_time | Number | Time to first response (hours) | 2.5 | | resolution_time | Number | Time to resolution (hours) | 8.0 | | satisfaction_score | Number | Customer satisfaction rating | 4.2 | | agent_assigned | String | Support agent name | "Sarah Johnson" | | status | String | Current ticket status | "Resolved" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Help desk system API access (Zendesk, Freshdesk, etc.) Email service integration (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for customer support analysis Configure the sheet name (default: "Support Analysis") 4. Configure Support System Integration Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for your help desk system or support portal Customize the user prompt to extract specific ticket data Set up categories and priority thresholds 5. Set up Notification Channels Configure Slack webhook or API credentials for alerts Set up email service credentials for critical issues Define alert thresholds for different priority levels Test notification delivery 6. Configure Schedule Trigger Set analysis frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider support system rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test ticket analysis with sample data 🔄 Workflow Customization Options Modify Analysis Targets Add or remove support channels (email, chat, social media) Change ticket categories and priority criteria Adjust analysis frequency based on ticket volume Extend Analysis Capabilities Add more sophisticated sentiment analysis Implement customer churn prediction models Include agent performance analytics Add automated response suggestions Customize Alert System Set different thresholds for different ticket types Create tiered alert systems (info, warning, critical) Add SLA breach notifications Include trend analysis alerts Output Customization Add data visualization and reporting features Implement support trend charts and graphs Create executive dashboards with key metrics Add customer satisfaction trend analysis 📈 Use Cases Support Ticket Management**: Automatically categorize and prioritize tickets Response Time Optimization**: Identify bottlenecks in support processes Customer Satisfaction Monitoring**: Track and improve satisfaction scores Agent Performance Analysis**: Monitor and improve agent productivity Product Issue Detection**: Identify recurring problems and feature requests SLA Compliance**: Ensure support teams meet service level agreements 🚨 Important Notes Respect support system API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update your analysis parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and GDPR compliance for customer data 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Ticket parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust analysis frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Help desk system API documentation Customer support analytics best practices
by vinci-king-01
Competitor Price Monitoring Dashboard with AI and Real-time Alerts 🎯 Target Audience E-commerce managers and pricing analysts Retail business owners monitoring competitor pricing Marketing teams tracking market positioning Product managers analyzing competitive landscape Data analysts conducting pricing intelligence Business strategists making pricing decisions 🚀 Problem Statement Manual competitor price monitoring is inefficient and often leads to missed opportunities or delayed responses to market changes. This template solves the challenge of automatically tracking competitor prices, detecting significant changes, and providing actionable insights for strategic pricing decisions. 🔧 How it Works This workflow automatically monitors competitor product prices using AI-powered web scraping, analyzes price trends, and sends real-time alerts when significant changes are detected. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain up-to-date price data AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract pricing information from competitor websites Price Analysis Engine - Processes historical data to detect trends and anomalies Alert System - Sends notifications via Slack and email when price changes exceed thresholds Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the price was recorded | "2024-01-15T10:30:00Z" | | competitor_name | String | Name of the competitor | "Amazon" | | product_name | String | Product name and model | "iPhone 15 Pro 128GB" | | current_price | Number | Current price in USD | 999.00 | | previous_price | Number | Previous recorded price | 1099.00 | | price_change | Number | Absolute price difference | -100.00 | | price_change_percent | Number | Percentage change | -9.09 | | product_url | URL | Direct link to product page | "https://amazon.com/iphone15" | | alert_triggered | Boolean | Whether alert was sent | true | | trend_direction | String | Price trend analysis | "Decreasing" | 🛠️ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Email service for alerts (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for price monitoring data Configure the sheet name (default: "Price Monitoring") 4. Configure Competitor URLs Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for each competitor you want to monitor Customize the user prompt to extract specific pricing data Set appropriate price thresholds for alerts 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials (SendGrid, SMTP, etc.) Define alert thresholds and notification preferences Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider competitor website rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test alert notifications with sample data 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove competitor websites Change product categories or specific products Adjust monitoring frequency based on market volatility Extend Price Analysis Add more sophisticated trend analysis algorithms Implement price prediction models Include competitor inventory and availability tracking Customize Alert System Set different thresholds for different product categories Create tiered alert systems (info, warning, critical) Add SMS notifications for urgent price changes Output Customization Add data visualization and reporting features Implement price history charts and graphs Create executive dashboards with key metrics 📈 Use Cases Dynamic Pricing**: Adjust your prices based on competitor movements Market Intelligence**: Understand competitor pricing strategies Promotion Planning**: Time your promotions based on competitor actions Inventory Management**: Optimize stock levels based on market conditions Customer Communication**: Proactively inform customers about price changes 🚨 Important Notes Respect competitor websites' terms of service and robots.txt Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider legal implications of automated price monitoring 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Price parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Slack API documentation for notification setup
by DataMinex
Transform property searches into personalized experiences! This powerful automation delivers dream home matches straight to clients' inboxes with professional CSV reports - all from a simple web form. 🚀 What this workflow does Create a complete real estate search experience that works 24/7: ✨ Smart Web Form - Beautiful property search form captures client preferences 🧠 Dynamic SQL Builder - Intelligently creates optimized queries from user input ⚡ Lightning Database Search - Scans 1000+ properties in milliseconds 📊 Professional CSV Export - Excel-ready reports with complete property details 📧 Automated Email Delivery - Personalized emails with property previews and attachments 🎯 Perfect for: Real Estate Agents** - Generate leads and impress clients with instant service Property Managers** - Automate tenant matching and recommendations Brokerages** - Provide 24/7 self-service property discovery Developers** - Showcase available properties with professional automation 💡 Why this workflow is a game-changer > "From property search to professional report delivery in under 30 seconds!" ⚡ Instant Results: Zero wait time for property matches 🎨 Professional Output: Beautiful emails that showcase your expertise 📱 Mobile Optimized: Works flawlessly on all devices 🧠 Smart Filtering: Only searches criteria clients actually specify 📈 Infinitely Scalable: Handles unlimited searches simultaneously 📊 Real Estate Data Source Built on authentic US market data from the Github: 🏘️ 1000+ Real Properties across all US states 💰 Actual Market Prices from legitimate listings 🏠 Complete Property Details (bedrooms, bathrooms, square footage, lot size) 📍 Verified Locations with accurate cities, states, and ZIP codes 🏢 Broker Information for authentic real estate context 🛠️ Quick Setup Guide Prerequisites Checklist ✅ [ ] SQL Server database (MySQL/PostgreSQL also supported) [ ] Gmail account for automated emails [ ] n8n instance (cloud or self-hosted) [ ] 20 minutes setup time Step 1: Import Real Estate Data 📥 🌟 Download the data 💾 Download CSV file (1000+ properties included) 🗄️ Create SQL Server table with this exact schema: CREATE TABLE [REALTOR].[dbo].[realtor_usa_price] ( brokered_by BIGINT, status NVARCHAR(50), price DECIMAL(12,2), bed INT, bath DECIMAL(3,1), acre_lot DECIMAL(10,8), street BIGINT, city NVARCHAR(100), state NVARCHAR(50), zip_code INT, house_size INT, prev_sold_date NVARCHAR(50) ); 📊 Import your CSV data into this table Step 2: Configure Database Connection 🔗 🔐 Set up Microsoft SQL Server credentials in n8n ✅ Test connection to ensure everything works 🎯 Workflow is pre-configured for the table structure above Step 3: Gmail Setup (The Magic Touch) 📧 🌐 Visit Google Cloud Console 🆕 Create new project (or use existing) 🔓 Enable Gmail API in API Library 🔑 Create OAuth2 credentials (Web Application) ⚙️ Add your n8n callback URL to authorized redirects 🔗 Configure Gmail OAuth2 credentials in n8n ✨ Authorize your Google account Step 4: Launch Your Property Search Portal 🚀 📋 Import this workflow template (form is pre-configured) 🌍 Copy your webhook URL from the Property Search Form node 🔍 Test with a sample property search 📨 Check email delivery with CSV attachment 🎉 Go live and start impressing clients! 🎨 Customization Playground 🏷️ Personalize Your Brand // Customize email subjects in the Gmail node "🏠 Exclusive Properties Curated Just for You - ${results.length} Perfect Matches!" "✨ Your Dream Home Portfolio - Handpicked by Our Experts" "🎯 Hot Market Alert - ${results.length} Premium Properties Inside!" 🔧 Advanced Enhancements 🎨 HTML Email Templates**: Create stunning visual emails with property images 📊 Analytics Dashboard**: Track popular searches and user engagement 🔔 Smart Alerts**: Set up automated price drop notifications 📱 Mobile Integration**: Connect to React Native or Flutter apps 🤖 AI Descriptions**: Add ChatGPT for compelling property descriptions 🌍 Multi-Database Flexibility // Easy database switching // MySQL: Replace Microsoft SQL node → MySQL node // PostgreSQL: Swap for PostgreSQL node // MongoDB: Use MongoDB node with JSON queries // Even CSV files: Use CSV reading nodes for smaller datasets 🚀 Advanced Features & Extensions 🔥 Pro Tips for Power Users 🔄 Bulk Processing**: Handle multiple searches simultaneously 💾 Smart Caching**: Store popular searches for lightning-fast results 📈 Lead Scoring**: Track which properties generate most interest 📅 Follow-up Automation**: Schedule nurturing email sequences 🎯 Integration Possibilities 🏢 CRM Connection**: Auto-add qualified leads to your CRM 📅 Calendar Integration**: Add property viewing scheduling 📊 Price Monitoring**: Track market trends and price changes 📱 Social Media**: Auto-share featured properties to social platforms 💬 Chat Integration**: Connect to WhatsApp or SMS for instant alerts 🔗 Expand Your Real Estate Automation 🌟 Related Workflow Ideas 🤖 AI Property Valuation - Add machine learning for price predictions 📊 Market Analysis Reports - Generate comprehensive market insights 📱 SMS Property Alerts - Instant text notifications for hot properties 🏢 Commercial Property Search - Adapt for office and retail spaces 💹 Investment ROI Calculator - Add financial analysis for investors 🏘️ Neighborhood Analytics - Include school ratings and demographics 🛠️ Technical Extensions 📷 Image Processing: Auto-resize and optimize property photos 🗺️ Map Integration: Add interactive property location maps 📱 Progressive Web App: Create mobile app experience 🔔 Push Notifications: Real-time alerts for saved searches 🚀 Get Started Now Import this workflow template Configure your database and Gmail Customize branding and messaging Launch your professional property search portal Watch client satisfaction soar!
by AOE Agent Lab
This n8n template demonstrates how to audit your brand’s visibility across multiple AI systems and automatically log the results to Google Sheets. It sends the same prompt to OpenAI, Perplexity, and (optionally) a ChatGPT web actor, then runs sentiment and brand-hierarchy analysis on the responses. Use cases are many: benchmark how often (and how positively) your brand appears in AI answers, compare responses across models, and build a repeatable “AI visibility” report for marketing and comms teams. 💡 Good to know You’ll bring your own API keys for OpenAI and Perplexity. Usage costs depend on your providers’ pricing. The optional APIfy actor automates the ChatGPT web UI and may violate terms of service. Use strictly at your own risk. ⁉ How it works A Manual Trigger starts the workflow (you can replace it with any trigger). Input prompts are read from a Google Sheet (or you can use the included “manual input” node). The prompt is sent to three tools: -- OpenAI (via API) to check baseline LLM knowledge. -- Perplexity (API) to retrieve an answer with citations. -- Optionally, an APIfy actor that scrapes a ChatGPT response (web interface). Responses are normalized and mapped (including citations where available). An LLM-powered sentiment pass classifies each response into: -- Basic Polarity: Positive, Neutral, or Negative -- Emotion Category: Joy, Sadness, Anger, Fear, Disgust, or Surprise -- Brand Hierarchy: ordered list such as Nike>Adidas>Puma The consolidated record (Prompt, LLM, Response, Brand mentioned flag, Brand Hierarchy, Basic Polarity, Emotion Category, Source 1–3/4) is appended to your “Output many models” Google Sheet. A simplified branch shows how to take a single response and push it to a separate sheet. 🗺️ How to use Connect your Google Sheets OAuth and create two tabs: -- Input: a single “Prompt” column -- Output: columns for Prompt, LLM, Response, Brand mentioned, Brand Hierarchy, Basic Polarity, Emotion Category, Source 1, Source 2, Source 3, Source 4 Add your OpenAI and Perplexity credentials. (Optional) Add an APIfy credential (Query Auth with token) if you want the ChatGPT web actor path. Run the Manual Trigger to process prompts in batches and write results to Sheets. Adjust the included “Limit for testing” node or remove it to process more rows. ⚒️ Requirements OpenAI API access (e.g., GPT-4.1-mini / GPT-5 as configured in the template) Perplexity API access (model: sonar) Google Sheets account with OAuth connected in n8n (Optional) APIfy account/token for the ChatGPT web actor 🎨 Customising this workflow Swap the Manual Trigger for a webhook or schedule to run audits automatically. Extend the sentiment analyzer instructions to include brand-specific rules or compliance checks. Track more sources (e.g., additional models or vertical search tools) by duplicating the request→map→append pattern. Add scoring (e.g., “visibility score” per prompt) and charts by pointing the output sheet into Looker Studio or a BI tool.
by Jitesh Dugar
Verified Gym Trial Pass with Photo ID Overview Automate gym trial pass generation with email verification, photo ID integration, QR codes, and professional PDF passes. This workflow handles the complete member onboarding process - from signup to verified pass delivery - in under 10 seconds. What This Workflow Does Receives signup data via webhook (name, email, photo URL, validity dates) Verifies email authenticity using VerifiEmail API (blocks disposable emails) Generates unique Pass ID in format GYM-{timestamp} Creates QR code for quick check-in at gym entrance Builds branded pass design with gradient styling, member photo, and validity dates Exports to PDF format for mobile-friendly viewing Sends email with PDF attachment and welcome message Logs all registrations in Google Sheets for record-keeping Returns API response with complete pass details Key Features ✅ Email Verification - Blocks fake and disposable email addresses ✅ Photo ID Integration - Displays member photo on digital pass ✅ QR Code Generation - Instant check-in scanning capability ✅ Professional Design - Gradient purple design with modern styling ✅ PDF Export - Mobile-friendly format that members can save ✅ Automated Emails - Welcome message with pass attachment ✅ Spreadsheet Logging - Automatic record-keeping in Google Sheets ✅ Error Handling - Proper 400 responses for invalid signups ✅ Success Responses - Detailed JSON with all pass information Use Cases Gyms & Fitness Centers** - Trial pass management for new members Yoga Studios** - Week-long trial class passes Sports Clubs** - Guest pass generation with photo verification Wellness Centers** - Temporary access cards for trial periods Co-working Spaces** - Day pass generation with member photos Swimming Pools** - Verified trial memberships with photo IDs What You Need Required Credentials VerifiEmail API - Email verification service Get API key: https://verifi.email HTMLCSSToImage API - PNG image generation Get credentials: https://htmlcsstoimg.com HTMLCSSToPDF API - PDF conversion Get credentials: https://pdfmunk.com Gmail OAuth2 - Email delivery Connect your Google account Enable Gmail API in Google Cloud Console Google Sheets API - Data logging Connect your Google account Same OAuth2 as Gmail Setup Instructions Step 1: Create Google Sheet Create a new Google Sheet named "Gym Trial Passes 2025" Add these column headers in Row 1: Pass ID Name Email Start Date Valid Till Issued At Email Verified Status Step 2: Configure Credentials Add VerifiEmail API credentials Add HTMLCSSToImage credentials Add HTMLCSSToPDF credentials Connect Gmail OAuth2 Connect Google Sheets OAuth2 Step 3: Update Google Sheets Node Open "Log to Google Sheets" node Select your "Gym Trial Passes 2025" sheet Confirm column mappings match your headers Step 4: Test the Workflow Copy the webhook URL from the Webhook node Open Postman and create a POST request Use this test payload: { "name": "Rahul Sharma", "email": "your-email@gmail.com", "photo_url": "https://images.unsplash.com/photo-1633332755192-727a05c4013d?w=400", "start_date": "2025-11-15", "valid_till": "2025-11-22" } Send the request and check: ✅ Email received with PDF pass ✅ Google Sheet updated with new row ✅ Success JSON response returned Step 5: Activate & Use Click "Active" toggle to enable the workflow Integrate webhook URL with your gym's website form Members receive instant verified passes upon signup Expected Responses ✅ Success Response (200 OK) { "status": "success", "message": "Gym trial pass verified and sent successfully! 🎉", "data": { "pass_id": "GYM-1731398400123", "email": "member@example.com", "name": "Rahul Sharma", "valid_from": "November 15, 2025", "valid_till": "November 22, 2025", "email_verified": true, "recorded_in_sheets": true, "pass_sent_to_email": true }, "timestamp": "2025-11-12T10:30:45.123Z" } ❌ Error Response (400 Bad Request) { "status": "error", "message": "Invalid or disposable email address. Please use a valid email to register.", "email_verified": false, "email_provided": "test@tempmail.com" } Customization Options Modify Pass Design Edit the Build HTML Pass node to customize: Colors and gradient (currently purple gradient) Layout and spacing Fonts and typography Logo placement (add your gym logo) Additional branding elements Change Email Template Edit the Send Email with Pass node to modify: Subject line Welcome message Instructions Branding elements Footer content Adjust Validity Period Workflow accepts custom start_date and valid_till from webhook payload. You can also hardcode validity periods in the Generate Pass Details node. Add Additional Fields Extend the workflow to capture: Phone number Emergency contact Medical conditions Membership preferences Referral source Performance Average execution time**: 8-12 seconds Handles**: 100+ passes per hour PDF size**: ~150-250 KB Email delivery**: Instant (Gmail API) Success rate**: 99%+ with valid emails Security & Privacy ✅ Email verification prevents fake signups ✅ Unique Pass IDs prevent duplication ✅ All data logged in your private Google Sheet ✅ No data stored in n8n (passes through only) ✅ HTTPS webhook for secure data transmission ✅ OAuth2 authentication for Google services Tags gym fitness trial-pass email-verification qr-code pdf-generation member-onboarding automation verification photo-id
by Typhoon Team
This n8n template demonstrates how to use Typhoon OCR + LLM to digitize business cards, enrich the extracted details, and save them directly into Google Sheets or any CRM. It works with both Thai and English business cards and even includes an optional step to draft greeting emails automatically. Use cases: Automatically capture leads at events, enrich contact details before saving them into your CRM, or simply keep a structured database of your professional network. Good to know Two versions of the workflow are provided: 🟢 Without Search API → cost-free option using only Typhoon OCR + LLM 🔵 With Search API → adds Google Search enrichment for richer profiles (may incur API costs via SerpAPI) The Send Email step is optional — include it if you want to follow up instantly, or disable it if not needed. Typhoon provides a free API for anyone to sign up and use → opentyphoon.ai How it works A form submission triggers the workflow with a business card image (JPG/PNG). Typhoon OCR extracts text from the card (supports Thai & English). Typhoon LLM parses the extracted text into structured JSON fields (e.g., name, job title, organization, email). Depending on your chosen path: Version 1: Typhoon LLM enriches the record with job type, level, and sector. Version 2: The workflow calls the Search API (via SerpAPI) to add a profile/company summary. The cleaned and enriched contact is saved to Google Sheets (can be swapped with your preferred CRM or database). (Optional) Typhoon LLM drafts a short, friendly greeting email, which can be sent automatically via Gmail. How to use The included form trigger is just one example. You can replace it with: A webhook for uploads A file drop in cloud storage Or even a manual trigger for testing You can easily change the destination from Google Sheets to HubSpot, Notion, Airtable, or Salesforce. The enrichment prompt is customizable — adjust it to classify contacts based on your organization’s needs. Requirements Typhoon API key Google Sheets API credentials + a prepared spreadsheet (Optional) Gmail API credentials for sending emails (Optional) SerpAPI key for the Search API enrichment path Customising this workflow This AI-powered business card reader can be adapted to many scenarios: Event lead capture: Collect cards at conferences and sync them to your CRM automatically. Sales enablement: Draft instant greeting emails for new contacts. Networking: Keep a clean and enriched database of your professional connections.
by Anshul Chauhan
Automate Your Life: The Ultimate AI Assistant in Telegram (Powered by Google Gemini) Transform your Telegram messenger into a powerful, multi-modal personal or team assistant. This n8n workflow creates an intelligent agent that can understand text, voice, images, and documents, and take action by connecting to your favorite tools like Google Calendar, Gmail, Todoist, and more. At its core, a powerful Manager Agent, driven by Google Gemini, interprets your requests, orchestrates a team of specialized sub-agents, and delivers a coherent, final response, all while maintaining a persistent memory of your conversations. Key Features 🧠 Intelligent Automation: Uses Google Gemini as a central "Manager Agent" to understand complex requests and delegate tasks to the appropriate tool. 🗣️ Multi-Modal Input: Interact naturally by sending text, voice notes, photos, or documents directly into your Telegram chat. 🔌 Integrated Toolset: Comes pre-configured with agents to manage your memory, tasks, emails, calendar, research, and project sheets. 🗂️ Persistent Memory: Leverages Airtable as a knowledge base, allowing the assistant to save and recall personal details, company information, or past conversations for context-rich interactions. ⚙️ Smart Routing: Automatically detects the type of message you send and routes it through the correct processing pipeline (e.g., voice is transcribed, images are analyzed). 🔄 Conversational Context: Utilizes a window buffer to maintain short-term memory, ensuring follow-up questions and commands are understood within the current conversation. How It Works The Telegram Trigger node acts as the entry point, receiving all incoming messages (text, voice, photo, document). A Switch node intelligently routes the message based on its type: Voice**: The audio file is downloaded and transcribed into text using a voice-to-text service. Photo**: The image is downloaded, converted to a base64 string, and prepared for visual analysis. Document**: The file is routed to a document handler that extracts its text content for processing. Text**: The message is used as-is. A Merge node gathers the processed input into a unified prompt. The Manager Agent receives this prompt. It analyzes the user's intent and orchestrates one or more specialized agents/tools: memory_base (Airtable): For saving and retrieving information from your long-term knowledge base. todo_and_task_manager (Todoist): To create, assign, or check tasks. email_agent (Gmail): To compose, search, or send emails. calendar_agent (Google Calendar): To schedule events or check your agenda. research_agent (Wikipedia/Web Search): To look up information. project_management (Google Sheets): To provide updates on project trackers. After executing the required tasks, the Manager Agent formulates a final response and sends it back to you via the Telegram node. Setup Instructions Follow these steps to get your AI assistant up and running. Telegram Bot: Create a new bot using the BotFather in Telegram to get your Bot Token. In the n8n workflow, configure the Telegram Trigger node's webhook. Add your Bot Token to the credentials in all Telegram nodes. For proactive messages, replace the chatId placeholders with your personal Telegram Chat ID. Google Gemini AI: In the Google Gemini nodes, add your credentials by providing your Google Gemini API key. Airtable Knowledge Base: Set up an Airtable base to act as your assistant's long-term memory. In the memory_base nodes (Airtable nodes), configure the credentials and provide the Base ID and Table ID. Google Workspace APIs: Connect your Google account credentials for Gmail, Google Calendar, and Google Sheets. In the relevant nodes, specify the Document/Sheet IDs you want the assistant to manage. Connect Other Tools: Add your credentials for Todoist and any other integrated tool APIs. Configure Conversational Memory: This workflow is designed for multi-user support. Verify that the Session Key in the "Window Buffer Memory" nodes is correctly set to a unique user identifier from Telegram (e.g., {{ $json.chat.id }}). This ensures conversations from different users are kept separate. Review Schedule Triggers: Check any nodes designed to run on a schedule (e.g., "At a regular time"). Adjust their cron expressions, times, and timezone to fit your needs (e.g., for daily summaries). Test the Workflow: Activate the workflow. Send a text message to your bot (e.g., "Hello!"). Estimated Setup Time 30–60 minutes:** If you already have your API keys, account credentials, and service IDs (like Sheet IDs) ready. 2–3 hours:** For a complete, first-time setup, which includes creating API keys, setting up new spreadsheets or Airtable bases, and configuring detailed permissions.