by Avkash Kakdiya
How it works This workflow runs on a schedule and retrieves newly created HubSpot contacts from the past 24 hours. It processes each contact individually and generates a tailored marketing campaign using an AI model. The generated content is formatted into a clean HTML structure. Finally, a personalized email is sent to each contact with their campaign strategy. Step-by-step Trigger and fetch contacts** Schedule Trigger – Starts the workflow at defined intervals. Search contacts – Retrieves contacts created in the last 24 hours from HubSpot. Process and generate campaign** Loop Over Contacts – Splits contacts into individual items for processing. AI Agent – Generates a personalized marketing campaign strategy. Groq Chat Model – Sub-node providing the AI model for content generation. Format AI's output – Converts AI text into HTML-friendly format. Send a message – Sends the personalized email to each contact. Why use this? Automates personalized outreach for every new lead Delivers instant value with AI-generated campaign strategies Reduces manual marketing effort and response time Improves engagement through highly tailored messaging Easily scalable and customizable for different business needs
by isaWOW
Automatically scan every PBN site in your Google Sheet, check if any removed client domain is still linked in the live HTML, and log all matches back into your tracking sheet — row by row, hands-free. What this workflow does Managing a network of PBN sites becomes risky the moment a client project gets offboarded. If the old link still exists on a PBN page, it can trigger a Google penalty, waste crawl budget, or simply point to a dead destination. Manually checking each PBN every time a project is removed is slow and error-prone. This workflow solves that completely. It reads your full PBN list from Google Sheets, skips any site you have already checked, then loops through each remaining PBN one at a time. For every site, it fetches the live HTML using an HTTP request, pulls your current list of offboarded project domains from a separate sheet, and runs a domain-matching check directly against that HTML. If a match is found, the workflow writes the detected domain into the "Offboarded Links" column automatically. It then pauses briefly and moves to the next PBN — no manual work required at any stage. Perfect for SEO agencies and link builders who manage multiple PBN networks and need a reliable, automatic way to keep their offboarding records up to date. Key features Automatic HTML scanning: Fetches the live page of every PBN site in real time using HTTP requests, so you are always checking the current version of the page — not a cached or outdated copy. Smart row filtering: Skips all PBN rows that have already been processed. The workflow only picks up rows where the "Offboarded Links" column is still empty, saving time and avoiding duplicate checks. Domain matching against offboarded projects: Pulls the complete list of removed client domains from your "offboard projects" sheet and checks each one against the fetched HTML. If any domain appears anywhere in the page source, it is flagged immediately. One-by-one loop with pause: Processes each PBN individually with a short pause between each iteration. This keeps the workflow stable, avoids rate-limit issues, and makes it easy to monitor progress in real time. Auto-update in Google Sheets: Writes the matched domain directly into your PBNs tracking sheet as soon as it is detected. No copy-paste, no manual entry — your records stay current automatically. How it works Trigger the workflow manually — Click the execute button to start the scan whenever you need to run a check across your PBN network. Read all PBN sites from Google Sheets — The workflow pulls every row from the "PBNs" sheet, which contains your site URLs, row numbers, and the "Offboarded Links" column. Filter out already-processed rows — A code node scans through all rows and removes any row where "Offboarded Links" is already filled in. Only unprocessed PBNs move forward. Loop through each PBN one by one — The remaining rows enter a loop. Each PBN is handled individually, one after the other, to keep the process clean and trackable. Fetch the live HTML of the PBN site — An HTTP request node sends a GET call to the PBN's URL and retrieves the full page HTML. Retries are enabled in case of temporary failures. Read the offboarded project domain list — A separate Google Sheets node pulls all domains from Column A of the "offboard projects" sheet. This is your master list of removed client websites. Match domains against the fetched HTML — A code node compares every domain from the offboarded list against the HTML content. If any domain is found, it is returned as a match. If none are found, the result is set to zero. Prepare the data for the sheet update — A Set node organizes the matched domain, the row number, and the HTML into a clean payload that is ready to write back into Google Sheets. Write the matched domain into the PBNs sheet — The workflow updates the correct row in the "PBNs" sheet by matching the row number and filling in the "Offboarded Links" column with the detected domain. Pause before the next iteration — A short wait is added after each update. This gives the system time to settle and prevents any issues before the loop continues to the next PBN. Setup requirements Tools you will need: Active n8n instance (self-hosted or n8n Cloud) Google Sheets with OAuth access for reading and updating PBN data A "PBNs" sheet containing your site URLs and tracking columns A separate "offboard projects" sheet with removed client domains in Column A Estimated setup time: 10–15 minutes Configuration steps 1. Add credentials in n8n: Google Sheets OAuth API — used for reading the PBNs sheet and the offboard projects sheet, and for writing matched results back. 2. Set up the PBNs sheet: Create a Google Sheet with the following columns: Site URL** — The full domain of each PBN site (e.g., example.com) Offboarded Links** — Leave this blank initially. The workflow will auto-fill this column when a match is found. row_number** — A unique number for each row. This is used by the workflow to identify and update the correct row. You can add this manually or use a simple formula. 3. Set up the offboard projects sheet: Create a second sheet (or a new tab) named "offboard projects" with the following structure: Column A** — List every client domain that has been offboarded or removed. Include the full URL format (e.g., https://clientsite.com). The workflow reads this column directly. 4. Update the Google Sheet URL in the workflow: Open the "Read PBN Sites from Sheet" node and paste your Google Sheet URL. Open the "Read Offboarded Project Domains" node and make sure the sheet tab is set to your "offboard projects" sheet. Open the "Write Matched Domain to PBNs Sheet" node and confirm it is pointing to the correct PBNs sheet and tab. 5. Run the workflow: Click the manual trigger to start. Watch the loop process each PBN one by one. Check your PBNs sheet — any matched offboarded domains will appear in the "Offboarded Links" column automatically. Use cases SEO agencies: Quickly audit the entire PBN network after a client project is offboarded. Instead of checking 50+ sites manually, run this workflow once and get a complete report in your Google Sheet within minutes. Link builders: Keep your PBN health records accurate at all times. Every time a project is removed, run this workflow to confirm whether any PBN still carries the old link before it causes issues. Freelance SEO consultants: Offer a professional offboarding audit as part of your service. This workflow handles the technical scanning, and you just need to review the results and take action on any flagged links. In-house SEO teams: Automate a task that used to take hours of manual checking. Run the scan on a regular schedule or whenever a project goes offline, and trust that your tracking sheet stays up to date without any extra effort. Customization options Add more domains to check: Simply add new rows to your "offboard projects" sheet in Column A. The workflow will automatically pick them up the next time it runs — no changes needed inside n8n. Run on a schedule instead of manually: Replace the manual trigger with a Schedule Trigger node. Set it to run daily, weekly, or at any interval that fits your workflow. Add error notifications: Connect a notification node (such as Slack, Email, or Discord) after the loop to alert you when the workflow finishes or when a match is detected. This way you do not need to check the sheet yourself. Reset and re-scan: If you need to re-check all PBN sites from scratch, simply clear the "Offboarded Links" column in your PBNs sheet. The filter node will treat all rows as unprocessed and scan them again on the next run. Troubleshooting HTTP request fails or returns no data: Check that the PBN site URL in your sheet is correct and the site is currently live. The workflow has retry enabled, but if the site is down or blocked, the request will still fail. You can skip those rows manually or add an error-handling branch. No matches found even though a link exists: Make sure the domain in your "offboard projects" sheet matches exactly what appears in the PBN's HTML. For example, if the HTML contains "www.clientsite.com" but your sheet only has "clientsite.com," the match will still work because the code strips "www." However, double-check for any spelling differences or trailing slashes. Offboarded Links column not updating: Verify that the row_number in your PBNs sheet is correct and unique for each row. The update node uses row_number to find and write to the right row. If row_number is missing or duplicated, the update will fail silently. Workflow stops mid-loop: This can happen if Google Sheets returns a rate-limit error. The pause node helps reduce this, but if it still occurs, increase the wait time in the "Pause Before Next Iteration" node or reduce the batch size. Filter skips rows that should be processed: Check if the "Offboarded Links" column has any hidden spaces or formatting. Even an empty-looking cell with a space character will be treated as filled. Clear those cells completely and run the workflow again. Important notes This workflow is designed to be run manually or on a schedule. It does not trigger automatically when a new project is offboarded — you need to initiate the scan yourself. The HTTP request fetches the live HTML of each PBN site. Make sure you have permission or ownership of the sites you are scanning to avoid any access issues. The domain matching is based on simple text inclusion in the HTML source. It will detect domains in links, text content, meta tags, or anywhere else they appear on the page. If you need stricter matching (e.g., only in anchor href tags), the code node can be updated accordingly. Keep your "offboard projects" sheet updated regularly. The workflow only checks domains that are listed there at the time of the scan. Support Need help or custom development? 📧 Email: info@isawow.com 🌐 Website: https://isawow.com/
by Neloy Barman
Who’s it for Affiliate managers, partner programs, and teams collecting leads via public forms who want automated, error-free routing to separate spreadsheets per affiliate — no more manual copying or sheet hunting. What it does This workflow captures new submissions from a Tally.so form, extracts the affiliate code, finds or creates the corresponding Google Sheet, appends the lead data, and sends a real-time Slack notification. New affiliates are handled automatically: if no sheet exists for the submitted code, the workflow creates one and places it in a designated Google Drive folder. Key features 📝 Real-time Tally form capture (Name, Email, Affiliation Code, Phone) 🔍 Smart routing by affiliation code 📊 Auto-creates new affiliate sheets when needed 📁 Organized storage in a single Google Drive folder 🔔 Instant Slack notifications with full lead details 🔄 How it works Tally submission → Webhook trigger Extract fields: Name, Email Address, Phone Number, Affiliation Code Add current Submission Date and Submission Time Search Google Drive for sheet matching the Affiliation Code ✅ Found → Append row with all data ❌ Not found → Create new sheet → Move to folder → Append row Send formatted Slack alert with full lead info 📋 Requirements Tally.so form including Name, Email Address, Phone Number, and Affiliation Code fields Google Drive folder for storing affiliate sheets Google Sheets & Drive OAuth2 credentials (with Drive + Sheets APIs enabled) Slack bot token with chat:write scope Slack channel ID where notifications will be sent ⚙️ How to set up Import the workflow into n8n Create credentials: Tally API Google Sheets OAuth2 Google Drive OAuth2 Slack Bot Configure nodes: Select your Tally form in the webhook Set the parent Folder ID for affiliate sheets Enter your Slack Channel ID Activate the workflow and start receiving leads! 🛠️ How to customize Add extra fields: Capture additional data (e.g., UTM parameters, notes, referral source) by adding the fields to your Tally form, then map them in the webhook parsing and Google Sheets **Append nodes. Change notification channels: Replace the **Slack node with Discord Webhook, Send Email, Microsoft Teams, or any other notification node to fit your team’s preferred platform. Add validation & duplicate checks: Insert **IF nodes or a Code node before the sheet append to validate data quality, reject incomplete entries, or check for duplicate emails/phone numbers. Support multiple programs or priority routing: Extend the routing logic with **Switch nodes or additional Drive searches to handle different affiliate programs, tiers, or priority queues.
by George Dan
How it works Submit one or more Apple Podcast episode URLs via the built-in n8n form The workflow queries the iTunes API to retrieve each podcast's public RSS feed, then parses the XML to locate the matching episode's MP3 file ElevenLabs Scribe transcribes the full audio by passing the MP3 URL directly - no file download needed GPT-5-MINI generates a structured summary for each episode: title, key points, useful info, and a bottom line All summaries are combined into a formatted HTML email and delivered to your inbox Set up steps Setup takes about 5 minutes Connect three credentials: ElevenLabs (HTTP Header Auth with your API key), OpenAI API, and Gmail OAuth2 Update the recipient email address in the "Send Summary Email" node Detailed instructions are in the sticky notes inside the workflow
by Rajeet Nair
Overview This workflow demonstrates an AI task routing system using multiple agents in n8n. It analyzes incoming user requests, determines their complexity, and routes them to the most appropriate AI agent for processing. A Supervisor Agent evaluates each request and classifies it as either simple or complex, returning a confidence score and reasoning. Based on this classification, an orchestrator agent delegates the task to the correct specialized agent. The workflow also includes a confidence validation mechanism. If the classification confidence falls below a defined threshold, an email alert is sent to an administrator for manual review. This architecture helps build scalable AI systems where tasks are intelligently routed to agents optimized for different levels of complexity. How It Works Webhook Trigger The workflow starts when a request is received through a webhook endpoint. Workflow Configuration The request and a configurable confidence threshold are stored using a Set node. Supervisor Agent Classification The Supervisor Agent analyzes the user request and determines whether the task is simple or complex, returning a confidence score and reasoning. Structured Output Parsing The classification result is parsed using a structured output parser to ensure reliable JSON formatting. Confidence Validation An IF node checks whether the confidence score meets the configured threshold. Agent Orchestration If the confidence is sufficient, an orchestrator agent delegates the task to either: Simple Task Agent for straightforward questions Complex Task Agent for tasks requiring deeper reasoning Fallback Handling If the confidence score is too low, the workflow sends an email alert requesting manual review. Webhook Response The final AI response is returned to the original requester through the Respond to Webhook node. Setup Instructions Add OpenAI credentials to all OpenAI model nodes: Supervisor Model Executor Model Simple Agent Model Complex Agent Model Configure the Workflow Configuration node: Set the userRequest placeholder if testing manually. Adjust the confidenceThreshold if required. Configure the Email Send node: Enter sender and administrator email addresses. Connect SMTP or your preferred email credentials. Activate the workflow and send requests to the Webhook endpoint to start task processing. Use Cases AI support systems that route queries based on complexity Customer service automation with intelligent escalation Multi-agent AI architectures for research or analysis tasks AI workflow orchestration for automation platforms Intelligent request classification and routing systems Requirements OpenAI API credentials** Email (SMTP) credentials** for alert notifications A system capable of sending requests to the workflow webhook
by Andrew Loh
How it works Complaints arrive via Gmail or a web form webhook Claude AI classifies each complaint: fault category, priority (P1/P2/P3), tenant tone, and drafts an acknowledgement email The right technician is looked up in Airtable by fault category A work order is created and the tenant receives an ACK email with their ticket reference and SLA commitment The FM team is notified in Slack with ticket summary An hourly schedule checks open tickets — any past their SLA deadline trigger an urgent escalation to FM management How to set up Connect Gmail to the Gmail Trigger and Send ACK email nodes Create your Airtable base with a Complaints table and a Technician table (one row per fault category) Connect Airtable, Anthropic, and Slack in their respective nodes If using a web form, point it to the Webhook URL
by Nirav Gajera
💰 AI Expense Tracker — Chat to Track Spending Instantly Track your expenses by chatting naturally. No forms, no apps — just type and it's saved. 📖 Description This workflow turns a simple chat interface into a powerful personal expense tracker. Just describe your spending in plain language — the AI understands it, categorizes it, and saves it to Google Sheets automatically. Example inputs the AI understands: spent 500 on lunch uber 150 paid 1200 electricity bill lunch in feb 25 cost 500 ← handles past dates too netflix 499 $50 hotel booking ← detects currency No rigid formats. No dropdowns. Just type naturally. ✨ Key Features Natural language input** — type expenses exactly how you'd say them AI-powered parsing** — Claude Haiku extracts amount, category, date, currency automatically 9 auto-detected categories** — Food, Transport, Shopping, Bills, Entertainment, Health, Business, Education, Other Multi-currency support** — INR, USD, EUR, GBP Past date handling** — "lunch in feb 25 cost 500" saves to February 2025, not today Running monthly total** — each row stores the cumulative month total Monthly summary** — type SUMMARY or summary february for any month Works on empty sheet** — no errors on first use Invalid input handling** — friendly error if no amount detected 💬 Commands | What you type | What happens | | :--- | :--- | | spent 500 on lunch | ✅ Saved: 🍕 Food & Dining — Lunch · ₹500 | | uber 150 | ✅ Saved: 🚗 Transport — Uber · ₹150 | | 1200 electricity bill | ✅ Saved: 💡 Bills & Utilities · ₹1200 | | lunch in feb 25 cost 500 | ✅ Saved to February 2025 correctly | | SUMMARY | 📊 Current month report with breakdown | | summary february | 📊 February report (current year) | | summary february 2025 | 📊 February 2025 specific report | | HELP | 📖 Shows all commands and categories | 🛠 Setup Requirements 1. Google Sheet Create a new Google Sheet with these exact headers in Row 1: | Col | Header | | :---: | :--- | | A | Date | | B | Amount | | C | Category | | D | Description | | E | Currency | | F | Month | | G | Raw Message | | H | Total | 2. Credentials needed | Credential | Used for | Free? | | :--- | :--- | :--- | | Anthropic API | Claude Haiku AI parsing | Paid (very low cost) | | Google Sheets OAuth2 | Read & write expenses | Free | 3. After importing Connect your Anthropic credential to the Claude Haiku node Connect your Google Sheets credential to all sheet nodes Update the Sheet ID in all Google Sheets nodes to point to your sheet Open the workflow chat and type your first expense 🏗 How It Works You type: "spent 500 on car wash" ↓ Detect Intent → classified as: expense ↓ Read All Expenses → loads sheet (works even if empty) ↓ Prepare Data → calculates existing month total ↓ AI Parse Expense (Claude Haiku) → amount: 500 → category: Transport → description: Car wash → date: today → currency: INR ↓ Parse & Total → derives Month from parsed date → computes new running total ↓ Is Valid? (amount > 0 and is_expense = true) ✅ YES → Save to Sheet → Reply with confirmation ❌ NO → Ask user to include an amount Summary flow: You type: "summary february" ↓ Detect Intent → classified as: summary ↓ Read for Summary → loads all rows ↓ Build Summary → detects "february" in message → filters rows by February (current year) → calculates total, breakdown by category, daily avg ↓ Returns formatted report 📊 Sample Summary Output 📊 March 2026 Report 💳 Total: ₹8,450 📝 Entries: 12 📈 Daily avg: ₹470 🔝 Top: 🍕 Food & Dining Breakdown: 🍕 Food & Dining: ₹3,200 (38%) 🚗 Transport: ₹1,800 (21%) 💡 Bills & Utilities: ₹1,200 (14%) 🛍️ Shopping: ₹1,050 (12%) 🎬 Entertainment: ₹800 (9%) 🏥 Health: ₹400 (5%) 📂 Auto-Detected Categories | Emoji | Category | Example keywords | | :---: | :--- | :--- | | 🍕 | Food & Dining | lunch, dinner, restaurant, zomato, swiggy, grocery | | 🚗 | Transport | uber, ola, petrol, metro, flight, car wash, parking | | 🛍️ | Shopping | amazon, flipkart, clothes, electronics, shoes | | 💡 | Bills & Utilities | electricity, wifi, rent, recharge, emi, gas | | 🎬 | Entertainment | netflix, movie, spotify, concert, gaming | | 🏥 | Health | medicine, doctor, gym, pharmacy, hospital | | 💼 | Business | office, software, domain, hosting, tools | | 📚 | Education | course, books, tuition, udemy, fees | | 💰 | Other | anything that doesn't match above | ⚙️ Workflow Nodes | Node | Type | Purpose | | :--- | :--- | :--- | | When chat message received | Chat Trigger | Entry point | | Detect Intent | Code | Classify: expense / summary / help | | Intent Switch | Switch | Route to correct path | | Read All Expenses | Google Sheets | Load rows (alwaysOutputData: true) | | Prepare Data | Code | Compute month total, handle empty sheet | | AI Parse Expense | LLM Chain | Extract fields using Claude Haiku | | Claude Haiku | Anthropic Model | AI model for parsing | | Parse & Total | Code | Validate, derive month, compute total | | Is Valid Expense? | IF | Check amount > 0 | | Save Expense to Sheet | Google Sheets | Append new row | | Reply Saved | Code | Format confirmation message | | Reply Invalid | Code | Request amount from user | | Read for Summary | Google Sheets | Load all rows for report | | Build Summary | Code | Filter by month, compute breakdown | | Send Help | Code | Return command reference | 🔧 Customisation Ideas Add a budget alert** — warn when monthly total exceeds a set limit Telegram integration** — replace chat trigger with Telegram bot WhatsApp integration** — use Twilio WhatsApp as the input channel Weekly digest** — add a Schedule Trigger for automatic weekly reports Multi-user** — store user ID with each row to support team expense tracking Export to PDF** — generate monthly expense report as a PDF ⚠️ Important Notes The Read All Expenses node has Always Output Data enabled — this is required so the flow works on an empty sheet Month is derived from the parsed date, not today's date — so past-dated entries file correctly The Total column stores the running month total at the time of each entry — it does not update retroactively if you delete rows 📦 Requirements Summary n8n (cloud or self-hosted) Anthropic API key (Claude Haiku — very low token usage) Google account with Sheets access Built with n8n · Claude Haiku · Google Sheets By Nirav Gajera
by Cheng Siong Chin
How It Works This workflow automates tax compliance by aggregating multi-channel revenue data, calculating jurisdiction-specific tax obligations, detecting anomalies, and generating submission-ready reports for tax authorities. Designed for finance teams, tax professionals, and e-commerce operations, it solves the challenge of manually reconciling transactions across multiple sales channels, applying complex tax rules, and preparing compliant filings under tight deadlines. The system triggers monthly or on-demand, fetching revenue data from e-commerce platforms, payment processors, and accounting systems. Transaction records flow through validation layers that merge historical context, classify revenue streams, and calculate tax obligations using jurisdiction-specific rules engines. AI models detect anomalies in tax calculations, identify unusual deduction patterns, and flag potential audit risks. The workflow routes revenue data by tax jurisdiction, applies progressive tax brackets, and generates formatted reports matching authority specifications. Critical anomalies trigger immediate alerts to tax teams via Gmail, while finalized reports store in Google Sheets and Airtable for audit trails. This eliminates 80% of manual tax preparation work, ensures multi-jurisdiction compliance, and reduces filing errors. Setup Steps Configure e-commerce API credentials for transaction access Set up payment processor integrations (Stripe, PayPal) for revenue reconciliation Add accounting system credentials (QuickBooks, Xero) for financial data Configure OpenAI API key for anomaly detection and tax analysis Set Gmail OAuth credentials for tax team alert notifications Link Google Sheets for report storage and audit trail documentation Connect Airtable workspace for structured tax record management Prerequisites Active e-commerce platform accounts with API access. Payment processor credentials. Use Cases Automated monthly sales tax calculations for multi-state e-commerce. Customization Modify tax calculation rules for specific jurisdiction requirements. Benefits Reduces tax preparation time by 80% through end-to-end automation.
by WeblineIndia
Daily Inventory Monitoring & Reorder System This workflow automatically monitors your WooCommerce store inventory, calculates stock health based on recent sales, classifies products, computes reorder quantities, assigns urgency levels and sends actionable alerts to Slack. This workflow runs daily to track your inventory and prevent stock issues. It fetches all active products and recent completed orders, calculates units sold in the last 30 days, evaluates stock health, and classifies products as Top Performer, Steady, At Risk, or Consider Discontinue. You receive: Daily inventory check (automated)** Database record of each product’s stock and recommended action** Slack alerts for urgent items and a daily summary** Ideal for teams wanting simple, automated visibility of inventory without manually reviewing stock levels. Quick Start – Implementation Steps Connect your WooCommerce account (products and orders). Connect Supabase to store inventory records. Connect Slack to receive alerts and daily summaries. Set the schedule time for daily checks. Review and adjust stock thresholds (lead time, safety days) if needed. Activate the workflow — daily inventory monitoring begins automatically. What It Does This workflow automates inventory monitoring: Fetches all published products from WooCommerce with current stock. Retrieves completed orders from the last 30 days to calculate sales. Calculates units sold per product and estimates average daily demand. Merges product and sales data for stock evaluation. Classifies products based on stock and demand: Top Performer Steady At Risk Consider Discontinue Calculates safety stock, reorder points, and reorder quantities. Assigns urgency levels (Normal, High, Critical) with clear action messages. Sends Slack alerts for high-priority products. Saves all inventory data into Supabase for tracking. Builds and sends a daily summary with totals, at-risk products, and reorder needs. This ensures your team always knows stock status and can act quickly to prevent shortages. Who’s It For This workflow is ideal for: Inventory managers Operations teams E-commerce teams Supply chain planners Anyone needing automated stock monitoring and alerts Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) WooCommerce API credentials** (products & orders) Supabase account** (database for inventory tracking) Slack workspace** with API permissions Basic understanding of inventory management and reorder logic How It Works Daily Check – Workflow triggers automatically at the scheduled time. Fetch Products & Orders – Gets all published products and completed orders from the last 30 days. Calculate Sales & Demand – Determines units sold and average daily demand per product. Merge Data – Combines stock data with sales to evaluate inventory health. Inventory Classification – Categorizes products as Top Performer, Steady, At Risk, or Consider Discontinue. Reorder Calculations – Computes safety stock, reorder point, and recommended reorder quantity. Assign Urgency & Actions – Flags products as Normal, High, or Critical and sets clear action messages. Immediate Action Check – Identifies high-priority products that need urgent attention. Save to Database – Stores inventory status and recommendations in Supabase. Daily Summary – Builds summary and sends Slack notifications for overall stock health. Setup Steps Import the provided n8n JSON workflow. Connect your WooCommerce account (products and orders). Connect Supabase account and configure the table for inventory tracking. Connect Slack and select channels for urgent alerts and daily summary. Adjust lead time, safety stock days, and any thresholds if needed. Activate the workflow — daily automated inventory monitoring and reporting begins. How To Customize Nodes Customize Reorder Calculations Adjust safety stock days, lead time, or reorder formulas in the Reorder Calculator node. Customize Urgency & Actions Modify logic in the Urgency & Recommendation node to change thresholds or messaging. Customize Slack Alerts You can change: Slack channel Message format Include emojis or tags Customize Database Storage Add extra fields in Supabase to store more product information if needed. Add-Ons (Optional Enhancements) You can extend this workflow to: Track multiple warehouses Send alerts only for specific categories Generate weekly inventory reports Include stock valuation or cost metrics Integrate with other communication channels (email, Teams) Use Case Examples Daily Inventory Check Automatically tracks stock levels for all products. Urgent Stock Alerts Notifies the team immediately when items are At Risk or need reorder. Reporting & Tracking Keeps a historical record of stock health in the database. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|---------| | Slack alerts not sent | Invalid credentials | Update Slack API key | | Supabase row not saved | Wrong table/field mapping | Check table and field names | | Wrong stock classification | Thresholds incorrect | Adjust lead time, safety days, or demand calculation | | Workflow not running | Schedule not active | Enable Schedule Trigger node | Need Help? If you need help in customizing or extending this workflow with multi-warehouse tracking, advanced alerts, dashboards or scaling, then our n8n automation developers at WeblineIndia will be happy to assist you.
by Marián Današ
Generate personalized concert ticket PDFs with QR codes using PDF Generator API, then email them to attendees, log sales to Google Sheets, and notify organizers via Slack — all triggered from a simple web form. Who is this for Event organizers, ticketing teams, and developers who need an automated pipeline to issue branded PDF concert tickets with unique QR codes for venue entry — without building a custom backend. How it works An attendee fills out a web form with their name, email, event details, seat number, and ticket tier (General / VIP / Backstage). The workflow generates a unique ticket ID and prepares all data for the PDF template. PDF Generator API renders a personalized PDF ticket. The QR code is a native template component that encodes the ticket ID automatically. A styled HTML confirmation email with a download link is sent to the attendee via Gmail. The ticket details are logged to a Google Sheets spreadsheet for tracking and attendance management. A Slack notification alerts the event organizer with a summary of the newly issued ticket. Set up PDF Generator API — Sign up at pdfgeneratorapi.com, create a ticket template with a QR Code component bound to {{ ticket_id }}, and note your template ID. Template ID — Open the "Prepare Ticket Data" Code node and replace the TEMPLATE_ID value with your own. Credentials — Connect your accounts in each node: PDF Generator API, Gmail, Google Sheets, and Slack. Google Sheets — Create a spreadsheet with columns: Ticket ID, Attendee, Email, Event, Venue, Date, Seat, Tier, PDF URL, Issued At. Set the spreadsheet ID in the "Log Ticket Sale" node. Slack — Choose a channel (e.g. #tickets) in the "Notify Event Organizer" node. Requirements PDF Generator API account (free trial available) Gmail account (OAuth) Google Sheets account (OAuth) Slack workspace (optional — remove the last node if not needed) How to customize Output format** — The PDF node returns a hosted URL by default (valid 30 days). Switch to File output to attach the PDF directly to the email instead. Ticket tiers** — Add or rename tiers in the form node and update the tier mapping logic in the "Prepare Ticket Data" Code node. Email design** — Edit the "Build Confirmation Email" Code node to match your brand colors and layout. Remove Slack** — Simply delete the "Notify Event Organizer" node if you don't need organizer alerts. Add payment** — Insert a Stripe or payment node before the form confirmation to handle paid tickets.
by Masaki Go
About This Template This workflow creates high-quality, text-rich advertising banners from simple LINE messages. It combines Google Gemini (for marketing-focused prompt engineering) and Nano Banana Pro (accessed via Kie.ai API) to generate images with superior text rendering capabilities. It also handles the asynchronous API polling required for high-quality image generation. How It Works Input: Users send a banner concept via LINE (e.g., "Coffee brand, morning vibe"). Prompt Engineering: Gemini optimizes the request into a detailed prompt, specifying lighting, composition, and Japanese catch-copy placement. Async Generation: The workflow submits a job to Nano Banana Pro (Kie API) and intelligently waits/polls until the image is ready. Hosting: The final image is downloaded and uploaded to a public AWS S3 bucket. Delivery: The image is pushed back to the user on LINE. Who It’s For Marketing teams creating A/B test assets. Japanese market advertisers needing accurate text rendering. Developers looking for an example of Async API Polling patterns in n8n. Requirements n8n** (Cloud or Self-hosted). Kie.ai API Key** (for Nano Banana Pro model). Google Gemini API Key**. AWS S3 Bucket** (Public access enabled). LINE Official Account** (Messaging API). Setup Steps Credentials: Configure the "Header Auth" credential for the Kie.ai nodes (Header: Authorization, Value: Bearer YOUR_API_KEY). AWS: Ensure your S3 bucket allows public read access so LINE can display the image. Webhook: Add the production webhook URL to your LINE Developers console.
by Kirill Khatkevich
This workflow continuously monitors the TikTok Ads Library for new creatives from specific advertisers or keyword searches, scrapes them via Apify, logs them into Google Sheets, and sends concise notifications to Telegram or Slack with the number of newly discovered ads. It is built as a safe, idempotent loop that can run on a schedule without creating duplicates in your sheet. Use Case Manually checking the TikTok Ads Library for competitor creatives is time-consuming, and it's easy to lose track of which ads you've already seen. This workflow is ideal if you want to: Track competitor creatives over time** in a structured Google Sheet. Avoid duplicates** by matching ads via their unique adId field. Get lightweight notifications* in Telegram or Slack that tell you *how many new ads appeared, without spamming you with full ad lists. Run the process on autopilot** (daily, weekly, etc.) with a single schedule. Monitor by advertiser ID or keywords** with flexible search parameters. How it Works The workflow is organized into four logical blocks: 1. Configuration & Date Conversion Configuration:** The Set Parameters Set node stores all key request variables: Ad target country (e.g., all or specific ISO country codes), Ad published date From (automatically set to yesterday by default), Ad published To (automatically set to today by default), Advertiser name or keyword (for keyword-based searches), adv_biz_ids (advertiser business IDs for specific advertiser tracking), Ad limit (optional limit on the number of results to scrape). Date Conversion:** Convert Dates to Unix transforms the human-readable date format (DD/MM/YYYY) into Unix timestamps in milliseconds, which are required by the TikTok Ads Library API. 2. Request Building & Data Fetching Body Construction:** Build Apify Body creates the JSON request body for the Apify actor: Builds the TikTok Ads Library URL with all search parameters (region, date range, advertiser name/keyword, advertiser IDs). Conditionally adds resultsLimit to the request body only if the Ad limit field is not empty, allowing you to scrape all results or limit them as needed. Data Fetching:** Get TT Ads through Apify executes the Apify actor (Tiktok Ads Scraper) and retrieves all matching ads from the TikTok Ads Library. 3. Data Preparation & De-duplication Data Extraction:** Prepare Data for Sheets safely extracts nested data from the API response: Extracts the first video URL from the videos array (if available). Extracts the cover image URL from the first video object. Extracts the TikTok username from the tiktokUser object (if available). Handles cases where arrays are empty or objects are missing without throwing errors. Load Existing IDs:** Read existing IDs pulls the existing adId column from your Google Sheet (configured to read a specific column/range, e.g., column K). Collect ID list converts these into a unique, normalized string array existingIds, which represents all ads you have already logged. Attach State:** Attach existing ids (Merge node) combines, for each execution, the freshly fetched TikTok response with the historical existingIds array from Sheets. Filter New Creatives:** Filter new creatives Code node compares each ad's adId (string) against the existingIds set and builds a new array containing only ads that are not yet present in the sheet. It also protects against duplicates inside the same batch by tracking seen IDs in a local Set. 4. Data Logging & Notification Write New Ads:** Append or update row in sheet performs an appendOrUpdate into Google Sheets, mapping core fields such as adId, adName, advertiserName, advertiserId, paidBy, impressions, regionStats, targeting, tiktokUser, startUrl, videos, and coverImageURL (using the =IMAGE() formula to display images directly in the sheet). The column mapping uses adId as the matching column so that existing rows can be updated if needed. Count:** In parallel with the write step, Filter new creatives also feeds into Count new ads. This Code node returns a single summary item with newCount = items.length, i.e., the total number of new creatives processed in this run. Guard:** Any new ads? checks whether newCount is greater than 0. If not, the workflow ends silently and no message is sent, avoiding noise. Notify:** When there are new creatives, both Send a text message (Telegram) and Send a message (Slack) send notifications to the configured channels. The message includes {{$json.newCount}} and a fixed link to the Google Sheet, giving you a quick heads-up without listing individual ads. Setup Instructions To use this template, configure the following components. 1. Credentials Apify:** Configure the Apify account credentials used by Get TT Ads through Apify. You'll need an Apify account with access to the Tiktok Ads Scraper actor. Google Sheets:** Connect your Google account in: Read existing IDs, Append or update row in sheet. Telegram (optional):** Connect your Telegram account credentials in Send a text message. Slack (optional):** Configure your Slack credentials in Send a message. 2. The Set Parameters Node Open the Set Parameters Set node and customize: Ad target country: Which countries to monitor (all for all countries, or specific ISO 3166 country codes like US, GB, etc.). Ad published date From: Start date for the search range (defaults to yesterday using {{ $now.minus({ days: 1 }).toFormat('dd/MM/yyyy') }}). Ad published To: End date for the search range (defaults to today using {{ $now.toFormat('dd/MM/yyyy') }}). Advertiser name or keyword: Search by advertiser name or keywords (URL-encoded format, e.g., %22Applicave%20LLC%22). adv_biz_ids: Specific advertiser business IDs to track (comma-separated if multiple). Ad limit: Optional limit on the number of results (leave empty to scrape all available results). 3. Google Sheets Configuration Read existing IDs** Set documentId and sheetName to your tracking spreadsheet and sheet (e.g., Sheet1). Configure the range to read only the column holding the ad adId values (e.g., column K: K:K). Append or update row in sheet** Point documentId and sheetName to the same spreadsheet/sheet. Make sure your sheet has the columns expected by the node (e.g., adId, coverImageURL, adName, Impressions, regionStats, targeting, tiktokUser, advertiserID, paidBy, advertiserName, startURL, videos). Confirm that adId is included in matchingColumns so de-duplication works correctly. 4. Notification Configuration Telegram:** In Send a text message, set: chatId: Your target Telegram chat or channel ID. text: Customize the message template as needed, but keep {{$json.newCount}} to show the number of new creatives. Slack:** In Send a message, set: channelId: Your target Slack channel ID. text: Customize the message template as needed, but keep {{$json.newCount}} to show the number of new creatives. 5. Schedule Open Schedule Trigger and configure when you want the workflow to run (e.g., every morning). Save and activate the workflow. Further Ideas & Customization This workflow is a solid foundation for systematic TikTok competitor monitoring. You can extend it to: Track multiple advertisers** by turning adv_biz_ids into a list and iterating over it with a loop or separate executions. Enrich the log with performance data** by creating a second workflow that reads the sheet, pulls engagement metrics (likes, shares, comments) for each logged adId from TikTok's API (if available), and merges the metrics back. Add more notification channels** such as email, or send a weekly summary that aggregates new ads by advertiser, format, or country. Tag or categorize creatives** (e.g., "video vs image", "country", "language", "advertiser type") directly in the sheet to make later analysis easier. Combine with Meta Ads monitoring** by running both workflows in parallel and creating a unified competitor intelligence dashboard. Add image analysis** by integrating Google Vision API to automatically detect objects, text, and themes in the cover images, similar to the Meta Ads creative analysis workflow.