by Oneclick AI Squad
This workflow scrapes property listings, enriches them with market data, and uses Claude AI to score each listing's investment potential based on rental yield, capital growth trends, location desirability, and risk factors. How it works Trigger - Scheduled run initiates a scrape job Scrape Listings - Fetches property listings from target URL(s) via HTTP Parse Listings - Extracts structured data (price, bedrooms, suburb, etc.) Fetch Market Data - Pulls suburb median prices, rental yields, vacancy rates Fetch Demographics - Gets population growth, income levels, infrastructure data Combine Enrichment - Merges all data per listing AI Investment Scoring - Claude AI scores each listing (0–100) with rationale Filter Top Picks - Keeps listings above configurable score threshold Format Report - Builds a clean investment report Save to Google Sheets - Logs all scored listings for tracking Send Digest - Posts top picks to Slack or email Setup Steps Import workflow into n8n Configure credentials: Anthropic API - Claude AI for investment scoring Google Sheets - Results & historical tracking Slack OAuth - Daily digest notifications RapidAPI / Zillow / Domain API - Property market data Set your target listing URLs in the 'Configure Scrape Targets' node Set your score threshold (default: 65) in 'Filter Top Picks' Set your Google Sheet ID and Slack channel Activate the workflow or POST to the webhook Sample Webhook Payload { "searchUrl": "https://www.domain.com.au/sale/sydney/?bedrooms=2-4&price=500000-900000", "suburb": "Parramatta", "maxListings": 20, "scoreThreshold": 65 } Scoring Criteria (Claude AI) Rental Yield** - Gross and estimated net yield vs suburb average Capital Growth** - 5-year suburb price trend Location Score** - Transport, schools, amenities proximity Vacancy Risk** - Suburb rental demand and vacancy rate Cash Flow** - Estimated weekly cash flow after mortgage Risk Flags** - Flood zones, high crime, oversupply signals Features Multi-source market enrichment AI-powered investment scoring with SWOT analysis Automated filtering of top-performing listings Google Sheets audit trail with historical scores Slack/email digest of daily top picks
by Cheng Siong Chin
How It Works This workflow automates data privacy compliance governance for privacy officers, legal operations teams, and data protection leads. It eliminates the manual effort of monitoring data usage events, classifying privacy risks, routing approval requests, and generating audit-ready compliance reports. Data usage events arrive via a webhook trigger while a scheduled audit runs in parallel, ensuring continuous and periodic coverage. Both feeds pass to the Privacy Governance Agent, backed by a governance model and shared memory — which coordinates three specialist tools: a Data Privacy Agent Tool (privacy policy assessment using a privacy model and Legal Database API), a Risk Detection Agent Tool (risk classification using a dedicated risk model), and an Audit Log Tool. Approval requests are routed via an Approval Request Tool with Slack notifications, and outputs are structured via a Compliance Output Parser and Approval History Tool. Results are routed by risk level, critical alerts trigger Slack notifications immediately, high-risk alerts follow a parallel Slack path, before all cases converge to prepare an audit record, store a compliance record in Google Sheets, prepare a compliance report, and distribute it via Gmail. Setup Steps Import workflow; configure the Data Usage Event Trigger webhook URL and Scheduled Compliance Audit interval. Add AI model credentials to the Privacy Governance Agent, Data Privacy Agent Tool, and Risk Detection Agent Tool. Connect the Legal Database API Tool with your privacy regulatory database endpoint and credentials. Link Slack credentials to the Slack Notification Tool, Send Critical Alert, and Send High Risk Alert nodes. Link Gmail credentials to the Send Compliance Report node. Connect Google Sheets credentials; set sheet IDs for Compliance Record and Audit Log tabs. Prerequisites OpenAI API key (or compatible LLM) Slack workspace with bot credentials Gmail account with OAuth credentials Google Sheets with compliance and audit tabs pre-created Use Cases Privacy officers automating GDPR and PDPA data usage event monitoring and risk classification Customisation Swap the Legal Database API to target jurisdiction-specific frameworks (GDPR, CCPA, PDPA, HIPAA) Benefits Dual-trigger ingestion ensures continuous and scheduled privacy coverage with no monitoring gaps
by Cheng Siong Chin
How It Works This workflow automates credit operations onboarding by running KYC verification, credit bureau checks, identity validation, and sanctions screening through a single AI-powered agent. Built for credit operations teams, compliance officers, and fintech platforms, it eliminates manual eligibility reviews that are slow and error-prone. Triggered via webhook, the Credit Operations Agent orchestrates all verification tools simultaneously, then routes customers by eligibility status, eligible, ineligible, pending documentation, or compliance escalation. Each path prepares structured data stored in Airtable, triggers appropriate follow-up actions (email, Slack alerts), and logs a full audit trail. A final formatted response is returned to the originating system, closing the loop end-to-end with no manual handoffs. Setup Steps Set webhook URL and connect Credit Operations webhook node to your intake system. Add OpenAI API key to the OpenAI Chat Model node. Configure KYC, Credit Bureau, Identity, and Sanctions tool credentials. Add Gmail OAuth2 and Slack bot token for notification nodes. Connect Airtable API key; set base/table IDs for eligible and ineligible customer stores. Prerequisites KYC & Credit Bureau API credentials Sanctions screening API access Gmail OAuth2 and Slack bot token Airtable API key Use Cases Fintech platforms automating loan application eligibility screening Customisation Add extra verification tools (e.g., biometric or document OCR APIs) Benefits Eliminates manual KYC and sanctions review bottlenecks
by WeblineIndia
WooCommerce Product Category Sales Performance Report This workflow automatically analyzes sales data by product category, compares performance across time periods (daily, weekly or monthly), stores structured results in Airtable and sends a clear summary to Slack for quick decision-making. This workflow pulls order data for two time periods (current and previous), groups sales by product category and calculates key metrics like revenue, units sold and share of total sales. Each category is then classified (Top Performer, Steady, Needs Attention, etc.) with a recommended action. The results are saved to Airtable for tracking & history and a short, easy-to-read summary is sent to Slack so stakeholders can understand performance at a glance. You get: Automated sales comparison (daily / weekly / monthly)** Category-wise performance classification** Historical tracking in Airtable** One clean Slack summary — no dashboards required** Ideal for product, sales and operations teams who want fast, consistent insights without manual reporting. Quick Start – Implementation Steps Configure the date granularity (daily, weekly or monthly). Connect your Orders data source (API, DB or platform node). Connect and configure your Airtable base & table. Connect your Slack workspace and choose a channel. Activate the workflow — reports start running automatically. What It Does This workflow automates category-level sales analysis: Builds current and previous date ranges dynamically. Fetches orders for both time periods. Normalizes and aggregates orders by product category. Calculates key metrics: Current revenue Previous revenue Units sold Share of total sales Classifies each category (Top Performer, Steady, At Risk, etc.). Adds a recommended business action for each category. Saves the final results to Airtable. Generates a short summary message. Sends a single Slack report to stakeholders. This ensures consistent, repeatable insights with no manual effort. Who’s It For This workflow is ideal for: Sales & revenue teams Product managers E-commerce operations teams Business analysts Startup founders & leadership Anyone needing automated sales performance insights Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Access to orders data (API, database or platform integration) Airtable base** + Personal Access Token Slack workspace** with API permissions Basic understanding of sales metrics (revenue, units, categories) How It Works Scheduler Trigger – Workflow runs on a defined schedule. Build Date Ranges – Calculates current and previous periods. Fetch Orders (Current) – Pulls orders for the active period. Fetch Orders (Previous) – Pulls orders for comparison. Aggregate by Category – Groups sales and calculates metrics. Classify Performance – Assigns tags and actions. Save to Airtable – Stores structured results. Build Slack Summary – Creates a readable summary message. Send to Slack – Delivers insights to the team. Setup Steps Import the provided n8n workflow JSON. Configure the Scheduler timing. Set your preferred granularity (daily / weekly / monthly). Connect and map your Orders data source. Connect Airtable and map fields: Category ID / Name Current Revenue Previous Revenue Units Share Tag Recommended Action Connect Slack API credentials and select a channel. Activate the workflow — done! How To Customize Nodes Change Time Period Switch between daily, weekly or monthly comparisons. Adjust rolling windows for testing or analysis. Adjust Performance Thresholds Modify revenue or share thresholds. Change category labels or actions. Customize Airtable Storage Add optional fields such as: Report date Growth percentage Notes or owner Review status Customize Slack Summary You may add: Emojis or highlights Mentions (@channel, @team) Links to Airtable records Separate sections for risks or wins Add-Ons (Optional Enhancements) You can extend this workflow to: Add Teams or Email notifications Track trends over multiple periods Generate charts or dashboards Add alerts for sudden drops or spikes Include AI-based insights or explanations Export reports to Google Sheets or CSV Use Case Examples 1\. Weekly Sales Review Automatically send category performance every week. 2\. Product Decision Support Identify which categories to promote or discontinue. 3\. Leadership Updates Share clear performance summaries with management. 4\. E-commerce Optimization Spot declining categories before revenue drops. 5\. Historical Analysis Track performance trends over time in Airtable. Troubleshooting Guide | Issue | Possible Cause | Solution | |-----------------------|--------------------------|------------------------------------------| | No Slack message | Slack node not connected | Verify Slack credentials | | No Airtable data | Field mapping mismatch | Match Airtable column names | | Missing current orders| Date range incorrect | Check UTC date logic | | Empty summary | No category data | Verify aggregation step | | Workflow not running | Trigger disabled | Enable Scheduler node | Need Help? If you need help extending or customizing this workflow with adding alerts, dashboards, AI insights or scaling it for production then our n8n workflow developers at WeblineIndia can assist with advanced automation and reporting solutions.
by Renan Miller
How it works This workflow automatically extracts specific data from received emails and saves it into a Google Sheets document for easy tracking and analysis. It connects to a Gmail account, searches for emails received within a defined date range from a specific sender, opens links inside those emails, extracts data from the linked pages (such as case ID, patient name, birth date, complaint, and location), processes and cleans the information using custom JavaScript logic, and finally saves the structured results into a Google Sheet. Setup steps Connect Gmail using OAuth2 credentials. Adjust the date filters and sender email in the “Search Emails” node. Customize the CSS selectors in the HTML extraction nodes to match the desired elements from your email or linked page. Open the Code node and modify the logic if you need to calculate or transform additional fields. Link your Google Sheets account and specify the spreadsheet and sheet name where the results will be appended.
by Wessel Bulte
Automatically BackUp Your n8n Workflows to OneDrive This workflow automates the backup of your self-hosted n8n instance by exporting all workflows and saving them as individual .json files to a designated OneDrive folder. Each file is timestamped for easy versioning and audit tracking. After a successful backup, the workflow optionally cleans up old backup files and sends a confirmation email to notify you that the process completed. How it works Uses the HTTP Request node to fetch all workflows via the /rest/workflows API. Iterates through each workflow using SplitInBatches. Converts each workflow to a .json file using Set and Function nodes. Uploads each file to a target Microsoft OneDrive folder using OAuth2. Deletes old backup files from OneDrive after upload, with the option to keep backups for a configurable number of time. Sends an email notification once all backups have completed successfully. Setup instructions Enter your n8n Base URL and authentication details in the HTTP Request node. Set up Microsoft OneDrive OAuth2 credentials for cloud upload. Configure the Email node with SMTP credentials to receive backup confirmation. (Optional) Adjust the file retention logic to keep backups for a set duration. A Cron trigger to schedule the workflow automatically (e.g., daily or weekly). 👉 Sticky notes inside the workflow explain each step for easy setup. Need Help Need Help 🔗 LinkedIn – Wessel Bulte
by plemeo
Who’s it for Growth hackers, community builders, and marketers who want to keep their Twitter (X) accounts active by liking posts from selected profiles automatically. How it works / What it does Schedule Trigger fires hourly. Profile Post Extractor fetches up to 20 tweets for each profile in your CSV. Select Cookie rotates Twitter session-cookies. Get Random Post checks against twitter_posts_already_liked.csv. Builds twitter_posts_to_like.csv, uploads to SharePoint. Phantombuster Autolike Agent likes the tweet. Logs the liked URL to avoid duplicates. How to set up Add Phantombuster + SharePoint credentials. In SharePoint “Phantombuster” folder: • twitter_session_cookies.txt • twitter_posts_already_liked.csv (header postUrl) • profiles_twitter.csv (list of profiles). Profile CSV format Your profiles_twitter.csv must contain a header profileUrl and direct links to the Twitter profiles. Example: profileUrl https://twitter.com/elonmusk https://twitter.com/openai
by Cheng Siong Chin
Introduction Automate price monitoring for e-commerce competitors—ideal for retailers, analysts, and pricing teams. ⚠️ Self-Hosted Only: Requires self-hosted n8n instance. How It Works Scrapes competitor URLs, extracts data via AI, detects price/stock changes, logs to Google Sheets with email alerts. Workflow Template Trigger → Scrape → AI Extract → Parse → Compare → Detect Changes → Update Sheets + Alert Workflow Steps Scraping: Firecrawl fetches Nike, Adidas, Sneaker data AI Extraction: Processes product details Parsing: Structures response Historical Check: Reads Sheets data Change Detection: Identifies price/stock updates Dual Output: Updates Sheets + sends alerts Setup Instructions 1. Firecrawl API Get key from dashboard → Add to n8n 2. OpenAI API Get key from platform → Add to n8n 3. Google Sheets OAuth2 Create OAuth2 in Google Cloud Console → Authorize in n8n → Enable API 4. Gmail OAuth2 Use same project → Authorize in n8n → Enable API 5. Spreadsheet Setup Create Sheet with required columns → Copy ID from URL → Paste in workflow Prerequisites Self-hosted n8n, Firecrawl account, OpenAI key, Google account (Sheets + Gmail OAuth2) Customization Add URLs, adjust thresholds, integrate Slack Benefits Saves 2+ hours daily, real-time tracking, automated alerts Google Sheets Structure Required Columns: Product Name** (Column A) Current Price** (Column B) Previous Price** (Column C) Stock Status** (Column D) Last Updated** (Column E) URL** (Column F) Change Detected** (Column G)
by Dominic Spatz
Overview Automate UniFi Controller updates on self-hosted instances. This workflow checks the official UniFi Debian repo for a fresh release in the last 24 hours and, if found, upgrades the unifi package via SSH. It can also summarize changes and ping you on Telegram. Sticky notes are included to guide setup. How it works Schedule* runs daily (default *13:13**). HTTP Request** fetches InRelease and parses Codename + Date. IF gate** continues only if the repo changed within 24h. SSH** runs: apt-get --allow-releaseinfo-change update apt-get upgrade -y unifi (Optional) LLM* creates a short summary → *Telegram** sends it. Setup Bind credentials: SSH (required), OpenAI (optional), Telegram (optional). Set env var TELEGRAM_CHAT_ID for notifications. Adjust the Schedule Trigger to your maintenance window. Import inactive, test once, then activate. Customize Change the 24h freshness window in the Code node. Swap Telegram for Slack/Email if preferred. Add pre/post steps (backups, restarts) around the upgrade. Safety Test on a non-production controller first. No hardcoded secrets—uses n8n credentials and environment variables. If you want approval before upgrades, stop after the IF gate and notify only.
by Eugen
👥 Who the Automation is for This automation is perfect for bloggers, solopreneurs, business owners, and marketing teams who want to scale SEO content creation. Instead of spending hours on research and drafting, you can go from a single keyword idea to a ready-to-edit WordPress draft in minutes. ⚙️ How the Automation Works Collect keywords in a Google Sheet and mark the ones you want as “prioritized.” Click “Prepare Content” → your keyword(s) are sent to n8n. n8n pulls the top 10 Google SERP results. AI analyzes competitors (tone, content type, gaps) and creates a content brief. Another AI generates a blog draft based on the brief. The draft is automatically uploaded to WordPress and your sheet updates. 👉 In short: Keyword → SERP → Brief → Draft → WordPress. 🛠 How to Set Up Full Setup Guide Copy the Google Sheets Template. Import the workflow into n8n. Add your API keys: Google Custom Search, Claude AI, and WordPress credentials. Test the webhook connection from Google Sheets. 🎉 Done — you now have a one-click pipeline from keyword idea to WordPress draft.
by isaWOW
An intelligent n8n workflow that automates your entire blog content pipeline—from keyword research to WordPress publishing. Using Google Gemini, DeepSeek, and Perplexity, this workflow generates SEO-optimized blog posts and publishes them automatically while tracking everything in Google Sheets. What this workflow does This automation handles your complete blog creation process: Scheduled content research:** Fetches approved topics from Google Sheets and conducts deep SEO research using Perplexity AI-powered writing:** Uses DeepSeek for competitor analysis and Google Gemini to write 800-1000 word SEO-optimized articles with FAQs Automated publishing:** Publishes directly to WordPress via REST API and updates tracking in Google Sheets Smart scheduling:** Runs daily at 7 AM, respects weekly frequency settings, and processes multiple clients in batches Setup requirements Tools you'll need: Active n8n instance (self-hosted or n8n Cloud) Google Sheets with OAuth access WordPress site with REST API enabled API keys: Google Gemini, DeepSeek, Perplexity (optional) Estimated setup time: 20-25 minutes Step-by-step setup 1. Prepare your Google Sheets Create two sheets: Client projects sheet: Columns: Client ID | Website URL | Blog API | GMB Name | Weekly Frequency | On Page Sheet Example: CLIENT001 | https://example.com | xxxx xxxx xxxx | Example Co | Mon,Wed,Fri | Sheet URL Content topics sheet (one per client, named "Content Requirement & Posting"): Columns: S.No. | Focus Keyword | Content Topic | Internal Linking URLs | Words | Topic Approval | Content Approval | Publish URLs | Weekly Frequency Example: 1 | best investment tips | Top 10 Investment Tips | https://example.com/page | 1000 | Approved | Approved | (empty) | Mon,Wed 2. Connect Google Sheets In n8n: Credentials → Add credential → Google Sheets OAuth2 API Complete OAuth authentication Open "Load Active Client Projects" node → Select your client sheet URL and credential Open "Get Approved Blog Topics from Sheet" node → Select credential (document URL is dynamic) 3. Add AI API credentials Google Gemini: Get API key: https://makersuite.google.com/app/apikey Add credential in n8n: Google PaLM API Select in "Gemini - Content Writing Model" node DeepSeek: Get API key: https://platform.deepseek.com/ Add credential in n8n: DeepSeek API Select in "DeepSeek - Research Model" node Perplexity (optional): Get API key: https://www.perplexity.ai/settings/api Add credential in n8n: Perplexity API Select in "Perplexity - Web Search Tool" node 4. Set up WordPress WordPress admin → Users → Profile → Application Passwords Create new application password (format: xxxx xxxx xxxx xxxx xxxx xxxx) Add to your Google Sheet: Blog API column: Paste application password Website URL column: Enter full URL (e.g., https://example.com) 5. Configure schedule Open "Daily Blog Publishing Schedule" node Set time (recommended: 7:00 AM) Choose timezone Save settings 6. Test and activate Add one test row in your sheets with today's day in Weekly Frequency Click "Daily Blog Publishing Schedule" → Execute node Verify: Blog published in WordPress, URL updated in Google Sheet Toggle workflow Active at the top How it works 1. Schedule & filtering (7 AM daily) Loads client projects from Google Sheets Filters clients by Weekly Frequency (e.g., only those publishing today) 2. Content fetching Loops through each client Fetches approved topics (Topic Approval = "Approved", Publish URLs = empty) Selects first pending topic 3. AI content creation Research phase:** DeepSeek + Perplexity analyze competitors, search intent, content gaps, LSI keywords Writing phase:** Google Gemini writes 800-1000 word article with FAQs in conversational English 4. Publishing Extracts title and body content Publishes to WordPress via REST API Updates Google Sheet with publish URL Continues to next topic/client Key features ✅ Automated research: Deep competitor analysis and SEO insights with Perplexity ✅ Dual AI models: DeepSeek for research, Gemini for writing ✅ SEO optimized: Natural keyword integration, LSI keywords, FAQs ✅ Batch processing: Handles multiple clients and topics in one run ✅ Smart scheduling: Publishes only on specified weekdays ✅ Complete tracking: End-to-end visibility in Google Sheets ✅ WordPress ready: Direct publishing with proper HTML formatting Troubleshooting Google Sheets not connecting: Re-authenticate OAuth credentials Verify sheet URLs and column names match exactly (case-sensitive) Check sharing permissions on sheets AI API errors: Verify API keys are active and have credits Check rate limits on API dashboards Reduce token usage if hitting limits WordPress publishing fails: Test REST API: Visit https://yoursite.com/wp-json/wp/v2/posts Verify application password is correct (with spaces) Ensure user has Author/Editor role Check Website URL includes https:// No topics being processed: Verify Topic Approval = "Approved" and Content Approval = "Approved" Ensure Publish URLs column is empty Check today's day matches Weekly Frequency setting Confirm sheet name is exactly "Content Requirement & Posting" Use cases Marketing agencies: Manage 10+ client blogs, scale without hiring writers SEO teams: Execute keyword strategies at scale with consistent quality Solo bloggers: Save 5-10 hours/week, maintain regular publishing schedule Content teams: Run company blog on autopilot with oversight and tracking Publishers: Operate multiple niche blogs, reduce costs by 70-80% Expected results Time savings:** 5-10 hours per week per client Output:** 10-20 SEO-optimized posts per week SEO impact:** Improved rankings within 2-3 months Cost efficiency:** 70-80% reduction vs. hiring writers Reliability:** Never miss a publishing deadline Workflow Customization & Next Steps This workflow (Part 1) focuses on content research and writing. To complete the full automation, you will need Part 2, which you will get in the next post. Please ensure you set it up as well, as it manages WordPress publishing along with featured images. Resources n8n documentation Google Gemini API DeepSeek API docs WordPress REST API Support Need help or custom development? 📧 Email: info@isawow.com 🌐 Website: https://isawow.com/
by Kirill Khatkevich
This workflow continuously monitors the Meta Ads Library for new creatives from a specific competitor pages, logs them into Google Sheets, and sends a concise Telegram notification with the number of newly discovered ads. It is built as a safe, idempotent loop that can run on a schedule without creating duplicates in your sheet. Use Case Manually checking the Meta Ads Library for competitor creatives is time‑consuming, and it’s easy to lose track of which ads you’ve already seen. This workflow is ideal if you want to: Track competitor creatives over time** in a structured Google Sheet. Avoid duplicates** by matching ads via their unique id field. Get lightweight notifications* in Telegram that tell you *how many new ads appeared, without spamming you with full ad lists. Run the process on autopilot** (daily, weekly, etc.) with a single schedule. How it Works The workflow is organized into three logical blocks: 1. Fetch Ads & Handle Pagination Configuration:** The Add parameters Set node stores all key request variables: ad_active_status (e.g. active), search_page_ids (competitor page IDs), ad_reached_countries, access_token. Routing:** Page or keywords routes execution into one of two HTTP Request nodes: Facebook Ads API by page — the main branch that queries ads by page ID. Facebook Ads API by keywords — an optional branch for keyword‑based searches. Normalization:** Facebook Ads API by ... returns the raw ads_archive response. Check the pagination then: extracts data (array of ad objects) into a dedicated field, reads paging.next into next_url for pagination. Pagination Loop:** If checks whether next_url is not empty. Set Next URL assigns next_url to a generic url field. Facebook Ads API pagination requests the next page and feeds it back into Check the pagination. This loop continues until there is no next_url, ensuring all pages of the Ads Library response are processed. 2. De‑duplicate Ads & Log to Google Sheets Load Existing IDs:** Read existing IDs pulls the existing id column from your Google Sheet (configured to read a specific column/range). Collect ID list converts these into a unique, normalized string array existingIds, which represents all ads you have already logged. Attach State:** Attach existing ids (Merge node) combines, for each execution, the freshly fetched Meta response (data) with the historical existingIds array from Sheets. Filter New Creatives:** Filter new creatives Code node compares each ad’s id (string) against the existingIds set and builds a new data array containing only ads that are not yet present in the sheet. It also protects against duplicates inside the same batch by tracking seen IDs in a local Set. Write New Ads:** Split Out expands the filtered data array into individual items (one item per new ad). Add to sheet then performs an appendOrUpdate into Google Sheets, mapping core fields such as id, ad_creation_time, page_name, ad_creative_bodies, ad_snapshot_url, languages, publisher_platforms, and link fields. The column mapping uses id as the matching column so that existing rows can be updated if needed. 3. Count New Ads & Notify in Telegram Count:** In parallel with the write step, Split Out also feeds into Count new ads. This Code node returns a single summary item with newCount = items.length, i.e. the total number of new creatives processed in this run. Guard:** Any new ads? checks whether newCount is greater than 0. If not, the workflow ends silently and no message is sent, avoiding noise. Notify:** When there are new creatives, Send a text message sends a Telegram message to the configured chatId. The message includes {{$json.newCount}} and a fixed link to the Google Sheet, giving you a quick heads‑up without listing individual ads. Setup Instructions To use this template, configure the following components. 1. Credentials Meta Ads / HTTP Header Auth:** Configure the Meta Ads HTTP Header credentials used by: Facebook Ads API by page, Facebook Ads API by keywords, Facebook Ads API pagination. Google Sheets:** Connect your Google account in: Read existing IDs, Add to sheet. Telegram:** Connect your Telegram account credentials in Send a text message. 2. The Add parameters Node Open the Add parameters Set node and customize: ad_active_status: Which ads to monitor (active, all, etc.). search_page_ids: The numeric ID of the competitor Facebook Page you want to track. ad_reached_countries: Comma‑separated list of country codes (US, US, CA, etc.). access_token: A valid long‑lived access token with permission to query the Ads Library. 3. Google Sheets Configuration Read existing IDs** Set documentId and sheetName to your tracking spreadsheet and sheet (e.g. an ads tab). Configure the range to read only the column holding the ad id values. Add to sheet** Point documentId and sheetName to the same spreadsheet/sheet. Make sure your sheet has the columns expected by the node (e.g. id, creation time, page, title, description, delivery_start_time, snapshot, languages, platforms, link). Confirm that id is included in matchingColumns so de‑duplication works correctly. 4. Telegram Configuration In Send a text message, set: chatId: Your target Telegram chat or channel ID. text: Customize the message template as needed, but keep {{$json.newCount}} to show the number of new creatives. 5. Schedule Open Schedule Trigger and configure when you want the workflow to run (e.g. every morning). Save and activate the workflow. Further Ideas & Customization This workflow is a solid foundation for systematic competitor monitoring. You can extend it to: Track multiple competitors** by turning search_page_ids into a list and iterating over it with a loop or separate executions. Enrich the log with performance data** by creating a second workflow that reads the sheet, pulls spend/impressions/CTR for each logged ad_id from Meta, and merges the metrics back. Add more notification channels** such as Slack or email, or send a weekly summary that aggregates new ads by page, format, or country. Tag or categorize creatives** (e.g. “video vs image”, “country”, “language”) directly in the sheet to make later analysis easier.