by Felix Kemeth
Overview Staying up to date with fast-moving topics like AI, machine learning, or your specific industry can be overwhelming. You either drown in daily noise or miss important developments between weekly digests. This AI News Agent workflow delivers a curated newsletter only when there's genuinely relevant news. I use it myself for AI and n8n topics. Key features: AI-driven send decision**: An AI agent evaluates whether today's news is worth sending. Deduplication**: Compares candidate articles against past newsletters to avoid repetition. Real-time news**: Uses SerpAPI's DuckDuckGo News engine for fresh results. Frequency guardrails**: Configure minimum and maximum days between newsletters. In this post, I'll walk you through the complete workflow, explain each component, and show you how to set it up yourself. What this workflow does At a high level, the AI News Agent: Fetches fresh news twice daily via SerpAPI's DuckDuckGo News engine. Stores articles in a persistent data table with automatic deduplication. Filters for freshness - only considers articles newer than your last newsletter. Applies frequency guardrails - respects your min/max sending preferences. Makes an editorial decision - AI evaluates if the news is worth sending. Enriches selected articles - uses Tavily web search for fact-checking and depth. Delivers via Telegram - sends a clean, formatted newsletter. Remembers what it sent - stores each edition to prevent future repetition. This allows you to get newsletters only when there's genuinely relevant news - in contrast to a fixed schedule. Requirements To run this workflow, you need: SerpAPI key** Create an account at serpapi.com and generate an API key. They offer 250 free searches/month. Tavily API key** Sign up at app.tavily.com and create an API key. Generous free tier available. OpenAI API key** Get one from OpenAI - required for AI agent calls. Telegram bot + chat ID** A free Telegram bot (via BotFather) and the chat/channel ID where you want the newsletter. See Telegram's bot tutorial for setup. How it works The workflow is organized into five logical stages. Stage 1: Schedule & Configuration Schedule Trigger** Runs the workflow on a cron schedule. Default: 0 0 9,17 * * * (twice daily at 9:00 and 17:00). These frequent checks enable the AI to send newsletters at these times when it observes actually relevant news, not only once a week. I picked 09:00 and 17:00 as natural check‑in points at the start and end of a typical workday, so you see updates when you’re most likely to read them without being interrupted in the middle of deep work. With SerpAPI’s 250 free searches/month, running twice per day with a small set of topics (e.g. 2–3) keeps you comfortably below the limit; if you add more topics or increase the schedule frequency, either tighten the cron window or move to a paid SerpAPI plan to avoid hitting the cap. Set topics and language** A Set node that defines your configuration: topics: comma-separated list (e.g., AI, n8n) language: output language (e.g., English) minDaysBetween: minimum days to wait (0 = no minimum) maxDaysBetween: maximum days without sending (triggers a "must-send" fallback) Stage 2: Fetch & Store News Build topic queries** Splits your comma-separated topics into individual search queries: In DuckDuckGo News via SerpAPI, a query like AI,n8n looks for news where both “AI” and “n8n” appear. For a niche tool like n8n, this is often almost identical to just searching for n8n (docs). It’s therefore better to split the topics, search for each of them separately, and let the AI later decide which news articles to select. return $input.first().json.topics.split(',').map(topic => ({ json: { topic: topic.trim() } })); Fetch news from SerpAPI (DuckDuckGo News)** HTTP Request node calling SerpAPI with: engine: duckduckgo_news q: your topic df: d (last day) Auth is handled via httpQueryAuth credentials with your SerpAPI key. SerpAPI also offers other news engines such as the Google News API (see here). DuckDuckGo News is used here because, unlike Google News, it returns an excerpt/snippet in addition to the title, source, and URL (see here)—giving the AI more context to work with. _Another option is NewsAPI, but its free tier delays articles by 24 hours, so you miss the freshness window that makes these twice-daily checks valuable. DuckDuckGo News through SerpAPI keeps the workflow real-time without that lag._ n8n has official SerpAPI nodes, but as of writing there is no dedicated node for the DuckDuckGo News API. That’s why this workflow uses a custom HTTP Request node instead, which works the same under the hood while giving you full control over the DuckDuckGo News parameters. Split SerpAPI results into articles** Expands the results array so each article becomes its own item. Upsert articles into News table** Stores each article in an n8n data table with fields: title, source, url, excerpt, date. Uses upsert on title + URL to avoid duplicates. Date is normalized to ISO UTC: DateTime.fromSeconds(Number($json.date), {zone: 'utc'}).toISO() Stage 3: Filtering & Frequency Guardrails This is where the workflow gets smart about what to consider and when to send. Get previous newsletters → Sort → Get most recent** Pulls all editions from the Newsletters table and isolates the latest one with its createdAt timestamp. Combine articles with last newsletter metadata** Attaches the last newsletter timestamp to each candidate article. Filter articles newer than last newsletter** Keeps only articles published after the last edition. Uses a safe default date (2024-01-01) if no previous newsletter exists: $json.date_2 > ($json.createdAt_1 || DateTime.fromISO('2024-01-01T00:00:00.000Z')) Stop if last newsletter is too recent** Compares createdAt against your minDaysBetween setting. If you're still in the "too soon to send" window, the workflow short-circuits here. Stage 4: AI Editorial Decision This is the core intelligence of the workflow - an AI that decides whether to send and what to include. This stage is also the actual agentic part of the workflow, where the system makes its own decisions instead of just following a fixed schedule. Aggregate candidate articles for AI** Bundles today's filtered articles into a compact list with title, excerpt, source, and url. Limit previous newsletters to last 5 → Aggregate** Prepares the last 5 newsletter contents for the AI to check against for repetition. Combine candidate articles with past newsletters** Merges both lists so the AI sees "today's candidates" + "recent history" side by side. AI: decide send + select articles** The heart of the workflow. A GPT-5.1 call with a comprehensive editorial prompt: You are an AI Newsletter Editor. Your job is to decide whether today’s newsletter edition should be sent, and to select the best articles. You will receive a list of articles with: 'title', 'excerpt', source, url. You will also receive content of previously sent newsletters (markdown). Your Tasks 1. Decide whether to send the newsletter Output "YES" only if all of the following are satisfied OR the fallback rule applies: Base Criteria There are at least 3 meaningful articles. Meaningful = not trivial, not purely promotional, not clickbait, contains actual informational value. Articles must be non-duplicate and non-overlapping: Not the same topic/headline rephrased Not reporting identical events with minor variations Not the same news covered by multiple sources without distinct insights Articles must be relevant to the user's topics: {{ $('Set topics and language').item.json.topics }} Articles must be novel relative to the topics in previous newsletters: Compare against all previous newsletters below Exclude articles that discuss topics already substantially covered Articles must offer clear value: New information Impact that matters to the user Insight, analysis, or meaningful expansion Fallback rule: Newsletter frequency requirement If at least 1 relevant article exists and the last newsletter was sent more than {{ $('Set topics and language').item.json.maxDaysBetween }} days ago, then you MUST return "YES" as a decision even if the other criteria are not completely met. Last newsletter was sent {{ $('Get most recent newsletter').item.json.createdAt ? Math.floor($now.diff(DateTime.fromISO($('Get most recent newsletter').item.json.createdAt), 'days').days) : 999 }} days ago. Otherwise → "NO" 2. If "YES": Select Articles Select the top 3–5 articles that best fulfill the criteria above. For each selected article, output: title** (rewrite for clarity, conciseness, and impact) summary** (1–2 sentences; written in the output language) source** url** All summaries must be written in: {{ $('Set topics and language').item.json.language }} Output Format (JSON) { "decision": "YES or NO", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } When "decision": "NO", return an empty array for "articles". Article Input Use these articles: {{ $json.results.map( article => `Title: ${article.title_2} Excerpt: ${article.excerpt_2} Source: ${article.source_2} URL: ${article.url_2}` ).join('\n---\n') }} You must also consider the topics already covered in previous newsletters to avoid repetition: {{ $json.newsletters.map(x => Newsletter: ${x.content}).join('\n---\n') }} The AI outputs structured JSON: { "decision": "YES", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } If AI decided to send newsletter** Routes based on decision === "YES". If NO, the workflow ends gracefully. Stage 5: Content Enrichment & Delivery Split selected articles for enrichment** Each selected article becomes its own item for individual processing. AI: enrich & write article** An AI Agent node with GPT-5.1 + Tavily web search tool. For each article: You are a research writer that updates short news summaries into concise, factual articles. Input: Title: {{ $json["title"] }} Summary: {{ $json["summary"] }} Source: {{ $json["source"] }} Original URL: {{ $json["url"] }} Language: {{ $('Set topics and language').item.json.language }} Instructions: Use Tavily Search to gather 2–3 reliable, recent, and relevant sources on this topic. Update the title if a more accurate or engaging one exists. Write 1–2 sentences summarizing the topic, combining the original summary and information from the new sources. Return the original source name and url as well. Output (JSON): { "title": "final article title", "content": "concise 1–2 sentence article content", "source": "the name of the original source", "url": "the url of the original source" } Rules: Ensure the topic is relevant, informative, and timely. Translate the article if necessary to comply with the desired language {{ $('Set topics and language').item.json.language }}. The Output Parser enforces the JSON schema with title, content, source, and url fields. Aggregate enriched articles** Collects all enriched articles back into a single array. Insert newsletter content into Newsletters table** Stores the final markdown content for future deduplication: $json.output.map(article => { const title = JSON.stringify(article.title).slice(1, -1); const content = JSON.stringify(article.content).slice(1, -1); const source = JSON.stringify(article.source).slice(1, -1); const url = JSON.stringify(article.url).slice(1, -1); return ${title}\n${content}\nSource: ${source}; }).join('\n\n') Send newsletter to Telegram** Sends the formatted newsletter to your Telegram chat/channel. Why this workflow is powerful Intelligent send decisions** The AI evaluates news quality before sending, leading to a less noisy and more relevant news digest. Memory across editions** By persisting newsletters and comparing against history, the workflow avoids repetition. Frequency guardrails with flexibility** Set boundaries (e.g., "at least 1 day between sends" and "must send within 5 days"), but let the AI decide the optimal moment within those bounds. Source-level deduplication** The news table with upsert prevents the same article from being considered multiple times across runs. Grounded in facts** SerpAPI provides real news sources; Tavily enriches with additional verification. The newsletter stays factual. Configurable and extensible** Change topics, language, frequency - all in one Set node. In addition, the workflow is modular, allowing to add new news sources or new delivery channels without touching the core logic. Configuration guide To customize this workflow for your needs: Topics and language Open Set topics and language and modify: topics: your interests (e.g., machine learning, startups, TypeScript) language: your preferred output language Frequency settings minDaysBetween: minimum days between newsletters (0 = no limit) maxDaysBetween: maximum gap before forcing a send For very high-volume topics (such as "AI"), expect the workflow to send almost every time once minDaysBetween has passed, because the content-quality criteria are usually met. Schedule Modify the Schedule Trigger cron expression. Default runs twice daily at 9:00 am and 5:00 pm; adjust to your preference. Telegram Update the chatId in the Telegram node to your chat/channel. Credentials Set up credentials for: SerpAPI (httpQueryAuth), Tavily, OpenAI, Telegram. Next steps and improvements Here are concrete directions to take this workflow further: Multi-agent architecture** Split the current AI calls into specialized agents: signal detection, relevance scoring, editorial decision, content enhancement, and formatting - each with a single responsibility. 1:1 personalization** Move from static topics to weighted preferences. Learn from click behavior and feedback. Telegram feedback buttons** Add inline buttons (👍 Useful / 👎 Not relevant / 🔎 More like this) and feed signals back into ranking. Email with HTML template** For more flexibility, send the newsletter via email. Incorporating other news APIs or RSS feeds** Add more sources such as other news APIs and RSS feeds from blogs, newsletters, or communities. Adjust for arxiv paper search and research news** Swap SerpAPI for arxiv search or other academic sources to obtain a personal research digest newsletter. Images and thumbnails** Fetch representative images for each article and include them in the newsletter. Web archive** Auto-publish each edition as a web page with permalinks. Retry logic and error handling** Add exponential backoff for external APIs and route failures to an error workflow. Prompt versioning** Move prompts to a data table with versioning for A/B testing and rollback. Audio and video news** Use audio or video models for better news communication. Wrap-up This AI News Agent workflow represents a significant evolution from simple scheduled newsletters. By adding intelligent send decisions, historical deduplication, and frequency guardrails, you get a newsletter that respects the quality of available news. I use this workflow myself to stay informed on AI and automation topics without the overload of daily news or the delayed delivery caused by a fixed newsletter schedule. Need help with your automations? Contact me here.
by Takumi Oku
Who is this for Entrepreneurs looking for verified technology to license. R&D Teams tracking aerospace innovation. Content Creators covering tech trends. How it works Fetch: Gets the latest patents from NASA's Tech Transfer API. Filter & Loop: Removes empty entries and processes each patent individually. Analyze: Translates the abstract (DeepL) and uses OpenAI to brainstorm practical business applications. Archive: Saves the details to Google Sheets. Notify: Compiles a summary and sends it to Slack. How to set up Prepare Google Sheet: Create a new sheet with these exact headers in Row 1: Date Title Abstract_Translated Business_Idea Link Edit Settings: Double-click the Edit Settings node to add your Google Sheet ID, Sheet Name, and Slack Channel ID. Credentials: Configure credentials for OpenAI, DeepL, Google Sheets, and Slack. Activate: Run a test execution, then switch the workflow to Active. Requirements OpenAI: API Key (gpt-4o or gpt-3.5-turbo) DeepL: API Key (Free or Pro) Google Sheets: OAuth2 credentials with Drive/Sheets scopes. Slack: Bot User OAuth Token with chat:write scope. How to customize Change the Prompt: Edit the Generate Business Ideas node to tailor ideas for a specific niche (e.g., "Applications for medical devices"). Adjust Schedule: Change the trigger in the Weekly Schedule node to run daily or monthly. Different Output: Swap Slack for Microsoft Teams or Email nodes if preferred.
by Jimmy Gay
🔧 AI-Powered Auto-Maintenance System for n8n Transform your n8n instance management with this advanced automation system featuring artificial intelligence-driven workflow selection. This template provides comprehensive maintenance operations with smart filtering capabilities. ✨ Key Features 🤖 Artificial Intelligence Engine Multi-criteria scoring system for intelligent workflow selection Semantic analysis for business-critical pattern recognition Automated decision-making with configurable thresholds 🎯 Core Maintenance Operations Security Audits**: Automated vulnerability scanning with Google Sheets reporting Smart Pause/Resume**: Intelligent workflow suspension during maintenance windows AI Backup Creation**: Selective duplication of high-value workflows Intelligent Export**: Comprehensive system backups with metadata 🔐 Enterprise Security Token-based authentication with request validation Protected workflow safeguards (never modifies critical systems) Comprehensive error handling and logging ⚡ Automation & Scheduling Configurable maintenance schedules (daily, weekly, monthly) Webhook-driven operations for external integration Real-time monitoring and statistics 🎯 Perfect For DevOps Teams**: Streamline n8n maintenance operations Enterprise Users**: Manage large-scale workflow environments System Administrators**: Automated security and backup management Advanced Users**: Leverage AI for intelligent workflow management 🚀 Quick Setup Import the template Configure 4 credentials (n8n API, Google Sheets, Google Drive, Webhook Auth) Set your security token and Google Sheet ID Activate and enjoy automated maintenance! 🧠 AI Intelligence Highlights The system evaluates workflows using 6+ criteria including activity status, complexity, priority tags, business criticality, and recent updates. Workflows are automatically scored and selected based on intelligent thresholds. Selection Logic: Duplicate threshold: ≥3 points (smart backup selection) Export threshold: ≥5 points (comprehensive backup) System workflows always protected 📊 Includes 25+ configured nodes with emoji naming 4 detailed markdown documentation cards Pre-configured schedules and examples Comprehensive error handling Statistical reporting and monitoring Perfect for organizations looking to implement intelligent, automated n8n maintenance with minimal manual intervention.
by Didac Fernandez
AI-Powered Financial Document Processing with Google Gemini This comprehensive workflow automates the complete financial document processing pipeline using AI. Upload invoices via chat, drop expense receipts into a folder, or add bank statements - the system automatically extracts, categorizes, and organizes all your financial data into structured Google Sheets. What this workflow does Processes three types of financial documents automatically: Invoice Processing**: Upload PDF invoices through a chat interface and get structured data extraction with automatic file organization Expense Management**: Monitor a Google Drive folder for new receipts and automatically categorize expenses using AI Bank Statement Processing**: Extract and organize transaction data from bank statements with multi-transaction support Financial Analysis**: Query all your financial data using natural language with an AI agent Key Features Multi-AI Persona System**: Four specialized AI personas (Mark, Donna, Victor, Andrew) handle different financial functions Google Gemini Integration**: Advanced document understanding and data extraction from PDFs Smart Expense Categorization**: Automatic classification into 17 business expense categories using LLM Real-time Monitoring**: Continuous folder watching for new documents with automatic processing Natural Language Queries**: Ask questions about your financial data in plain English Automatic File Management**: Intelligent file naming and organization in Google Drive Comprehensive Error Handling**: Robust processing that continues even when individual documents fail How it works Invoice Processing Flow User uploads PDF invoice via chat interface File is saved to Google Drive "Invoices" folder Google Gemini extracts structured data (vendor, amounts, line items, dates) Data is parsed and saved to "Invoice Records" Google Sheet File is renamed as "{Vendor Name} - {Invoice Number}" Confirmation message sent to user Expense Processing Flow User drops receipt PDF into "Expense Receipts" Google Drive folder System detects new file within 1 minute Google Gemini extracts expense data (merchant, amount, payment method) OpenRouter LLM categorizes expense into appropriate business category All data saved to "Expenses Recording" Google Sheet Bank Statement Processing Flow User uploads bank statement to "Bank Statements" folder Google Gemini extracts multiple transactions from statement Custom JavaScript parser handles various bank formats Individual transactions saved to "Bank Transactions Record" Google Sheet Financial Analysis Enable the analysis trigger when needed Ask questions in natural language about your financial data AI agent accesses all three spreadsheets to provide insights Get reports, summaries, and trend analysis What you need to set up Required APIs and Credentials Google Drive API** - For file storage and monitoring Google Sheets API** - For data storage and retrieval Google Gemini API** - For document processing and data extraction OpenRouter API** - For expense categorization (supports multiple LLM providers) Google Drive Folder Structure Create these folders in your Google Drive: "Invoices" - Processed invoice storage "Expense Receipts" - Drop zone for expense receipts (monitored) "Bank Statements" - Drop zone for bank statements (monitored) Google Sheets Setup Create three spreadsheets with these column headers: Invoice Records Sheet: Vendor Name, Invoice Number, Invoice Date, Due Date, Total Amount, VAT Amount, Line Item Description, Quantity, Unit Price, Total Price Expenses Recording Sheet: Merchant Name, Transaction Date, Total Amount, Tax Amount, Payment Method, Line Item Description, Quantity, Unit Price, Total Price, Category Bank Transactions Record Sheet: Transaction ID, Date, Description/Payee, Debit (-), Credit (+), Currency, Running Balance, Notes/Category Use Cases Small Business Accounting**: Automate invoice and expense tracking for bookkeeping Freelancer Financial Management**: Organize client invoices and business expenses Corporate Expense Management**: Streamline employee expense report processing Financial Data Analysis**: Generate insights from historical financial data Bank Reconciliation**: Automate transaction recording and account reconciliation Tax Preparation**: Maintain organized records with proper categorization Technical Highlights Expense Categories**: 17 predefined business expense categories (Cost of Goods Sold, Marketing, Payroll, etc.) Multi-format Support**: Handles various PDF layouts and bank statement formats Scalable Processing**: Processes multiple documents simultaneously Error Recovery**: Continues processing even when individual documents fail Natural Language Interface**: No technical knowledge required for financial queries Real-time Processing**: Documents processed within minutes of upload Benefits Time Savings**: Eliminates manual data entry from financial documents Accuracy**: AI-powered extraction reduces human error Organization**: Automatic file naming and categorization Insights**: Query financial data using natural language Compliance**: Maintains organized records for accounting and audit purposes Scalability**: Handles growing document volumes without additional overhead This workflow transforms tedious financial document processing into an automated, intelligent system that grows with your business needs.
by Lucía Maio Brioso
🧑💼 Who is this for? This workflow is for any YouTube user who wants to bulk delete all playlists from their own channel — whether to start fresh, clean up old content, or prepare the account for a new purpose. It’s useful for: Creators reorganizing their channel People transferring content to another account Anyone who wants to avoid deleting playlists manually one by one 🧠 What problem is this workflow solving? YouTube does not offer a built-in way to delete multiple playlists at once. If you have dozens or hundreds of playlists, removing them manually is extremely time-consuming. This workflow automates the entire deletion process in seconds, saving you hours of repetitive effort. ⚙️ What this workflow does Connects to your YouTube account Fetches all playlists you’ve created (excluding system playlists) Deletes them one by one** automatically > ⚠️ This action is irreversible. Once a playlist is deleted, it cannot be recovered. Use with caution. 🛠️ Setup 🔐 Create a YouTube OAuth2 credential in n8n for your channel. 🧭 Assign the credential to both YouTube nodes. ✅ Click “Test workflow” to execute. > 🟨 By default, this workflow deletes everything. If you want to be more selective, see the customization tips below. 🧩 How to customize this workflow to your needs ✅ Add a confirmation flag Insert a Set node with a custom field like confirm_delete = true, and follow it with an IF node to prevent accidental execution. ✂️ Delete only some playlists Add a Filter node after fetching playlists — you can match by title, ID, or keyword (e.g. only delete playlists containing “old”). 🛑 Add a pause before deletion Insert a Wait or NoOp node to give you a moment to cancel before it runs. 🔁 Adapt to scheduled cleanups Use a Cron trigger if you want to periodically clear temporary playlists.
by Luka Zivkovic
Complete Telegram Trivia Bot with AI Question Generation Build a fully-featured Telegram trivia bot that automatically generates fresh questions daily using OpenAI and tracks user progress with NocoDB. Perfect for communities, education, or entertainment! ✨ Key Features 🤖 AI Question Generation: Automatically creates 40+ new trivia questions daily across 8 categories 📊 Smart User Management: Tracks scores, prevents question repeats, maintains leaderboards 🎮 Game Mechanics: Star-based difficulty scoring, answer history, progress tracking 🏆 Competitive Elements: Real-time leaderboards with emoji rankings and user positioning 🛡️ Robust Architecture: Error handling, state management, and data validation 🚀 Perfect For Community Engagement**: Keep Telegram groups active with daily trivia challenges Educational Content**: Create learning experiences with categorized questions Business Applications**: Employee training, customer engagement, lead generation Personal Projects**: Learn n8n automation while building something fun 📱 Supported Commands /start - Welcome new users with setup instructions /question - Get personalized trivia questions (never repeats correctly answered ones) /score - View current points and statistics /leaderboard - See top 10 players with rankings /stats - Detailed accuracy and performance metrics /help - Complete command reference 🔧 How It Works User Journey: User sends /question command to bot System checks their answer history to avoid repeats Displays fresh question with multiple choice options Processes answer, updates score based on difficulty stars Saves complete answer history for future filtering AI Content Pipeline: Daily scheduler triggers question generation OpenAI creates 5 questions per category (8 categories total) Questions automatically saved to NocoDB with difficulty ratings Content includes explanations and proper formatting 🛠️ Set Up Steps Prerequisites: n8n instance (cloud or self-hosted) NocoDB database (free tier works) OpenAI API key (Not required if you want to add questions yourself) Telegram bot token Database Setup: Create 3 NocoDB tables with the exact field specifications provided in the sticky notes. The workflow includes complete schema documentation. Configuration Time: ~15 minutes for database setup + API keys Detailed Setup Instructions: All setup steps, database schemas, and configuration details are documented in the workflow's sticky notes for easy implementation. 📈 Advanced Features Question History Tracking**: Users never see correctly answered questions again Difficulty-Based Scoring**: 1-5 star rating system with corresponding points Category Management**: 8 different trivia categories for variety State Management**: Proper game flow with idle/waiting states Error Handling**: Graceful fallbacks for all edge cases Scalable Architecture**: Supports unlimited concurrent users 🎯 Business Applications Lead Generation**: Capture user data through engaging trivia Employee Training**: Create custom questions for onboarding Customer Engagement**: Keep users active in your Telegram community Educational Tools**: Subject-specific learning with progress tracking Event Activation**: Conferences, workshops, or team building 💡 Customization Options Modify question categories for your niche Adjust scoring systems and difficulty levels Add custom commands and features Integrate with other platforms or APIs Create specialized question sets 🔗 Get Started Ready to build your own AI-powered trivia bot? Start with n8n and follow the comprehensive setup guide included in this workflow template. Next Steps: Import this workflow template Follow the database setup instructions in sticky notes Configure your API credentials Test with sample questions Launch your trivia bot! Turn your friend group into trivia champions with AI-generated questions that spark friendly competition!
by Anna Bui
🎯 LinkedIn ICP Lead Qualification Automation Automatically identify and qualify ideal customer prospects from LinkedIn post reactions using AI-powered profile analysis and intelligent data enrichment. Perfect for sales teams and marketing professionals who want to convert LinkedIn engagement into qualified leads without manual research. This workflow transforms post reactions into actionable prospect data with AI-driven ICP classification. Good to know LinkedIn Safety**: Only use cookie-free Apify actors to avoid account detection and suspension risks Daily Processing Limits**: Scrape maximum 1 page of reactions per day (50-100 profiles) to stay under LinkedIn's radar Apify actors cost approximately $0.01-0.05 per profile scraped - budget accordingly for daily processing Includes intelligent rate limiting to prevent API restrictions and maintain LinkedIn account safety AI classification requires clear definition of your Ideal Customer Profile criteria Processing too many profiles or running too frequently will trigger LinkedIn's anti-scraping measures Always monitor your LinkedIn account health and Apify usage patterns for any warning signs How it works Scrapes LinkedIn post reactions using Apify's specialized actor to identify engaged users Extracts and cleans profile data including names, job titles, and LinkedIn URLs Checks against existing Airtable records to prevent duplicate processing and save costs Creates new prospect records with basic information for tracking purposes Enriches profiles with comprehensive LinkedIn data including company details and experience Aggregates and formats profile data for AI analysis and classification Uses AI to analyze prospects against your ICP criteria with detailed reasoning Updates records with ICP classification results and extracted email addresses Implements smart batching and delays to respect API rate limits throughout the process How to use IMPORTANT**: Select cookie-free Apify actors only to avoid LinkedIn account suspension Set up Apify API credentials in both HTTP Request nodes for safe LinkedIn scraping Configure Airtable OAuth2 authentication and select your prospect tracking base Replace the LinkedIn post URL with your target post in the initial scraper node Daily Usage**: Process only 1 page of reactions per day (typically 50-100 profiles) maximum Customize the AI classification prompt with your specific ICP criteria and job titles Test with a small batch first to verify setup and monitor both API costs and LinkedIn account health Schedule workflow to run daily rather than processing large batches to maintain account safety Requirements Apify account with API access and sufficient credits for profile scraping Airtable account with OAuth2 authentication configured OpenAI or compatible AI model credentials for prospect classification LinkedIn post URL with reactions to analyze (minimum 10+ reactions recommended) Clear definition of your Ideal Customer Profile criteria for accurate AI classification Customising this workflow Safety First**: Always verify Apify actors are cookie-free before configuring to protect your LinkedIn account Modify ICP classification criteria in the AI prompt to match your specific target customer profile Set up daily scheduling (not hourly/frequent) to respect LinkedIn's usage patterns and avoid detection Adjust rate limiting delays based on your comfort level with LinkedIn scraping frequency Add additional data fields to Airtable schema for storing custom prospect information Integrate with CRM systems like HubSpot or Salesforce for automatic lead import Set up Slack notifications for new qualified prospects or daily summary reports Create email marketing sequences in tools like Mailchimp for nurturing qualified leads Add lead scoring based on company size, industry, or engagement level for prioritization Consider rotating between different LinkedIn posts to diversify your prospect sources while maintaining daily limits
by Nikan Noorafkan
🚀 Channable + Google Ads + Relevance AI: Scalable AI Workflow for Automated Ad Copy Generation & Publishing 🧩 Overview This workflow automates the entire ad creation process for Google Ads by integrating product data, AI-generated copy, compliance checks, and publication into your marketing pipeline. It connects n8n, Relevance AI, Google Sheets, and optionally Channable to: Fetch product data from your catalog Generate Google Text Ad headlines and descriptions using Relevance AI Validate character limits and ensure Google Ads compliance Route non-compliant ads to a Slack review channel Save compliant, ready-to-publish ads in Google Sheets Notify your marketing team automatically after each generation cycle 🧠 Key Benefits ✅ 100% automated ad copy pipeline ✅ AI-generated, human-quality Google Ads text ✅ Built-in compliance verification (Google Ads policy) ✅ Google Sheet integration for team review ✅ Daily automatic schedule (zero manual effort) ✅ Slack alerts for QA and transparency ✅ Modular design — extendable for Shopping and Performance Optimization ✅ Scalable for 10 → 10,000+ product ads ⚙️ System Architecture Tech Stack n8n** – Automation Orchestrator Relevance AI** – AI tools for copy generation and policy compliance Google Sheets** – Data storage and team collaboration Slack** – Real-time alerts and notifications (Optional) Channable – Product feed integration 🧭 Workflow Logic Daily Trigger (00:00) ⬇️ 1️⃣ Get Product Feed (Channable or custom API) ⬇️ 2️⃣ Split Into Batches (50 products each) ⬇️ 3️⃣ Generate Ad Copy (Relevance AI tool → Claude 3.5 prompt) ⬇️ 4️⃣ Validate Character Limits (JS node: max 30 headline / 90 description) ⬇️ 5️⃣ Compliance Check (Relevance AI agent → Google Ads policies) ⬇️ 6️⃣ IF Compliant → CSV / Google Sheets ↳ ❌ Non-Compliant → Slack Alert ⬇️ 7️⃣ Aggregate Batches + Generate CSV ⬇️ 8️⃣ Save to Google Sheets (“Generated Ads” tab) ⬇️ 9️⃣ Slack Notification → Summary Report 📋 Environment Variables Set these in n8n → Settings → Variables → Add Variable Copy-paste from your ENVIRONMENT_VARIABLES_CORRECTED.txt. Includes: ✅ Relevance AI region, API key, tool & agent IDs ✅ Google Ads, Merchant Center, and Sheets credentials ✅ Slack channel name ✅ Optional Channable endpoint Example: RELEVANCE_AI_API_URL=https://api-f1db6c.stack.tryrelevance.com/latest RELEVANCE_TOOL_AD_COPY_ID=bueQG8io04dw RELEVANCE_AGENT_COMPLIANCE_ID=xT29mQ4QKsl GOOGLE_SHEET_ID=1q2w3e4r5t6y7u8i9o0p SLACK_CHANNEL=#google-ads-automation 🏗️ Node-by-Node Breakdown | Node | Description | Endpoint / Logic | | -------------------------------------- | ----------------------------------------------- | ----------------------------------------------------------------------------- | | 🕓 Schedule Trigger | Runs daily at 00:00 | Cron 0 0 * * * | | 📦 Get Product Feed | Pulls product data from Channable or custom API | GET {{$env.CHANNABLE_API_URL}}/v1/projects/{{$env.PROJECT_ID}}/items | | 🧮 Split Into Batches | Processes 50 products at a time | Avoids rate limits | | ✍️ Generate Ad Copy (Relevance AI) | Calls AI tool for each product | POST {{$env.RELEVANCE_AI_API_URL}}/tools/google_text_ad_copy_generator/run | | 🔍 Validate Character Limits | JS validation (≤30 headline / ≤90 description) | Truncates smartly | | 🧠 Compliance Check Agent | Verifies Google Ads compliance | POST {{$env.RELEVANCE_AI_API_URL}}/agents/google_ads_compliance_checker/run | | ⚖️ IF Compliant | Routes APPROVED vs REJECTED | "contains 'APPROVED'" | | 💾 Format for CSV | Formats compliant ads for export | Maps ID, headline, desc, URLs | | 📊 Aggregate Batches | Combines all results | Merges datasets | | 🧱 Generate CSV File | Converts JSON → CSV | Escaped string-safe format | | 📑 Save to Google Sheets | Saves reviewed ads | Sheet: Generated Ads | | 📢 Slack Notification (Success) | Posts completion summary | Shows ad count, timestamp | | 🚨 Slack Alert (Non-Compliant) | Notifies team for review | Includes issues, category | 🔑 API Authentication Setup 🔹 Relevance AI Create “HTTP Header Auth” credential Header Name: Authorization Header Value: Bearer {{$env.RELEVANCE_AI_API_KEY}} 🔹 Google Sheets Credential type: “Google OAuth2 API” Scopes: https://www.googleapis.com/auth/spreadsheets https://www.googleapis.com/auth/drive.file 🔹 Slack Create Slack App → Add Bot Token Scopes → chat:write Paste token in n8n “Slack API” credential. 🔹 (Optional) Channable Header Auth: Bearer {{$env.CHANNABLE_API_TOKEN}} 🧩 Google Sheet Template Sheet name: Generated Ads Columns: | product_id | headline | description | final_url | display_url | generated_at | Optional: Add compliance_status or notes columns for QA. ⚙️ Testing Procedure Manual Trigger: Disable the schedule → click “Execute Workflow”. Batch Size: Start small (3 products). Expected Output: ✅ Ad copy generated ✅ Character limits validated ✅ Slack alerts for rejects ✅ Google Sheet filled Check logs in Executions for errors. Re-enable the cron trigger after successful validation. 🧾 Example Output | product_id | headline | description | final_url | display_url | generated_at | | ---------- | ------------------ | --------------------------------------------- | ------------------------------------------------ | ----------- | -------------------- | | 12243 | “Eco Bamboo Socks” | “Soft, breathable comfort for everyday wear.” | https://shop.com/socks | shop.com | 2025-10-22T00:00:00Z | 📬 Slack Alert Templates ✅ Success Notification ✅ Google Ads Generation Complete 📊 Summary: • Total Ads Generated: 50 • Saved to Google Sheets: Generated Ads • Timestamp: 2025-10-22T00:00:00Z ⚠️ Non-Compliant Alert ⚠️ Non-Compliant Ad Flagged Product: Bamboo Socks Issues: Contains “Free Shipping” Headline too long Timestamp: 2025-10-22T00:00:00Z 🧰 Maintenance & Monitoring | Frequency | Task | | --------- | -------------------------------- | | Daily | Check Slack alerts for rejects | | Weekly | Review ad performance metrics | | Monthly | Update Relevance AI prompts | | Quarterly | Refresh API tokens and variables | 📊 Success Metrics ✅ Compliance approval rate: >85% 🚫 Disapproval rate: <5% 📈 CTR improvement: +15–25% ⏱️ Time saved: 10–15 hours/week 🌐 Scalable: 1,000+ ads/day 🪜 Next Steps Deploy and monitor for 7 days. After 30 days → activate Workflow 2: Performance Optimization Loop. Extend to Shopping Feed Optimization. Add multi-language generation using Relevance AI. Integrate Google Ads API publishing (full automation). 🔗 Resources n8n Docs Relevance AI Docs Google Ads API Merchant API Channable Help 🎉 Conclusion You now have a production-ready, scalable AI-powered ad generation system integrating Channable, Google Ads, and Relevance AI — built entirely on n8n. This delivers: 💡 AI creativity at scale ✅ Google Ads policy compliance ⚙️ Hands-free daily automation 📊 Transparent reporting and collaboration > Start small → validate → scale to 10,000+ ads per day. > Within weeks, you’ll have a self-learning, always-on ad pipeline driving consistent performance.
by Ramon David
This workflow manages subscription billing reminders and data updates via Telegram. It runs daily at 8:00 AM to check for upcoming due subscriptions, formats relevant information, and sends reminders to users. It also processes user messages for subscription management—adding, updating, or retrieving billing info—using AI-powered natural language understanding. Main outcomes include automated subscription tracking, timely reminders, and conversational interaction through Telegram, reducing manual tracking efforts and improving billing accuracy. Automation Benefits Time & Cost Savings Manual Process: Several hours/week spent managing subscriptions and reminders manually. Automated Process: Workflow completes checks, reminders, and data updates in under a minute. Time Savings: Saves approximately 5 hours weekly, translating to significant productivity gains and cost reduction. ROI: Automation pays for itself within the first month due to saved labor. Error Reduction: Minimized manual entry errors, ensuring accurate billing records and timely reminders. Business Impact Solves the problem of manual subscription tracking and reminders. Scales effortlessly as subscription list grows. Opens new opportunities for proactive customer engagement, personalized messaging, and integrated billing insights. Setup Guide Prerequisites Google Sheets account with subscription data sheet. OpenAI API key with access to GPT-4. Telegram bot token with messaging permissions. Email SMTP setup if email reminders are used. API Configuration Google Sheets: Generate OAuth2 credentials, enable Sheets API, and authorize access. OpenAI: Create API key, set model to GPT-4, and test connectivity. Telegram: Create bot via BotFather, retrieve token, and set webhook URL. Webhook URL: Use the provided URL in the Telegram bot settings. Node-by-Node Setup OpenAI Chat Model: Enter API credentials, select GPT-4 model. Google Sheets: Input spreadsheet ID, sheet name, and ensure correct permissions. Telegram Nodes: Insert chat ID, message parsing, and response formatting. Schedule Trigger: Confirm cron expression for daily execution. For AI nodes, test with sample messages to verify formatting and extraction. Testing & Validation Run workflow manually. Confirm data is retrieved, processed, and responses sent. Verify subscription updates in Google Sheets. Check Telegram chats for correct message flow. N8N Documentation References Google Sheets Node OpenAI Node Telegram Node Schedule Trigger Maintenance & Troubleshooting Regular Maintenance (Monthly) Check API credentials and renew tokens if expired. Monitor workflow logs for errors. Review Google Sheets data for consistency. Update API keys when new versions or permissions are granted. Verify currency conversion accuracy periodically. Common Issues & Solutions Workflow not triggering: check schedule settings and webhook URLs. Data not updating: verify Google Sheets credentials and permissions. Incorrect responses: test AI prompt inputs and outputs. API failures: regenerate API keys or check quota limits. Reconfigure nodes if external API changes. Monitoring & Alerts Set up email or Slack alerts for failures. Regularly review execution logs. Track key metrics like successful runs, error rates, and response times. Support & Escalation Check n8n logs first for errors. Export workflow for support if needed. Use n8n community forums for common issues. Contact API providers for account-specific problems. Emergency procedures: restart workflow, regenerate tokens. Updates & Improvements Review workflow performance quarterly. Optimize AI prompts for better accuracy. Backup workflow configurations before major changes. Incorporate user feedback for feature enhancements.
by Cheng Siong Chin
How It Works Automates financial aggregation, validation, and intelligent tax assessment. Integrates revenue, expenses, and invoices via scheduled connectors, merges data into unified records, and applies AI-driven analysis for anomaly detection and tax calculations. The system evaluates tax liability against configurable thresholds, intelligently routes filings to government portals or tax agents based on jurisdiction rules, and triggers automated email notifications for compliance deadlines and payment reminders. Designed for accountants, small business owners, and finance teams managing quarterly tax obligations while minimizing manual errors and compliance risks across multiple entities. Setup Steps Configure OpenAI, Gmail, and Google Sheets credentials Connect revenue and expense data sources Define tax thresholds and jurisdiction-specific rules in workflow nodes Map output fields to government or tax agent systems Create email templates for notifications Test the workflow with sample financial data before enabling Prerequisites OpenAI API key, Gmail account, Google Sheets, accounting software or data source connectivity Use Cases Quarterly tax filing automation, multi-client accountant workflows, enterprise compliance monitoring Customization Adjust tax thresholds by jurisdiction, integrate additional data sources Benefits Significant reduction in calculation errors, faster filing timelines, automated deadline alerts
by shae
How it works This Lead Capture & Auto-Qualification workflow transforms raw leads into qualified prospects through intelligent automation. Here's the high-level flow: Lead Intake → Data Validation → Enrichment → Scoring → Smart Routing → CRM Integration & Notifications The system captures leads from any source, validates the data, enriches it with company intelligence, scores based on qualification criteria, and automatically routes high-value prospects to sales while nurturing lower-priority leads. Set up steps Time to set up: Approximately 30-45 minutes Prerequisites: Active accounts with HubSpot, Clearbit, Apollo, and Slack Step 1: Import Workflow (2 minutes) Copy the workflow JSON and import into your n8n instance The workflow will appear with all nodes and sticky note documentation Step 2: Configure Environment Variables (5 minutes) Set these in your n8n environment: APOLLO_API_URL SLACK_SALES_CHANNEL_ID SLACK_MARKETING_CHANNEL_ID CRM_ASSIGNMENT_URL Step 3: Set Up API Credentials (15 minutes) Create credential connections for: Clearbit API (enrichment) Apollo API (HTTP Header Auth) HubSpot API (CRM integration) Slack API (notifications) Step 4: Customize Scoring Logic (10 minutes) Review the qualification criteria in the Code node Adjust scoring weights based on your ideal customer profile Modify industry targeting and company size thresholds Step 5: Test & Activate (8 minutes) Send test webhook requests to validate the flow Verify CRM contact creation and Slack notifications Activate the workflow for live lead processing
by vinci-king-01
How it works This workflow automatically analyzes website visitors in real-time, enriches their data with company intelligence, and provides lead scoring and sales alerts. Key Steps Webhook Trigger - Receives visitor data from your website tracking system. AI-Powered Company Intelligence - Uses ScrapeGraphAI to extract comprehensive company information from visitor domains. Visitor Enrichment - Combines visitor behavior data with company intelligence to create detailed visitor profiles. Lead Scoring - Automatically scores leads based on company size, industry, engagement, and intent signals. CRM Integration - Updates your CRM with enriched visitor data and lead scores. Sales Alerts - Sends real-time notifications to your sales team for high-priority leads. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for company intelligence gathering. Set up HubSpot connection - Connect your HubSpot CRM to automatically update contact records. Configure Slack integration - Set up your Slack workspace and specify the sales alert channel. Customize lead scoring criteria - Adjust the scoring algorithm to match your target customer profile. Set up website tracking - Configure your website to send visitor data to the webhook endpoint. Test the workflow - Verify all integrations are working correctly with a test visitor. Key Features Real-time visitor analysis** with company intelligence enrichment Automated lead scoring** based on multiple factors (company size, industry, engagement) Intent signal detection** (pricing interest, demo requests, contact intent) Priority-based sales alerts** with recommended actions CRM integration** for seamless lead management Deal size estimation** based on company characteristics