by Jonathan
This workflow takes Dialpad call information after a call is disconnected and pushes it into Syncro as a ticket timer update, matching the start time and end time provided by Dialpad and a note that containing the contact or customer name and number. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by JYLN
This updated workflow will automatically archive your Spotify Discover Weekly tracks to another manually created playlist, without the nuisance of duplicate tracks. It utilizes the latest verisons of n8n's Schedule trigger, Spotify, Switch, Merge, and IF nodes. Special thanks to trey for their original version of the workflow, as well as ihortom for their help with navigating the Switch node's outputs. To use this workflow, you'll need to: Create a playlist for use as the archive playlist within your Spotify account Create and select your Spotify credentials within each Spotify node within the workflow See workflow README for additional information and optional setup steps.
by Airtop
About The LinkedIn Profile Discovery Automation Are you tired of manually searching for LinkedIn profiles or paying expensive data providers for often outdated information? If you spend countless hours trying to find accurate LinkedIn URLs for your prospects or candidates, this automation will change your workflow forever. Just give this workflow the information you have about a contact, and it will automatically augment it with a LinkedIn profile. How to find a LinkedIn Profile Link In this guide, you'll learn how to automate LinkedIn profile link discovery using Airtop's built-in node in n8n. Using this automation, you'll have a fully automated workflow that saves you hours of manual searching while providing accurate, validated LinkedIn URLs. What You'll Need A free Airtop API key A Google Workspace account. If you have a Gmail account, youβre all set Estimated setup time: 10 minutes Understanding the Process This automation leverages the power of intelligent search algorithms combined with LinkedIn validation to ensure accuracy. Here's how it works: Takes your input data (name, company, etc.) and constructs intelligent search queries Utilizes Google search to identify potential LinkedIn profile URLs Validates the discovered URLs directly against LinkedIn to ensure accuracy Returns confirmed, accurate LinkedIn profile URLs Setting Up Your Automation Getting started with this automation is straightforward: Prepare Your Google Sheet Create a new Google Sheet with columns for input data (name, company, domain, etc.) Add columns for the output LinkedIn URL and validation status (see this example) Configure the Automation Connect your Google Workspace account to n8n if you haven't already Add your Airtop API credentials (Optionally) Configure your Airtop Profile and sign-in to LinkedIn in order to validate profile URL's Run Your First Test Add a few test entries to your Google Sheet Run the workflow Check the results in your output columns Customization Options While the default setup uses Google Sheets, this automation is highly flexible: Webhook Integration**: Perfect for integrating with tools like Clay, Instantly, or your custom applications Alternatives**: Replace Google Sheets with Airtable, Notion, or any other tools you already use for more robust database capabilities Custom Output Formatting**: Modify the output structure to match your existing systems Batch Processing**: Configure for bulk processing of multiple profiles Real-World Applications This automation has the potential to transform how we organizations handle profile enrichment. Recruiting Firm Success Story With this automation, a recruiting firm could save hundreds of dollars a month in data enrichment fees, achieve better accuracy, and eliminate subscription costs. They would also be able to process thousands of profiles weekly with near-perfect accuracy. Sales Team Integration A B2B sales team could integrate this automation with their CRM, automatically enriching new leads with validated LinkedIn profiles and saving their SDRs hours per week on manual research. Best Practices To maximize the accuracy of your results: Always include company information (domain or company name) with your search queries Use full names rather than nicknames or initials when possible Consider including location data for more accurate results with common names Implement rate limiting to respect LinkedIn's usage guidelines Keep your input data clean and standardized for best results Use the integrated proxy to navigate more effectively through Google and LinkedIn What's Next? Now that you've automated LinkedIn profile discovery, consider exploring related automations: Automated lead scoring based on LinkedIn profile data Email finder automation using validated LinkedIn profiles Integration with your CRM for automated contact enrichment
by Preston Zeller
How It Works This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps: When a new lead is received through your CRM webhook, the workflow captures their address information It then makes an API call to BatchData to retrieve comprehensive property details A sophisticated scoring algorithm evaluates the lead based on property characteristics like: Property value (higher values earn more points) Square footage (larger properties score higher) Property age (newer constructions score higher) Investment status (non-owner occupied properties earn bonus points) Lot size (larger lots receive additional score) Leads are automatically classified into categories (high-value, qualified, potential, or unqualified) The workflow updates your CRM with enriched property data and qualification scores High-value leads trigger immediate follow-up tasks for your team Notifications are sent to your preferred channel (Slack in this example) The entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately.. Who It's For This workflow is perfect for: Real estate agents and brokers looking to prioritize high-value property leads Mortgage lenders who need to qualify borrowers based on property assets Home service providers (renovators, contractors, solar installers) targeting specific property types Property investors seeking specific investment opportunities Real estate marketers who want to segment audiences by property value Home insurance agents qualifying leads based on property characteristics Any business that bases lead qualification on property details will benefit from this automated qualification system. About BatchData BatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides: Property valuation and estimates Ownership information Property characteristics (size, age, bedrooms, bathrooms) Tax assessment data Transaction history Occupancy status (owner-occupied vs. investment) Lot details and dimensions By integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information. This workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.
by Encoresky
This workflow automates the process of handling conversation transcriptions and distributing key information across your organization. Here's what it does: Trigger: The workflow is initiated via a webhook that receives a transcription (e.g., from a call or meeting). Summarization & Extraction: Using AI, the transcription is summarized, and key information is extracted β such as action items, departments involved, and client details. Department Notifications: The relevant summarized information is automatically routed to specific departments via email based on content classification. CRM Sync: The summarized version is saved to the associated contact or deal in HubSpot for future reference and visibility. *Multi-Channel Alerts: *The summary is also sent via WhatsApp and Slack to keep internal teams instantly informed, regardless of platform. Use Case: Ideal for sales, customer service, or operations teams who manage client conversations and want to ensure seamless cross-departmental communication, documentation, and follow-up. Apps Used: Webhook (Trigger) OpenAI (or other AI/NLP for summarization) HubSpot Email Slack WhatsApp (via Twilio or third-party provider)
by WeblineIndia
This smart automation workflow created by the AI development team at WeblineIndia, helps with the daily collection and storage of weather data. Using the OpenWeatherMap API and Airtable, this solution gathers vital weather details such as temperature, humidity, and wind speed. The automation ensures daily updates, creating a dependable historical record of weather patterns for future reference and analysis. Steps: Set Schedule Trigger Configure a Cron node to trigger the workflow daily, for example, at 7 AM. Fetch Weather Data (HTTP Request) Use the HTTP Request node to retrieve weather data from the OpenWeatherMap API. Include your API key and query parameters (e.g., q=London, unit=metric) to specify the city and desired units. Parse Weather Data Utilize a JSON Parse node to extract key weather details, such as temperature, humidity, and wind speed, from the API response. Store Data in Airtable Use the Airtable node to insert the parsed data into the designated Airtable table. Ensure proper mapping of fields like temperature, humidity, and wind speed. Save and Execute Save the workflow and activate it to ensure weather data is fetched and stored automatically every day. Outcome This robust solution, developed by WeblineIndia, reliably collects and archives daily weather data, providing businesses and individuals with an accessible record of weather trends for analysis and decision-making. About WeblineIndia We specialize in creating custom automation solutions and innovative software workflows to help businesses streamline operations and achieve efficiency. This weather data fetcher is just one example of our expertise in delivering value through technology.
by Robert Breen
π« Email Unsubscribe Handler for Outlook Description This n8n workflow automatically scans recent email replies from your Outlook inbox and identifies unsubscribe requests. If a contact replies with any variation of "unsubscribe" within the past 7 days, the system performs two key actions: Saves the contactβs email address in a BigQuery unsubscribes table (for compliance and tracking). Deletes that contact from the active leads table in BigQuery (to stop future outreach). This flow can be triggered on a schedule (e.g. every 4 hours) or run manually as needed. Key Features π₯ Email Parsing from Outlook: Automatically monitors for replies that contain unsubscribe language. π§ Smart Filtering: Captures unsubscribes based on message content, not just subject lines. π BigQuery Integration: Logs unsubscribed emails and removes them from your leads dataset. π€ Connect with Me Description Iβm Robert Breen, founder of Ynteractive β a consulting firm that helps businesses automate operations using n8n, AI agents, and custom workflows. Iβve helped clients build everything from intelligent chatbots to complex sales automations, and Iβm always excited to collaborate or support new projects. If you found this workflow helpful or want to talk through an idea, Iβd love to hear from you. Links π Website: https://www.ynteractive.com πΊ YouTube: @ynteractivetraining πΌ LinkedIn: https://www.linkedin.com/in/robert-breen π¬ Email: rbreen@ynteractive.com
by Bastien Laval
Description Boost your productivity and keep your Asana workspace clutter-free with this n8n workflow. It automatically scans for tasks whose due dates have passed and reschedules them to the current date, ensuring no important to-dos slip through the cracks. Additionally, any completed tasks in Asana with an overdue date are removed, maintaining a clear, organized task list. Key Benefits Streamline Task Management**: No more manual updatesβlet the workflow reschedule overdue tasks for you. Optimize Workspace Organization**: Eliminate finished tasks to focus on active priorities and reduce clutter. Save Time and Effort**: Automate repetitive maintenance, freeing you to concentrate on what truly matters. Configuration Steps Add your Asana credentials Schedule the workflow to run at desired intervals (e.g., daily or weekly). Select your Workspace Name and your Assignee Name (user) in the Get user tasks node (Optional) Tailor filtering conditions to match your preferred due-date rules and removal criteria. Activate the workflow and watch your Asana workspace stay up to date and clutter-free.
by Arlin Perez
Make your n8n instance faster, cleaner, and more efficient by deleting old workflow executions β while keeping only the most recent ones you actually need. Whether you're using n8n Cloud or self-hosted, this lightweight workflow helps reduce database/storage usage and improves UI responsiveness, using only official n8n nodes. π Description Automatically clean up old executions in your n8n instance using only official nodes β no external database queries required. Whether you're on the Cloud version or running self-hosted, this workflow helps you optimize performance and keep your instance tidy by maintaining only the most recent executions per workflow. Ideal for users managing dozens or hundreds of workflows, this solution reduces storage usage and improves the responsiveness of the n8n UI, especially in environments where execution logs can accumulate quickly. β What It Does Retrieves up to 250 recent executions across all workflows Groups executions by workflow Keeps only the most recent N executions per workflow (value is configurable) Deletes all older executions (regardless of their status: success, error, etc.) Works entirely with native n8n nodes β no external database access required Optionally: set the number of executions to keep as 0 to delete all past executions from your instance in a single run π οΈ How to Set Up πβCreate a Personal API Key in your n8n instance: Go to Settings β API Keys β Create a new key π§βCreate a new n8n API Credential (used by both nodes): In your n8n credentials panel: Name: anything you like (e.g., βInternal API Accessβ) API Key: paste the Personal API Key you just created Base URL: your full n8n instance URL with the /api/v1 path, e.g. https://your-n8n-instance.com/api/v1 β βUse this credential in both: The Get Many Executions node (to fetch recent executions) The Delete Many Executions node (to remove outdated executions) π§©βIn the βSet Executions to Keepβ node: Edit the variable executionsToKeep and set the number of most recent executions to retain per workflow (e.g. 10) Tip: Set it to 0 to delete all executions π¦βNote: The βGet Many Executionsβ node will retrieve up to 250 executions per run β this is the maximum allowed by the n8n API. π§ βNo further setup is required β the filtering and grouping logic is handled inside the Code Node automatically. π§ͺ Included Nodes Overview π Schedule Trigger β Set to run daily, weekly, etc. π₯ Get Many Executions β Fetches past executions via n8n API π οΈ Set Executions to Keep β Set how many recent ones to keep π§ Code Node β Filters out executions to delete per workflow ποΈ Delete Executions β Deletes outdated executions π‘ Why Use This? Reduce clutter and improve performance in your n8n instance Maintain execution logs only when theyβre useful Avoid bloating your storage or database with obsolete data Compatible with both n8n Cloud and self-hosted setups Uses only official, supported n8n nodes β no SQL, no extra setup π This workflow modifies and deletes execution data. Always review and test it first on a staging instance or on a limited set of workflows before using it in production.
by Sankalp Dev
This automation workflow transforms Meta advertising data into executive ready presentation decks, eliminating manual report creation while ensuring stakeholders receive consistent performance insights. It generates professional Google Slides presentations from your ad campaigns and delivers them automatically via email to designated recipients. By combining scheduled data extraction with AI-powered analysis and automated presentation building, you'll receive polished, actionable reports that facilitate strategic advertising decisions and client communication Key Features: Scheduled automated summary deck generation (daily, weekly, or monthly) AI powered data analysis using advanced language models Intelligent presentation generation with actionable recommendations Direct email delivery of formatted summary decks Prerequisites: GoMarble MCP account and API access Anthropic account Google Slides, Google Drive & Gmail accounts n8n instance (cloud or self-hosted) Configuration Time: ~15-20 minutes Step By Step Setup: 1. Connect GoMarble MCP to n8n Follow the integration guide: GoMarble MCP Setup Configure your Meta Ads account credentials in GoMarble platform 2. Configure the Schedule Trigger 3.Customize the Ad Account Settings. Update the account name to match your ad account name. 4. Customise the Report Prompt (Although the workflow includes a pre configured template report prompt) Define specific metrics and KPIs to track Set analysis parameters and report format preferences 5. Set up AI Agent Configuration Configure Anthropic Claude model with your API credentials Connect the GoMarble MCP tools for Meta advertising data 6. Configure Google Services Integration Set up Google Slides OAuth2 API for presentation creation Configure Google Drive OAuth2 API for file management Link Gmail OAuth2 for automated email delivery 7. Customize Email Delivery Set recipient email addresses for stakeholders Customize email subject line and message content Advanced Configuration Modify report prompt to include specific metrics and KPIs Adjust slide content structure (5-slide format: Executive Snapshot, Channel KPIs, Top Campaigns, Under-performers, Action Recommendations) What You'll Get Automated Presentation Creation: Weekly Google Slides decks generated without manual intervention Professional Ads Analysis: Executive-ready performance summaries with key metrics and insights Structured Intelligence: Consistent 5-slide format covering spend, ROAS, campaign performance, and strategic recommendations Direct Stakeholder Delivery: Presentations automatically emailed as attachments to specified recipients Data-Driven Insights: AI-powered analysis of campaign performance with actionable next steps Scalable Reporting: Easy to modify timing, recipients, or content structure as business needs evolve Perfect for marketing teams, agencies, and business owners who need regular Meta advertising performance updates delivered professionally without manual report creation.
by Hemanth Arety
Automatically fetch, curate, and distribute Reddit content digests using AI-powered filtering. This workflow monitors multiple subreddits, ranks posts by relevance, removes spam and duplicates, then delivers beautifully formatted digests to Telegram, Discord, or Slack. Who's it for Perfect for content creators tracking trends, marketers monitoring discussions, researchers following specific topics, and community managers staying informed. Anyone who wants high-quality Reddit updates without manually browsing multiple subreddits. How it works The workflow fetches top posts from your chosen subreddits using Reddit's JSON API (no authentication required). Posts are cleaned, deduplicated, and filtered by upvote threshold and custom keywords. An AI model (Google Gemini, OpenAI, or Claude) then ranks remaining posts by relevance, filters out low-quality content, and generates a formatted digest. The final output is delivered to your preferred messaging platform on a schedule or on-demand. Setup requirements n8n version 1.0+ AI provider API key (Google Gemini recommended - has free tier) At least one messaging platform configured: Telegram bot token + chat ID Discord webhook URL Slack OAuth token + channel access How to set up Open the Configuration node and edit subreddit list, post counts, and keywords Configure the Schedule Trigger or use manual execution Add your AI provider credentials in the AI Content Curator node Enable and configure your preferred delivery platform (Telegram/Discord/Slack) Test with manual execution, then activate the workflow Customization options Subreddits**: Add unlimited subreddits to monitor (comma-separated) Time filters**: Choose from hour, day, week, month, year, or all-time top posts Keywords**: Set focus keywords to prioritize and exclude keywords to filter out Post count**: Adjust how many posts to fetch vs. how many appear in final digest AI prompt**: Customize ranking criteria and output format in the AI node Schedule**: Use cron expressions for hourly, daily, or weekly digests Output format**: Modify the formatting code to match your brand style Add email notifications, database storage, or RSS feed generation by extending the workflow with additional nodes.
by Felix Kemeth
Overview Staying up to date with fast-moving topics like AI, machine learning, or your specific industry can be overwhelming. You either drown in daily noise or miss important developments between weekly digests. This AI News Agent workflow delivers a curated newsletter only when there's genuinely relevant news. I use it myself for AI and n8n topics. Key features: AI-driven send decision**: An AI agent evaluates whether today's news is worth sending. Deduplication**: Compares candidate articles against past newsletters to avoid repetition. Real-time news**: Uses SerpAPI's DuckDuckGo News engine for fresh results. Frequency guardrails**: Configure minimum and maximum days between newsletters. In this post, I'll walk you through the complete workflow, explain each component, and show you how to set it up yourself. What this workflow does At a high level, the AI News Agent: Fetches fresh news twice daily via SerpAPI's DuckDuckGo News engine. Stores articles in a persistent data table with automatic deduplication. Filters for freshness - only considers articles newer than your last newsletter. Applies frequency guardrails - respects your min/max sending preferences. Makes an editorial decision - AI evaluates if the news is worth sending. Enriches selected articles - uses Tavily web search for fact-checking and depth. Delivers via Telegram - sends a clean, formatted newsletter. Remembers what it sent - stores each edition to prevent future repetition. This allows you to get newsletters only when there's genuinely relevant news - in contrast to a fixed schedule. Requirements To run this workflow, you need: SerpAPI key** Create an account at serpapi.com and generate an API key. They offer 250 free searches/month. Tavily API key** Sign up at app.tavily.com and create an API key. Generous free tier available. OpenAI API key** Get one from OpenAI - required for AI agent calls. Telegram bot + chat ID** A free Telegram bot (via BotFather) and the chat/channel ID where you want the newsletter. See Telegram's bot tutorial for setup. How it works The workflow is organized into five logical stages. Stage 1: Schedule & Configuration Schedule Trigger** Runs the workflow on a cron schedule. Default: 0 0 9,17 * * * (twice daily at 9:00 and 17:00). These frequent checks enable the AI to send newsletters at these times when it observes actually relevant news, not only once a week. I picked 09:00 and 17:00 as natural checkβin points at the start and end of a typical workday, so you see updates when youβre most likely to read them without being interrupted in the middle of deep work. With SerpAPIβs 250 free searches/month, running twice per day with a small set of topics (e.g. 2β3) keeps you comfortably below the limit; if you add more topics or increase the schedule frequency, either tighten the cron window or move to a paid SerpAPI plan to avoid hitting the cap. Set topics and language** A Set node that defines your configuration: topics: comma-separated list (e.g., AI, n8n) language: output language (e.g., English) minDaysBetween: minimum days to wait (0 = no minimum) maxDaysBetween: maximum days without sending (triggers a "must-send" fallback) Stage 2: Fetch & Store News Build topic queries** Splits your comma-separated topics into individual search queries: In DuckDuckGo News via SerpAPI, a query like AI,n8n looks for news where both βAIβ and βn8nβ appear. For a niche tool like n8n, this is often almost identical to just searching for n8n (docs). Itβs therefore better to split the topics, search for each of them separately, and let the AI later decide which news articles to select. return $input.first().json.topics.split(',').map(topic => ({ json: { topic: topic.trim() } })); Fetch news from SerpAPI (DuckDuckGo News)** HTTP Request node calling SerpAPI with: engine: duckduckgo_news q: your topic df: d (last day) Auth is handled via httpQueryAuth credentials with your SerpAPI key. SerpAPI also offers other news engines such as the Google News API (see here). DuckDuckGo News is used here because, unlike Google News, it returns an excerpt/snippet in addition to the title, source, and URL (see here)βgiving the AI more context to work with. _Another option is NewsAPI, but its free tier delays articles by 24 hours, so you miss the freshness window that makes these twice-daily checks valuable. DuckDuckGo News through SerpAPI keeps the workflow real-time without that lag._ n8n has official SerpAPI nodes, but as of writing there is no dedicated node for the DuckDuckGo News API. Thatβs why this workflow uses a custom HTTP Request node instead, which works the same under the hood while giving you full control over the DuckDuckGo News parameters. Split SerpAPI results into articles** Expands the results array so each article becomes its own item. Upsert articles into News table** Stores each article in an n8n data table with fields: title, source, url, excerpt, date. Uses upsert on title + URL to avoid duplicates. Date is normalized to ISO UTC: DateTime.fromSeconds(Number($json.date), {zone: 'utc'}).toISO() Stage 3: Filtering & Frequency Guardrails This is where the workflow gets smart about what to consider and when to send. Get previous newsletters β Sort β Get most recent** Pulls all editions from the Newsletters table and isolates the latest one with its createdAt timestamp. Combine articles with last newsletter metadata** Attaches the last newsletter timestamp to each candidate article. Filter articles newer than last newsletter** Keeps only articles published after the last edition. Uses a safe default date (2024-01-01) if no previous newsletter exists: $json.date_2 > ($json.createdAt_1 || DateTime.fromISO('2024-01-01T00:00:00.000Z')) Stop if last newsletter is too recent** Compares createdAt against your minDaysBetween setting. If you're still in the "too soon to send" window, the workflow short-circuits here. Stage 4: AI Editorial Decision This is the core intelligence of the workflow - an AI that decides whether to send and what to include. This stage is also the actual agentic part of the workflow, where the system makes its own decisions instead of just following a fixed schedule. Aggregate candidate articles for AI** Bundles today's filtered articles into a compact list with title, excerpt, source, and url. Limit previous newsletters to last 5 β Aggregate** Prepares the last 5 newsletter contents for the AI to check against for repetition. Combine candidate articles with past newsletters** Merges both lists so the AI sees "today's candidates" + "recent history" side by side. AI: decide send + select articles** The heart of the workflow. A GPT-5.1 call with a comprehensive editorial prompt: You are an AI Newsletter Editor. Your job is to decide whether todayβs newsletter edition should be sent, and to select the best articles. You will receive a list of articles with: 'title', 'excerpt', source, url. You will also receive content of previously sent newsletters (markdown). Your Tasks 1. Decide whether to send the newsletter Output "YES" only if all of the following are satisfied OR the fallback rule applies: Base Criteria There are at least 3 meaningful articles. Meaningful = not trivial, not purely promotional, not clickbait, contains actual informational value. Articles must be non-duplicate and non-overlapping: Not the same topic/headline rephrased Not reporting identical events with minor variations Not the same news covered by multiple sources without distinct insights Articles must be relevant to the user's topics: {{ $('Set topics and language').item.json.topics }} Articles must be novel relative to the topics in previous newsletters: Compare against all previous newsletters below Exclude articles that discuss topics already substantially covered Articles must offer clear value: New information Impact that matters to the user Insight, analysis, or meaningful expansion Fallback rule: Newsletter frequency requirement If at least 1 relevant article exists and the last newsletter was sent more than {{ $('Set topics and language').item.json.maxDaysBetween }} days ago, then you MUST return "YES" as a decision even if the other criteria are not completely met. Last newsletter was sent {{ $('Get most recent newsletter').item.json.createdAt ? Math.floor($now.diff(DateTime.fromISO($('Get most recent newsletter').item.json.createdAt), 'days').days) : 999 }} days ago. Otherwise β "NO" 2. If "YES": Select Articles Select the top 3β5 articles that best fulfill the criteria above. For each selected article, output: title** (rewrite for clarity, conciseness, and impact) summary** (1β2 sentences; written in the output language) source** url** All summaries must be written in: {{ $('Set topics and language').item.json.language }} Output Format (JSON) { "decision": "YES or NO", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } When "decision": "NO", return an empty array for "articles". Article Input Use these articles: {{ $json.results.map( article => `Title: ${article.title_2} Excerpt: ${article.excerpt_2} Source: ${article.source_2} URL: ${article.url_2}` ).join('\n---\n') }} You must also consider the topics already covered in previous newsletters to avoid repetition: {{ $json.newsletters.map(x => Newsletter: ${x.content}).join('\n---\n') }} The AI outputs structured JSON: { "decision": "YES", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } If AI decided to send newsletter** Routes based on decision === "YES". If NO, the workflow ends gracefully. Stage 5: Content Enrichment & Delivery Split selected articles for enrichment** Each selected article becomes its own item for individual processing. AI: enrich & write article** An AI Agent node with GPT-5.1 + Tavily web search tool. For each article: You are a research writer that updates short news summaries into concise, factual articles. Input: Title: {{ $json["title"] }} Summary: {{ $json["summary"] }} Source: {{ $json["source"] }} Original URL: {{ $json["url"] }} Language: {{ $('Set topics and language').item.json.language }} Instructions: Use Tavily Search to gather 2β3 reliable, recent, and relevant sources on this topic. Update the title if a more accurate or engaging one exists. Write 1β2 sentences summarizing the topic, combining the original summary and information from the new sources. Return the original source name and url as well. Output (JSON): { "title": "final article title", "content": "concise 1β2 sentence article content", "source": "the name of the original source", "url": "the url of the original source" } Rules: Ensure the topic is relevant, informative, and timely. Translate the article if necessary to comply with the desired language {{ $('Set topics and language').item.json.language }}. The Output Parser enforces the JSON schema with title, content, source, and url fields. Aggregate enriched articles** Collects all enriched articles back into a single array. Insert newsletter content into Newsletters table** Stores the final markdown content for future deduplication: $json.output.map(article => { const title = JSON.stringify(article.title).slice(1, -1); const content = JSON.stringify(article.content).slice(1, -1); const source = JSON.stringify(article.source).slice(1, -1); const url = JSON.stringify(article.url).slice(1, -1); return ${title}\n${content}\nSource: ${source}; }).join('\n\n') Send newsletter to Telegram** Sends the formatted newsletter to your Telegram chat/channel. Why this workflow is powerful Intelligent send decisions** The AI evaluates news quality before sending, leading to a less noisy and more relevant news digest. Memory across editions** By persisting newsletters and comparing against history, the workflow avoids repetition. Frequency guardrails with flexibility** Set boundaries (e.g., "at least 1 day between sends" and "must send within 5 days"), but let the AI decide the optimal moment within those bounds. Source-level deduplication** The news table with upsert prevents the same article from being considered multiple times across runs. Grounded in facts** SerpAPI provides real news sources; Tavily enriches with additional verification. The newsletter stays factual. Configurable and extensible** Change topics, language, frequency - all in one Set node. In addition, the workflow is modular, allowing to add new news sources or new delivery channels without touching the core logic. Configuration guide To customize this workflow for your needs: Topics and language Open Set topics and language and modify: topics: your interests (e.g., machine learning, startups, TypeScript) language: your preferred output language Frequency settings minDaysBetween: minimum days between newsletters (0 = no limit) maxDaysBetween: maximum gap before forcing a send For very high-volume topics (such as "AI"), expect the workflow to send almost every time once minDaysBetween has passed, because the content-quality criteria are usually met. Schedule Modify the Schedule Trigger cron expression. Default runs twice daily at 9:00 am and 5:00 pm; adjust to your preference. Telegram Update the chatId in the Telegram node to your chat/channel. Credentials Set up credentials for: SerpAPI (httpQueryAuth), Tavily, OpenAI, Telegram. Next steps and improvements Here are concrete directions to take this workflow further: Multi-agent architecture** Split the current AI calls into specialized agents: signal detection, relevance scoring, editorial decision, content enhancement, and formatting - each with a single responsibility. 1:1 personalization** Move from static topics to weighted preferences. Learn from click behavior and feedback. Telegram feedback buttons** Add inline buttons (π Useful / π Not relevant / π More like this) and feed signals back into ranking. Email with HTML template** For more flexibility, send the newsletter via email. Incorporating other news APIs or RSS feeds** Add more sources such as other news APIs and RSS feeds from blogs, newsletters, or communities. Adjust for arxiv paper search and research news** Swap SerpAPI for arxiv search or other academic sources to obtain a personal research digest newsletter. Images and thumbnails** Fetch representative images for each article and include them in the newsletter. Web archive** Auto-publish each edition as a web page with permalinks. Retry logic and error handling** Add exponential backoff for external APIs and route failures to an error workflow. Prompt versioning** Move prompts to a data table with versioning for A/B testing and rollback. Audio and video news** Use audio or video models for better news communication. Wrap-up This AI News Agent workflow represents a significant evolution from simple scheduled newsletters. By adding intelligent send decisions, historical deduplication, and frequency guardrails, you get a newsletter that respects the quality of available news. I use this workflow myself to stay informed on AI and automation topics without the overload of daily news or the delayed delivery caused by a fixed newsletter schedule. Need help with your automations? Contact me here.