by Ranjan Dailata
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primary_keywords, seo_strength_score, keyword_density_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors** → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection** → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report** → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization** → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs** → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo** for intelligent web scraping OpenAI GPT-4.1-mini** for keyword and SEO analysis Google Sheets** for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
by Sankalp Dev
This automation workflow transforms Meta advertising data into executive ready presentation decks, eliminating manual report creation while ensuring stakeholders receive consistent performance insights. It generates professional Google Slides presentations from your ad campaigns and delivers them automatically via email to designated recipients. By combining scheduled data extraction with AI-powered analysis and automated presentation building, you'll receive polished, actionable reports that facilitate strategic advertising decisions and client communication Key Features: Scheduled automated summary deck generation (daily, weekly, or monthly) AI powered data analysis using advanced language models Intelligent presentation generation with actionable recommendations Direct email delivery of formatted summary decks Prerequisites: GoMarble MCP account and API access Anthropic account Google Slides, Google Drive & Gmail accounts n8n instance (cloud or self-hosted) Configuration Time: ~15-20 minutes Step By Step Setup: 1. Connect GoMarble MCP to n8n Follow the integration guide: GoMarble MCP Setup Configure your Meta Ads account credentials in GoMarble platform 2. Configure the Schedule Trigger 3.Customize the Ad Account Settings. Update the account name to match your ad account name. 4. Customise the Report Prompt (Although the workflow includes a pre configured template report prompt) Define specific metrics and KPIs to track Set analysis parameters and report format preferences 5. Set up AI Agent Configuration Configure Anthropic Claude model with your API credentials Connect the GoMarble MCP tools for Meta advertising data 6. Configure Google Services Integration Set up Google Slides OAuth2 API for presentation creation Configure Google Drive OAuth2 API for file management Link Gmail OAuth2 for automated email delivery 7. Customize Email Delivery Set recipient email addresses for stakeholders Customize email subject line and message content Advanced Configuration Modify report prompt to include specific metrics and KPIs Adjust slide content structure (5-slide format: Executive Snapshot, Channel KPIs, Top Campaigns, Under-performers, Action Recommendations) What You'll Get Automated Presentation Creation: Weekly Google Slides decks generated without manual intervention Professional Ads Analysis: Executive-ready performance summaries with key metrics and insights Structured Intelligence: Consistent 5-slide format covering spend, ROAS, campaign performance, and strategic recommendations Direct Stakeholder Delivery: Presentations automatically emailed as attachments to specified recipients Data-Driven Insights: AI-powered analysis of campaign performance with actionable next steps Scalable Reporting: Easy to modify timing, recipients, or content structure as business needs evolve Perfect for marketing teams, agencies, and business owners who need regular Meta advertising performance updates delivered professionally without manual report creation.
by oka hironobu
Forecast sales trends and generate reports with Stripe, Sheets, and Gemini AI Who is this for Revenue operations teams, SaaS growth managers, and sales directors who need automated weekly insights from their Stripe payment data. Perfect for small to medium businesses tracking subscription revenue, one-time charges, and refund patterns without manual spreadsheet work. How it works Every Monday morning, the workflow pulls the previous week's charges, subscriptions, and refunds from Stripe's API. It merges this fresh data with historical sales records stored in Google Sheets, then calculates key metrics like week-over-week growth, moving averages, and MRR estimates. Google Gemini AI analyzes the compiled data to identify trends, predict next week's performance, and flag unusual revenue patterns. When significant changes are detected (20%+ variance), the system triggers targeted alerts through a separate Slack channel. All insights get logged to a Google Sheets history for tracking, while a comprehensive dashboard page updates automatically in Notion. The weekly summary posts to your main sales Slack channel, and executives receive detailed email reports with strategic recommendations. How to set up Configure Stripe API credentials with read access to charges, subscriptions, and refunds. Set up Google Sheets OAuth for both reading historical data and writing analysis logs. Create a Notion integration with page update permissions for your sales dashboard. Add Slack OAuth credentials for posting to your chosen sales and alerts channels. Configure Gmail SMTP for executive email delivery. Update the Configuration Settings node with your specific IDs, channels, and email addresses. Requirements Stripe account with API access Google Sheets with historical sales data Notion workspace for dashboard Slack workspace with posting permissions Gmail account for executive reports Google Gemini API access How to customize Adjust the anomaly detection threshold in the Calculate Trends code node (currently 50% variance triggers alerts). Modify the Slack message templates, email formatting, or add additional metrics to the Notion dashboard. Change the schedule trigger from weekly to daily or monthly based on your reporting needs.
by Hemanth Arety
Automatically fetch, curate, and distribute Reddit content digests using AI-powered filtering. This workflow monitors multiple subreddits, ranks posts by relevance, removes spam and duplicates, then delivers beautifully formatted digests to Telegram, Discord, or Slack. Who's it for Perfect for content creators tracking trends, marketers monitoring discussions, researchers following specific topics, and community managers staying informed. Anyone who wants high-quality Reddit updates without manually browsing multiple subreddits. How it works The workflow fetches top posts from your chosen subreddits using Reddit's JSON API (no authentication required). Posts are cleaned, deduplicated, and filtered by upvote threshold and custom keywords. An AI model (Google Gemini, OpenAI, or Claude) then ranks remaining posts by relevance, filters out low-quality content, and generates a formatted digest. The final output is delivered to your preferred messaging platform on a schedule or on-demand. Setup requirements n8n version 1.0+ AI provider API key (Google Gemini recommended - has free tier) At least one messaging platform configured: Telegram bot token + chat ID Discord webhook URL Slack OAuth token + channel access How to set up Open the Configuration node and edit subreddit list, post counts, and keywords Configure the Schedule Trigger or use manual execution Add your AI provider credentials in the AI Content Curator node Enable and configure your preferred delivery platform (Telegram/Discord/Slack) Test with manual execution, then activate the workflow Customization options Subreddits**: Add unlimited subreddits to monitor (comma-separated) Time filters**: Choose from hour, day, week, month, year, or all-time top posts Keywords**: Set focus keywords to prioritize and exclude keywords to filter out Post count**: Adjust how many posts to fetch vs. how many appear in final digest AI prompt**: Customize ranking criteria and output format in the AI node Schedule**: Use cron expressions for hourly, daily, or weekly digests Output format**: Modify the formatting code to match your brand style Add email notifications, database storage, or RSS feed generation by extending the workflow with additional nodes.
by Felix Kemeth
Overview Staying up to date with fast-moving topics like AI, machine learning, or your specific industry can be overwhelming. You either drown in daily noise or miss important developments between weekly digests. This AI News Agent workflow delivers a curated newsletter only when there's genuinely relevant news. I use it myself for AI and n8n topics. Key features: AI-driven send decision**: An AI agent evaluates whether today's news is worth sending. Deduplication**: Compares candidate articles against past newsletters to avoid repetition. Real-time news**: Uses SerpAPI's DuckDuckGo News engine for fresh results. Frequency guardrails**: Configure minimum and maximum days between newsletters. In this post, I'll walk you through the complete workflow, explain each component, and show you how to set it up yourself. What this workflow does At a high level, the AI News Agent: Fetches fresh news twice daily via SerpAPI's DuckDuckGo News engine. Stores articles in a persistent data table with automatic deduplication. Filters for freshness - only considers articles newer than your last newsletter. Applies frequency guardrails - respects your min/max sending preferences. Makes an editorial decision - AI evaluates if the news is worth sending. Enriches selected articles - uses Tavily web search for fact-checking and depth. Delivers via Telegram - sends a clean, formatted newsletter. Remembers what it sent - stores each edition to prevent future repetition. This allows you to get newsletters only when there's genuinely relevant news - in contrast to a fixed schedule. Requirements To run this workflow, you need: SerpAPI key** Create an account at serpapi.com and generate an API key. They offer 250 free searches/month. Tavily API key** Sign up at app.tavily.com and create an API key. Generous free tier available. OpenAI API key** Get one from OpenAI - required for AI agent calls. Telegram bot + chat ID** A free Telegram bot (via BotFather) and the chat/channel ID where you want the newsletter. See Telegram's bot tutorial for setup. How it works The workflow is organized into five logical stages. Stage 1: Schedule & Configuration Schedule Trigger** Runs the workflow on a cron schedule. Default: 0 0 9,17 * * * (twice daily at 9:00 and 17:00). These frequent checks enable the AI to send newsletters at these times when it observes actually relevant news, not only once a week. I picked 09:00 and 17:00 as natural check‑in points at the start and end of a typical workday, so you see updates when you’re most likely to read them without being interrupted in the middle of deep work. With SerpAPI’s 250 free searches/month, running twice per day with a small set of topics (e.g. 2–3) keeps you comfortably below the limit; if you add more topics or increase the schedule frequency, either tighten the cron window or move to a paid SerpAPI plan to avoid hitting the cap. Set topics and language** A Set node that defines your configuration: topics: comma-separated list (e.g., AI, n8n) language: output language (e.g., English) minDaysBetween: minimum days to wait (0 = no minimum) maxDaysBetween: maximum days without sending (triggers a "must-send" fallback) Stage 2: Fetch & Store News Build topic queries** Splits your comma-separated topics into individual search queries: In DuckDuckGo News via SerpAPI, a query like AI,n8n looks for news where both “AI” and “n8n” appear. For a niche tool like n8n, this is often almost identical to just searching for n8n (docs). It’s therefore better to split the topics, search for each of them separately, and let the AI later decide which news articles to select. return $input.first().json.topics.split(',').map(topic => ({ json: { topic: topic.trim() } })); Fetch news from SerpAPI (DuckDuckGo News)** HTTP Request node calling SerpAPI with: engine: duckduckgo_news q: your topic df: d (last day) Auth is handled via httpQueryAuth credentials with your SerpAPI key. SerpAPI also offers other news engines such as the Google News API (see here). DuckDuckGo News is used here because, unlike Google News, it returns an excerpt/snippet in addition to the title, source, and URL (see here)—giving the AI more context to work with. _Another option is NewsAPI, but its free tier delays articles by 24 hours, so you miss the freshness window that makes these twice-daily checks valuable. DuckDuckGo News through SerpAPI keeps the workflow real-time without that lag._ n8n has official SerpAPI nodes, but as of writing there is no dedicated node for the DuckDuckGo News API. That’s why this workflow uses a custom HTTP Request node instead, which works the same under the hood while giving you full control over the DuckDuckGo News parameters. Split SerpAPI results into articles** Expands the results array so each article becomes its own item. Upsert articles into News table** Stores each article in an n8n data table with fields: title, source, url, excerpt, date. Uses upsert on title + URL to avoid duplicates. Date is normalized to ISO UTC: DateTime.fromSeconds(Number($json.date), {zone: 'utc'}).toISO() Stage 3: Filtering & Frequency Guardrails This is where the workflow gets smart about what to consider and when to send. Get previous newsletters → Sort → Get most recent** Pulls all editions from the Newsletters table and isolates the latest one with its createdAt timestamp. Combine articles with last newsletter metadata** Attaches the last newsletter timestamp to each candidate article. Filter articles newer than last newsletter** Keeps only articles published after the last edition. Uses a safe default date (2024-01-01) if no previous newsletter exists: $json.date_2 > ($json.createdAt_1 || DateTime.fromISO('2024-01-01T00:00:00.000Z')) Stop if last newsletter is too recent** Compares createdAt against your minDaysBetween setting. If you're still in the "too soon to send" window, the workflow short-circuits here. Stage 4: AI Editorial Decision This is the core intelligence of the workflow - an AI that decides whether to send and what to include. This stage is also the actual agentic part of the workflow, where the system makes its own decisions instead of just following a fixed schedule. Aggregate candidate articles for AI** Bundles today's filtered articles into a compact list with title, excerpt, source, and url. Limit previous newsletters to last 5 → Aggregate** Prepares the last 5 newsletter contents for the AI to check against for repetition. Combine candidate articles with past newsletters** Merges both lists so the AI sees "today's candidates" + "recent history" side by side. AI: decide send + select articles** The heart of the workflow. A GPT-5.1 call with a comprehensive editorial prompt: You are an AI Newsletter Editor. Your job is to decide whether today’s newsletter edition should be sent, and to select the best articles. You will receive a list of articles with: 'title', 'excerpt', source, url. You will also receive content of previously sent newsletters (markdown). Your Tasks 1. Decide whether to send the newsletter Output "YES" only if all of the following are satisfied OR the fallback rule applies: Base Criteria There are at least 3 meaningful articles. Meaningful = not trivial, not purely promotional, not clickbait, contains actual informational value. Articles must be non-duplicate and non-overlapping: Not the same topic/headline rephrased Not reporting identical events with minor variations Not the same news covered by multiple sources without distinct insights Articles must be relevant to the user's topics: {{ $('Set topics and language').item.json.topics }} Articles must be novel relative to the topics in previous newsletters: Compare against all previous newsletters below Exclude articles that discuss topics already substantially covered Articles must offer clear value: New information Impact that matters to the user Insight, analysis, or meaningful expansion Fallback rule: Newsletter frequency requirement If at least 1 relevant article exists and the last newsletter was sent more than {{ $('Set topics and language').item.json.maxDaysBetween }} days ago, then you MUST return "YES" as a decision even if the other criteria are not completely met. Last newsletter was sent {{ $('Get most recent newsletter').item.json.createdAt ? Math.floor($now.diff(DateTime.fromISO($('Get most recent newsletter').item.json.createdAt), 'days').days) : 999 }} days ago. Otherwise → "NO" 2. If "YES": Select Articles Select the top 3–5 articles that best fulfill the criteria above. For each selected article, output: title** (rewrite for clarity, conciseness, and impact) summary** (1–2 sentences; written in the output language) source** url** All summaries must be written in: {{ $('Set topics and language').item.json.language }} Output Format (JSON) { "decision": "YES or NO", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } When "decision": "NO", return an empty array for "articles". Article Input Use these articles: {{ $json.results.map( article => `Title: ${article.title_2} Excerpt: ${article.excerpt_2} Source: ${article.source_2} URL: ${article.url_2}` ).join('\n---\n') }} You must also consider the topics already covered in previous newsletters to avoid repetition: {{ $json.newsletters.map(x => Newsletter: ${x.content}).join('\n---\n') }} The AI outputs structured JSON: { "decision": "YES", "articles": [ { "title": "...", "summary": "...", "source": "...", "url": "..." } ] } If AI decided to send newsletter** Routes based on decision === "YES". If NO, the workflow ends gracefully. Stage 5: Content Enrichment & Delivery Split selected articles for enrichment** Each selected article becomes its own item for individual processing. AI: enrich & write article** An AI Agent node with GPT-5.1 + Tavily web search tool. For each article: You are a research writer that updates short news summaries into concise, factual articles. Input: Title: {{ $json["title"] }} Summary: {{ $json["summary"] }} Source: {{ $json["source"] }} Original URL: {{ $json["url"] }} Language: {{ $('Set topics and language').item.json.language }} Instructions: Use Tavily Search to gather 2–3 reliable, recent, and relevant sources on this topic. Update the title if a more accurate or engaging one exists. Write 1–2 sentences summarizing the topic, combining the original summary and information from the new sources. Return the original source name and url as well. Output (JSON): { "title": "final article title", "content": "concise 1–2 sentence article content", "source": "the name of the original source", "url": "the url of the original source" } Rules: Ensure the topic is relevant, informative, and timely. Translate the article if necessary to comply with the desired language {{ $('Set topics and language').item.json.language }}. The Output Parser enforces the JSON schema with title, content, source, and url fields. Aggregate enriched articles** Collects all enriched articles back into a single array. Insert newsletter content into Newsletters table** Stores the final markdown content for future deduplication: $json.output.map(article => { const title = JSON.stringify(article.title).slice(1, -1); const content = JSON.stringify(article.content).slice(1, -1); const source = JSON.stringify(article.source).slice(1, -1); const url = JSON.stringify(article.url).slice(1, -1); return ${title}\n${content}\nSource: ${source}; }).join('\n\n') Send newsletter to Telegram** Sends the formatted newsletter to your Telegram chat/channel. Why this workflow is powerful Intelligent send decisions** The AI evaluates news quality before sending, leading to a less noisy and more relevant news digest. Memory across editions** By persisting newsletters and comparing against history, the workflow avoids repetition. Frequency guardrails with flexibility** Set boundaries (e.g., "at least 1 day between sends" and "must send within 5 days"), but let the AI decide the optimal moment within those bounds. Source-level deduplication** The news table with upsert prevents the same article from being considered multiple times across runs. Grounded in facts** SerpAPI provides real news sources; Tavily enriches with additional verification. The newsletter stays factual. Configurable and extensible** Change topics, language, frequency - all in one Set node. In addition, the workflow is modular, allowing to add new news sources or new delivery channels without touching the core logic. Configuration guide To customize this workflow for your needs: Topics and language Open Set topics and language and modify: topics: your interests (e.g., machine learning, startups, TypeScript) language: your preferred output language Frequency settings minDaysBetween: minimum days between newsletters (0 = no limit) maxDaysBetween: maximum gap before forcing a send For very high-volume topics (such as "AI"), expect the workflow to send almost every time once minDaysBetween has passed, because the content-quality criteria are usually met. Schedule Modify the Schedule Trigger cron expression. Default runs twice daily at 9:00 am and 5:00 pm; adjust to your preference. Telegram Update the chatId in the Telegram node to your chat/channel. Credentials Set up credentials for: SerpAPI (httpQueryAuth), Tavily, OpenAI, Telegram. Next steps and improvements Here are concrete directions to take this workflow further: Multi-agent architecture** Split the current AI calls into specialized agents: signal detection, relevance scoring, editorial decision, content enhancement, and formatting - each with a single responsibility. 1:1 personalization** Move from static topics to weighted preferences. Learn from click behavior and feedback. Telegram feedback buttons** Add inline buttons (👍 Useful / 👎 Not relevant / 🔎 More like this) and feed signals back into ranking. Email with HTML template** For more flexibility, send the newsletter via email. Incorporating other news APIs or RSS feeds** Add more sources such as other news APIs and RSS feeds from blogs, newsletters, or communities. Adjust for arxiv paper search and research news** Swap SerpAPI for arxiv search or other academic sources to obtain a personal research digest newsletter. Images and thumbnails** Fetch representative images for each article and include them in the newsletter. Web archive** Auto-publish each edition as a web page with permalinks. Retry logic and error handling** Add exponential backoff for external APIs and route failures to an error workflow. Prompt versioning** Move prompts to a data table with versioning for A/B testing and rollback. Audio and video news** Use audio or video models for better news communication. Wrap-up This AI News Agent workflow represents a significant evolution from simple scheduled newsletters. By adding intelligent send decisions, historical deduplication, and frequency guardrails, you get a newsletter that respects the quality of available news. I use this workflow myself to stay informed on AI and automation topics without the overload of daily news or the delayed delivery caused by a fixed newsletter schedule. Need help with your automations? Contact me here.
by Takumi Oku
Who is this for Entrepreneurs looking for verified technology to license. R&D Teams tracking aerospace innovation. Content Creators covering tech trends. How it works Fetch: Gets the latest patents from NASA's Tech Transfer API. Filter & Loop: Removes empty entries and processes each patent individually. Analyze: Translates the abstract (DeepL) and uses OpenAI to brainstorm practical business applications. Archive: Saves the details to Google Sheets. Notify: Compiles a summary and sends it to Slack. How to set up Prepare Google Sheet: Create a new sheet with these exact headers in Row 1: Date Title Abstract_Translated Business_Idea Link Edit Settings: Double-click the Edit Settings node to add your Google Sheet ID, Sheet Name, and Slack Channel ID. Credentials: Configure credentials for OpenAI, DeepL, Google Sheets, and Slack. Activate: Run a test execution, then switch the workflow to Active. Requirements OpenAI: API Key (gpt-4o or gpt-3.5-turbo) DeepL: API Key (Free or Pro) Google Sheets: OAuth2 credentials with Drive/Sheets scopes. Slack: Bot User OAuth Token with chat:write scope. How to customize Change the Prompt: Edit the Generate Business Ideas node to tailor ideas for a specific niche (e.g., "Applications for medical devices"). Adjust Schedule: Change the trigger in the Weekly Schedule node to run daily or monthly. Different Output: Swap Slack for Microsoft Teams or Email nodes if preferred.
by Milo Bravo
Event Alumni Re-engager: HubSpot → Gemini → Personalized Outreach Who is this for? Event marketers and conference organizers who want to reactivate past attendees with AI-personalized emails at 3-5x ROI vs. cold leads. What problem is this workflow solving? Alumni gold is untapped: Past attendees convert 3-5x better Manual segmentation takes hours Generic emails get ignored This auto-segments + personalizes outreach from HubSpot data. What this workflow does CRM → AI Alumni Machine: Trigger**: Manual (or schedule 8-12 weeks pre-event) HubSpot Pull**: Past event_registration contacts Smart Filter**: Exclude current registrants 3-Tier Segments**: Champions (3+ events) / Returning (2) / One-timers (1) Gemini Personalization**: Unique copy per attendee Email Send**: Alumni-exclusive CTAs Slack Summary**: Campaign stats posted Main workflow required. This is a sub-workflow triggered by Event Registration with Auto-Enrichment Setup (7 minutes) HubSpot**: Header Auth (Bearer YOUR_API_KEY) Custom Property**: event_registration (comma-separated event IDs) Gemini**: Flash Lite API key Email**: SMTP + Slack OAuth2 Config**: Event details in Set Campaign Config Fully configurable—no code changes needed. How to customize to your needs Segments**: Add VIPs / Speakers / High-LTV CRM**: HubSpot → Salesforce → Sheets Copy**: Edit Gemini prompts for tone/industry Channels**: Add WhatsApp / LinkedIn Timing**: Cron for automated runs HubSpot Setup: event_registration property auto-created by companion template. ROI: 3-5x conversion** vs. cold leads 60% lower CAC** (alumni segment) 2h → 2min** campaign launch Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: event participant re-engagement, conference registration, HubSpot automation, personalized outreach, conference marketing
by Didac Fernandez
AI-Powered Financial Document Processing with Google Gemini This comprehensive workflow automates the complete financial document processing pipeline using AI. Upload invoices via chat, drop expense receipts into a folder, or add bank statements - the system automatically extracts, categorizes, and organizes all your financial data into structured Google Sheets. What this workflow does Processes three types of financial documents automatically: Invoice Processing**: Upload PDF invoices through a chat interface and get structured data extraction with automatic file organization Expense Management**: Monitor a Google Drive folder for new receipts and automatically categorize expenses using AI Bank Statement Processing**: Extract and organize transaction data from bank statements with multi-transaction support Financial Analysis**: Query all your financial data using natural language with an AI agent Key Features Multi-AI Persona System**: Four specialized AI personas (Mark, Donna, Victor, Andrew) handle different financial functions Google Gemini Integration**: Advanced document understanding and data extraction from PDFs Smart Expense Categorization**: Automatic classification into 17 business expense categories using LLM Real-time Monitoring**: Continuous folder watching for new documents with automatic processing Natural Language Queries**: Ask questions about your financial data in plain English Automatic File Management**: Intelligent file naming and organization in Google Drive Comprehensive Error Handling**: Robust processing that continues even when individual documents fail How it works Invoice Processing Flow User uploads PDF invoice via chat interface File is saved to Google Drive "Invoices" folder Google Gemini extracts structured data (vendor, amounts, line items, dates) Data is parsed and saved to "Invoice Records" Google Sheet File is renamed as "{Vendor Name} - {Invoice Number}" Confirmation message sent to user Expense Processing Flow User drops receipt PDF into "Expense Receipts" Google Drive folder System detects new file within 1 minute Google Gemini extracts expense data (merchant, amount, payment method) OpenRouter LLM categorizes expense into appropriate business category All data saved to "Expenses Recording" Google Sheet Bank Statement Processing Flow User uploads bank statement to "Bank Statements" folder Google Gemini extracts multiple transactions from statement Custom JavaScript parser handles various bank formats Individual transactions saved to "Bank Transactions Record" Google Sheet Financial Analysis Enable the analysis trigger when needed Ask questions in natural language about your financial data AI agent accesses all three spreadsheets to provide insights Get reports, summaries, and trend analysis What you need to set up Required APIs and Credentials Google Drive API** - For file storage and monitoring Google Sheets API** - For data storage and retrieval Google Gemini API** - For document processing and data extraction OpenRouter API** - For expense categorization (supports multiple LLM providers) Google Drive Folder Structure Create these folders in your Google Drive: "Invoices" - Processed invoice storage "Expense Receipts" - Drop zone for expense receipts (monitored) "Bank Statements" - Drop zone for bank statements (monitored) Google Sheets Setup Create three spreadsheets with these column headers: Invoice Records Sheet: Vendor Name, Invoice Number, Invoice Date, Due Date, Total Amount, VAT Amount, Line Item Description, Quantity, Unit Price, Total Price Expenses Recording Sheet: Merchant Name, Transaction Date, Total Amount, Tax Amount, Payment Method, Line Item Description, Quantity, Unit Price, Total Price, Category Bank Transactions Record Sheet: Transaction ID, Date, Description/Payee, Debit (-), Credit (+), Currency, Running Balance, Notes/Category Use Cases Small Business Accounting**: Automate invoice and expense tracking for bookkeeping Freelancer Financial Management**: Organize client invoices and business expenses Corporate Expense Management**: Streamline employee expense report processing Financial Data Analysis**: Generate insights from historical financial data Bank Reconciliation**: Automate transaction recording and account reconciliation Tax Preparation**: Maintain organized records with proper categorization Technical Highlights Expense Categories**: 17 predefined business expense categories (Cost of Goods Sold, Marketing, Payroll, etc.) Multi-format Support**: Handles various PDF layouts and bank statement formats Scalable Processing**: Processes multiple documents simultaneously Error Recovery**: Continues processing even when individual documents fail Natural Language Interface**: No technical knowledge required for financial queries Real-time Processing**: Documents processed within minutes of upload Benefits Time Savings**: Eliminates manual data entry from financial documents Accuracy**: AI-powered extraction reduces human error Organization**: Automatic file naming and categorization Insights**: Query financial data using natural language Compliance**: Maintains organized records for accounting and audit purposes Scalability**: Handles growing document volumes without additional overhead This workflow transforms tedious financial document processing into an automated, intelligent system that grows with your business needs.
by Lucía Maio Brioso
🧑💼 Who is this for? This workflow is for any YouTube user who wants to bulk delete all playlists from their own channel — whether to start fresh, clean up old content, or prepare the account for a new purpose. It’s useful for: Creators reorganizing their channel People transferring content to another account Anyone who wants to avoid deleting playlists manually one by one 🧠 What problem is this workflow solving? YouTube does not offer a built-in way to delete multiple playlists at once. If you have dozens or hundreds of playlists, removing them manually is extremely time-consuming. This workflow automates the entire deletion process in seconds, saving you hours of repetitive effort. ⚙️ What this workflow does Connects to your YouTube account Fetches all playlists you’ve created (excluding system playlists) Deletes them one by one** automatically > ⚠️ This action is irreversible. Once a playlist is deleted, it cannot be recovered. Use with caution. 🛠️ Setup 🔐 Create a YouTube OAuth2 credential in n8n for your channel. 🧭 Assign the credential to both YouTube nodes. ✅ Click “Test workflow” to execute. > 🟨 By default, this workflow deletes everything. If you want to be more selective, see the customization tips below. 🧩 How to customize this workflow to your needs ✅ Add a confirmation flag Insert a Set node with a custom field like confirm_delete = true, and follow it with an IF node to prevent accidental execution. ✂️ Delete only some playlists Add a Filter node after fetching playlists — you can match by title, ID, or keyword (e.g. only delete playlists containing “old”). 🛑 Add a pause before deletion Insert a Wait or NoOp node to give you a moment to cancel before it runs. 🔁 Adapt to scheduled cleanups Use a Cron trigger if you want to periodically clear temporary playlists.
by vinci-king-01
CRM Contact Sync with Mailchimp and Pipedrive This workflow keeps your contact records perfectly aligned between your CRM (e.g. HubSpot / Salesforce / Pipedrive) and Mailchimp. Whenever a contact is created or updated in one system, the automation propagates the change to the other platform, ensuring every email address, phone number and custom field stays in sync. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or cloud) Community nodes: Pipedrive, Mailchimp A dedicated service account in each platform with permission to read & write contacts Basic understanding of how webhooks work (for CRM → n8n triggers) Required Credentials Pipedrive API Token** – Used for creating, updating and searching contacts in Pipedrive Mailchimp API Key** – Grants access to lists/audiences and contact operations CRM Webhook Secret** (optional) – If your CRM supports signing webhook payloads Specific Setup Requirements | Environment Variable | Description | Example | |----------------------|--------------------------------------|---------------------------------| | PIPEDRIVE_API_KEY | Stored in n8n credential manager | 123abc456def | | MAILCHIMP_API_KEY | Stored in n8n credential manager | us-1:abcd1234efgh5678 | | MAILCHIMP_DC | Mailchimp Datacenter (sub-domain) | us-1 | | CRM_WEBHOOK_URL | Generated by the Webhook node | https://n8n.myserver/webhook/... | How it works This workflow keeps your contact records perfectly aligned between your CRM (e.g. HubSpot / Salesforce / Pipedrive) and Mailchimp. Whenever a contact is created or updated in one system, the automation propagates the change to the other platform, ensuring every email address, phone number and custom field stays in sync. Key Steps: Inbound Webhook**: Receives contact-change events from the CRM. Split in Batches**: Processes contacts in chunks to stay within API rate limits. Mailchimp Upsert**: Adds or updates each contact in the specified Mailchimp audience. Pipedrive Upsert**: Mirrors the same change in Pipedrive (or vice-versa). Merge & IF nodes**: Decide whether to create or update a contact by checking existence. Error Trigger**: Captures any API failures and posts them to the configured alert channel. Set up steps Setup Time: 15-25 minutes Create credentials • In n8n, add new credentials for Pipedrive and Mailchimp using your API keys. • Name them clearly (e.g. “Pipedrive Main”, “Mailchimp Main”). Import the workflow • Download or paste the JSON template into n8n. • Save the workflow. Configure the Webhook node • Set HTTP Method to POST. • Copy the generated URL and register it as a webhook in your CRM’s contact-update events. Map CRM fields • Open the first Set node and match CRM field names (firstName, lastName, email, etc.) to the standard keys used later in the flow. Select Mailchimp Audience • In the Mailchimp node, choose the audience/list that should receive the contacts. Define Pipedrive Person Fields • If you have custom fields, add them in the Pipedrive node’s Additional Fields section. Enable the workflow • Turn the workflow from “Inactive” to “Active”. • Send a test update from the CRM to verify that contacts appear in both Mailchimp and Pipedrive. Node Descriptions Core Workflow Nodes: Webhook** – Accepts contact-change payloads from the CRM. Set** – Normalises field names to a common schema. SplitInBatches** – Loops through contacts in controllable group sizes. HTTP Request** – Generic calls (e.g. HubSpot/Salesforce look-ups when required). Pipedrive** – Searches for existing persons; creates or updates accordingly. Mailchimp** – Performs contact upsert into an audience. If** – Branches logic on “contact exists?”. Merge** – Re-assembles branch data back into a single execution line. Code** – Small JS snippets for complex field transformations. Error Trigger** – Listens for any node failure and routes it to alerts. StickyNote** – Documentation hints inside the workflow. Data Flow: Webhook → Set (Normalise) → SplitInBatches SplitInBatches → Mailchimp (Get) → If (Exists?) → Mailchimp (Upsert) SplitInBatches → Pipedrive (Search) → If (Exists?) → Pipedrive (Upsert) Merge → End / Success Customization Examples Add a Tag to Mailchimp contacts // Place inside a Code node before the Mailchimp Upsert item.tags = ['Synced from CRM', 'High-Value']; return item; Apply a Deal Stage in Pipedrive // Pipedrive node → Additional Fields "deal": { "title": "New Lead from Mailchimp", "stage_id": 2 } Data Output Format The workflow outputs structured JSON data: { "id": 1472, "status": "updated", "email": "alex@example.com", "source": "CRM", "synced": { "pipedrive": "success", "mailchimp": "success" }, "timestamp": "2024-04-27T10:15:00Z" } Troubleshooting Common Issues HTTP 401 Unauthorized – Verify that the API keys are still valid and have not been revoked. Webhook receives no data – Double-check that the CRM webhook URL matches exactly and that the event is enabled. Performance Tips Batch contacts in groups of 50-100 to respect Mailchimp & Pipedrive rate limits. Use Continue On Fail in non-critical nodes to prevent the entire run from stopping. Pro Tips: Map your CRM’s custom fields once in the Set node to avoid touching each downstream node. Use Merge+If pattern to keep “create vs update” logic tidy. Enable workflow execution logs only in development to reduce storage usage. *Community Template Disclaimer: This workflow is provided by the n8n community “as is”. n8n GmbH makes no warranties regarding its performance, security or compliance. Always review and test in a development environment before using it in production.*
by Kumar SmartFlow Craft
🚀 How it works Fully automates your service order pipeline from incoming booking to supplier confirmation — with built-in SLA enforcement and automatic escalation if a supplier goes silent. 📥 Receives orders via webhook from your booking form or website 💳 Verifies payment against Stripe before processing anything 🤖 Extracts and structures order details (service type, address, date, priority) using Claude AI 👤 Upserts the customer contact and creates a deal in Freshworks CRM automatically 📧 Sends branded confirmation emails to the customer and assigned supplier via Postmark ⏱️ Enforces a 4-hour supplier acceptance SLA — escalates automatically if no response 🔁 Reassigns to a backup supplier and retries for 2 hours before flagging for manual review 🚨 Alerts your team on Slack if manual intervention is required 📊 Logs every outcome (confirmed, reassigned, escalated) to Google Sheets for full audit trail 🛠️ Set up steps Estimated setup time: ~30 minutes Webhook — copy the webhook URL and point it from your booking form or website checkout Stripe — add your Stripe secret key to the HTTP Request node; set the correct payment_intent field name from your payload Claude (Anthropic) — connect your Anthropic API credential; claude-sonnet-4-6 or higher recommended Freshworks CRM — connect your Freshworks credential; set your domain in the HTTP Request upsert node (e.g. yourcompany.freshsales.io) Postmark — add your Postmark Server Token to the HTTP Request nodes; update the sender email address Slack — connect Slack OAuth2; set your ops/dispatch channel in the alert nodes (e.g. #dispatch-alerts) Google Sheets — connect Google Sheets OAuth2; set your spreadsheet ID and sheet name in the log nodes Follow the sticky notes inside the workflow — each section has a one-liner setup guide 📋 Prerequisites Stripe account with payment intents enabled Anthropic API key (Claude API access) Freshworks CRM account (Growth plan or higher for API access) Postmark account with a verified sender domain Slack workspace with a bot or OAuth2 app Google Sheets spreadsheet set up as your audit log --- Custom Workflow Request with Personal Dashboard kumar@smartflowcraft.com https://www.smartflowcraft.com/contact More free templates https://www.smartflowcraft.com/n8n-templates
by Kumar SmartFlow Craft
⏺ 🚀 How it works Fully automates your inbound and outbound voice sales pipeline — from live call qualification to CRM pipeline management — with multi-agent AI and automatic lead nurturing if a prospect doesn't book. 📞 Receives end-of-call reports from Vapi or Retell AI via webhook — works with both providers out of the box 🧠 Qualifies every inbound lead using BANT scoring (Budget · Authority · Need · Timeline) powered by Claude Haiku 📅 Detects appointment intent and preferred meeting time using GPT-4o before touching your CRM 🗂️ Upserts the contact and creates a pipeline opportunity in GoHighLevel automatically — no duplicates 💬 Analyses objections and generates a rebuttal script using Claude Sonnet (feel-felt-found + Challenger Sale) 📝 Writes a professional CRM note from the call summary using Gemini 2.0 Flash — ready to sync 🔁 Enrols unqualified leads into a GoHighLevel nurture workflow automatically for long-term follow-up 📤 Fires prioritised outbound calls every morning at 9 AM via Vapi — GPT-4o Mini ranks leads by conversion probability 📊 Logs every call (inbound + outbound) to Supabase and Google Sheets for full pipeline reporting 🛠️ Set up steps Estimated setup time: ~45 minutes Webhook — copy the webhook URL and paste it into your Vapi or Retell dashboard as the end-of-call report URL GoHighLevel — connect your HighLevel OAuth2 credential; set your Pipeline ID, Hot Stage ID, and Nurturing Stage ID in the opportunity nodes (Opportunities → Settings → Pipelines) Anthropic — connect your Anthropic API credential; used for Claude Haiku (BANT qualification) and Claude Sonnet (objection handling) OpenAI — connect your OpenAI API credential; used for GPT-4o (booking intent detection) and GPT-4o Mini (outbound lead ranking) Google Gemini — connect your Google Gemini API credential; used for CRM note writing with gemini-2.0-flash Vapi — add your Vapi API key to the HTTP Request node header; set your Phone Number ID and Assistant ID in the outbound call node (Vapi Dashboard → Phone Numbers / Assistants) Supabase — connect your Supabase API credential; create the voice_call_logs table using the SQL in the setup sticky note inside the workflow Google Sheets — connect Google Sheets OAuth2; set your Spreadsheet ID and ensure a sheet named Voice Call Log exists with the columns listed in the setup sticky note Follow the sticky notes inside the workflow — each section has a one-liner setup guide 📋 Prerequisites Vapi or Retell AI account with an active phone number and assistant configured Anthropic API key (Claude API access) OpenAI API key (GPT-4o and GPT-4o Mini access) Google Gemini API key GoHighLevel account with at least one pipeline and automation workflow set up Supabase project with the voice_call_logs table created Google Sheets spreadsheet set up as your call log --- Custom Workflow Request with Personal Dashboard kumar@smartflowcraft.com https://www.smartflowcraft.com/contact More free templates https://www.smartflowcraft.com/n8n-templates