by Panth1823
Automate Personalized HR Email Outreach with Rate Limiting This workflow streamlines HR outreach by fetching contact data, validating emails, enforcing daily sending limits, and sending personalized emails with attachments, all while logging activity. How it works Read HR contact data from Google Sheets. Remove duplicates and validate email formats. Apply dynamic daily email sending limits. Generate personalized email content. Download resumes for attachments. Send emails via Gmail with attachments. Log sending status (success/failure) to Google Sheets. Setup Configure Google Sheets credentials. Configure Gmail OAuth2 credentials. Update 'Google Sheets - Read HR Data' with your document and sheet IDs. Define email content in 'Email Creator' node. Set 'Download Resume' URL to your resume repository. Update 'Log to Google Sheets' with your tracking sheet IDs. Customization Adjust the 'Rate Limiter' node's RAMP_START and LIMIT_BY_WEEK variables to match your desired sending schedule and volume.
by Massimo Bensi
Automate Google News Digests with AI & Gmail Approval Workflow in n8n Overview This n8n automation template shows how to automatically collect and format daily Google News digests on your chosen topics, send them to your Gmail inbox for quick approval, and—if declined—generate the next set of curated news articles until you approve. ⚠️ Disclaimer: This workflow template uses community nodes and works only on n8n self-hosted instances. == Use case Streamline content curation for social media, newsletters, or blog posts by scheduling an AI-formatted Google News workflow that saves time in finding trending stories. How to use ⏰ Schedule the automation to run at your preferred time. 🔎 Fetch the latest trending Google News on your selected topic with SerpApi integration. 🤖 Send articles in batches of 10 to an AI content formatter that generates clean HTML output. 📧 Receive an approval email in your Gmail inbox with the AI-formatted news digest. ❌ Decline the digest to trigger the next batch of 10 curated news articles until you approve. 📊 Workflow logic uses AirTable counters and a custom Code node to manage batching. Setup instructions Connect your SerpApi, AirTable, OpenAI, and Gmail accounts. In the Gmail node, set the variable $env.EMAIL_ADDRESS_ME or replace the “To” field with your email. In AirTable, create a free-tier base with two columns: WorkflowID and Counter. The workflow will manage row creation and deletion automatically. Define your news topic or keyword in the SerpApi “Search Query (q)” field. Run the workflow and check your Gmail inbox for your curated AI-powered news digest. Requirements AirTable account Gmail account SerpApi account OpenAI account Customising this workflow ⏱ Adjust the schedule in the “Schedule Trigger” node for daily, weekly, or custom timing. 🔑 Enter your niche news keyword in the “Search Query (q)” field of the SerpApi node. 📦 Change the batch size (default 10) inside the Code node “Extract Details.” 🎨 Personalize the Gmail approval email template inside the AI Agent node “Prepare Content Review Email” for branding or formatting preferences.
by Daniel Shashko
This workflow automates the process of monitoring multiple RSS feeds, intelligently identifying new articles, maintaining a record of processed content, and delivering timely notifications to a designated Slack channel. It leverages AI to ensure only truly new and relevant articles are dispatched, preventing duplicate alerts and information overload. 🚀 Main Use Cases Automated News Aggregation:** Continuously monitor industry news, competitor updates, or specific topics from various RSS feeds. 📈 Content Curation:** Filter and deliver only new, unprocessed articles to a team or personal Slack channel. 🎯 Duplicate Prevention:** Maintain a persistent record of seen articles to avoid redundant notifications. 🛡️ Enhanced Information Delivery:** Provide a streamlined and intelligent way to stay updated without manual checking. 📧 How it works The workflow operates in distinct, interconnected phases to ensure efficient and intelligent article delivery: 1. RSS Feed Data Acquisition 📥 Initiation:** The workflow is manually triggered to begin the process. 🖱️ RSS Link Retrieval:** It connects to a Baserow database to fetch a list of configured RSS feed URLs. 🔗 Individual Feed Processing:** Each RSS feed URL is then processed independently. 🔄 Content Fetching & Parsing:** An HTTP Request node downloads the raw XML content of each RSS feed, which is then parsed into a structured JSON format for easy manipulation. 📄➡️🌳 2. Historical Data Management 📚 Seen Articles Retrieval:** Concurrently, the workflow queries another Baserow table to retrieve a comprehensive list of article GUIDs or links that have been previously processed and notified. This forms the basis for duplicate detection. 🔍 3. Intelligent Article Filtering with AI 🧠 Data Structuring for AI:** A Code node prepares the newly fetched articles and the list of already-seen articles into a specific JSON structure required by the AI Agent. 🏗️ AI-Powered Filtering:** An AI Agent, powered by an OpenAI Chat Model and supported by a Simple Memory component, receives this structured data. It is precisely prompted to compare the new articles against the historical "seen" list and return only those articles that are genuinely new and unprocessed. 🤖 Output Validation:** A Structured Output Parser ensures that the AI Agent's response adheres to a predefined JSON schema, guaranteeing data integrity for subsequent steps. ✅ JSON Cleaning:** A final Code node takes the AI's raw JSON string output, parses it, and formats it into individual n8n items, ready for notification and storage. 🧹 4. Notification & Record Keeping 🔔 Persistent Record:** For each newly identified article, its link is saved to the Baserow "seen products" table, marking it as processed and preventing future duplicate notifications. 💾 Slack Notification:** The details of the new article (title, content, link) are then formatted and sent as a rich message to a specified Slack channel, providing real-time updates. 💬 Summary Flow: Manual Trigger → RSS Link Retrieval (Baserow) → HTTP Request → XML Parsing | Seen Articles Retrieval (Baserow) → Data Structuring (Code) → AI-Powered Filtering (AI Agent, OpenAI, Memory, Parser) → JSON Cleaning (Code) → Save Seen Articles (Baserow) → Slack Notification 🎉 Benefits: Fully Automated:** Eliminates manual checking of RSS feeds and Slack notifications. ⏱️ Intelligent Filtering:** Leverages AI to accurately identify and deliver only new content, avoiding duplicates. 💡 Centralized Data Management:** Utilizes Baserow for robust storage of RSS feed configurations and processed article history. 🗄️ Real-time Alerts:** Delivers timely updates directly to your team or personal Slack channel. ⚡ Scalable & Customizable:** Easily adaptable to monitor various RSS feeds and integrate with different Baserow tables and Slack channels. ⚙️ Setup Requirements: Baserow API Key:** Required for accessing and updating your Baserow databases. 🔑 OpenAI API Key:** Necessary for the AI Agent to function. 🤖 Slack Credentials:** Either a Slack OAuth token (recommended for full features) or a Webhook URL for sending messages. 🗣️ Baserow Table Configuration:** A table with an rssLink column to store your RSS feed URLs. A table with a Nom column to store the links of processed articles. For any questions or further assistance, feel free to connect with me on LinkedIn: https://www.linkedin.com/in/daniel-shashko/
by Deepak Singh
How it works This workflow automatically generates a daily Indian marketing & advertising newsletter. It fetches articles from Campaign India and Economic Times BrandEquity feeds, merges them, and evaluates each story using an AI relevance filter. Only meaningful updates, such as brand launches, marketing campaigns, and changes to digital media, are retained. Relevant stories are stored in an n8n Data Table and later used to build a clean HTML newsletter. Every day at 7:30 PM IST, the workflow composes the email and sends it via Gmail, with an optional SMTP fallback if Gmail fails. After sending, processed entries are removed to keep the next day’s digest fresh. Set up steps Add your Gmail and (optional) SMTP credentials. Update the recipient email inside the Gmail/SMTP nodes. Confirm the Data Table exists or let n8n create it automatically. Adjust schedule timing if you want the newsletter at a different time. Add or remove RSS feeds as needed (inside the brown “RSS Fetching Block”). (Full explanations for each block are included as sticky notes inside the workflow.) By Deepak Singh If you need help or want custom automations: deepakbiz@outlook.com
by Yassin Zehar
Description Automatically detect and escalate Product UAT critical bugs using AI, create Jira issues, notify engineering teams, and close the feedback loop with testers. This workflow analyzes raw UAT feedback submitted via a webhook, classifies it with an AI model, validates severity, and automatically escalates confirmed critical bugs to Jira and Slack. Testers are notified, and the original webhook receives a structured response for full traceability. It is designed for teams that want fast, reliable critical bug handling during UAT without manual triage. Context During Product UAT and beta testing, critical bugs are often buried in unstructured feedback coming from forms, Slack, or internal tools. Missing or delaying these issues can block releases and create friction between Product and Engineering. This workflow ensures: Faster detection of critical bugs Immediate escalation to engineering Clear ownership and visibility Consistent communication with testers It combines AI-based classification with deterministic routing to keep UAT feedback actionable and production-ready. Who is this for? Product Managers running UAT or beta programs Project Managers coordinating QA and release readiness Engineering teams who need fast, clean bug escalation Product Ops teams standardizing feedback workflows Any team handling high-volume UAT feedback Perfect for teams that want speed, clarity, and traceability during UAT. Requirements Webhook trigger (form, Slack integration, internal tool, etc.) OpenAI account (for AI triage) Jira (critical bug tracking) Slack (engineering alerts) Gmail or Slack (tester notifications) How it works Trigger The workflow starts when UAT feedback is submitted via a webhook. Normalize & Clean Incoming data is normalized (tester, build, page, message) and cleaned to ensure a consistent, AI-ready structure. AI Triage & Validation An AI model analyzes the feedback and returns a structured triage result (type, severity, summary, confidence), which is parsed and validated. Critical Bug Escalation Validated critical bugs automatically: create a Jira issue with full context trigger an engineering Slack alert Closed Loop The tester is notified via Slack or email, and the workflow responds to the original webhook with a structured status payload. What you get Automated critical bug detection during UAT Instant Jira ticket creation Real-time engineering alerts in Slack Automatic tester communication Full traceability via structured webhook responses About me : I’m Yassin a Product Manager Scaling tech products with a data-driven mindset. 📬 Feel free to connect with me on Linkedin
by Cheng Siong Chin
How It Works This workflow automates tenant screening by analyzing payment history, credit, and employment data to predict rental risks. Designed for property managers, landlords, and real estate agencies, it solves the challenge of objectively evaluating tenant reliability and preventing payment defaults.The system runs daily assessments, fetching rent payment history, credit bureau reports, and employment records. An AI agent merges this data, calculates risk scores, and routes alerts based on severity. High-risk tenants trigger immediate email notifications for intervention, medium-risk cases post to Slack for monitoring, while low-risk updates save quietly to databases. Automated collection workflows initiate for high-risk cases. Setup Steps Configure payment history, credit bureau, and employment credentials in fetch nodes Add OpenAI API key for risk analysis and set Gmail/Slack credentials for alerts Customize risk score thresholds and routing rules in workflow logic Prerequisites Payment system API, credit bureau access, employment verification API Use Cases Rental application screening, existing tenant monitoring Customization Modify risk scoring criteria, adjust alert thresholds Benefits Reduces defaults through early detection, eliminates screening bias
by Cheng Siong Chin
Introduction Automatically imports Excel schedules from Google Drive, validates data with AI, syncs to Google Calendar, and emails smart summaries. Ideal for educators, managers, and administrators handling recurring academic or project schedules. How It Works Trigger → Download Excel → Filter events → Dual AI analysis (OpenAI + Parser) → Merge insights → Enrich data → Create/Update Google Calendar events → Generate and email AI summary. Workflow Template Trigger → Download Excel → Filter Events → AI Analysis → Merge Insights → Enrich Data → Create/Update Calendar → AI Summary → Email Report Workflow Steps Trigger: Runs on schedule to detect new files. Read Excel: Converts spreadsheet data to JSON. Filter Events: Removes invalid entries. AI Context Analysis: Understands event links and conflicts. Structured Parser: Formats AI output for consistency. Merge Insights: Combines multi-AI results. Enrich Data: Prepares Google Calendar-ready events. Calendar Actions: Creates or updates events. AI Summary: Generates executive overview. Email Delivery: Sends formatted summary report. Setup Google Drive: Connect OAuth2 → get file ID. Calendar: Enable API → authorize in n8n. OpenAI: Add API key → select GPT model. Email (Gmail/SMTP): Configure sender and recipients. Trigger: Set timezone and frequency. Excel Format: Include Name, Date, Time, Location, Staff, etc.
by Arunava
Automatically track your Android app’s keyword rankings on Google Play. This workflow checks ranks via SerpApi, updates a Baserow table, and posts a heads-up in Slack so your team can review changes quickly. 💡 Perfect for ASO teams tracking daily keyword positions Growth & marketing standups that want quick rank visibility Lightweight historical logging without a full BI stack 🧠 What it does Runs on a schedule (e.g., weekly) Queries SerpApi for each keyword’s Play Store ranking Saves results to Baserow: Current Rank, Previous Rank, Last modified Sends a Slack alert: “Ranks updated — review in Baserow” ⚡ Requirements SerpApi account & API key Baserow account + API token Slack connection (bot/app or credential in n8n) ⚙️ Setup Instructions 1) Create a Baserow table Create a new table (any name). Add user-field names exactly: Keywords (text) Current Rank (number) Previous Rank (number) Last modified (date/time) Optional fields you can add later: Notes, Locale, Store Country, App Package ID. 2) Connect credentials in n8n Baserow: add your API token and select your Database and Table in the Baserow nodes. Slack: connect your Slack account/workspace in the Slack node. SerpApi: open the HTTP Request node and put your API key under Query Parameters → api_key = YOUR_KEY. 3) Verify field mapping In the Baserow (Update Row) node, map: Row ID → {{$json.id}} Current Rank → {{$json["Current Rank"]}} Previous Rank → your code node should set this (the template copies the old “Current Rank” into “Previous Rank” before writing the new one) Last modified → {{$now}} (or the timestamp you compute) 🛟 Notes & Tips If you prefer a single daily Slack summary instead of multiple pings, add a Code node after updates to aggregate lines and send one message. Treat 0 or missing ranks as “not found” and flag them in Slack if helpful. For multi-country tracking, include hl/gl (locale/country) in your SerpApi query params and store them as columns. 🤝 Need a hand? I’m happy to help you get this running smoothly—or tailor it to your brand. Reach out to me via email: imarunavadas@gmail.com
by Marián Današ
Why Creating and sending invoices manually is a major administrative bottleneck. It's not only slow but also prone to human error, such as creating duplicate invoice numbers or sending sensitive financial data in an unsecured format. This workflow solves these problems by creating a robust, end-to-end automation. It ensures every invoice has a unique ID, is professionally generated, is password-protected, and is delivered to your customer automatically. What This workflow provides a complete, secure solution for automated invoicing. It is designed to be triggered by a Webhook (e.g., from your e-commerce store, CRM, or billing platform) that provides customer and order details. The workflow then executes the following steps: Generate & Verify ID: It first generates a new invoice ID. It then performs a critical check by reading your master Google Sheet to ensure this ID is unique, preventing duplicate invoices. Generate PDF: Once the ID is verified, it passes the data to the PDF Generator API. This service dynamically populates your custom invoice template. (PDF Generator API makes it incredibly easy to build and manage your document templates via their web-based editor). Encrypt Document: For enhanced security, the workflow uses a PDF Generator API operation to encrypt the newly generated invoice with a password, protecting your client's sensitive data. Store & Deliver: Finally, it uploads the secure PDF to a specified Google Drive folder for your records and then automatically sends it to the customer as an attachment using Gmail. How Prerequisites: You will need active accounts for: PDF Generator API (for both generation and encryption) Google Suite (for Sheets, Drive, and Gmail) PDF Generator API Setup: Log in to your PDF Generator API account and use their template builder to create your invoice design. Note your Template ID, API Key, and API Secret. In the n8n PDFGeneratorAPI node (Generate a PDF document), create new credentials using your Key and Secret. In the node's parameters, select your Template ID from the list. Google Sheets Setup: Create a Google Sheet to act as your master list of invoices. In the Check If ID Already Exists node, authenticate your Google Sheets account. Set the Spreadsheet ID and Sheet Name. In the "Columns to Return" field, enter the name of the column where you store your invoice IDs. Security & Delivery Setup: Encrypt Node: In the Encrypt PDF document node, authenticate your PDF Generator API credentials (the same ones from Step 2). You can set a static password, or for better security, use an expression to set a dynamic password from the webhook data (e.g., the customer's postal code or order ID). Google Drive Node: Authenticate the Upload file node and specify the Drive and Folder ID where invoices should be stored. Gmail Node: Authenticate the Send a message + file node. Use an expression to map the customer's email from the trigger data into the "To" field. Test & Activate: The Webhook node has pinned test data. You can click "Test workflow" to run the entire process with this sample data. Once you confirm the file is generated, encrypted, and sent, connect your live app (e.g., Shopify, Stripe, etc.) to the production Webhook URL. Activate the workflow.
by Rully Saputra
Sign up for Decodo — get better pricing here Who’s it for This workflow is designed for e-commerce operators, pricing analysts, retail founders, and procurement teams who need reliable, automated price intelligence without manual tracking. If you manage competitive pricing across Amazon or multiple product URLs stored in Google Sheets, this workflow gives you consistent monitoring and automated alerts. What it does This automation checks product prices on a schedule, scrapes real-time Amazon data using Decodo, compares it with your baseline price, and routes alerts depending on whether the price increases, stays normal, or drops. High increases trigger Telegram alerts and automatically create a Google Calendar meeting. Price drops send a rich HTML email to stakeholders. All items are processed in controlled batches to avoid rate limits. How it works Reads product URLs + baseline prices from Google Sheets Uses Decodo to extract current Amazon price, title, and product data Calculates percentage difference via Code node Routes items through the Switch node (High / Normal / Low) Sends alerts or emails accordingly Loops continuously until all rows are processed Requirements Google Sheets, Gmail, Google Calendar, and - Telegram credentials Active Decodo API credentials A sheet containing URL + baseline columns How to customize You can adjust alert thresholds, add more channels (Slack, WhatsApp, Notion), change batch size, modify email templates, or extend the Google Sheet with additional product metadata. The routing logic is easily expandable for more pricing scenarios.
by Amirul Hakimi
Advanced AI Lead Enrichment & Cold Email Personalization with n8n, Airtable, Apify, and LLMs Automated B2B Lead Nurturing: Hyper-Personalization for High-Converting Cold Email Campaigns This powerful n8n automation workflow is designed to execute advanced B2B lead enrichment and hyper-personalization for cold email outreach. By orchestrating a complex chain of data scraping, AI analysis (via LLMs/GPT-4.1), and CRM synchronization (using Airtable), this workflow ensures every lead receives a highly tailored and relevant outreach message, maximizing conversion rates and minimizing manual effort. Workflow Execution & Key Features Airtable Trigger & Lead Qualification: The workflow is triggered by an Airtable webhook, pulling a new lead record (including name, email, and company URLs). Email Validation* is performed using *NeverBounce** to filter out invalid contacts. Initial Lead Filtering screens for key demographic criteria (e.g., US: Yes or No? and target Headcount: >5, <30?). Only qualified B2B leads proceed, ensuring optimal resource allocation. Deep Web & Social Scraping (Apify Integration): LinkedIn Company Scraper* and a *LinkedIn Profile Scraper* (via *Apify**) extract raw data from the lead's company and personal profiles. Company Homepage Scraper** pulls the main website content for analysis. Scrape Personal LinkedIn Posts** node retrieves recent activity for the ultimate personalization hook. AI-Powered Data Synthesis & Variable Determination: Multiple OpenAI (GPT-4.1-mini/4.1) nodes analyze and structure the raw, cleaned text (Remove HTML nodes ensure clean inputs). Determine Valuable URLs** uses an LLM to smartly categorize and select key company pages (e.g., ==/about==, ==/solutions==, ==/case-studies==) for deeper scraping. Analyze Company/Mission, Analyze Offerings & Positioning, Analyze Process & Differentiation, and **Analyze Proof of Success nodes create factual, structured business summaries for the ultimate ICP research. Determine Variables* nodes create *pre-written, personalized cold email variables** (==company_specialty==, ==ICPofLead==, ==PainPointLeadSolves==, etc.) for different outreach strategies. LinkedIn Post Personalization: An LLM (Craft Opening Line - Posts) analyzes recent LinkedIn activity to generate a hyper-specific, conversation-starting opener (e.g., "Saw your LinkedIn post about..."). Conditional logic (Posts Available?) determines whether to use the post-based opener or fall back to the standard, company-based personalization. CRM Update & Campaign Launch (Instantly.ai): Finalized, enriched lead data and the crafted personalization variables are synchronized back to the Airtable CRM for record-keeping and lead status updates (Update Lead W/ Enrichment). The lead is then seamlessly pushed to the Instantly.ai outbound platform, injecting the AI-generated custom variables directly into the cold email sequence for mass deployment. This blueprint automates the tedious, high-effort task of prospect research and personalization, providing a scalable lead generation solution that increases both outreach quality and sales velocity. Stop sending generic emails—start leveraging AI automation today.
by David Olusola
📄 AI Summarize Weekly Google Docs Updates → Send Email This workflow automatically reviews selected Google Docs every week, checks for updates, and generates a professional weekly summary email using AI. It’s perfect for keeping your team or leadership informed without manually digging through multiple documents. ⚙️ How It Works Weekly Monday 9AM Trigger A Cron node runs every Monday at 9 AM (adjustable). Ensures summaries arrive at the start of the workweek. Prepare Docs List A Code node defines which Google Docs to monitor. Includes doc IDs, names, and categories (e.g., Projects, Meetings, Updates). Sets a 7-day date range for updates. Fetch Google Docs & Metadata Retrieves document content and metadata (last modified, user, version). Filters only docs that have been updated in the past week. Process Doc Data Extracts plain text content from Google Docs. Cleans and normalizes text for summarization. Collects key details: word count, modified by, last updated date. Aggregate Updated Docs Gathers all updated docs into one combined content block. Prepares context for the AI model to create a weekly digest. Generate AI Summary Uses GPT-4 to generate a business-style weekly summary. Includes: Executive summary Key updates by document Important changes Action items Next week’s focus Prepare Email Content Formats the AI response into both plain text and HTML email. Adds a list of updated documents with last modified info. Send Summary Email Sends the final summary to the configured team emails via Gmail. Subject line includes the date range for easy reference. 🛠️ Setup Steps 1. Google Docs Setup Add document IDs in the Prepare Docs List node. Enable the Google Drive API in your Google account. Connect Google OAuth credentials in n8n. 2. OpenAI API Key Get your key from platform.openai.com. Add it to your n8n credentials. Uses GPT-4 for high-quality summaries. 3. Email Recipients Update the Gmail node with your team’s email addresses. Customize the subject line and template if needed. 4. Schedule Default: Every Monday at 9 AM. Adjust the Cron node if you prefer a different time. 📧 Example Output Subject: 📄 Weekly Document Updates Summary – 08/22/2025 – 08/29/2025 Body (excerpt): Dear Team, Here's your weekly document updates summary: Executive Summary: Project Status Doc updated with new timelines. Meeting Notes highlight three key decisions from leadership. Team Updates document includes two new hires and onboarding tasks. Key Updates by Document: • Project Status Doc (Projects) - Updated by Alice on 08/27 • Meeting Notes (Meetings) - Updated by Bob on 08/28 • Team Updates (Updates) - Updated by Sarah on 08/29 Action Items: Confirm revised project deadlines. Follow up on onboarding checklist. This summary was automatically generated by your n8n workflow. ⚡ With this workflow, you’ll never miss important document changes — your team gets a clear, AI-generated weekly digest straight in their inbox.