by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features ⏰ Scheduled Processing:** Runs daily at 7 AM automatically 📊 Data Validation:** Ensures application completeness 💾 Database Updates:** Maintains student records 📧 Auto Emails:** Sends welcome messages ❌ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email
by Marián Današ
Who’s this for 💼 This template is designed for teams and developers who need to generate PDF documents automatically from HTML templates. It’s suitable for use cases such as invoices, confirmations, reports, certificates, or any custom document that needs to be created dynamically based on incoming data. What this workflow does ⚙️ This workflow automates the full lifecycle of document generation, from request validation to delivery and storage. It is triggered by a POST webhook that receives structured JSON data describing the requested document and client information. Before generating the document, the workflow validates the client’s email address using Hunter Email Verification to prevent invalid or mistyped emails. If the email is valid, the workflow loads the appropriate HTML template from a Postgres database, fills it with the incoming data, and converts it into a PDF using PDF Generator API. Once the PDF is generated, it is sent to the client via Gmail, uploaded to Supabase Storage, and the transaction is recorded in the database for tracking and auditing purposes. How it works 🛠️ Receives a document generation request via a POST webhook. Validates the client’s email address using Hunter. Generates a PDF document from an HTML template using PDF Generator API. Sends the PDF via Gmail and uploads it to Supabase Storage. Stores a document generation record in the database. How to set up 🖇️ Before activating the workflow, make sure all required services and connections are prepared and available in your n8n environment. Create a POST webhook endpoint that accepts structured JSON input. Add Hunter API credentials for email verification. Add PDF Generator API credentials for HTML to PDF conversion. Prepare a Postgres database with tables for HTML templates and document generation records. Set up Gmail or SMTP credentials for email delivery. Configure Supabase Storage for storing generated PDF files. Requirements ✅ PDF Generator API account Hunter account Postgres database Gmail or SMTP-compatible email provider Supabase project with Storage enabled How to customize the workflow 🤖 This workflow can be adapted to different document generation scenarios by extending or modifying its existing steps: Add extra validation steps before document generation if required. Extend delivery options by sending the generated PDF to additional services or webhooks. Enhance security by adding document encryption or access control. Add support for additional document types by storing more HTML templates in the database. Modify the database schema or queries to store additional metadata related to generated documents. Adjust the data mapping logic in the Code node to match your input structure.
by Milo Bravo
Automated Email Outreach: Telegram → Gmail → Sheets Dashboard Who is this for? Solo founders, sales teams, and event organizers who need email outreach without expensive tools but want full control from Telegram. What problem is this workflow solving? Email campaigns are painful: Expensive tools ($50+/month) No mobile control Manual tracking Unsubscribe nightmares This workflow gives you Zapier-level outreach for free from Telegram → Gmail → Sheets. What this workflow does Telegram Control /outreach command launches campaigns Smart Sending Gmail + random delays (anti-spam) Real-time Tracking Open pixels + unsubscribe webhooks Sheets Dashboard Leads, logs, stats in one place Compliance Auto-unsubscribe + opt-out tracking Full flow: Telegram Bot → Parse Command → Sheets Leads → Gmail Send → Pixel/Unsub Track → Update Dashboard Setup (7 minutes) Telegram: Create bot → Get token → Update chatId Gmail: OAuth2 credential (any account) Google Sheets: Create sheet with tabs: Dashboard (stats) Leads (email, name, status) Logs (sends, opens, unsubs) Config: Update Sheet ID + webhook URLs Test: /outreach cap:2 → Verify sends text Telegram commands: /outreach sender:you@domain.com subject:"Event Invite" body:"Hi {{name}}..." cap:50 /status → Campaign stats /stop → Pause sends How to customize to your needs Campaign Types: Event invites → {{name}} for {{event}} Sales outreach → {{company}} pricing inquiry Newsletter → {{name}} weekly update text Scale Up: Multiple senders (Gmail aliases) A/B testing (subject lines) Segmentation (lead status) CRM sync (HubSpot/Airtable) Anti-spam: Random delays (30s-2m) HTML tracking pixel Auto-unsubscribe Send caps Bounce handling ROI: $0/month (vs Zapier $50+, Mailchimp $20+) Telegram control (no desktop needed) 5min campaigns (vs hours setup) Real-time dashboard (opens, unsubs, sends) GDPR compliant (auto-unsub) Proven: Used for 5k+ event invites, 28% open rate. Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: n8n email outreach, telegram automation, gmail campaigns, google sheets dashboard, no-code email marketing, sales outreach automation, event invite workflow.
by Klardaten
This workflow watches your Outlook inbox for new emails with attachments and archives those attachments in DATEV DMS. It skips emails without attachments, looks up a related DATEV client, processes each attachment separately, uploads the file to DATEV DMS, creates the matching document entry, optionally sends a Slack notification, and then marks the email as archived in Outlook. How it works The workflow starts when a new email arrives in Outlook. Emails without attachments are skipped. A DATEV client is retrieved for the email. Attachments are processed one by one. Each file is uploaded to DATEV DMS and linked to a document. A Slack message can be sent after a successful upload. The Outlook email is then marked as archived. Setup steps Connect your Microsoft Outlook OAuth2 credentials. Connect your DATEVconnect credentials. Optionally connect Slack credentials. Replace the demo client lookup with your own client-matching logic. Adjust the folder, registry, and metadata fields in the document creation step. Update the Outlook archive action to match your process.
by Yaron Been
Detect when target accounts adopt competitor technology by enriching company watchlists with PredictLeads tech detection data and emailing the assigned AE. This workflow monitors a Google Sheets watchlist of company domains for competitor technology adoption. It checks each company against the PredictLeads Technology Detections API, and when a match is found (e.g., a prospect starts using Salesforce), it sends an alert email to the account executive responsible for that account. How it works: Schedule trigger runs the workflow daily at 8 AM. Reads the company watchlist from Google Sheets (domain, company name, AE email). Loops through each company and fetches technology detections from PredictLeads. Checks if any detected technology matches the configured competitor tool. If a match is found, extracts the assigned AE's email address. Sends a Gmail alert to the AE with company name, domain, detected tech, and date. If no match, moves to the next company in the loop. Setup: Create a Google Sheet with a "CompetitorWatchlist" tab containing columns: domain, company_name, ae_email. Set the competitor technology name in the Check Competitor Tech code node (default: "Salesforce"). Connect your Gmail account (OAuth2) for sending alert emails. Add your PredictLeads API credentials (X-Api-Key and X-Api-Token headers). Requirements: Google Sheets OAuth2 credentials. Gmail OAuth2 credentials. PredictLeads API account (https://docs.predictleads.com). Notes: Change the COMPETITOR_TECH variable in the code node to match your specific competitor (e.g., "HubSpot", "Marketo", "Zendesk"). Each row in the watchlist should have a valid AE email -- rows without one are skipped. PredictLeads Technology Detections API docs: https://docs.predictleads.com
by PollupAI
Who’s it for This workflow is built for B2B SaaS and CX teams that are drowning in unstructured customer feedback across tools. It’s ideal for Customer Success, Product and Support leaders who want a light “voice of customer engine” without rebuilding their stack: Gmail for interactions, Slack for conversations, Pipedrive for notes and Zendesk for tickets, plus Notion for follow-up tasks. How it works / What it does The workflow runs on a schedule or manual trigger and first sets the CSM’s email address. It then uses an AI “Data agent” to pull recent customer signals from multiple sources: Gmail messages, Slack messages, Pipedrive notes and Zendesk tickets. A “Signals agent” compresses each piece of feedback into a concise, neutral summary, which is then grouped by topic via a “Clustering agent”. Each cluster gets a label, count and examples. Finally, an “Action agent” routes clusters based on their label: Create Zendesk tickets for product/performance issues Post to a dedicated Slack channel for billing / contract topics Create Notion tasks for sales-related feedback Send targeted Gmail messages to the CSM for high-risk or engagement-related items How to set up Import the workflow into n8n. Connect credentials for Gmail, Slack, Pipedrive, Zendesk, Notion and OpenAI. Update the CSM email in the “Set CSM email” node. Adjust date filters, send-to addresses and Slack channel IDs as needed. Enable the schedule trigger for weekly or daily digests. Requirements Active accounts & credentials for: Gmail, Slack, Pipedrive, Zendesk and Notion OpenAI (or compatible) API key for the LLM node At least one Slack channel for posting feedback (e.g. #billing-feedback) How to customize the workflow Change the time window or filters (sender, channel, query) for each data source. Edit the clustering and routing prompts to match your own categories and teams. Add new destinations (e.g. Jira, HubSpot) by connecting more tools to the Action agent. Modify thresholds (e.g. minimum count) before a cluster triggers an action. Localize labels and email copy to your team’s language and tone.
by Veena Pandian
Who is this for? Founders, product managers, content strategists, indie hackers, and anyone who wants to automatically monitor tech industry trends across multiple sources — without manually browsing Hacker News and Product Hunt every day. What this workflow does This workflow scans public RSS feeds from Hacker News and Product Hunt daily, scores every item against configurable keyword groups (AI, SaaS, No-Code, Dev Tools, etc.), clusters the results into ranked themes, and delivers a prioritized intelligence report via Slack and email. All signals and themes are logged to Google Sheets for historical trend analysis. How it works Daily trigger fires on a configurable schedule (default: every 24 hours). Fetches RSS feeds from Hacker News (posts with 50+ points) and Product Hunt in parallel. Parses and normalizes all feed items — extracting titles, descriptions, URLs, and publish dates from RSS/Atom XML. Scores each item against 7 weighted keyword groups. Title matches receive a bonus multiplier. Source weights (Hacker News 1.5x, Product Hunt 1.3x) amplify signals from higher-authority sources. Clusters into themes — groups scored items by primary category, calculates theme strength using source diversity and volume bonuses, and classifies each as VERY_STRONG, STRONG, MODERATE, or WEAK. Builds an intelligence report with theme rankings, top 10 signals, and action items for surging topics. Generated in both plain text (Slack) and HTML (email). Delivers and logs — posts to Slack, sends HTML email, and appends both individual signals and theme summaries to separate Google Sheet tabs. Setup steps Connect Google Sheets OAuth2 credentials and update the Sheet ID in both "Log Signals to Sheet" and "Log Themes to Sheet" nodes. Create a Google Sheet with two tabs: signal — headers: date, title, source, score, category, url themes — headers: date, category, signal_level, theme_strength, item_count, sources, top_keywords Connect Slack OAuth2 credentials and configure your target channel in the "Post Report to Slack" node. Connect Gmail OAuth2 credentials and update YOUR_EMAIL@EXAMPLE.COM in the "Email Daily Report" node. Activate the workflow. Requirements n8n instance (self-hosted or cloud) Google Cloud project with Sheets API enabled Slack workspace with a bot configured Gmail account with OAuth2 credentials (or swap for SMTP) No API keys needed for RSS feeds — they are publicly accessible How to customize Add more RSS feeds** — duplicate a feed node (e.g., TechCrunch, Reddit, Lobsters), connect it to the Merge node as an additional input, and add the parsing logic in the "Parse All RSS Feeds" code node. Edit keyword groups** — modify the keywordGroups object in the "Score and Classify Signals" node. Add your industry-specific keywords, adjust weights, and rename categories. Adjust source weights** — change the weight multipliers in the parser node to reflect which sources you trust most. Theme thresholds** — modify the strength cutoffs (30 = VERY_STRONG, 15 = STRONG, 8 = MODERATE) in the "Aggregate Signals into Themes" node. Schedule** — change from daily to hourly for real-time monitoring, or weekly for a digest format. Add AI analysis** — insert an LLM node after the report builder to generate strategic commentary on detected trends.
by Frederik Duchi
This n8n template demonstrates how to automatically create tasks (or in general, records) in Baserow based on template or blueprint tables. The first blueprint table is the master table that holds the general information about the template. For example: a standard procedure to handle incidents. The second table is the details table that holds multiple records for the template. Each record in that table is a specific task that needs to be assigned to someone with a certain deadline. This makes it easy to streamline task creation for recurring processes. Use cases are many: Project management (generate tasks for employees based on a project template) HR & onboarding (generate tasks for employee onboarding based on a template) Operations (create checklists for maintenance, audits, or recurring procedures) Good to know The Baserow template for handling Standard Operating Procedures works perfect as a base schema to try out this workflow. Authentication is done through a database token. Check the documentation on how to create such a token. Tasks are inserted using the HTTP request node instead of a dedicated Baserow node. This is to support batch import instead of importing records one by one. Requirements Baserow account (cloud or self-hosted) A Baserow database with at least the following tables: Assignee / employee table. This is required to be able to assign someone to a task. Master table with procedure or template information. This is required to be able to select a certain template Details table with all the steps associated with a procedure or template. This is required to convert each step into a specific task. A step must have a field Days to complete with the number of days to complete the step. This field will be used to calculate the deadline. Tasks table that contains the actual tasks with an assignee and deadline. How it works Trigger task creation (webhook)** The automation starts when the webhook is triggered through a POST request. It should contain an assignee, template, date and note in the body of the request. It will send a succes or failure response once all steps are completed. Configure settings and ids** Stores the ids of the involved Baserow database and tables, together with the API credentials and the data from the webhook. Get all template steps** Gets all the steps from the template Details table that are associated with the id of the Master template table. For example: the master template can have a record about handling customer complaints. The details table contains all the steps to handle this procedure. Calculate deadlines for each step** Prepares the input of the tasks by using the same property names as the field of the Tasks table. Adjust this names, add or remove fields if this is required for your database structure. The deadline of each step is calculated by adding the number of days a step can take based on the deadline of the first step. This is done through a field Days to complete in the template Details table. For example. If the schedule_date property in the webhook is set to 2025-10-01 and the Days to complete for the step is 3, then the deadline will be 2025-10-04 Avoid scheduling during the weekend** It might happen that the calculated deadline is on a Saturday or Sunday. This Code node moves those dates to the first Monday to avoid scheduling during the weekend. Aggregate tasks for insert** Aggregates the data from the previous nodes as an array in a property named items. This matches perfect with the Baserow API to insert new records in batch. Generate tasks in batch** Calls the API endpoint /api/database/rows/table/{table_id}/batch/ to insert multiple records at once in the tasks table. Check the Baserow API documentation for further details. Success / Error response** Sends a simple text response to indicate the success or failure of the record creation. This is to offer feedback when triggering the automation from a Baserow application, but can be replaced with a JSON response. How to use Call the Trigger task creation node with the required parameters through a POST request. This can be done from any web application. For example: the application builder in Baserow supports an action to send an HTTP request. The Procedure details page in the Standard Operating Procedures template demonstrates this action. The following information is required in the body of the request. This information is required to create the actual tasks. { "assignee_id": integer refering to the id of the assignee in the database, "template_id": integer refering to the id of the template or procedure in the master table, "schedule_date": the date the tasks need to start scheduling, "note": text with an optional note about the tasks } Set the corresponding ids in the Configure settings and ids node. Check the names of the properties in the Calculate deadlines for each step node. Make sure the names of those properties match the field names of your Tasks table. You can replace the text message in the Success response and Failure response with a more structured format if this is necessary in your application. Customising this workflow Add support for public holidays (e.g., using an external calendar API). Modify the task assignment logic (e.g., pre-assign tasks in the details table). Combine with notifications (email, Slack, etc.) to alert employees when new tasks are generated.
by WeblineIndia
AI Resume Screening Workflow using Gmail, OpenAI (GPT-4 Turbo) & Slack This workflow automatically fetches resumes from Gmail, extracts text from PDF attachments, analyzes candidate profiles using AI, and shortlists qualified candidates by sending their details to Slack. It significantly reduces manual effort in resume screening and speeds up hiring decisions. Quick Implementation Steps Connect your Gmail account in n8n Connect your OpenAI API (GPT-4 Turbo) Connect your Slack account Ensure resumes are received as PDF attachments in Gmail Update job requirements inside the AI Resume Analyzer node Run the workflow manually or activate it What It Does This workflow automates the resume screening process by integrating Gmail, AI, and Slack into a seamless pipeline. It starts by fetching emails containing resume attachments, then processes and validates these files to ensure only PDF resumes are analyzed. Once validated, the workflow extracts text from each resume and sends it to an AI model. The AI evaluates candidate details such as skills, experience, and contact information, and calculates a match score based on predefined job requirements. Finally, the workflow determines whether a candidate should be shortlisted or rejected. Shortlisted candidates are instantly shared on Slack with complete details, enabling faster hiring decisions and reducing manual screening effort. Who’s It For HR teams and recruiters Hiring managers Startups handling large volumes of applications Recruitment agencies Businesses looking to automate candidate screening Requirements to Use This Workflow n8n (self-hosted or cloud) Gmail account (OAuth connected) OpenAI API key (GPT-4 Turbo access) Slack account (with channel access) Resumes in PDF format Basic understanding of n8n workflows How It Works & Set Up Setup Instructions Manual Trigger Setup Use the Start Workflow Manually node to initiate the workflow. Gmail Configuration Connect your Gmail account in the Fetch Gmail node. Ensure emails contain resume attachments. Attachment Processing The workflow formats and checks attachments. Only emails with attachments proceed further. PDF Validation Ensure resumes are in PDF format. Non-PDF files are ignored automatically. Text Extraction The Extract Resume Text node converts PDF resumes into plain text. AI Configuration Add your OpenAI credentials in the OpenAI Chat Model node. Customize job requirements inside the AI Resume Analyzer node prompt. Evaluation Logic Candidates are scored based on: Skills (50%) Experience (30%) Location (20%) Output Handling If score ≥ 70 → Sent to Slack as shortlisted. Else → Marked as rejected Activate Workflow Run manually or activate for continuous use. How To Customize Nodes AI Resume Analyzer** - Modify job requirements (skills, experience, location). Adjust scoring logic or threshold Evaluate Candidate Score** - Change minimum score (default: 70) Slack Node** - Customize message format. Change Slack channel Gmail Node** - Add filters (labels, subjects, senders) Extract Resume Text** - Extend for other file formats if needed 1.8 Add-ons (Enhancements) Store shortlisted candidates in Google Sheets or database Send rejection emails automatically Add ATS (Applicant Tracking System) integration Include resume ranking system Add duplicate candidate detection Enable multi-role job matching 1.9 Use Case Examples Automating resume screening for tech hiring Filtering candidates for specific skill sets (e.g., React Native developers) Handling bulk job applications efficiently Shortlisting candidates for remote roles Pre-screening candidates before interviews There can be many more use cases depending on hiring workflows and business needs. 1.10 Troubleshooting Guide | Issue | Possible Cause | Solution | |---------------------------|-----------------------------------------------------|------------------------------------------------------------| | No emails fetched | Gmail not connected or incorrect filters | Reconnect Gmail and check filters | | Attachments not processed | Emails don’t contain attachments | Ensure resumes are attached | | PDF not detected | Incorrect file type | Ensure resumes are in PDF format | | AI response parsing error | Invalid JSON from AI | Check prompt and enforce strict JSON | | No Slack message | Slack credentials or channel issue | Verify Slack connection and channel ID | | Low-quality matches | Incorrect job requirements | Update AI prompt with accurate criteria | Need Help? If you need assistance setting up this workflow or customizing it for your business needs, we’re here to help. Whether you want to: Extend this workflow with advanced features Integrate it with your existing systems Build similar automation solutions Feel free to reach out to our n8n workflow development experts at WeblineIndia for expert support and tailored workflow development. 👉 Let us help you automate smarter and scale faster.
by Yassin Zehar
Description Automatically triage Product UAT feedback with AI, deduplicate it against your existing Notion backlog, create/update the right Notion item, and close the loop with the tester (Slack or email). This workflow standardizes incoming UAT feedback, runs AI classification (type, severity, summary, suggested title, confidence), searches Notion to prevent duplicates, and upserts the roadmap entry for product review. It then confirms receipt to the tester and returns a structured webhook response. Context Feature requests often arrive unstructured and get lost across channels. Product teams waste time re-triaging the same ideas, creating duplicates, and manually confirming receipt. This workflow ensures: Faster feature request triage Fewer duplicates in your roadmap/backlog Consistent structure for every feedback item Automatic tester acknowledgement Full traceability via webhook response Who is this for? Product Managers running UAT or beta programs Product Ops teams managing a roadmap backlog Teams collecting feature requests via forms, Slack, or internal tools Anyone who wants AI speed with clean backlog hygiene Requirements Webhook trigger (form / Slack / internal tool) OpenAI account (AI triage) Notion account (roadmap/backlog database) Slack and/or Gmail (tester notification) How it works Trigger: feedback received via webhook Normalize & Clean: standardizes fields and cleans message AI Triage: returns structured JSON (type, severity, title, confidence…) Notion Dedupe & Upsert: search by suggested title → update if found, else create Closed Loop: notify tester (Slack or email) + webhook response payload What you get One workflow to capture and structure feature requests Clean Notion backlog without duplicates Automatic tester confirmation Structured output for downstream automation About me : I’m Yassin a Product Manager Scaling tech products with a data-driven mindset. 📬 Feel free to connect with me on Linkedin
by Kevin Meneses
What this workflow does This workflow automatically generates a daily stock market email digest, combining price movements and recent financial news into a clean, actionable report. Instead of manually checking charts and news every morning, this workflow does it for you. It combines: Market data from EODHD APIs Financial news with sentiment analysis Smart processing using JavaScript (no raw data overload) ✉️ Automated email delivery via Gmail Only relevant insights reach your inbox. How it works (overview) The workflow runs on a schedule (daily at 7 AM) or manually It reads your stock watchlist from Google Sheets It fetches: Historical price data (last 14 days) Latest news (last 7 days) JavaScript processes the data: Calculates daily % change Filters recent news Assigns sentiment (positive / neutral / negative) A clean HTML email is generated The digest is sent automatically via Gmail 👉 The result: a complete market snapshot in seconds How to configure it EODHD APIs (Market Data) Add your API key in the ⚙️ Config node Used for: Price data Financial news Sentiment signals Google Sheets Add your credentials Create a column named ticker Add one stock per row (e.g. AAPL, TSLA, MSFT) Gmail Add your Gmail credentials Set your recipient email Schedule Default: runs daily at 7 AM You can adjust it easily in the Schedule node Why this workflow is powerful Most people: Check multiple sites every morning Waste time jumping between charts and news Miss important market moves This workflow: ✔ Automates everything ✔ Centralizes data ✔ Adds context with sentiment analysis ✔ Saves hours every week 📊 Data powered by EODHD APIs This workflow uses EODHD APIs to retrieve: Historical stock prices Financial news Sentiment indicators 👉 If you want to build your own financial automations, dashboards, or trading workflows: Get started here (10% discount): https://eodhd.com/pricing-special-10?via=kmg&ref1=Meneses Final result You get a daily email like this: Top movers (price changes) Curated news per stock Sentiment insights Clean, readable format All generated automatically.
by Itunu
Automatically detect unsubscribe replies in your outreach emails and clean your Google Sheets contact list; keeping your domain reputation and deliverability strong. 🎯 Who it’s for This template is designed for freelancers, lead generation specialists, and outreach managers; particularly those running email outreach campaigns for clients or personal lead-gen projects. If you send cold emails, manage multiple lead lists, or handle outreach at scale, this workflow helps you automatically manage unsubscribe requests to maintain healthy email deliverability and protect your domain reputation. ⚙️ How it works Trigger: The workflow starts when a new reply is received in your Gmail inbox. Intent Detection: The email text is analyzed using OpenAI to detect unsubscribe intent (“unsubscribe”, “remove me”, “opt out”, etc.). Normalization & Filtering: A Code node verifies the AI output for accuracy and ensures the result is standardized as either "unsubscribe" or "keep". Check & Update Sheets: If the contact requested removal, the workflow checks your Unsubscribe Sheet to see if they’re already listed. If not, the contact is added to the Unsubscribe Sheet and simultaneously removed from your Main Outreach Sheet. Optional Gmail Label: Adds an “Unsubscribe” tag in Gmail for quick visual tracking (optional customization). 🧩 Requirements To run this workflow, you’ll need: Gmail Credentials** → for reading incoming replies and applying labels. Google Sheets Credentials** → to manage both the “Main” and “Unsubscribe” spreadsheets. OpenAI API Key** → used for detecting unsubscribe intent from message text. All credentials can be added through the n8n Credentials Manager. 🧠 How to Customize Polling Time:** Adjust how often the Gmail Trigger checks for new replies (default: every 5 minutes). Sheets:** Replace the linked Google Sheets IDs with your own. You can change sheet names and columns freely. Intent Rules:** Modify the Code node to include additional unsubscribe phrases or alternate keywords. Optional Gmail Tagging:** Enable or disable tagging for unsubscribed messages. Secondary Validation:** Enable the second “If” check after the OpenAI node to double-confirm unsubscribe intent before moving contacts. 💡 Why this workflow matters By automatically managing unsubscribe requests, you: Respect recipients’ opt-out preferences Reduce spam complaints Protect your domain reputation and increase deliverability Save hours of manual list cleaning This is a must-have automation for anyone running cold email outreach, especially freelancers managing multiple client inboxes. 🪄 Quick Setup Tips Replace all "Gmail account" and "Google Service Account account" credential references with your actual credentials. Ensure your sheet has an EMAIL column for lookup. Test with a few mock replies before activating for production. Optional: Add a time-based trigger to run the sheet cleanup periodically.