by RealSimple Solutions
š§ Analyze and Diagnose n8n Workflow Errors Automatically via OpenAI and Email > ā ļø This template is available on āļø Cloud & š„ļø self-hosted n8n instances with the OpenAI node enabled. š¤ Who is this for? This workflow is designed for n8n developers, automation engineers, and DevOps teams who want to automatically capture and analyze workflow errors, and receive professional HTML-styled diagnostics directly in their inbox. š„ What problem does this solve? Manually troubleshooting failed workflows in n8n can be time-consuming. This template streamlines error detection by: Capturing workflow failures using the Error Trigger node Diagnosing root causes with the help of OpenAI Sending a fully-formatted, human-readable HTML error report via email Including practical resolutions and next-step suggestions This helps you or your team resolve issues faster and avoid repeated manual debugging. āļø What this workflow does ā” Triggers on any n8n workflow error š¦ Extracts relevant error metadata including node, execution ID, and timestamps š§ Sends error content to OpenAI for analysis and recommendations š Generates an HTML email report with inline styles and clear formatting š„ Emails the result to a system administrator or support email š ļø Setup Install the OpenAI node in your self-hosted n8n instance. Add your OpenAI API Key securely in credentials. Configure the SMTP Email node with your email credentials. Adjust the Error Trigger to monitor specific workflows or all workflows. Set your preferred admin or dev email address in the final node. š§ How to customize this workflow to your needs š§© Use a [Set node] to define your variables, such as: Default admin email Workflow filter (optional) āļø Customize the prompt sent to OpenAI if you want deeper or more specific analysis šØ Modify the email HTML styles to match your brand or internal format š¾ Add additional logging (e.g., to Airtable, Google Sheets, or Notion) for long-term error tracking š Sticky Note Title: Automated Error Reporter with AI-Powered Diagnosis Description: Captures any n8n error, sends it to OpenAI, and emails a beautiful HTML report to the administrator with steps to resolve the issue. Requires OpenAI credentials and SMTP configured.
by Amjid Ali
Automate Digital Delivery After PayPal Purchase Using n8n A Complete Step-by-Step Guide to Seamless Template Delivery Built by Amjid Ali ā SyncBricks Deliver personalized files instantly after PayPal transactions using n8n ā without writing a single backend line. š What This n8n Workflow Does This automation template helps you automatically deliver a digital product (such as an n8n template or JSON file) to customers who pay via PayPal ā within seconds. You can: Automatically extract customer info Identify what was purchased Send a clean, branded email with the product file Promote your other courses, books, and tools š¦ Use Case Example Product: AI-Powered Social Media Content Generator & Publisher When a customer buys this product through PayPal, this automation: Listens for a successful payment event Fetches order details via API Sends an HTML email with the template attached Promotes your other offerings with embedded links š§ Prerequisites Youāll need: An n8n instance (self-hosted or n8n Cloud) A PayPal developer account PayPal OAuth2 credentials configured in n8n Your product hosted as a downloadable .json file (Oracle, Dropbox, GitHub, etc.) SMTP email credentials in n8n š§ Step-by-Step Setup 1. Webhook Trigger Node: Webhook Listens for a POST request from PayPalās webhook for PAYMENT.CAPTURE.COMPLETED events. š Add the webhook to your PayPal Developer App > Webhooks. 2. Wait Node: Wait Adds a brief delay to ensure the payment is completely processed before continuing. 3. Filter Event Type Node: Switch Processes only when the event is PAYMENT.CAPTURE.COMPLETED. 4. Fetch Order Details Node: HTTP Request Retrieves the order information from PayPal's Orders API. URL format: https://api.paypal.com/v2/checkout/orders/{{ order_id }} 5. Extract Email & Product Info Node: Set Extracts first name, last name, email address, and the purchased item name. 6. Identify Product Purchased Node: Switch Checks if the product is āAI-Powered Social Media Content Generator & Publisherā. 7. Download Workflow File Node: HTTP Request Fetches the hosted workflow JSON from object storage (Oracle in this case). 8. Convert to Downloadable File Node: Code Converts the JSON content into a binary file and attaches it. 9. Send Custom Email Node: Send Email Sends a rich HTML email to the buyer with: Their name The file attachment Product name Helpful resource links: š Mastering n8n Course on Udemy š Step-by-Step Guide (n8n Book) š n8n Video Tutorials (Free Course) āļø Sign up for n8n Cloud ā Use code AMJID10 š„ YouTube Video Walkthrough š Additional Learning Resources š My Full Automation Suite Explore more and master n8n with these resources: š Mastering n8n (Full Udemy Course) š Get Your Step-by-Step Guide (n8n Book) š„ Get Step-by-Step Tutorials (Video Course) āļø Sign up for n8n Cloud š” Templates, Tools, and More šŗ YouTube Channel ā SyncBricks š Need Help or Customization? Reach out! Email: amjid@amjidali.com LinkedIn: linkedin.com/in/amjidali Website: syncbricks.com
by Luciano Gutierrez
Instagram Auto-Comment Responder with AI Agent Integration Version: 1.1.0 ā§ n8n Version: 1.88.0+ ā§ License: MIT A fully automated workflow for managing and responding to Instagram comments using AI agents. Designed to improve engagement and save time, this system listens for new Instagram comments, verifies and filters them, fetches relevant post data, processes valid messages with a natural language AI, and posts context-aware replies directly on the original post. Key Features š¬ AI-Driven Engagement: Intelligent responses to comments via a GPT-powered agent. ā Webhook Verification: Handles Instagram webhook handshake to ensure secure integration. š¦ Data Extraction: Maps incoming payload fields (user ID, username, message text, media ID) for processing. š« Self-Comment Filtering: Automatically skips comments made by the account owner to prevent loops. š” Post Data Retrieval: Fetches the mediaās id and caption from the Graph API (v22.0) before generating a reply. š§ Natural Language Processing: Uses a custom system prompt to maintain brand tone and context. š Automated Replies: Posts the AI-generated message back to the comment thread using Instagramās API. š§© Modular Architecture: Clear separation of steps via sticky notes and dedicated HTTP Request and Agent nodes. Use Cases Social Media Automation**: Keep followers engaged 24/7 with instant, relevant replies. Community Building**: Maintain a consistent voice and tone across all interactions. Brand Reputation Management**: Ensure no valid comment goes unanswered. AI Customer Support**: Triage simple questions and direct followers to resources or support. Technical Implementation Webhook Verification Node: Webhook + Respond to Webhook Echoes hub.challenge to confirm subscription and secure incoming events. Data Extraction Node: Set Maps payload fields into structured variables: conta.id, usuario.id, usuario.name, usuario.message.id, usuario.message.text, usuario.media.id, endpoint. User Validation Node: Filter Skips processing if conta.id equals usuario.id (self-comments). Post Data Retrieval Node: HTTP Request (Get post data) GET https://graph.instagram.com/v22.0/{{ $json.usuario.media.id }}?fields=id,caption&access_token={{ credentials }} Captures the mediaās caption for richer context in replies. AI Response Generation Nodes: AI Agent + OpenRouter Chat Model Uses a detailed system prompt with: Profile persona (expert in AI & automations, friendly tone). Input data (username, comment text, post caption). Filtering logic (spam, praise, questions, vague comments). Returns either the reply text or [IGNORE] for irrelevant content. Posting the Reply Node: HTTP Request (Post comment) POST {{ $json.endpoint }}/{{ $json.usuario.message.id }}/replies with message={{ $json.output }} Sends the AI answer back under the original comment. Instructions for Setup Import Workflow In n8n > Workflows > Import from File, upload the provided .json template. Configure Credentials Instagram Graph API (Header Auth or FacebookGraphApi) with instagram_basic, instagram_manage_comments scopes. OpenRouter/OpenAI API key for AI agent. Customize System Prompt Edit the AI Agentās prompt to adjust brand tone, language (Brazilian Portuguese), length, or emoji usage. Test & Activate Publish a test comment on an Instagram post. Verify each nodeās execution, ensuring the webhook, filter, data extraction, HTTP requests, and AI Agent respond as expected. Extend & Monitor Add sentiment analysis or lead capture nodes as needed. Monitor execution logs for errors or rate-limit events. Tags Social Media ⢠Instagram Automation ⢠Webhook Verification ⢠AI Agent ⢠HTTP Request ⢠Auto Reply ⢠Community Management
by Sean Lon
Target Audience You will find this workflow or template perfect if you are in the internal talent acquisition teams, recruitment agencies, HR professionals, and hiring managers seeking to bulk automate the initial screening of CVs and resumes. Eg. Automatically get result of candidate who has been shortlisted/rejected with its rationale and score automatically. By eliminating manual evaluation and screening, you get smart AI-Agent helping you to have standardized efficient, and scalable solution for handling large volumes of applications. With bulk automation, you can focus strategic decision-making rather than tedious screening tasks, ensuring a faster, more accurate, and fair hiring process. Key focus This workflow focusses on having a more organized file-folder management, trackable candidate cv, maintainable job description, autonomous ai-agent. Organized Folder-File Structure ā CVs are automatically categorized based on their status, ensuring a structured workflow and easy retrieval Candidate Tracker ā A real-time tracking system records the state of each CV, allowing recruiters to monitor the shortlisted, rejected, or KIV (Keep in View) candidates. AI Agent for Decision Automation ā The AI autonomously orchestrates screening decisions, replacing manual LLM configurations with dynamic AI-driven evaluations for scalability and accuracy. Maintainable Job Description Management ā A structured job description file ensures continuous updates, keeping hiring criteria flexible and aligned with recruitment needs. Email Notifications ā The system automatically sends receipt confirmations upon processing completion, providing timely updates to recruiters. Features - Workflow Automated Resume Screening Workflow This workflow leverages Groq Llama4 for intelligent resume analysis, speeding the screening process by generating a matching score, result (shortlisted/rejected/kiv), and key insights/rationale into their suitability for provided job description. Step-by-Step Process: Monitors Google Drive:** Listens and checks for new resume cv in google drive . Retrieve Resume:** Downloads the CV resumes from google drive . Extract Resume Data:* Extract *text content** from CV resume PDF files Extract Job Description Data:* Extract *text content** from job description Analyze with Groq:** Generate a matching score based on job requirements. [SCORE: 1-10] Provide decision into their job suitability. [SHORTLISTED/REJECTED/KIV] Provide actionable insights into their job suitability. [REASON] This ensures a fast, efficient, and accurate screening process, eliminating manual evaluation. Setup Guide Step-by-Step Instructions Ensure all credentials are ready and setup (groq, gdrive ,gmail, gsheet, gdoc) View official n8n documentation on node setup accordingly. See also the notes of setup . Folder & File Setup 1. Create a google-drive folder like this View directory example 2. Create a job description like this View file example 3. Configure a tracker like this ( Candidate Name, AI Score,AI Verdict, AI Reason) View file example email conversations report as you like. You are ready to go!
by Mark de Jonge
About the workflow The workflow reads every reply that is received from a cold email campaign and qualifies if the lead is interested in a meeting. If the lead is interested, a deal is made in pipedrive. You can add as many email inboxes as you need! Setup: Add credentials to the Gmail, OpenAI and Pipedrive Nodes. Add a in_campaign field in Pipedrive for persons. In Pipedrive click on your credentials at the top right, go to company settings > Data fields > Person and click on add custom field. Single option [TRUE/FALSE]. If you have only one email inbox, you can delete one of the Gmail nodes. If you have more than two email inboxes, you can duplicate a Gmail node as many times as you like. Just connect it to the Get email node, and you are good to go! In the Gmail inbox nodes, select Inbox under label names and uncheck Simplify.
by Ranjan Dailata
Who this is for? The Automate Etsy Data Mining with Bright Data Scrape & Google Gemini workflow is designed for eCommerce analysts, product researchers, and AI developers seeking to extract actionable insights from Etsy listings at scale. It is ideal for: eCommerce Entrepreneurs** - Researching product demand and competition. Market Analysts** - Tracking pricing, reviews, and trends across Etsy categories. Product Managers** - Identifying niche opportunities and design inspirations. Data Scientists & AI Engineers** - Automating product intelligence pipelines. Growth Hackers** - Leveraging Etsy insights to refine product-market fit. What problem is this workflow solving? Manually browsing Etsy to analyze product listings, pricing, reviews, and seller activity is slow, inconsistent, and unscalable. Scraping Etsy requires unlocking JavaScript-heavy content and structuring noisy data for analysis. This workflow solves: Automated and scalable scraping of Etsy product listings using Bright Dataās infrastructure. A fully paginated data structured Estry production data extraction via the Google Gemini LLM. Enables faster decision-making for product research and competitive analysis via the fully automated paginated data extraction. What this workflow does Receives input: Sets the Esty URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. Cleans and preprocesses the scraped content for readability. Sends the content to Google Gemini for: Enriched results including: Data persistence over the disk. Sends the response to a target system via Webhook notification. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set Esty Search Query for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Input Sources** : Replace the static URL with dynamic input from Google Sheets, Webhook, or Airtable to research multiple niches. Prompt Customization** : Adjust Gemini prompts to extract specific insights for example: List key features of the product Summarization of the review themes Data Output Options** : Update the Webhook notification to save data to: Google Sheets Notion or Airtable SQL/NoSQL Slack/Email
by Davi Saranszky Mesquita
Log errors and avoid sending too many emails Use case Most of the time, itās necessary to log all errors that occur. However, in some cases, a scheduled task or service consuming excessive resources might trigger a surge of errors. To address this, we can log all errors but limit alerts to a maximum of one notification every 5 minutes. What this workflow does This workflow can be configured to receive error events, or you can integrate it before your own error-handling logic. If used as the primary error handler, note that this flow will only add a database log entry and take no further action. Youāll need to add your own alerts (e.g., email or push notifications). Below is an example of a notification setup I prefer to use. At the end, thereās an error cleanup option. This feature is particularly useful in development environments. If you already have an error-handling workflow, you can call this one as a sub-workflow. Its final steps include cleanup logic to reset the execution state and terminate the workflow. Setup Verify all Postgres nodes and credentials when using the 'Error Handling Sample' How to adjust it to your needs 1) You can set this workflow as a sub-workflow within your existing error-handling setup. 2) Alternatively, you can add the "Error Handling Sample" at the end of this workflow, which sends email and push notifications. Configuration Requirements: ā ļø You must create a database table for this to work! DDL of this sample: create table p1gq6ljdsam3x1m."N8Err" ( id serial primary key, created_at timestamp, updated_at timestamp, created_by varchar, updated_by varchar, nc_order numeric, title text, "URL" text, "Stack" text, json json, "Message" text, "LastNode" text ); alter table p1gq6ljdsam3x1m."N8Err" owner to postgres; create index "N8Err_order_idx" on p1gq6ljdsam3x1m."N8Err" (nc_order); by Davi Saranszky Mesquita https://www.linkedin.com/in/mesquitadavi/
by Krishna Kumar Eswaran
š§ Problem This Solves: For developers and creators, consistently posting quality content on LinkedIn can be time-consuming. This workflow automates the process by: Fetching the latest Dev.to articles Posting them to LinkedIn twice daily Preventing duplicates using Airtable Sending success alerts to Telegram This ensures you're always active on LinkedIn, with zero manual effort. š„ Who This Template Is For Developers who want to build their presence on LinkedIn Tech creators or solo founders looking to grow an audience Community/page managers who want regular, curated content Busy professionals aiming for consistent LinkedIn engagement without doing it manually āļø Workflow Breakdown This automation runs twice a day (9:00 AM and 7:00 PM) and performs the following steps: Fetches Dev.to articles based on a tag Checks Airtable to avoid reposting the same article Posts to LinkedIn if itās new Sends a Telegram message after posting successfully š§© Step-by-Step Setup Instructions ā 1. Airtable Configuration Create a new base in Airtable with just one table and one column: Table Name: PostedArticles Column: ArticleID (Single line text ā stores the unique ID of each Dev.to article posted) This column is used to track posted articles and prevent duplicates. š 2. Dev.to API Setup Use the following endpoint in the HTTP Request node: arduino Copy Edit https://dev.to/api/articles?tag=YOUR_TAG_HERE&per_page=10 Replace YOUR_TAG_HERE with a tag like android, webdev, ai, etc. š¬ 3. Telegram Bot Setup Open @BotFather in Telegram and create a new bot Save the bot token Get your chat ID using @userinfobot or via Telegram API Add a Telegram node in n8n using this token and chat ID This will notify you when a post is successfully published. š§¾ 4. LinkedIn Setup Create a LinkedIn Developer App Use OAuth2 to connect it in n8n Choose to post on either a user profile or a company page š§± 5. n8n Workflow Structure Hereās the basic structure of the workflow: Cron Node ā Triggers at 9:00 AM and 7:00 PM daily HTTP Request ā Fetches latest articles from Dev.to Airtable Search ā Checks if ArticleID already exists IF Node ā Filters new vs. already-posted articles LinkedIn Post ā Publishes new article Airtable Create ā Saves the new ArticleID Telegram Message ā Sends success confirmation š ļø Customization Tips Change the Dev.to tag in the API URL Modify LinkedIn post format (add hashtags, emojis, personal notes) Adjust posting times in the Cron node Use additional filters (e.g., only post articles with a cover image or certain word count)
by Krishna Kumar Eswaran
š§ Problem This Solves Manually sharing Medium articles to LinkedIn daily can be repetitive and time-consuming. This automation: Fetches the latest Medium articles based on a tag (e.g., android) Posts them on LinkedIn twice daily Uses Airtable to prevent duplicates Sends a confirmation to Telegram once posted Stay consistently active on LinkedIn without lifting a finger. š„ Who This Template Is For Developers who write or follow Medium content Tech creators or founders looking to grow an audience Community or page managers needing regular curated posts Busy professionals who want hands-free LinkedIn engagement āļø Workflow Breakdown This automation runs at 9:00 AM and 7:00 PM daily and performs these steps: Fetch articles from MediumAPI.com by tag Check Airtable to prevent reposting the same article Post on LinkedIn if itās new Store the article ID in Airtable Send a Telegram message after successful posting š§¾ Step-by-Step Setup Instructions ā 1. Airtable Configuration Create a base with: Table Name: PostedArticles Column: ArticleID (Single line text ā to track posted articles) š 2. MediumAPI Setup Go to https://mediumapi.com Sign up and generate your API key from the dashboard Use this API endpoint in an HTTP node: GET https://mediumapi.com/api/tag/YOUR_TAG/latest Headers: Authorization: Bearer YOUR_API_KEY Replace YOUR_TAG with a topic like android, ai, webdev, etc. š¬ 3. Telegram Bot Setup Go to @BotFather and create a new bot Save the bot token Use @userinfobot to get your Telegram chat ID Add a Telegram node in n8n with the token + chat ID š 4. LinkedIn Setup Create a LinkedIn Developer App Connect it via OAuth2 in n8n Choose to post on your profile or company page š§± 5. n8n Workflow Structure Node Type Description Cron Triggers the flow twice a day HTTP Request Fetches articles from MediumAPI.com Airtable Search Checks if article ID already exists IF Node Skips duplicates LinkedIn Post Publishes to your LinkedIn profile/page Airtable Create Stores posted article ID Telegram Node Sends success notification š ļø Customization Tips Change the tag in the API URL to match your niche Add hashtags or personal comments to the LinkedIn message Schedule different posting times in the Cron node Filter Medium posts based on length or title keywords (optional)
by Yar Malik (Asfandyar)
Intro This template is for teams, individuals, or businesses who want to automatically send daily email reminders (e.g., updates, status alerts, followāups) using n8n + Gmail. How it works Cron Trigger fires every day at your specified time. Google Sheets node reads all rows from your sheet. If node filters rows matching your condition (e.g., Status = "Pending"). Send a message (Gmail) sends a customized email to each filtered row. Required Google Sheet Structure | Column Name | Type | Example | Notes | |-------------|--------|--------------------------|------------------------------------| | Email | string | user@example.com | Recipient email address | | Status | string | Pending | Filter criterion | | Subject | string | Daily Status Update | Email subject (supports variables) | | Body | string | āPlease update your taskā| Email body (text or HTML) | Detailed Setup Steps Google Sheets Build your sheet with the columns above. In n8n ā Credentials, add Google Sheets API (avoid sensitive names). Gmail In n8n ā Credentials ā Gmail (OAuth2 or SMTP), connect your account. Do not include your real email in the credential name. Import & Configure Export the workflow JSON (threeādot menu ā Export). Paste it under Template Code in the Creator form. In each node, select your Google Sheets and Gmail credentials. Sticky Notes On the If node: āDefines which rows to email.ā On the Gmail node: āSends the email.ā Customization Guidance Adjust schedule: change the Cron expression in **Cron Trigger. Modify filter: edit the condition in the **If node. Customize email**: use expressions like {{$node["Get row(s) in sheet"].json["Subject"]}}. Troubleshooting Verify the Google Sheet is shared with the connected service account. Check your Cron timezone and expression. Ensure Gmail credentials are valid and not rateālimited. Security & Best Practices Remove** any real email addresses and sheet IDs. Use** n8n Credentials or environment variablesānever hardācode secrets. Add** sticky notes for any complex logic.
by Naveen Choudhary
Who is this template for? Growth teams, SDRs, recruiters, or anyone whoāÆroutinely hunts for hardātoāfind business emails and would rather spend time reaching out than guessing formats. What problem does this workflow solve? Manually piecing together email patterns, crossāchecking them in a verifier, and updating a tracking sheet is slow and errorāprone. This template automates theāÆentire loopāresearch, guess, verify, and logāso you hit Start and watch rows fill up with readyātoāsend addresses. What this workflow does Pull fresh leads ā Grabs only the rows in your GoogleāÆSheet where StatusāÆ=āÆFALSE. Find the company pattern ā Queries Serper.dev for snippets and feeds them to GeminiāÆFlash (via OpenRouter) to spot the dominant email format. Build the address ā Constructs a likely email for every first/last name. Verify in real time ā Pings Prospeo by default (API) or lets you bulkāclean in Sparkle.io. Write it back ā Updates the sheet with pattern, email, confidence, verification status, and flips Status toāÆTRUE. Loop until done ā Runs batchābyābatch so you never hit API limits. š Work freeātier magic (up to \~2,500 contacts/month) | Service | Free allowance | How this template uses it | | -------------- | ----------------------------- | ------------------------------------------------------------------------------------ | | Serper.dev | 2,500 searches/mo | Scrapes three public email snippets per domain to learn the pattern | | Sparkle.io | 10,000 bulk verifications/day | Manual uploadādownload optionāperfect to clean your first 2.5k emails at zero cost | | Prospeo | 75 API calls/mo | Builtāin if you prefer fully automated verification | Quick Sparkle workflow: Let the template generate emails. Export the āEmailā column toāÆCSV ā upload to Sparkle.io. Download the results and paste the "verification\_status" back into the sheet (or add a small n8n import subāflow). Setup (5āÆminutes) Copy the GoogleāÆSheet linked in the sticky note and paste its ID into the Get Rows and Update Rows nodes. Add credentials for GoogleāÆSheets, Serper (XāAPIāKEY), OpenRouter, and optionally Prospeo. Hit Execute Workflowāthatās it. How to customise Prefer Sparkle for volume:** Skip the Prospeo node, export emails in one click, bulkāverify in Sparkle, and reāimport results. Swap the search source:* Replace the *Get Email Pattern HTTP node with Bing, Brave, etc. Extend enrichment:* Add phone lookāups or LinkedIn scrapers before the *Update Rows node. Autoārun:** Replace the Manual Trigger with a Cron node so the sheet cleans itself every morning. AdditionalāÆresources | Tool | Purpose | Link | | --------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------- | -------------------------------------------------------- | | Prospeo ā APIāready email verificationSpecial offer: 20āÆ% free credits for the firstāÆ3āÆmonths on any plan using this link! | Realātime, singleācall mailbox validation | prospeo.io | | Sparkle.io ā highāvolume bulk verifier (manual upload) | Free daily quota of 10āÆ000 verifications | app.sparkle.io/signāup | | OpenRouter ā API gateway for GeminiĀ Flash & other LLMs | One key unlocks multiple frontier models | openrouter.ai | | Serper.dev ā Google Search API | 2āÆ500 searches/month on the free tier | serper.dev | Add the relevant keys or signup details from these links, drop them into the matching n8n credentials, and youāre all set to enrich your first 2āÆ500 contacts at zero cost. Happy building!
by Khairul Muhtadin
Who is this for? This workflow is perfect for Gmail users who want a tidy inbox without manual effort. Itās especially great for those overwhelmed by SPAM, social media updates, or promotional emails and want them automatically removed regularly. What problem is this workflow solving? Unwanted emails like SPAM, social notifications, and promotions can clutter your Gmail inbox, making it hard to focus on what matters. Manually deleting them is repetitive and time-consuming. This workflow automates the cleanup, keeping your inbox streamlined. What this workflow does Every 3 days, this workflow deletes emails from Gmailās SPAM, Social, and Promotions categories. It uses n8nās Gmail node to fetch these emails, merges them into a single list, splits out individual email IDs, and deletes each one. The scheduled process ensures consistent inbox maintenance. Setup Set up valid Gmail OAuth2 credentials in n8n. Import the "Clean My Mail" workflow into your n8n instance. Confirm the Gmail nodes target SPAM, CATEGORY_SOCIAL, and CATEGORY_PROMOTIONS labels. Adjust the "Run Every 3 Days (Trigger)" nodeās schedule if needed. Activate the workflow to begin automated cleaning. How to customize this workflow to your needs Change the Gmail node labels to target other categories or custom labels. Adjust the schedule frequency in the trigger node. Add filters to spare specific emails from deletion. Extend functionality with nodes for archiving or notifications. made by:* khmuhtadin Need a custom? contact me on LinkedIn or Web