by PollupAI
Who’s it for This workflow is built for B2B SaaS and CX teams that are drowning in unstructured customer feedback across tools. It’s ideal for Customer Success, Product and Support leaders who want a light “voice of customer engine” without rebuilding their stack: Gmail for interactions, Slack for conversations, Pipedrive for notes and Zendesk for tickets, plus Notion for follow-up tasks. How it works / What it does The workflow runs on a schedule or manual trigger and first sets the CSM’s email address. It then uses an AI “Data agent” to pull recent customer signals from multiple sources: Gmail messages, Slack messages, Pipedrive notes and Zendesk tickets. A “Signals agent” compresses each piece of feedback into a concise, neutral summary, which is then grouped by topic via a “Clustering agent”. Each cluster gets a label, count and examples. Finally, an “Action agent” routes clusters based on their label: Create Zendesk tickets for product/performance issues Post to a dedicated Slack channel for billing / contract topics Create Notion tasks for sales-related feedback Send targeted Gmail messages to the CSM for high-risk or engagement-related items How to set up Import the workflow into n8n. Connect credentials for Gmail, Slack, Pipedrive, Zendesk, Notion and OpenAI. Update the CSM email in the “Set CSM email” node. Adjust date filters, send-to addresses and Slack channel IDs as needed. Enable the schedule trigger for weekly or daily digests. Requirements Active accounts & credentials for: Gmail, Slack, Pipedrive, Zendesk and Notion OpenAI (or compatible) API key for the LLM node At least one Slack channel for posting feedback (e.g. #billing-feedback) How to customize the workflow Change the time window or filters (sender, channel, query) for each data source. Edit the clustering and routing prompts to match your own categories and teams. Add new destinations (e.g. Jira, HubSpot) by connecting more tools to the Action agent. Modify thresholds (e.g. minimum count) before a cluster triggers an action. Localize labels and email copy to your team’s language and tone.
by Veena Pandian
Who is this for? Founders, product managers, content strategists, indie hackers, and anyone who wants to automatically monitor tech industry trends across multiple sources — without manually browsing Hacker News and Product Hunt every day. What this workflow does This workflow scans public RSS feeds from Hacker News and Product Hunt daily, scores every item against configurable keyword groups (AI, SaaS, No-Code, Dev Tools, etc.), clusters the results into ranked themes, and delivers a prioritized intelligence report via Slack and email. All signals and themes are logged to Google Sheets for historical trend analysis. How it works Daily trigger fires on a configurable schedule (default: every 24 hours). Fetches RSS feeds from Hacker News (posts with 50+ points) and Product Hunt in parallel. Parses and normalizes all feed items — extracting titles, descriptions, URLs, and publish dates from RSS/Atom XML. Scores each item against 7 weighted keyword groups. Title matches receive a bonus multiplier. Source weights (Hacker News 1.5x, Product Hunt 1.3x) amplify signals from higher-authority sources. Clusters into themes — groups scored items by primary category, calculates theme strength using source diversity and volume bonuses, and classifies each as VERY_STRONG, STRONG, MODERATE, or WEAK. Builds an intelligence report with theme rankings, top 10 signals, and action items for surging topics. Generated in both plain text (Slack) and HTML (email). Delivers and logs — posts to Slack, sends HTML email, and appends both individual signals and theme summaries to separate Google Sheet tabs. Setup steps Connect Google Sheets OAuth2 credentials and update the Sheet ID in both "Log Signals to Sheet" and "Log Themes to Sheet" nodes. Create a Google Sheet with two tabs: signal — headers: date, title, source, score, category, url themes — headers: date, category, signal_level, theme_strength, item_count, sources, top_keywords Connect Slack OAuth2 credentials and configure your target channel in the "Post Report to Slack" node. Connect Gmail OAuth2 credentials and update YOUR_EMAIL@EXAMPLE.COM in the "Email Daily Report" node. Activate the workflow. Requirements n8n instance (self-hosted or cloud) Google Cloud project with Sheets API enabled Slack workspace with a bot configured Gmail account with OAuth2 credentials (or swap for SMTP) No API keys needed for RSS feeds — they are publicly accessible How to customize Add more RSS feeds** — duplicate a feed node (e.g., TechCrunch, Reddit, Lobsters), connect it to the Merge node as an additional input, and add the parsing logic in the "Parse All RSS Feeds" code node. Edit keyword groups** — modify the keywordGroups object in the "Score and Classify Signals" node. Add your industry-specific keywords, adjust weights, and rename categories. Adjust source weights** — change the weight multipliers in the parser node to reflect which sources you trust most. Theme thresholds** — modify the strength cutoffs (30 = VERY_STRONG, 15 = STRONG, 8 = MODERATE) in the "Aggregate Signals into Themes" node. Schedule** — change from daily to hourly for real-time monitoring, or weekly for a digest format. Add AI analysis** — insert an LLM node after the report builder to generate strategic commentary on detected trends.
by Samyotech
What this workflow does This workflow implements a two-stage news automation system designed for reusable and topic-driven email delivery. News articles are continuously collected from multiple platforms using RSS feeds and stored in a vector database with semantic embeddings and category metadata. Instead of fetching news on demand, the workflow separates daily ingestion from weekly delivery. This allows the same news data to be reused across different topics, audiences, or delivery schedules. On a weekly basis, relevant articles are retrieved from the vector store based on defined areas of interest and item limits. The selected news is then processed by an AI agent, which converts the raw articles into a structured, email-ready format before sending the final content to users. How it works News articles are collected daily from multiple RSS feeds Articles are categorized and stored in a vector database On a weekly trigger, topic preferences are evaluated Relevant articles are retrieved using vector-based search An AI agent formats the content for email delivery The email is sent to the user Setup To use this workflow, complete the following steps: Add and configure your RSS feed sources Connect a vector database and embedding model Configure AI model credentials for content generation Set up email service credentials Define weekly scheduling and topic inputs Test retrieval and email output Customization You can customize this workflow by: Adding or removing RSS feed sources Adjusting news categories or topic filters Changing the number of articles retrieved per topic Modifying the AI agent’s writing tone or structure Reusing the vector store for other content workflows Updating email frequency or delivery format Requirements RSS feed URLs Vector database credentials AI model credentials Email service credentials
by Frederik Duchi
This n8n template demonstrates how to automatically create tasks (or in general, records) in Baserow based on template or blueprint tables. The first blueprint table is the master table that holds the general information about the template. For example: a standard procedure to handle incidents. The second table is the details table that holds multiple records for the template. Each record in that table is a specific task that needs to be assigned to someone with a certain deadline. This makes it easy to streamline task creation for recurring processes. Use cases are many: Project management (generate tasks for employees based on a project template) HR & onboarding (generate tasks for employee onboarding based on a template) Operations (create checklists for maintenance, audits, or recurring procedures) Good to know The Baserow template for handling Standard Operating Procedures works perfect as a base schema to try out this workflow. Authentication is done through a database token. Check the documentation on how to create such a token. Tasks are inserted using the HTTP request node instead of a dedicated Baserow node. This is to support batch import instead of importing records one by one. Requirements Baserow account (cloud or self-hosted) A Baserow database with at least the following tables: Assignee / employee table. This is required to be able to assign someone to a task. Master table with procedure or template information. This is required to be able to select a certain template Details table with all the steps associated with a procedure or template. This is required to convert each step into a specific task. A step must have a field Days to complete with the number of days to complete the step. This field will be used to calculate the deadline. Tasks table that contains the actual tasks with an assignee and deadline. How it works Trigger task creation (webhook)** The automation starts when the webhook is triggered through a POST request. It should contain an assignee, template, date and note in the body of the request. It will send a succes or failure response once all steps are completed. Configure settings and ids** Stores the ids of the involved Baserow database and tables, together with the API credentials and the data from the webhook. Get all template steps** Gets all the steps from the template Details table that are associated with the id of the Master template table. For example: the master template can have a record about handling customer complaints. The details table contains all the steps to handle this procedure. Calculate deadlines for each step** Prepares the input of the tasks by using the same property names as the field of the Tasks table. Adjust this names, add or remove fields if this is required for your database structure. The deadline of each step is calculated by adding the number of days a step can take based on the deadline of the first step. This is done through a field Days to complete in the template Details table. For example. If the schedule_date property in the webhook is set to 2025-10-01 and the Days to complete for the step is 3, then the deadline will be 2025-10-04 Avoid scheduling during the weekend** It might happen that the calculated deadline is on a Saturday or Sunday. This Code node moves those dates to the first Monday to avoid scheduling during the weekend. Aggregate tasks for insert** Aggregates the data from the previous nodes as an array in a property named items. This matches perfect with the Baserow API to insert new records in batch. Generate tasks in batch** Calls the API endpoint /api/database/rows/table/{table_id}/batch/ to insert multiple records at once in the tasks table. Check the Baserow API documentation for further details. Success / Error response** Sends a simple text response to indicate the success or failure of the record creation. This is to offer feedback when triggering the automation from a Baserow application, but can be replaced with a JSON response. How to use Call the Trigger task creation node with the required parameters through a POST request. This can be done from any web application. For example: the application builder in Baserow supports an action to send an HTTP request. The Procedure details page in the Standard Operating Procedures template demonstrates this action. The following information is required in the body of the request. This information is required to create the actual tasks. { "assignee_id": integer refering to the id of the assignee in the database, "template_id": integer refering to the id of the template or procedure in the master table, "schedule_date": the date the tasks need to start scheduling, "note": text with an optional note about the tasks } Set the corresponding ids in the Configure settings and ids node. Check the names of the properties in the Calculate deadlines for each step node. Make sure the names of those properties match the field names of your Tasks table. You can replace the text message in the Success response and Failure response with a more structured format if this is necessary in your application. Customising this workflow Add support for public holidays (e.g., using an external calendar API). Modify the task assignment logic (e.g., pre-assign tasks in the details table). Combine with notifications (email, Slack, etc.) to alert employees when new tasks are generated.
by WeblineIndia
AI Resume Screening Workflow using Gmail, OpenAI (GPT-4 Turbo) & Slack This workflow automatically fetches resumes from Gmail, extracts text from PDF attachments, analyzes candidate profiles using AI, and shortlists qualified candidates by sending their details to Slack. It significantly reduces manual effort in resume screening and speeds up hiring decisions. Quick Implementation Steps Connect your Gmail account in n8n Connect your OpenAI API (GPT-4 Turbo) Connect your Slack account Ensure resumes are received as PDF attachments in Gmail Update job requirements inside the AI Resume Analyzer node Run the workflow manually or activate it What It Does This workflow automates the resume screening process by integrating Gmail, AI, and Slack into a seamless pipeline. It starts by fetching emails containing resume attachments, then processes and validates these files to ensure only PDF resumes are analyzed. Once validated, the workflow extracts text from each resume and sends it to an AI model. The AI evaluates candidate details such as skills, experience, and contact information, and calculates a match score based on predefined job requirements. Finally, the workflow determines whether a candidate should be shortlisted or rejected. Shortlisted candidates are instantly shared on Slack with complete details, enabling faster hiring decisions and reducing manual screening effort. Who’s It For HR teams and recruiters Hiring managers Startups handling large volumes of applications Recruitment agencies Businesses looking to automate candidate screening Requirements to Use This Workflow n8n (self-hosted or cloud) Gmail account (OAuth connected) OpenAI API key (GPT-4 Turbo access) Slack account (with channel access) Resumes in PDF format Basic understanding of n8n workflows How It Works & Set Up Setup Instructions Manual Trigger Setup Use the Start Workflow Manually node to initiate the workflow. Gmail Configuration Connect your Gmail account in the Fetch Gmail node. Ensure emails contain resume attachments. Attachment Processing The workflow formats and checks attachments. Only emails with attachments proceed further. PDF Validation Ensure resumes are in PDF format. Non-PDF files are ignored automatically. Text Extraction The Extract Resume Text node converts PDF resumes into plain text. AI Configuration Add your OpenAI credentials in the OpenAI Chat Model node. Customize job requirements inside the AI Resume Analyzer node prompt. Evaluation Logic Candidates are scored based on: Skills (50%) Experience (30%) Location (20%) Output Handling If score ≥ 70 → Sent to Slack as shortlisted. Else → Marked as rejected Activate Workflow Run manually or activate for continuous use. How To Customize Nodes AI Resume Analyzer** - Modify job requirements (skills, experience, location). Adjust scoring logic or threshold Evaluate Candidate Score** - Change minimum score (default: 70) Slack Node** - Customize message format. Change Slack channel Gmail Node** - Add filters (labels, subjects, senders) Extract Resume Text** - Extend for other file formats if needed 1.8 Add-ons (Enhancements) Store shortlisted candidates in Google Sheets or database Send rejection emails automatically Add ATS (Applicant Tracking System) integration Include resume ranking system Add duplicate candidate detection Enable multi-role job matching 1.9 Use Case Examples Automating resume screening for tech hiring Filtering candidates for specific skill sets (e.g., React Native developers) Handling bulk job applications efficiently Shortlisting candidates for remote roles Pre-screening candidates before interviews There can be many more use cases depending on hiring workflows and business needs. 1.10 Troubleshooting Guide | Issue | Possible Cause | Solution | |---------------------------|-----------------------------------------------------|------------------------------------------------------------| | No emails fetched | Gmail not connected or incorrect filters | Reconnect Gmail and check filters | | Attachments not processed | Emails don’t contain attachments | Ensure resumes are attached | | PDF not detected | Incorrect file type | Ensure resumes are in PDF format | | AI response parsing error | Invalid JSON from AI | Check prompt and enforce strict JSON | | No Slack message | Slack credentials or channel issue | Verify Slack connection and channel ID | | Low-quality matches | Incorrect job requirements | Update AI prompt with accurate criteria | Need Help? If you need assistance setting up this workflow or customizing it for your business needs, we’re here to help. Whether you want to: Extend this workflow with advanced features Integrate it with your existing systems Build similar automation solutions Feel free to reach out to our n8n workflow development experts at WeblineIndia for expert support and tailored workflow development. 👉 Let us help you automate smarter and scale faster.
by Yassin Zehar
Description Automatically triage Product UAT feedback with AI, deduplicate it against your existing Notion backlog, create/update the right Notion item, and close the loop with the tester (Slack or email). This workflow standardizes incoming UAT feedback, runs AI classification (type, severity, summary, suggested title, confidence), searches Notion to prevent duplicates, and upserts the roadmap entry for product review. It then confirms receipt to the tester and returns a structured webhook response. Context Feature requests often arrive unstructured and get lost across channels. Product teams waste time re-triaging the same ideas, creating duplicates, and manually confirming receipt. This workflow ensures: Faster feature request triage Fewer duplicates in your roadmap/backlog Consistent structure for every feedback item Automatic tester acknowledgement Full traceability via webhook response Who is this for? Product Managers running UAT or beta programs Product Ops teams managing a roadmap backlog Teams collecting feature requests via forms, Slack, or internal tools Anyone who wants AI speed with clean backlog hygiene Requirements Webhook trigger (form / Slack / internal tool) OpenAI account (AI triage) Notion account (roadmap/backlog database) Slack and/or Gmail (tester notification) How it works Trigger: feedback received via webhook Normalize & Clean: standardizes fields and cleans message AI Triage: returns structured JSON (type, severity, title, confidence…) Notion Dedupe & Upsert: search by suggested title → update if found, else create Closed Loop: notify tester (Slack or email) + webhook response payload What you get One workflow to capture and structure feature requests Clean Notion backlog without duplicates Automatic tester confirmation Structured output for downstream automation About me : I’m Yassin a Product Manager Scaling tech products with a data-driven mindset. 📬 Feel free to connect with me on Linkedin
by Kevin Meneses
What this workflow does This workflow automatically generates a daily stock market email digest, combining price movements and recent financial news into a clean, actionable report. Instead of manually checking charts and news every morning, this workflow does it for you. It combines: Market data from EODHD APIs Financial news with sentiment analysis Smart processing using JavaScript (no raw data overload) ✉️ Automated email delivery via Gmail Only relevant insights reach your inbox. How it works (overview) The workflow runs on a schedule (daily at 7 AM) or manually It reads your stock watchlist from Google Sheets It fetches: Historical price data (last 14 days) Latest news (last 7 days) JavaScript processes the data: Calculates daily % change Filters recent news Assigns sentiment (positive / neutral / negative) A clean HTML email is generated The digest is sent automatically via Gmail 👉 The result: a complete market snapshot in seconds How to configure it EODHD APIs (Market Data) Add your API key in the ⚙️ Config node Used for: Price data Financial news Sentiment signals Google Sheets Add your credentials Create a column named ticker Add one stock per row (e.g. AAPL, TSLA, MSFT) Gmail Add your Gmail credentials Set your recipient email Schedule Default: runs daily at 7 AM You can adjust it easily in the Schedule node Why this workflow is powerful Most people: Check multiple sites every morning Waste time jumping between charts and news Miss important market moves This workflow: ✔ Automates everything ✔ Centralizes data ✔ Adds context with sentiment analysis ✔ Saves hours every week 📊 Data powered by EODHD APIs This workflow uses EODHD APIs to retrieve: Historical stock prices Financial news Sentiment indicators 👉 If you want to build your own financial automations, dashboards, or trading workflows: Get started here (10% discount): https://eodhd.com/pricing-special-10?via=kmg&ref1=Meneses Final result You get a daily email like this: Top movers (price changes) Curated news per stock Sentiment insights Clean, readable format All generated automatically.
by Itunu
Automatically detect unsubscribe replies in your outreach emails and clean your Google Sheets contact list; keeping your domain reputation and deliverability strong. 🎯 Who it’s for This template is designed for freelancers, lead generation specialists, and outreach managers; particularly those running email outreach campaigns for clients or personal lead-gen projects. If you send cold emails, manage multiple lead lists, or handle outreach at scale, this workflow helps you automatically manage unsubscribe requests to maintain healthy email deliverability and protect your domain reputation. ⚙️ How it works Trigger: The workflow starts when a new reply is received in your Gmail inbox. Intent Detection: The email text is analyzed using OpenAI to detect unsubscribe intent (“unsubscribe”, “remove me”, “opt out”, etc.). Normalization & Filtering: A Code node verifies the AI output for accuracy and ensures the result is standardized as either "unsubscribe" or "keep". Check & Update Sheets: If the contact requested removal, the workflow checks your Unsubscribe Sheet to see if they’re already listed. If not, the contact is added to the Unsubscribe Sheet and simultaneously removed from your Main Outreach Sheet. Optional Gmail Label: Adds an “Unsubscribe” tag in Gmail for quick visual tracking (optional customization). 🧩 Requirements To run this workflow, you’ll need: Gmail Credentials** → for reading incoming replies and applying labels. Google Sheets Credentials** → to manage both the “Main” and “Unsubscribe” spreadsheets. OpenAI API Key** → used for detecting unsubscribe intent from message text. All credentials can be added through the n8n Credentials Manager. 🧠 How to Customize Polling Time:** Adjust how often the Gmail Trigger checks for new replies (default: every 5 minutes). Sheets:** Replace the linked Google Sheets IDs with your own. You can change sheet names and columns freely. Intent Rules:** Modify the Code node to include additional unsubscribe phrases or alternate keywords. Optional Gmail Tagging:** Enable or disable tagging for unsubscribed messages. Secondary Validation:** Enable the second “If” check after the OpenAI node to double-confirm unsubscribe intent before moving contacts. 💡 Why this workflow matters By automatically managing unsubscribe requests, you: Respect recipients’ opt-out preferences Reduce spam complaints Protect your domain reputation and increase deliverability Save hours of manual list cleaning This is a must-have automation for anyone running cold email outreach, especially freelancers managing multiple client inboxes. 🪄 Quick Setup Tips Replace all "Gmail account" and "Google Service Account account" credential references with your actual credentials. Ensure your sheet has an EMAIL column for lookup. Test with a few mock replies before activating for production. Optional: Add a time-based trigger to run the sheet cleanup periodically.
by Nabin Bhandari
Who’s it for This template is for businesses, customer support teams, and professionals who want to deliver AI-powered WhatsApp assistance. It helps automate conversations, schedule meetings, answer FAQs, and send follow-up emails — all from WhatsApp. How it works A customer sends a WhatsApp message, which is captured by the Twilio Trigger. The incoming text is formatted and passed to the AI Support Agent. Based on the request, the agent can: Manage Google Calendar events (create, list, delete). Answer questions from your knowledge base (Supabase + embeddings). Draft and send emails via Gmail. Reply directly on WhatsApp with the appropriate response. How to set up Connect your Twilio account with WhatsApp enabled. Add your OpenAI API key. Connect Google Calendar. Connect Gmail. Configure Supabase for knowledge base storage. Requirements Twilio account (with WhatsApp number) OpenAI API key Google Calendar Gmail account Supabase project How to customize Update the Set Fields node with your Twilio number, API keys, and Gmail details. Add custom documents to Supabase for domain-specific FAQs. Adjust AI prompts for different roles (e.g., booking bot, HR assistant, customer support). Extend by adding more tools (CRM, Slack, Notion, etc.) as needed.
by kota
What this workflow does This workflow monitors your Gmail inbox for new, unreplied emails and automatically generates a professional reply draft using AI. Instead of sending the email automatically, the draft is sent to Slack so a human can review and decide whether to send it. This makes it ideal for teams that want to save time on email replies while keeping full control over outgoing communication. How it works Checks Gmail on a schedule for new, unreplied emails Limits the number of emails processed per run to avoid overload Extracts the email body and sends it to an AI model Generates a polite, professional reply draft Sends the draft to a Slack channel for review Adds a Gmail label to prevent duplicate processing Setup time ~10–15 minutes Who this is for Customer support teams Freelancers and consultants Small businesses handling frequent email inquiries Anyone who wants AI-assisted email replies with human approval Requirements Gmail account Slack workspace OpenAI (or compatible AI) credentials
by András Farkas
UPDATES: 2025-12-03 fix JS code in calculate hourly sum node E.ON W1000 → n8n → Home Assistant (Spook) “Integration” This workflow processes emails from the E.ON portal containing 15-minute +A -A (import/export) data and daily 1.8.0 2.8.0 meter readings. It extracts the required columns from the attached XLSX file, groups the 15-minute values by hour, then: updates the Spook/Recorder statistics under the IDs sensor.grid_energy_import and sensor.grid_energy_export, and sets the current meter readings for the entities input_number.grid_import_meter and input_number.grid_export_meter. > You may need to modify the workflow if there are changes in how E.ON sends scheduled exports. If the exported data format changes, please report it on Github! Requirements n8n (cloud or self-hosted) HACS addon available here: Rbillon59/home-assistant-addons Official n8n Docker Compose template Simplified n8n Docker Compose template available on Github (For Gmail) Gmail API authentication (OAuth2) read-only email access to the account receiving the messages Setup guide available here (For IMAP) IMAP provider credentials Home Assistant access via Long-Lived Access Token or API key Setup guide available here Spook integration Documentation and installation guide available here E.ON Portal Setup Create a scheduled export on the E.ON portal with the following parameters: Under the Remote Meter Reading menu, click on the + new scheduled export setting button. Specify POD identifier(s): choose one of the PODs you want to query. Specify meter variables: select the following: +A Active Power Consumption -A Active Power Feed-In DP_1-1:1.8.0*0 Active Power Consumption Daily Reading DP_1-1:2.8.0*0 Active Power Feed-In Daily Reading Export sending frequency: daily Days of historical data to include: recommended 7 days to backfill missed data. Email subject: by default, use [EON-W1000]. If processing multiple PODs with the workflow, give each a unique identifier. Home Assistant Preparation Create the following input_number entities in configuration.yaml or via Helpers: input_number: grid_import_meter: name: grid_import_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh grid_export_meter: name: grid_export_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh > If you name the entities differently, make sure to reflect these changes in the workflow. Create the following template_sensor entities in configuration.yaml or via Helpers: input_number: grid_import_meter: name: grid_import_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh grid_export_meter: name: grid_export_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh >If you name the entities differently, make sure to reflect these changes in the workflow. create the following template_sensor entities in config.yaml or via Helpers: template: sensor: name: "grid_energy_import" state: "{{ states('input_number.grid_import_meter') | float(0) }}" unit_of_measurement: "kWh" device_class: energy state_class: total_increasing name: "grid_energy_export" state: "{{ states('input_number.grid_export_meter') | float(0) }}" unit_of_measurement: "kWh" device_class: energy state_class: total_increasing > If you name the entities differently, make sure to reflect these changes in the workflow. n8n import and authentication importing the workflow In n8n → Workflows → Import from File/Clipboard → paste the JSON. Select the downloaded JSON and paste it into a new workflow using Ctrl+V. Set up n8n Credentials The credentials must be configured in the Home Assistant and Gmail nodes. The setup process is described in the Requirements section.
by Kumar SmartFlow Craft
🚀 How it works Fully automates your Day 0–30 employee onboarding sequence the moment HR submits a webhook. No manual steps, no missed tasks. 🔐 Provisions Google Workspace account via Admin API 💬 Posts a personalised welcome message to Slack 📝 Creates a Notion onboarding page pre-filled with the employee's details 📧 Sends a welcome email via Gmail with first-day instructions ⏱️ Waits 7 days, then checks task completion — alerts the manager if anything is overdue ✅ Waits 30 days, runs a final completion check and closes the onboarding loop 🛠️ Set up steps Estimated setup time: ~20 minutes Webhook — copy the webhook URL and send it from your HR system (BambooHR, HiBob, Workday, or a simple form) Google Workspace — connect a Service Account with Domain-Wide Delegation; grant admin.directory.user scope Slack — connect Slack OAuth2; set the welcome channel in the node (e.g. #general) Notion — connect Notion OAuth2; set your Onboarding database ID in the Create Page node Gmail — connect Gmail OAuth2; customise the welcome email template in the Send Email node Follow the sticky notes inside the workflow — each key node has a one-liner guide 📋 Prerequisites Google Workspace (Business Starter or higher) Slack workspace with a bot or OAuth2 app Notion workspace with an onboarding database Gmail account for sending welcome emails Custom Workflow Request with Personal Dashboard kumar@smartflowcraft.com https://www.smartflowcraft.com/contact More free templates https://www.smartflowcraft.com/n8n-templates