by Panth1823
Automate Personalized HR Email Outreach with Rate Limiting This workflow streamlines HR outreach by fetching contact data, validating emails, enforcing daily sending limits, and sending personalized emails with attachments, all while logging activity. How it works Read HR contact data from Google Sheets. Remove duplicates and validate email formats. Apply dynamic daily email sending limits. Generate personalized email content. Download resumes for attachments. Send emails via Gmail with attachments. Log sending status (success/failure) to Google Sheets. Setup Configure Google Sheets credentials. Configure Gmail OAuth2 credentials. Update 'Google Sheets - Read HR Data' with your document and sheet IDs. Define email content in 'Email Creator' node. Set 'Download Resume' URL to your resume repository. Update 'Log to Google Sheets' with your tracking sheet IDs. Customization Adjust the 'Rate Limiter' node's RAMP_START and LIMIT_BY_WEEK variables to match your desired sending schedule and volume.
by Avkash Kakdiya
How it works This workflow captures webinar feedback through a webhook and normalizes the submitted data for processing. It stores raw feedback in Google Sheets, uses an AI model to understand sentiment and intent, and generates a personalized response. A professional HTML thank-you email is sent automatically to each attendee. All replies and delivery details are logged back into the spreadsheet for tracking. Step-by-step Receive webinar feedback** Feedback Webhook – Accepts feedback submissions from a webinar form in real time. ID Generation – Creates a human-readable, unique feedback ID for tracking. Normalize Feedback – Cleans and standardizes incoming fields like name, email, rating, and comments. Store and enrich feedback** Store Partial – Saves the raw feedback data into Google Sheets. Common Resources – Attaches shared webinar resources such as recordings and slides. Analyze feedback with AI** Message a model – Evaluates sentiment, engagement level, and intent using an AI model. Parse AI Response – Extracts structured insights like segment, reply text, and next steps. Generate and send follow-up** Merge – Combines feedback data, AI response, and resources. Build Email HTML – Creates a clean, professional HTML email tailored to each attendee. Send AI Thank You Email – Sends the personalized follow-up via Gmail. Log final outcome** Store Feedback – Updates Google Sheets with the sent email content, timestamp, and status. Why use this? Save time by automating webinar feedback follow-ups end to end. Ensure every attendee receives a thoughtful, personalized response. Maintain a complete feedback and communication log in one place. Improve engagement without sounding promotional or generic. Scale post-webinar communication without manual effort.
by vinci-king-01
Medical Research Tracker with Email and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Pipedrive account and API token SMTP credentials (or native n8n Email credentials) for sending alerts List of target URLs or RSS feeds from government or healthcare policy organizations Basic familiarity with n8n credential setup Required Credentials | Service | Credential Name | Purpose | |--------------------|-----------------|-----------------------------------| | ScrapeGraphAI | API Key | Perform web scraping | | Pipedrive | API Token | Create / update deals & notes | | Email (SMTP/Nodemailer) | SMTP creds | Send alert emails | Environment Variables (optional) | Variable | Example Value | Description | |-------------------------|------------------------------|-----------------------------------------------| | N8N_DEFAULT_EMAIL_FROM | policy-bot@yourorg.com | Default sender for Email Send node | | POLICY_KEYWORDS | telehealth, Medicare, HIPAA | Comma-separated keywords for filtering | How it works This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Key Steps: Manual Trigger**: Kick-starts the workflow or schedules it via cron. Set → URL List**: Defines the list of healthcare policy pages or RSS feeds to scrape. Split In Batches**: Iterates through each URL so scraping happens sequentially. ScrapeGraphAI**: Extracts headlines, publication dates, and links. Code (Filter & Normalize)**: Removes duplicates, standardizes JSON structure, and applies keyword filters. HTTP Request**: Optionally enriches data with summary content using external APIs (e.g., OpenAI, SummarizeBot). If Node**: Checks if the policy item is new (not already logged in Pipedrive). Pipedrive**: Creates a new deal or note for tracking and collaboration. Email Send**: Sends an alert to compliance or leadership teams with the policy summary. Sticky Note**: Provides inline documentation inside the workflow. Set up steps Setup Time: 15–20 minutes Install ScrapeGraphAI: In n8n, go to “Settings → Community Nodes” and install n8n-nodes-scrapegraphai. Create Credentials: a. Pipedrive → “API Token” from your Pipedrive settings → add in n8n. b. ScrapeGraphAI → obtain API key → add as credential. c. Email SMTP → configure sender details in n8n. Import Workflow: Copy the JSON template into n8n (“Import from clipboard”). Update URL List: Open the initial Set node and replace placeholder URLs with the sites you monitor (e.g., cms.gov, nih.gov, who.int, state health departments). Define Keywords (optional): a. Open the Code node “Filter & Normalize”. b. Adjust the const keywords = [...] array to match topics you care about. Test Run: Trigger manually; verify that: Scraped items appear in the execution logs. New deals/notes show up in Pipedrive. Alert email lands in your inbox. Schedule: Add a Cron node (e.g., every 6 hours) in place of Manual Trigger for automated execution. Node Descriptions Core Workflow Nodes: Manual Trigger** – Launches the workflow on demand. Set – URL List** – Holds an array of target policy URLs/RSS feeds. Split In Batches** – Processes each URL one at a time to avoid rate limiting. ScrapeGraphAI** – Scrapes page content and parses structured data. Code – Filter & Normalize** – Cleans results, removes duplicates, applies keyword filter. HTTP Request – Summarize** – Calls a summarization API (optional). If – Duplicate Check** – Queries Pipedrive to see if the policy item already exists. Pipedrive (Deal/Note)** – Logs a new deal or adds a note with policy details. Email Send – Alert** – Notifies subscribed stakeholders. Sticky Note** – Embedded instructions inside the canvas. Data Flow: Manual Trigger → Set (URLs) → Split In Batches → ScrapeGraphAI → Code (Filter) → If (Duplicate?) → Pipedrive → Email Send Customization Examples 1. Add Slack notifications // Insert after Email Send { "node": "Slack", "parameters": { "channel": "#policy-alerts", "text": New policy update: ${$json["title"]} - ${$json["url"]} } } 2. Use different CRM (HubSpot) // Replace Pipedrive node config { "resource": "deal", "operation": "create", "title": $json["title"], "properties": { "dealstage": "appointmentscheduled", "description": $json["summary"] } } Data Output Format The workflow outputs structured JSON data: { "title": "Telehealth Expansion Act of 2024", "date": "2024-05-30", "url": "https://www.congress.gov/bill/118th-congress-house-bill/1234", "summary": "This bill proposes expanding Medicare reimbursement for telehealth services...", "source": "congress.gov", "status": "new" } Troubleshooting Common Issues Empty Scrape Results – Check if the target site uses JavaScript rendering; ScrapeGraphAI may need a headless browser option enabled. Duplicate Deals in Pipedrive – Ensure the “If Duplicate?” node compares a unique field (e.g., URL or title) before creating a new deal. Performance Tips Limit batch size to avoid API rate limits. Cache or store the last scraped timestamp to skip unchanged pages. Pro Tips: Combine this workflow with an n8n “Cron” or “Webhook” trigger for fully automated monitoring. Use environment variables for keywords and email recipients to avoid editing nodes each time. Leverage Pipedrive’s automations to notify additional teams (e.g., legal) when high-priority items are logged.
by Cheng Siong Chin
How It Works This workflow automates tenant screening by analyzing payment history, credit, and employment data to predict rental risks. Designed for property managers, landlords, and real estate agencies, it solves the challenge of objectively evaluating tenant reliability and preventing payment defaults.The system runs daily assessments, fetching rent payment history, credit bureau reports, and employment records. An AI agent merges this data, calculates risk scores, and routes alerts based on severity. High-risk tenants trigger immediate email notifications for intervention, medium-risk cases post to Slack for monitoring, while low-risk updates save quietly to databases. Automated collection workflows initiate for high-risk cases. Setup Steps Configure payment history, credit bureau, and employment credentials in fetch nodes Add OpenAI API key for risk analysis and set Gmail/Slack credentials for alerts Customize risk score thresholds and routing rules in workflow logic Prerequisites Payment system API, credit bureau access, employment verification API Use Cases Rental application screening, existing tenant monitoring Customization Modify risk scoring criteria, adjust alert thresholds Benefits Reduces defaults through early detection, eliminates screening bias
by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features ⏰ Scheduled Processing:** Runs daily at 7 AM automatically 📊 Data Validation:** Ensures application completeness 💾 Database Updates:** Maintains student records 📧 Auto Emails:** Sends welcome messages ❌ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email
by Rakin Jakaria
Who this is for This workflow is for digital marketing agencies or sales teams who want to automatically find business leads based on industry & location, gather their contact details, and send personalized cold emails — all from one form submission. What this workflow does This workflow starts every time someone submits the Lead Machine Form. It then: Scrapes business data* (company name, website, phone, address, category) using *Apify** based on business type & location. Extracts the best email address* from each business website using *Google Gemini AI**. Stores valid leads* in *Google Sheets**. Generates cold email content** (subject + body) with AI based on your preferred tone (Friendly, Professional, Simple). Sends the cold email** via Gmail. Updates the sheet** with send status & timestamp. Setup To set this workflow up: Form Trigger – Customize the “Lead Machine” form fields if needed (Business Type, Location, Lead Number, Email Style). Apify API – Add your Apify Actor Endpoint URL in the HTTP Request node. Google Gemini – Add credentials for extracting email addresses. Google Sheets – Connect your sheet for storing leads & email status. OpenAI – Add your credentials for cold email generation. Gmail – Connect your Gmail account for sending cold emails. How to customize this workflow to your needs Change the AI email prompt to reflect your brand’s voice and offer. Add filters to only target leads that meet specific criteria (e.g., website must exist, email must be verified). Modify the Google Sheets structure to track extra info like “Follow-up Date” or “Lead Source”. Switch Gmail to another email provider if preferred.
by Carl Fung
✨ Intro This workflow shows how to go beyond a “plain” AI chatbot by: 🧠 Adding a Personality Layer — Link an extra LLM to inject a custom tone and style. Here, it’s Nova, a sassy, high-fashion assistant. You can swap in any personality without changing the main logic. 🎨 Custom Styling with CSS — Easily restyle the chatbot to match your brand or project theme. Together, these make your bot smart, stylish, and uniquely yours. ⚙️ How it Works 📥 Route Input Chat trigger sends messages to a Switch. If a Telegram video note exists → runs the audio path. Otherwise → runs the text path. 🎤 Audio Path Telegram Get a File → OpenAI Speech-to-Text → pass transcript to the agent. 💬 Text Path Chat text is normalized and sent to the agent. 🛠 Agent Brain Uses tools like Gmail 📧, Google Calendar 📅, Google Drive 📂, Airtable 📋, SerpAPI 🌐, Wikipedia 📚, Hacker News 📰, and Calculator ➗. 🧾 Memory Keeps the last 20 messages for context-aware replies. 💅 Optional Personality Polish An LLM Chain adds witty or cheeky tone on top of the agent’s response. 🛠 Setup Steps ⏱ Time Required ~10–15 minutes (+5 minutes for each Google/Airtable connection). 🔑 Connect Credentials OpenAI (and/or Anthropic) Telegram Bot Gmail, Google Calendar, Google Drive Airtable SerpAPI 📌 Configure IDs Set Airtable base/table. Set Calendar email. Adjust Drive search query defaults if needed. 🎙 Voice Optional Disable Telegram + Transcribe nodes if you only want text chat. 🎭 Choose Tone Edit Chat Trigger’s welcome text/CSS for custom look. Or disable persona chain for neutral voice. 🚀 Publish Activate workflow and share the chat URL. 💡 Detailed behavior notes are available as sticky notes inside the workflow.
by Ishita Virmani
This N8N template helps keep track of multiple school websites for admission updates and sends an email notification. Good To Know This template uses all free tier tools like Gemini for LLM, Email alerting, How it works For each School Website provided: Get & clean the content through HTTP Request Node Gemini model takes the HTML content and defined prompt that instructs on how to identify if Pre-nursery Admissions for year 2026-207 have announced yet. If LLM response confirms the announcement, trigger an email to the configured address. Features Scheduled daily checks HTTP scraping Google Gemini text extraction for admission for Pre-nursery Email alerts How to use Import workflow. Provide already created or new Gemini API key within "Are admissions open" node. Setup SMTP account credentials within "Send Email" node, along with From-Email and To-Email. Finally update your list of School and their Admission URLs within "Shortlisted Schools" node. Customizing the workflow It can be used for tracking school admissions for any class including Pre-Nursery/ Bal-vatika/ 1st etc. via modifying the prompt. It can be used for tracking any school that has details uploaded on their websites and can be extracted via HTTP request node.
by Hyrum Hurst
Who this is for Property management teams handling multiple properties with high package/visitor traffic who want automated tenant and management notifications. What this workflow does Automatically classifies package and visitor events, sends notifications to tenants, alerts property managers, and logs activity for accountability. How it works Package/visitor system triggers workflow. AI classifies urgency and type. Notifications sent via Email, SMS, and Slack. Google Sheets logs all events. Optional AI follow-up suggestions for unclaimed packages or missed visitors. How to set up Connect Webhook, Slack, Email, SMS, and AI. Test routing and logging. Adjust AI prompts for local building protocols. Requirements AI Node Webhook from package/visitor system Slack, Email, SMS credentials Google Sheets Built by QuarterSmart. Created by Hyrum Hurst.
by Davide
This workflow automates the process of generating advertising (ADV) images from multiple reference images and publishing them directly to social media (Instagram and Facebook with Upload-Post) with Seedream v4 AI. This workflow automates the process of generating an AI image based on a user's text prompt and up to 6 reference images. The process is triggered by a user submitting a web form. Key Advantages ✅ Automated Image Creation – Generates high-quality, consistent visuals from multiple references without manual editing. ✅ Seamless Social Media Publishing – Automatically posts to Instagram and Facebook with minimal effort. ✅ SEO-Optimized Titles – Ensures your posts get better reach with AI-generated, keyword-friendly titles. ✅ Scalable Workflow – Can be triggered manually, on schedule, or via form submissions. ✅ Time-Saving – Reduces manual steps from design to publishing, enabling faster content production. ✅ Multi-Platform Support – Easily extendable to other platforms (TikTok, LinkedIn, etc.) with Upload-Post API. How It Works Form Trigger: A user fills out a form with a "Prompt" (text description) and a list of "Reference images" (comma-separated URLs). Data Processing: The workflow converts the submitted image URL string into a proper array for the AI API. AI Image Generation: The workflow sends the prompt and image URLs to the fal.ai API (specifically, the ByteDance seedream model) to generate a new, consistent image. Status Polling: It periodically checks the status of the AI job until the image generation is COMPLETED. Result Retrieval: Once complete, it fetches the URL of the generated image and downloads the image file itself. SEO Title Generation: The original user prompt is sent to OpenAI's GPT-4o-mini model to generate an optimized, engaging social media title. Cloud Backup: The generated image is uploaded to a specified Google Drive folder for storage. Social Media Posting: Finally, the workflow posts the downloaded image file to both Instagram and Facebook via the Upload-Post.com API, using the AI-generated title. Set Up Steps To make this workflow functional, you need to configure several third-party services and their corresponding credentials within n8n. Obtain fal.ai API Key: Create an account at fal.ai. Locate your API key in your account settings. In the "Create Video" and "Get status" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Key YOUR_FAL_AI_API_KEY. Configure Upload-Post.com API: Create an account at Upload-Post.com and get your API key. Create a profile within the Upload-Post app (e.g., test1); this profile manages your social account connections. In both the "Post to Instagram" and "Post to Facebook" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Apikey YOUR_UPLOAD_POST_API_KEY. Crucially, in the same nodes, find the user parameter in the body and replace the placeholder YOUR_USERNAME with the profile name you created (e.g., test1). Configure OpenAI/OpenRouter (Optional for Title Generation): The "Generate title" node uses an OpenAI-compatible API. The provided example uses OpenRouter. Ensure you have valid credentials (e.g., for OpenRouter or directly for OpenAI) configured in n8n and selected in this node. Configure Google Drive (Optional for Backup): The "Upload Image" node requires Google OAuth credentials. Set up a Google Cloud project, enable the Drive API, and create OAuth 2.0 credentials in the n8n settings. Authenticate and select the desired destination folder in the node's parameters. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by oka hironobu
TimeRex AI-Powered Booking Automation Description (for n8n template submission) Transform your TimeRex booking management with AI-powered automation. This workflow automatically processes bookings, enriches data with AI insights, and keeps your team informed via Slack—all in real-time. What This Workflow Does 🤖 AI-Powered Intelligence Smart Company Detection**: Automatically identifies company names from guest email domains Booking Categorization**: Uses Google Gemini to classify bookings (Sales/Support/Interview/Partnership/Media) Meeting Brief Generation**: AI creates actionable preparation notes for hosts before each meeting ⚡ Automated Processing Receives webhooks from TimeRex for confirmed and cancelled bookings Validates requests with security token verification Logs enriched booking data to Google Sheets Sends detailed Slack notifications with AI-generated insights 🛡️ Security & Reliability Token-based webhook authentication Security alerts for unauthorized access attempts Automatic cancellation handling with data cleanup Use Cases Sales Teams**: Automatically categorize leads and prepare meeting briefs Recruitment**: Streamline interview scheduling with AI-powered candidate insights Customer Success**: Track support meetings and prepare context for calls Media Relations**: Manage press interviews with automated briefings How It Works TimeRex sends a webhook when a booking is confirmed or cancelled Security token is verified (failed attempts trigger Slack alerts) For confirmed bookings: Media source is detected from calendar name Company name is extracted from email domain AI categorizes the booking purpose AI generates a meeting preparation brief Enriched data is saved to Google Sheets Slack notification is sent with AI insights For cancellations: Booking is found by Event ID Row is deleted from Google Sheets Cancellation alert is sent to Slack Setup Instructions Webhook Configuration Copy the webhook URL from the "TimeRex Webhook" node Paste it in TimeRex Settings → Webhook Security Token Copy your TimeRex security token Update the Verify Security Token node with your token Google Sheets Create a spreadsheet with these columns: event_id, booking_date, guest_name, guest_email, calendar_name, meeting_url, host_name, media_source, company_name, booking_category, ai_meeting_brief, created_at Update all Google Sheets nodes with your Sheet ID AI Credentials Connect your Google Gemini API credentials to both AI model nodes Slack Connect your Slack account Select your notification channel in all Slack nodes Activate Turn on the workflow and start receiving AI-enhanced booking notifications! Requirements TimeRex account with webhook access Google Cloud account (for Sheets & Gemini API) Slack workspace n8n instance (self-hosted or cloud) Customization Tips Modify the Filter by Calendar Type node to match your calendar naming convention Adjust AI prompts in the LLM Chain nodes for different categorization or brief styles Add more media sources to the Media Master sheet for accurate source tracking Extend the workflow with email confirmations or calendar event creation Short Description (100 characters max) Automate TimeRex bookings with AI-powered categorization, meeting briefs, and Slack notifications. Categories Sales Productivity AI Scheduling Tags TimeRex, Booking, AI, Google Gemini, Slack, Google Sheets, Automation, Meeting Management, LLM, Scheduling
by Frederik Duchi
This n8n template demonstrates how to automatically create tasks (or in general, records) in Baserow based on template or blueprint tables. The first blueprint table is the master table that holds the general information about the template. For example: a standard procedure to handle incidents. The second table is the details table that holds multiple records for the template. Each record in that table is a specific task that needs to be assigned to someone with a certain deadline. This makes it easy to streamline task creation for recurring processes. Use cases are many: Project management (generate tasks for employees based on a project template) HR & onboarding (generate tasks for employee onboarding based on a template) Operations (create checklists for maintenance, audits, or recurring procedures) Good to know The Baserow template for handling Standard Operating Procedures works perfect as a base schema to try out this workflow. Authentication is done through a database token. Check the documentation on how to create such a token. Tasks are inserted using the HTTP request node instead of a dedicated Baserow node. This is to support batch import instead of importing records one by one. Requirements Baserow account (cloud or self-hosted) A Baserow database with at least the following tables: Assignee / employee table. This is required to be able to assign someone to a task. Master table with procedure or template information. This is required to be able to select a certain template Details table with all the steps associated with a procedure or template. This is required to convert each step into a specific task. A step must have a field Days to complete with the number of days to complete the step. This field will be used to calculate the deadline. Tasks table that contains the actual tasks with an assignee and deadline. How it works Trigger task creation (webhook)** The automation starts when the webhook is triggered through a POST request. It should contain an assignee, template, date and note in the body of the request. It will send a succes or failure response once all steps are completed. Configure settings and ids** Stores the ids of the involved Baserow database and tables, together with the API credentials and the data from the webhook. Get all template steps** Gets all the steps from the template Details table that are associated with the id of the Master template table. For example: the master template can have a record about handling customer complaints. The details table contains all the steps to handle this procedure. Calculate deadlines for each step** Prepares the input of the tasks by using the same property names as the field of the Tasks table. Adjust this names, add or remove fields if this is required for your database structure. The deadline of each step is calculated by adding the number of days a step can take based on the deadline of the first step. This is done through a field Days to complete in the template Details table. For example. If the schedule_date property in the webhook is set to 2025-10-01 and the Days to complete for the step is 3, then the deadline will be 2025-10-04 Avoid scheduling during the weekend** It might happen that the calculated deadline is on a Saturday or Sunday. This Code node moves those dates to the first Monday to avoid scheduling during the weekend. Aggregate tasks for insert** Aggregates the data from the previous nodes as an array in a property named items. This matches perfect with the Baserow API to insert new records in batch. Generate tasks in batch** Calls the API endpoint /api/database/rows/table/{table_id}/batch/ to insert multiple records at once in the tasks table. Check the Baserow API documentation for further details. Success / Error response** Sends a simple text response to indicate the success or failure of the record creation. This is to offer feedback when triggering the automation from a Baserow application, but can be replaced with a JSON response. How to use Call the Trigger task creation node with the required parameters through a POST request. This can be done from any web application. For example: the application builder in Baserow supports an action to send an HTTP request. The Procedure details page in the Standard Operating Procedures template demonstrates this action. The following information is required in the body of the request. This information is required to create the actual tasks. { "assignee_id": integer refering to the id of the assignee in the database, "template_id": integer refering to the id of the template or procedure in the master table, "schedule_date": the date the tasks need to start scheduling, "note": text with an optional note about the tasks } Set the corresponding ids in the Configure settings and ids node. Check the names of the properties in the Calculate deadlines for each step node. Make sure the names of those properties match the field names of your Tasks table. You can replace the text message in the Success response and Failure response with a more structured format if this is necessary in your application. Customising this workflow Add support for public holidays (e.g., using an external calendar API). Modify the task assignment logic (e.g., pre-assign tasks in the details table). Combine with notifications (email, Slack, etc.) to alert employees when new tasks are generated.