by Ahmed Saadawi
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Vtiger CRM – Auto-Answer FAQs with DeepSeek AI Description: This workflow automates the process of answering FAQ drafts in Vtiger CRM using DeepSeek LLM via LangChain. It's perfect for teams who want to accelerate knowledge base creation, improve support response consistency, or reduce the manual effort of writing FAQ content. Every 1 minute, this workflow: 📥 Retrieves the most recent FAQ record marked as Draft in Vtiger CRM 🧠 Sends the question to a LangChain agent powered by DeepSeek AI 📝 Receives a plain-text answer 📤 Updates the original FAQ with the generated answer and changes its status to Published ⚙️ How It Works Trigger:** Scheduled to run every 1 minute Query:** Pulls the latest FAQ from Vtiger where faqstatus = 'Draft' AI Agent:** Uses LangChain + DeepSeek to generate a natural-language answer Memory Buffer:** Keeps context using LangChain memory Update:** Pushes the answer back to Vtiger and marks it as Published 🛠️ Setup Instructions Connect Credentials for: Vtiger CRM API DeepSeek API Ensure your Vtiger CRM has a Faq module with fields: question faq_answer faqstatus Install the required Community Node: Go to Settings → Community Nodes Click Install Node and enter: n8n-nodes-vtiger-crm Restart your instance when prompted. Optionally customize the schedule or field names as needed. 👤 Who Is This For? Customer support teams building a knowledge base Businesses using Vtiger as a CRM or internal helpdesk Teams looking to automate repetitive content creation using LLMs 🔐 Credentials Required ✅ Vtiger CRM API credentials ✅ DeepSeek AI API key ✅ Highlights Fully automated LLM-powered FAQ generation Uses custom community node for Vtiger support Lightweight and runs on a short interval (1 min) Includes sticky note for clarity and onboarding Clean conditional logic and memory context built-in 🏷 Tags vtiger, crm, faq automation, ai automation, deepseek, langchain, llm, open source crm, faq generation, customer support, n8n, n8n community nodes, workflow automation, ai generated answers, vtiger integration, deepseek ai, langchain integration
by Kees Bosch - Browserflow
Auto find & invite LinkedIn Leads This n8n template automates LinkedIn lead generation by scraping profiles, filtering out existing connections, and sending connection requests — all in a controlled, looped workflow. Ideal for outreach campaigns, recruitment, or lead gen efforts. ⚠️ Disclaimer – Community Node Notice This template uses a verified community node available inside the n8n cloud environment. To use it, go to "Nodes" → search for: Browserflow for Linkedin …and click Install. It’s officially verified and accessible directly from n8n cloud. In case you wish to run this template locally, you need to go to the settings, click community nodes and search for n8n-nodes-browserflow. Then after installing you can start using the actions in this node. 🛠️ How to Use Trigger: Manual Start Initiates the workflow manually via the “Test workflow” button, giving you full control. Scrape LinkedIn Profiles Uses the Browserflow automation to extract profile links from a LinkedIn search or keyword query. Split Out Results Converts the list of profiles into individual items for single-profile processing. Loop Through Each Profile Ensures each LinkedIn profile is handled one at a time, avoiding simultaneous actions. Check Existing Connection Verifies if you’re already connected with the lead on LinkedIn. Conditional Logic ✅ Already Connected → Skip to next profile ❌ Not Connected → Continue to next step Send Connection Invite Sends a LinkedIn connection request, optionally with a personalized message. 📦 Requirements n8n (cloud or self-hosted) Installed community node: Browserflow for Linkedin LinkedIn account Valid Browserflow acount (you can set up a free 7-day trial at https://browserflow.io) ⚙️ Setup Instructions Install the Browserflow Community Node Search “Browserflow for Linkedin” > Install. Get your API key Get your API key at https://browserflow.io Setup your Browserflow account After registering, setup your Browserflow and connect with Linkedin using the wizard at https://browserflow.io Connect with Browserflow by making a credential Click on the Browserflow actions to setup a connection with Browserflow by adding your API key to a credential. 🧩 Customization Tips Targeting: Adjust the Browserflow actions to scrape specific roles, industries, or locations. Messaging: You can add a message to the connection invite but remind that LinkedIn limits the amount of messages that can be send each month. Use variables in the message for personalization (e.g., {firstName}). Trigger: Replace manual trigger with a cron node for scheduled outreach. Integration: Combine with CRM tools (e.g., HubSpot, Notion, Airtable) for syncing leads or integrate with AI Agents.
by Miquel Colomer
This n8n workflow template automates the process of collecting and delivering the "Top Deals of the Day" from MediaMarkt, tailored to user preferences. By combining user-submitted forms, Bright Data web scraping, GPT-4o-mini deal generation, and email delivery, this workflow sends personalized product recommendations straight to a user’s inbox. > ⚠️ Note: This workflow uses community nodes (Bright Data and Document Generator) which only work on *self-hosted n8n instances*. 🚀 What It Does Collects user preferences via a form (categories + email) Scrapes MediaMarkt’s deals page using Bright Data Uses GPT-4o-mini (OpenAI) to recommend top deals Generates a structured HTML email using a template Sends the personalized deals directly via email 🧩 Community Node Integration We created and used the following community nodes: Bright Data** – To scrape MediaMarkt deals using proxy-based scraping Document Generator** – To generate a templated HTML document from deal data These nodes are not available in n8n Cloud and require self-hosted n8n. 🛠️ Step-by-Step Setup Install Community Nodes Make sure you're on a self-hosted n8n instance. Install: n8n-nodes-brightdata n8n-nodes-document-generator Configure Credentials Bright Data API Key (Proxy + Scraping setup) OpenAI API Key (GPT-4o-mini access) SMTP Credentials for sending emails Customize the Form Adapt the form node to collect desired categories and email addresses. Typical categories include appliances, phones, laptops, etc. Design Your HTML Template In the Document Generator node, you can tweak the HTML/CSS to change how deals appear in the final email. Test the Workflow Submit the form with test data and check that the entire flow—from scraping to email—executes as expected. 🧠 How It Works: Workflow Overview User Interaction via Form Users select product categories and enter their email. This triggers the workflow. Data Extraction via Bright Data Bright Data scrapes the MediaMarkt offers page and returns HTML content. HTML Parsing Key elements like product names, prices, and links are extracted for processing. GPT-4o-mini Recommendation Generation The extracted data is sent to OpenAI (GPT-4o-mini), which filters, ranks, and enhances deals based on the user’s preferences. Data Structuring & Split The result is split into individual deal items to be formatted. HTML Document Creation Document Generator populates a clean HTML template with the top recommended deals. Email Delivery The final document is emailed via SMTP to the user with a friendly message. 📨 Final Output Users receive a custom HTML email featuring a curated list of top MediaMarkt deals based on their selected categories. 🔐 Credentials Used Bright Data API** – Web scraping with proxy support OpenAI API** – Generating personalized recommendations SMTP** – Sending personalized deal emails ✨ Customization Tips Change the Data Source**: You can adapt this to scrape other e-commerce sites. Update the Email Template**: Make it match your branding or include images. Extend the Form**: Add preferences like price range or specific brands. Add Scheduling**: Use Cron to run the workflow daily or weekly. ❓Questions? Template and node created by Miquel Colomer and n8nhackers.com. Need help customizing or deploying? Contact us for consulting and support.
by The O Suite
This n8n workflow automates website security audits. It combines direct website scanning, threat intelligence from AlienVault OTX, and advanced analysis from an OpenAI large language model (LLM) to generate and email a comprehensive security report. How it Works (Workflow Flow): Input: A user provides a website URL via a simple web form. Data Collection: An HTTP Request node visits the provided URL to gather initial data (status code, headers). An AlienVault HTTP Request node queries AlienVault OTX for known threats associated with the website's hostname. Data Preparation (Prepare Data for AI): A custom code node consolidates the collected website data and AlienVault intelligence, performing initial checks for common issues (e.g., error codes, missing security headers, AlienVault warnings). AI Analysis (Security Configuration Audit): The prepared data is sent to an OpenAI Chat Model, which acts as a cybersecurity expert. The AI analyzes the data to identify vulnerabilities, explain their impact, suggest exploitation methods, and outline mitigation steps. Report Formatting (Format Report for Email): Another custom code node takes the AI's plain-text report and converts it into a structured HTML format suitable for email. Delivery (Send Security Report): The final HTML report is sent via Gmail to a specified email address. Setup Steps: To use this workflow, you'll need an n8n instance and the following credentials: n8n Instance: Ensure your n8n environment is running. OpenAI API Key: Generate a key from OpenAI. Add an "OpenAI API" credential in n8n (e.g., "OpenAI account"). AlienVault OTX API Key: Obtain a key from your AlienVault OTX profile. Add an "AlienVault OTX API" credential in n8n (e.g., "AlienVault account"). Gmail Account: Set up a "Gmail OAuth2" credential in n8n for sending emails (recommended for security; involves Google Cloud setup). Import Workflow: Copy the workflow's JSON code. In n8n, import the workflow via "Workflows" > "New" > "Import from JSON". Configure Recipient: In the "Send Security Report" node, specify the email address where reports should be sent. Activate: Enable the workflow to start processing submissions. Once activated, access the "On form submission" webhook URL to input a URL and trigger an audit.
by Hostinger
Quickly transform any LinkedIn profile URL into a concise, AI‑generated professional summary — perfect for recruiters, sales teams, and hiring managers who need instant insights into prospects or candidates without manual research. How it works The workflow polls a Google Sheet for new or updated rows containing LinkedIn profile URLs. For each URL, the Real‑Time LinkedIn Scraper API (via RapidAPI) pulls experience and education sections. Extracted profile data is sent to OpenAI’s GPT model, which generates a clean, structured summary highlighting key strengths, career trajectory, and differentiators. The generated summary is written back into a new column in the same row of your Google Sheet for easy review and sharing. Set up steps Connect your Google account and select the spreadsheet + worksheet containing your list of LinkedIn URLs. Sign up for the Real‑Time LinkedIn Scraper API on RapidAPI, copy your API key, and add it to the workflow’s HTTP Request node. Insert your OpenAI API key credentials. Ensure your Google Sheet has one column for “linkedin_url” and create two empty columns named “full_name” and "summary" (or customize them based on your needs). Run a single row through the workflow to verify scraping accuracy and summary formatting, then turn on the workflow for continuous automation. With this template, eliminate hours of manual profile review — instantly gain actionable insights and focus on what really matters: building relationships and closing deals.
by bangank36
Overview This workflow retrieves all blog and event collection items from a Squarespace site and saves them into a Google Sheets spreadsheet. It uses pagination to fetch 20 items per request, ensuring all content is collected efficiently. How It Works The workflow queries your Squarespace blog and event collections. It fetches data in paginated batches (20 items per page). The retrieved data is formatted and inserted into Google Sheets. The workflow runs on demand or on a schedule, ensuring your data stays up to date. Requirements Credentials To use this template, you need: Your Squarespace collection URL Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? This template is designed for: Bloggers looking to manage and analyze content externally. Businesses and marketers tracking content performance. Anyone who needs an automated way to extract Squarespace blog and event data. Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by Robert Breen
This n8n training workflow demonstrates how to connect a sub-workflow as a tool to an AI Agent. In this example, the main workflow is a Website Chatbot that engages visitors, collects contact information, and sends that data to a CRM process. The CRM process itself is a separate sub-workflow, connected to the agent as a tool via the Tool Workflow node. Step-by-Step Setup Instructions 1. Create the Sub-Workflow (CRM Tool) This sub-workflow will be triggered by the AI agent to process collected information. It will: Receive inputs (email, description) from the main chatbot workflow. Format the data into a structured JSON format. Append the data to a Google Sheet (acting as the CRM database). Send a confirmation message back to the main workflow. Steps inside the sub-workflow: When Executed by Another Workflow** – Triggered by the main workflow’s tool node. Convert Conversation (Agent)** – Uses OpenAI to extract and format the input into a JSON structure: { "email": "jane.doe@example.com", "description": "Wants help automating lead intake and sending Slack notifications." } Structured Output Parser – Ensures the extracted data matches the expected JSON schema. Append row in sheet (Google Sheets) – Adds the new lead data to your CRM sheet. Code Node – Returns a simple text confirmation like "Thanks for the info, we will be in touch soon". Required setup for Google Sheets: Enable the Google Sheets API and connect your Google account in n8n. Create a sheet with at least the columns email and description. Use the sheet's Document ID and tab name in the Google Sheets node. 2. Create the Main Workflow (Website Chatbot) This workflow acts as the main AI Agent handling incoming chat messages. Steps in the main workflow: When chat message received – Starts the workflow whenever a visitor sends a message via your chatbot integration. Website Chatbot (Agent Node) – Configured with a System Message that: Briefly explains your services. Asks the visitor what processes they want to automate. Requests their name and email. Sends collected data to the CRM tool once email and description are available. OpenAI Chat Model – Connects to the AI agent as its language model. Simple Memory – Stores short-term context for the ongoing chat. CRM Tool (Tool Workflow Node) – Points to the sub-workflow created in Step 1, allowing the chatbot to trigger it directly. 3. Connecting the Sub-Workflow to the AI Agent Add a Tool Workflow node to the main workflow. Select "Parameter" as the source. Paste in your sub-workflow JSON or select it from your n8n workflows. Connect the Tool Workflow node to your AI Agent using the ai_tool connection. Give the tool a clear description (e.g., crm tool to store lead information) so the agent knows when to use it. 4. How It Works in Action A visitor sends a message through the chatbot. The AI Agent engages, asks questions, and collects their name, email, and request. Once collected, the agent triggers the CRM Tool. The sub-workflow formats the data, stores it in Google Sheets, and sends a confirmation. The chatbot confirms with the visitor that their request was received. 5. Customization Ideas Replace Google Sheets with your actual CRM API. Add validation to ensure the email format is correct before saving. Expand the CRM tool to send a Slack or email notification after storing the lead. Created by Robert A. – Ynteractive Website: https://ynteractive.com Email: robert@ynteractive.com
by Juan Carlos Cavero Gracia
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This automation template is designed for content curators, marketers, and anyone looking to supercharge their content sharing strategy. It transforms any web article, blog post, or news link into a series of platform-specific social media posts, generated by AI. It also captures a live screenshot of the webpage to use as the post image, automating the entire process of publishing them across X (Twitter), LinkedIn, Threads, and Reddit. Note: The default example is configured to share n8n templates, but this workflow can promote any web page, article, or news story. Just change the URL! The upload-post node only works for self-hosted n8n instances, but you can use the standard http node for uploading the content* Who Is This For? Content Curators & Marketers:** Effortlessly share valuable industry news and articles with tailored messages and visuals for each audience. Social Media Managers:** Keep your social feeds consistently active with relevant, high-quality content without the manual overhead. Community Builders & Brand Evangelists:** Quickly disseminate product updates, tutorials, and blog posts to your community on all relevant platforms. Professionals & Thought Leaders:** Build your personal brand by easily sharing insightful articles with automated visuals, adding your unique perspective. What Problem Does This Workflow Solve? Sharing a single piece of content across multiple social platforms is tedious. You need to manually write unique posts, create visuals, and then publish everything. This workflow addresses these challenges by: Automating Content Creation:** Uses a powerful AI agent (Google Gemini) to read any URL and write compelling, unique posts for each social network. Generating Visuals Automatically:** Captures a high-quality screenshot of the source webpage to use as a visually appealing image in your posts, increasing engagement. Ensuring Platform-Specific Tone:** The AI is instructed to generate professional posts for LinkedIn, concise threads for X, conversational updates for Threads, and community-focused posts for Reddit. One-Click Distribution:** Takes a single URL as input and handles the entire content creation and sharing process across multiple platforms automatically. How It Works Input a URL: In the "Set Input Data" node, simply paste the URL of the article or page you want to share. AI Analysis & Generation: The workflow sends the URL to the AI agent, which scrapes the content and generates four distinct, ready-to-publish posts. Screenshot Generation: At the same time, it uses the ScreenshotOne service to capture a high-quality image of the provided URL. Cross-Platform Publishing: The generated content and the screenshot are automatically sent to the corresponding nodes to be posted on X, LinkedIn, and Threads, while the text-only version is sent to Reddit. Setup AI Model Credentials: Add your Google Gemini API key to the Google Gemini Chat Model node to power the AI agent. Screenshot Service (ScreenshotOne): The workflow uses ScreenshotOne to generate images for your posts. Create a free account at screenshotone.com to get your own API key. The free plan includes 100 screenshots per month. In the Upload Post X, Upload Post LinkedIn, and Upload Post Threads nodes, go to the Photos parameter (under Additional Fields) and replace the existing access_key in the URL with your own. Upload-Post Account: This workflow uses upload-post.com for multi-platform posting. Create a free account at upload-post.com to get your API Token and User ID. Add the credentials in the Upload Post X, Upload Post LinkedIn, and Upload Post Threads nodes. Reddit Credentials: Connect your Reddit account using OAuth2 in the Reddit node to enable posting. Customize the AI: (Optional) Edit the prompt in the Social Media Agent node to match your content. The default prompt is optimized for sharing n8n templates, but you can easily adapt it for any topic to fit your brand's voice and style. Requirements Accounts:** n8n, Google (for Gemini API), ScreenshotOne, upload-post.com, Reddit. API Keys & Credentials:** Google Gemini API Key, ScreenshotOne API Key, Upload-post.com API Token & User ID, Reddit OAuth2 credentials. Use this template to become a content-sharing powerhouse, saving hours of work while increasing your reach and engagement across the web.
by Robert Breen
n8n Workflow: OpenAI DALL·E 2 Image Generation & Google Drive Upload Description This n8n workflow automates the process of generating multiple AI-created images from a single prompt using OpenAI's DALL·E 2, then uploads the results directly to a Google Drive folder. It includes a loop to produce several image variations for the same prompt, making it ideal for creative projects, marketing materials, or content experimentation. Step-by-Step Setup Instructions 1. Prepare Your API Keys OpenAI API Key** Sign up or log in at https://platform.openai.com/ Go to API Keys and create a new one. Copy and store this securely — you'll need it in n8n. Google Drive API** Go to https://console.cloud.google.com/ Create a project and enable Google Drive API. Create OAuth 2.0 credentials and set the redirect URI to your n8n OAuth redirect (found in your n8n Google Drive node setup). Connect your Google account when adding credentials in n8n. 2. Workflow Nodes Overview Manual Trigger – Starts the workflow manually. Set Image Prompt – Stores the prompt text and base file name (e.g., “Make an image of an attractive woman standing in New York City”). Duplicate Rows (Code Node) – Creates multiple "runs" of the same prompt for variation. Loop Over Items – Processes each variation one at a time. Generate an image (OpenAI DALL·E 2) – Sends the prompt to OpenAI and retrieves an image. Upload to Google Drive – Saves each generated image to your chosen Google Drive folder. 3. Building the Workflow in n8n Step 1 — Manual Trigger Add a Manual Trigger node to start the workflow manually when testing. Step 2 — Set Image Prompt Add a Set node with two fields: Prompt → The image description text. Name → The base name for the saved file. Example: | Name | Value | |--------|---------------------------------------------------------------| | Prompt | Make an image of an attractive woman standing in New York City | | Name | woman-nyc | Step 3 — Duplicate Rows (Code Node) Use this JavaScript to create three copies of the prompt (run 1, run 2, run 3): const original = items[0].json; return [ { json: { ...original, run: 1 } }, { json: { ...original, run: 2 } }, { json: { ...original, run: 3 } }, ]; Step 4 — Loop Over Items Insert a Split in Batches node and set the batch size to 1. This ensures each prompt variation runs through the image generation process individually. Connect this node so it runs after the Duplicate Rows node. Step 5 — Generate Image Add the OpenAI Image Generation node and configure it as follows: Model**: dall-e-2 Prompt**: ={{ $json.Prompt }} Leave other options at their defaults unless you want to specify image size or style. Connect your OpenAI API credentials created in Step 1. This node will send the current prompt in the batch to OpenAI's DALL·E 2 model and return an AI-generated image. Step 6 — Upload to Google Drive Add a Google Drive node and configure it to store the generated image: File Name**: ={{ $('Set Image Prompt').item.json.Name }} - {{ $('Duplicate Rows').item.json.run }} Folder ID**: Select the target Google Drive folder where images should be saved. Connect your Google Drive OAuth2 API credentials. The node will upload each generated image to your chosen Google Drive location, with a unique filename for each variation. Running the Workflow Execute the workflow manually. The process will: Loop through each prompt variation. Generate an image using OpenAI DALL·E 2. Upload the image to Google Drive with a unique name. You will find all generated images in the selected Google Drive folder. Customization Tips Change the number of variations by editing the Duplicate Rows code. Adjust the prompt dynamically from other data sources like Google Sheets, webhooks, or forms. Schedule the workflow to run at specific times or trigger it via an API call. Created by Robert A. – Ynteractive Website: https://ynteractive.com Email: robert@ynteractive.com
by Tomek
How it works Use Telegram to send in new phrases (flashcard front) You can also manually input phrase in the workflow itself ChatGPT generates provided phrase description (in English but you can change it) including multiple meanings & generates examples of using the phrase in a sample sentence (flashcard back) Steps to setup Provide your Telegram bot API key (optional) Provide your OpenAI key Provide Google Sheets credentials How to import flashcards from Google Sheets into Anki Use Google Sheets to Anki add-on: 1871608121 In Anki simply click Sync Decks and you're done :) Enjoy
by Intuz
This n8n template from Intuz provides a complete and automated solution for hyper-personalized email outreach. It powerfully combines AI with Gmail and Google Sheets, using specific keywords and prospect data to automatically craft unique, compelling email content that boosts engagement and secures more replies. Instead of manually replying to every lead or inquiry, this template does the heavy lifting for you, ensuring every response is relevant, thoughtful, and timely. It reads each person's unique inquiry, uses OpenAI to craft a perfectly tailored and human-like response, and sends it directly from your Gmail account. Ideal for sales, marketing, and customer support teams looking to boost engagement and save hours of manual work. Use Cases: Sales Teams: Instantly follow up with new leads from your website's contact form with a personalized touch. Customer Support: Provide initial, intelligent responses to support tickets, answering common questions or acknowledging receipt of a complex issue. Marketing Automation: Nurture leads by responding to content downloads or webinar sign-ups with relevant, non-generic information. Founders & Solopreneurs: Manage all incoming business inquiries (partnerships, media, etc.) efficiently without sacrificing quality. How It Works: Trigger the Flow (Manual): Start the automation whenever you're ready to process a new batch of inquiries from your sheet. Fetch Inquiries from Google Sheets: The workflow connects to your specified Google Sheet and reads each row. It pulls the contact's First Name, Email ID, the Inquiry Intent (e.g., "Demo Request," "Pricing Inquiry"), and the full text of their Original Inquiry. Sync Your Signature: Before writing the email, an HTTP Request node dynamically fetches your display name from your Gmail account settings. This ensures the signature in the generated email (Thanks, {{Your Name}}) is always accurate. Craft a Hyper-Personalized Reply with AI: It uses this context to generate a high-quality, professional, and friendly email reply in HTML format. For example: If the intent is "Technical Support," the AI will generate a helpful, empathetic response addressing the technical issue. If the intent is "Partnership Proposal," it will draft a professional reply acknowledging the proposal and outlining the next steps. Send via Gmail: The final node takes the AI-generated message, adds a relevant subject line (e.g., "Re: Your Demo Request"), and sends it directly to the contact's email address from your connected Gmail account. This process loops for every single row in your Google Sheet, turning a list of names into a series of meaningful conversations. Setup Instructions: To get this workflow running, you'll need to configure a few things: Credentials: Google: Connect your Google account via OAuth2 and ensure you have enabled access for Google Sheets, Google Drive, and Gmail. OpenAI: Add your OpenAI API key as a credential. Google Sheet Setup: Create a Google Sheet with the following exact column headers: -First Name -Email ID -Inquiry Intent (A short category like "Demo Request", "Billing Issue", etc.) -Original Inquiry (The full text of the email or message you received). Node Configuration: Get row(s) in sheet: Select your Google Sheet document and the specific sheet name. Message a model (OpenAI): Choose your preferred OpenAI model (e.g., gpt-4-turbo, gpt-3.5-turbo). HTTP Request & Send Personalized emails: These nodes should automatically use your configured Gmail credentials. No changes are typically needed. Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Oneclick AI Squad
📚 Automated School Fee Reminder Workflow with Payment Link Automatically sends fee reminders (via email and WhatsApp) to parents with secure payment links, 3 days before the due date. 🔧 Main Components Daily Fee Check – 8 AM** Scheduled trigger that starts the workflow daily at 8 AM. Read Pending Fees** Fetches student fee records from an Excel sheet (using getAll method). Process Fee Reminders** Filters records to find pending fees due within the next 3 days. Prepare Email Reminder** Generates personalized email messages with payment links. Wait for Email Preparation** Adds delay/wait condition for email logic readiness. Send Email Reminder** Sends the fee reminder email with a secure payment link to the parent. Prepare WhatsApp Reminder** Generates WhatsApp-friendly messages with fee and payment details. Wait for WhatsApp Preparation** Waits for WhatsApp message logic to complete. Send WhatsApp Message** Sends the message to the parent’s WhatsApp number using a message API. Update Reminder Status** Updates the Excel file to mark reminders as sent to avoid duplicates. 🧩 Channels Used 📧 Email – with personalized payment link 💬 WhatsApp – formatted reminder message 🔐 Payment Integration Secure payment links are auto-generated per student to enable direct and safe online fee payments. ✅ Essential Prerequisites Excel sheet with fee records (student_fee_data.xlsx) SMTP credentials for sending email WhatsApp API or provider integration (like Twilio or Gupshup) Access to a payment gateway or service for link generation File storage access to update reminder status in Excel 📁 Required Excel File Structure (student_fee_data.xlsx) | Student ID | Name | Email | Phone | Fee Due Date | Amount | Reminder Sent | | ---------- | ---- | ----- | ----- | ------------ | ------ | ------------- | 🧾 Expected Input Format Example { "studentId": "ST123", "name": "Ria Mehta", "email": "ria.mehta@example.com", "phone": "+919123456789", "dueDate": "2025-08-10", "amount": "₹5000", "reminderSent": "No" } 🚀 Key Features ⏰ Scheduled Daily Execution – Fully automated at 8 AM 🧮 Due-Date Filtering – Only targets fees due in the next 3 days 💬 Multi-Channel Notifications – Sends reminders via both Email and WhatsApp 🔗 Secure Payment Links – Auto-generated for each student 🔄 Reminder Tracking – Prevents duplicate reminders by updating status ⚙️ Quick Setup Guide Import Workflow JSON into your n8n instance. Configure schedule in the “Daily Fee Check” node (default: 8 AM). Set Excel file path in the “Read Pending Fees” node. Update your fee processing logic in the “Process Fee Reminders” node. Add email credentials in the “Send Email Reminder” node. Integrate WhatsApp provider API in the “Send message” node. Define how you generate secure payment links. Test with sample data and activate workflow. 🛠️ Parameters to Configure | Parameter | Description | | ------------------ | ------------------------------------------ | | excel_file_path | Path to the fee tracking Excel file | | smtp_host | SMTP server for sending email reminders | | smtp_user | Email username | | smtp_password | Email password | | whatsapp_api_key | WhatsApp API key for sending messages | | payment_api_url | URL for generating payment links | | admin_email | (Optional) Admin email for error reporting |