by Robert Breen
This n8n workflow template creates an intelligent data analysis chatbot that can answer questions about data stored in Google Sheets using OpenAI's GPT-5 Mini model. The system automatically analyzes your spreadsheet data and provides insights through natural language conversations. What This Workflow Does Chat Interface**: Provides a conversational interface for asking questions about your data Smart Data Analysis**: Uses AI to understand column structures and data relationships Google Sheets Integration**: Connects directly to your Google Sheets data Memory Buffer**: Maintains conversation context for follow-up questions Automated Column Detection**: Automatically identifies and describes your data columns ๐ Try It Out! 1. Set Up OpenAI Connection Get Your API Key Visit the OpenAI API Keys page. Go to OpenAI Billing. Add funds to your billing account. Copy your API key into your OpenAI credentials in n8n (or your chosen platform). 2. Prepare Your Google Sheet Connect Your Data in Google Sheets Data must follow this format: Sample Marketing Data First row** contains column names. Data should be in rows 2โ100. Log in using OAuth, then select your workbook and sheet. 3. Ask Questions of Your Data You can ask natural language questions to analyze your marketing data, such as: Total spend** across all campaigns. Spend for Paid Search only**. Month-over-month changes** in ad spend. Top-performing campaigns** by conversion rate. Cost per lead** for each channel. ๐ฌ Need Help or Want to Customize This? ๐ง rbreen@ynteractive.com ๐ LinkedIn ๐ n8n Automation Experts
by Robert Breen
This n8n workflow automates bulk AI video generation using Freepik's Image-to-Video API powered by Minimax Hailuo-02-768p. It reads video prompts from a Google Sheet, generates multiple variations of each video using Freepik's AI, handles asynchronous video processing with intelligent polling, and automatically uploads completed videos to Google Drive with organized file names. This is perfect for content creators, marketers, or video producers who need to generate multiple AI videos in bulk and store them systematically. Key Features: Bulk video generation from Google Sheets prompts Multiple variations per prompt (configurable duplicates) Asynchronous processing with smart status polling Automatic retry mechanism for processing delays Direct upload to Google Drive with organized naming Freepik Minimax Hailuo-02 AI-powered video generation (768p quality) Intelligent wait/retry system for video rendering Step-by-Step Implementation Guide Prerequisites Before setting up this workflow, you'll need: n8n instance (cloud or self-hosted) Freepik API account with Video Generation access Google account with access to Sheets and Drive Google Sheet with your video prompts Step 1: Set Up Freepik API Credentials Go to Freepik API Developer Portal Create an account or sign in Navigate to your API dashboard Generate an API key with Video Generation permissions Copy the API key and save it securely In n8n, go to Credentials โ Add Credential โ HTTP Header Auth Configure as follows: Name: "Header Auth account" Header Name: x-freepik-api-key Header Value: Your Freepik API key Step 2: Set Up Google Credentials Google Sheets Access: Go to Google Cloud Console Create a new project or select existing one Enable Google Sheets API Create OAuth2 credentials In n8n, go to Credentials โ Add Credential โ Google Sheets OAuth2 API Enter your OAuth2 credentials and authorize with spreadsheets.readonly scope Google Drive Access: In Google Cloud Console, enable Google Drive API In n8n, go to Credentials โ Add Credential โ Google Drive OAuth2 API Enter your OAuth2 credentials and authorize Step 3: Create Your Google Sheet Create a new Google Sheet: sheets.google.com Set up your sheet with these columns: Column A: Prompt (your video generation prompts) Column B: Name (identifier for file naming) Example data: | Prompt | Name | |-------------------------------------------------|---------------| | A butterfly landing on a flower in slow motion | butterfly-01 | | Ocean waves crashing on rocky coastline | ocean-waves | | Time-lapse of clouds moving across blue sky | clouds-timelapse | Copy the Sheet ID from the URL (the long string between /d/ and /edit) Step 4: Set Up Google Drive Folder Create a folder in Google Drive for your generated videos Copy the Folder ID from the URL when viewing the folder Note: The workflow is configured to use a folder called "n8n workflows" Step 5: Import and Configure the Workflow Copy the provided workflow JSON In n8n, click Import from File or Import from Clipboard Paste the workflow JSON Configure each node as detailed below: Node Configuration Details: Get prompt from google sheet (Google Sheets) Document ID**: Your Google Sheet ID (from Step 3) Sheet Name**: Sheet1 (or your sheet name) Operation**: Read Credentials**: Select your "Google Sheets account" Duplicate Rows2 (Code Node) Purpose**: Creates multiple variations of each prompt JavaScript Code**: const original = items[0].json; return [ { json: { ...original, run: 1 } }, { json: { ...original, run: 2 } }, ]; Customization**: Add more runs for additional variations Loop Over Items (Split in Batches) Processes items in batches to manage API rate limits Options**: Keep default settings Reset**: false Create Video (HTTP Request) Method**: POST URL**: https://api.freepik.com/v1/ai/image-to-video/minimax-hailuo-02-768p Authentication**: Generic โ HTTP Header Auth Credentials**: Select your "Header Auth account" Send Body**: true Body Parameters**: Name: prompt Value: ={{ $json.Prompt }} Get Video URL (HTTP Request) Method**: GET URL**: https://api.freepik.com/v1/ai/image-to-video/minimax-hailuo-02-768p/{{ $json.data.task_id }} Authentication**: Generic โ HTTP Header Auth Credentials**: Select your "Header Auth account" Timeout**: 120000 (2 minutes) Purpose**: Polls the API for video completion status Switch (Switch Node) Purpose**: Routes workflow based on video generation status Conditions**: Completed: {{ $json.data.status }} equals COMPLETED Failed: {{ $json.data.status }} equals FAILED Created: {{ $json.data.status }} equals CREATED In Progress: {{ $json.data.status }} equals IN_PROGRESS Wait (Wait Node) Amount**: 30 seconds Purpose**: Waits before re-checking video status Webhook ID**: Auto-generated for resume functionality Download Video as Base64 (HTTP Request) Method**: GET URL**: ={{ $json.data.generated[0] }} Purpose**: Downloads completed video file Upload to Google Drive1 (Google Drive) Operation**: Upload Name**: =video - {{ $('Get prompt from google sheet').item.json.Name }} - {{ $('Duplicate Rows2').item.json.run }} Drive ID**: My Drive Folder ID**: Your Google Drive folder ID (from Step 4) Credentials**: Select your "Google Drive account" Step 6: Customize for Your Use Case Modify Duplicate Count: Edit the "Duplicate Rows2" code to create more variations Update File Naming: Change the naming pattern in the Google Drive upload node Adjust Wait Time: Modify the Wait node duration based on typical processing times Add Video Parameters: Enhance the Create Video request with additional Freepik parameters Step 7: Test the Workflow Ensure your Google Sheet has test data Click Execute Workflow on the manual trigger (if present) Monitor the execution flow - note that video generation takes time Watch the Switch node handle different status responses Verify videos are uploaded to Google Drive when completed Step 8: Production Deployment Set up error handling for API failures and timeouts Configure appropriate batch sizes based on your Freepik API limits Add logging for successful uploads and failed generations Consider webhook triggers for automated execution Set up monitoring for stuck or failed video generations Freepik Video API Details Video Generation Process: Submit Request: Send prompt to generate video Get Task ID: Receive task_id for tracking Poll Status: Check generation status periodically Download: Retrieve completed video URL Status Types: CREATED: Video generation task created IN_PROGRESS: Video is being generated COMPLETED: Video ready for download FAILED: Generation failed Model Specifications: Model**: minimax-hailuo-02-768p Resolution**: 768p Duration**: Typically 5-10 seconds Format**: MP4 Example Enhanced Parameters: { "prompt": "{{ $json.Prompt }}", "duration": 5, "aspect_ratio": "16:9", "fps": 24 } Workflow Flow Summary Start โ Read prompts from Google Sheets Duplicate โ Create multiple runs for variations Loop โ Process items in batches Generate โ Submit video generation request to Freepik Poll โ Check video generation status Switch โ Route based on status: Completed โ Download video Processing/Created โ Wait and retry Failed โ Handle error Download โ Retrieve completed video file Upload โ Save to Google Drive with organized naming Continue โ Process next batch Troubleshooting Tips Common Issues: Long Processing Times**: Video generation can take 2-5 minutes per video Timeout Errors**: Increase timeout in "Get Video URL" node Rate Limits**: Reduce batch size and add longer waits between requests Failed Generations**: Check prompt complexity and API limits Upload Failures**: Verify Google Drive folder permissions Error Handling: Add Try/Catch nodes around API calls Implement exponential backoff for retries Log failed generations to Google Sheets Set up email notifications for critical failures Performance Optimization: Adjust wait times based on typical generation duration Use smaller batch sizes for more reliable processing Monitor API usage and costs in Freepik dashboard Cost Considerations Freepik API: Video generation typically costs more than image generation Check your plan's video generation limits Monitor usage through Freepik dashboard Consider upgrading for higher volume needs Processing Time: Each video can take 2-5 minutes to generate Plan workflow execution time accordingly Consider running during off-peak hours for large batches Contact Information Robert A Ynteractive For support, customization, or questions about this workflow: ๐ง Email: rbreen@ynteractive.com ๐ Website: https://ynteractive.com/ ๐ผ LinkedIn: https://www.linkedin.com/in/robert-breen-29429625/ Need help implementing this workflow or want custom automation solutions? Get in touch for professional n8n consulting and workflow development services.
by Calistus Christian
Summary Turns the latest CVEs from NVD into a clean, sortable email digest (table + plaintext) and sends it via Gmail. The flow pulls the newest CVEs, extracts Vendor / Product / Version, severity & CVSS, highlights public exploit references, drafts an HTML table, then asks OpenAI to tighten the copy before emailing it. Optionally, you can swap the Gmail node to Signal, Slack, Microsoft Teams, etc. Perfect for: SecOps leads who want a low-noise digest of what changed recently, grouped and ranked by severity. * What this workflow does Triggers on a schedule (every 30 minutes by default). Calls the NVD 2.0 API to fetch recent CVEs. Parses each CVE to extract: Vendor / Product / Version(s) (from CPE 2.3 where available, with a text fallback) Severity + CVSS (V3.1/V3.0/V2 fallback) and vector string Exploit signal (tags/links like ExploitโDB, GitHub PoCs, etc.) Short English summary + direct NVD link Builds an HTML email (and a plaintext fallback) ranked by severity then score. Uses OpenAI to polish the subject line and copy into a concise, professional digest (JSONโonly contract). Sends the digest with the Gmail node. * Prerequisites NVD API key** (free) --- create at https://nvd.nist.gov/developers/request-an-api-key OpenAI API key** with access to gpt-4o-mini (or change the model) Email sending**: Gmail node with OAuth2 (recommended), or swap to the generic Email Send (SMTP) node if you prefer. Quick start Import the workflow JSON below. Open HTTP Request โ Headers and confirm apiKey uses {{$env.NVD_API_KEY}}. Open Send a message (Gmail) and set To to {{$env.RECIPIENT_EMAIL}} (or your address). Open OpenAI Email Crafter and connect your OpenAI credential (or change model if needed). Hit Execute to test, then Activate when happy. Credits Created by ca7ai (n8n Creator). * Tags security, cve, cisa, nvd, email, monitoring, openai, gmail, automation
by Marth
How It Works: The 5-Node Security Flow This workflow efficiently performs a scheduled data breach scan. 1. Scheduled Check (Cron Node) This is the workflow's trigger. It schedules the workflow to run at a specific, regular interval. Function:** Continuously runs on a set schedule, for example, every Monday morning. Process:* The *Cron** node automatically initiates the workflow, ensuring routine data breach scans are performed without manual intervention. 2. List Emails to Check (Code Node) This node acts as your static database, defining which email addresses to monitor for breaches. Function:** Stores a list of email addresses from your team or customers in a single, easy-to-update array. Process:** It configures the list of emails that are then processed by the subsequent nodes. This makes it simple to add or remove addresses as needed. 3. Query HIBP API (HTTP Request Node) This node connects to the HaveIBeenPwned (HIBP) API to check for breaches. Function:** Queries the HIBP API for each email address on your list. Process:** It sends a request to the HIBP API. The API responds with a list of data breaches that the email was found in, if any. 4. Is Breached? (If Node) This is the core detection logic. It checks the API response to see if any breach data was returned. Function:** Compares the API's response to an empty array. Process:* If the API response is *not empty**, it indicates a breach has been found, and the workflow is routed to the notification node. If the response is empty, the workflow ends safely. 5. Send High-Priority Alert (Slack Node) / End Workflow (No-Op Node) These nodes represent the final action of the workflow. Function:** Responds to a detected breach. Process:* If a breach is found, the *Slack* node sends an urgent alert to your team's security channel, notifying them of the compromised email. If no breaches are found, the *No-Op** node ends the workflow without any notification. How to Set Up Implementing this essential cybersecurity monitor in your n8n instance is quick and straightforward. 1. Prepare Your Credentials & API Before building the workflow, ensure all necessary accounts are set up and their credentials are ready. HIBP API Key:* You need to get an *API key** from haveibeenpwned.com. This key is required to access the API. Slack Credential:* Set up a *Slack credential* in n8n and note the *Channel ID** of your security alert channel (e.g., #security-alerts). 2. Import the Workflow JSON Get the workflow structure into your n8n instance. Import:** In your n8n instance, navigate to the "Workflows" section. Click the "New" or "+" icon, then select "Import from JSON." Paste the provided JSON code into the import dialog and import the workflow. 3. Configure the Nodes Customize the imported workflow to fit your specific monitoring needs. Scheduled Check (Cron):** Set the schedule according to your preference (e.g., every Monday at 8:00 AM). List Emails to Check (Code):* Open this node and *edit the emailsToCheck array**. Enter the list of company email addresses you want to monitor. Query HIBP API (HTTP Request):** Open this node and in the "Headers" section, add the header hibp-api-key with the value of your HIBP API key. Send High-Priority Alert (Slack):* Select your *Slack credential* and replace YOUR_SECURITY_ALERT_CHANNEL_ID with your actual *Channel ID**. 4. Test and Activate Verify that your workflow is working correctly before setting it live. Manual Test:** Run the workflow manually. You can test with a known breached email address (you can find examples online) to ensure the alert is triggered. Verify:** Check your specified Slack channel to confirm that the alert is sent with the correct information. Activate:** Once you're confident in its function, activate the workflow. n8n will now automatically monitor your important accounts for data breaches on the schedule you set.
by Oneclick AI Squad
This automated n8n workflow tracks hourly cloud spending across AWS, Azure, and GCP. It detects cost spikes or budget overruns in real time, tags affected resources, and sends alerts via email, WhatsApp, or Slack. This ensures proactive cost management and prevents budget breaches. Good to Know AWS, Azure, and GCP APIs must have read access to billing data. Use secure credentials for API keys or service accounts. The workflow runs every hour for near real-time cost tracking. Alerts can be sent to multiple channels (Email, WhatsApp, Slack). Tags are applied automatically to affected resources for easy tracking. How It Works Hourly Cron Trigger โ Starts the workflow every hour to fetch updated billing data. AWS Billing Fetch โ Retrieves latest cost and usage data via AWS Cost Explorer API. Azure Billing Fetch โ Retrieves subscription cost data from Azure Cost Management API. GCP Billing Fetch โ Retrieves project-level spend data using GCP Cloud Billing API. Data Parser โ Combines and cleans data from all three clouds into a unified format. Cost Spike Detector โ Identifies unusual spending patterns or budget overruns. Owner Identifier โ Matches resources to their respective owners or teams. Auto-Tag Resource โ Tags the affected resource for quick identification and follow-up. Alert Sender โ Sends notifications through Email, WhatsApp, and Slack with detailed cost reports. How to Use Import the workflow into n8n. Configure credentials for AWS, Azure, and GCP billing APIs. Set your budget threshold in the Cost Spike Detector node. Test the workflow to ensure all APIs fetch data correctly. Adjust the Cron Trigger for your preferred monitoring frequency. Monitor alert logs to track and manage cost spikes. Requirements AWS Access Key & Secret Key with Cost Explorer Read Permissions. Azure Client ID, Tenant ID, Client Secret with Cost Management Reader Role. GCP Service Account JSON Key with Billing Account Viewer Role. Customizing This Workflow Change the trigger frequency in the Cron node (e.g., every 15 min for faster alerts). Modify alert channels to include additional messaging platforms. Adjust cost spike detection thresholds to suit your organizationโs budget rules. Extend the Data Parser to generate more detailed cost breakdowns. Want a tailored workflow for your business? Our experts can craft it quickly Contact our team
by Rahul Joshi
Description: Automate your personal email management with this AI-powered inbox triage system built entirely in n8n. This template connects Gmail, Azure OpenAI (GPT-4o-mini), and Notion to classify, summarize, and store your incoming emailsโhelping you focus only on what matters. The workflow fetches unread emails from Gmail, runs them through a custom AI classification model (Important, Ignore, Delegate, Reply Later), creates clear summaries, and stores the results in Notion for future reference. No more clutterโyour inbox is automatically sorted, prioritized, and documented. โ What This Template Does (Step-by-Step): ๐ง Fetch Unread Emails from Gmail Retrieves only unread, inbox-labeled emails via Gmail API. Captures sender, subject, and email content for processing. ๐ Split Emails for Individual Process- ing Breaks down bulk email retrieval into single-item batches for parallel AI classification. โ๏ธ Clean & Structure Email Data Extracts subject, sender, and message text. Removes unnecessary metadata for cleaner AI inputs. ๐ค AI Classification with Azure OpenAI (GPT-4o-mini) Categorizes emails into Important, Ignore, Delegate, or Reply Later. Uses a precise, prompt-engineered LLM chain for consistent results. ๐ Generate Clear, Actionable Summaries Combines classification and key email details into concise summaries. ๐ Aggregate Results into a Digest Merges all processed email summaries into a batch report. ๐ Store Insights in Notion Saves structured summaries and classifications into a Notion page for easy tracking and retrieval. ๐ฏ Perfect For: Busy professionals who want a clutter-free inbox. Founders & executives managing high email volume. Remote teams needing quick email triage and visibility. Productivity enthusiasts looking to integrate AI into their workflow. โ๏ธ Built With: Gmail API (email retrieval) n8n Split In Batches (parallel processing) Azure OpenAI GPT-4o-mini (classification & summarization) Notion API (data storage & archiving) ๐ Key Benefits: โ Saves hours of manual email triage. ๐ Ensures no important emails are missed. ๐ง AI-driven, consistent prioritization. ๐ Centralized email intelligence in Notion. ๐ Fully no-code and customizable.
by Sayone Technologies
๐ Meeting Notes Summarizer & Slack Notifier Easily keep your team aligned by summarizing meeting notes, extracting action items, and delivering them directly to Slack. ๐ What This Workflow Does โฐ Triggers on a schedule to fetch meeting data from your note-taking tool ๐ Retrieves meeting summaries and action items using the MeetGeek API ๐ค Uses Google Gemini AI to generate concise summaries and action points ๐จ Restructures the output into Slack Block Kit format ๐ข Sends daily Slack notifications with clear summaries and actionable tasks ๐ฅ Who Is This For? ๐ฉโ๐ผ Teams that want automated daily meeting briefs ๐ Project managers who need action items clearly assigned ๐ Remote or hybrid teams using Slack as their main communication hub โณ Anyone looking to reduce the time spent reviewing long meeting notes ๐ ๏ธ Technical Requirements ๐ API key & credentials for your meeting note-taking app ๐ค Google Gemini AI credentials ๐ฌ Slack workspace with proper OAuth setup โก Set Up the Workflow with Ease ๐๏ธ Configure your meeting note API in the โGet Meetings Listโ and โSummaryโ nodes. ๐ค Add Gemini AI credentials for generating summaries. ๐ Connect your Slack channel for notifications. โ Activate the workflow so that your team will start receiving daily meeting insights automatically. ๐จ Want to Customize It Further? ๐ Change the trigger schedule (daily, weekly, or after each meeting). ๐ญ Modify the Slack Block Kit layout for different formatting styles. ๐ง Add extra integrations like email, Notion, or Google Docs to save notes. โ๏ธ Adjust the AI prompt for different summary styles (short/long, formal/casual, etc.). ๐ Filter meetings by specific teams, projects, or keywords. ๐ Customize the API URL in the HTTP Request node to connect with other note-taking tools or different API endpoints.
by Dhruv from Saleshandy
Automatically import new user signups from any database, filter by signup date, and enroll users into your Saleshandy email sequence for immediate engagement. Activity is logged to a spreadsheet (e.g., Google Sheets) for tracking and analytics. Fully configurable, no hardcoded values. Prerequisites A database with a users table (fields: id, full_name, email, created_at) Saleshandy account with API key and an active sequence. Spreadsheet (e.g., Google Sheets) with columns: ID, Name, Email, Created_at Configured OAuth/API credentials for each service How It Works Fetches new signups from your database within your desired date range (e.g., daily or weekly). Splits user names and formats user data as needed. Adds each user to your Saleshandy sequence using their name and email. Logs every processed record in your spreadsheet for further tracking and analytics. Runs automatically on your defined schedule (example: daily trigger). Set Up Steps Estimated time: 10โ20 minutes Collect your database and Saleshandy access credentials, and spreadsheet info. Edit database node(s) to include your connection and correct date filter. Set your Saleshandy API key and target sequence ID. Enter your spreadsheet link or ID and authenticate as needed. Test the workflow with a small user batch before scheduling it for routine runs. Check sticky notes by each workflow node for details and best practices. Requirements Database connection credentials and access Saleshandy API key and sequence ID Google Sheets (or alternative) setup credentials Customisation Tips Edit the date filter to adjust the range (last day, week, month, or custom) Add error-handling nodes to catch issues with API calls or data Set up notifications (email, Slack, etc.) for process success/failure Rename nodes to reflect your business logic or steps Replace the manual trigger with a webhook or scheduled cron if desired Configure workflow variables for all credentials and IDsโavoid hardcoding
by Rapiwa
Who is this for? This workflow is ideal for WooCommerce store owners who want to automatically send promotional WhatsApp messages to their customers when new coupons are created. Itโs designed for marketers and eCommerce managers looking to boost engagement, streamline coupon sharing, and track campaign performance effortlessly through Google Sheets. Overview This workflow listens for WooCommerce coupon creation events (coupon.created) and uses customer billing data to send promotional WhatsApp messages via the Rapiwa API. The flow formats the coupon data, cleans phone numbers, verifies WhatsApp registration with Rapiwa, sends the promotional message when verified, and logs each attempt to Google Sheets (separate sheets for verified/sent and unverified/not sent). What this Workflow Does Listens for new coupon creation events in WooCommerce via the WooCommerce Trigger node Retrieves all customer data from the WooCommerce store Processes customers in batches to control throughput Cleans and formats customer phone numbers for WhatsApp Verifies if phone numbers are valid WhatsApp accounts using Rapiwa API Sends personalized WhatsApp messages with coupon details to verified numbers Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Automated coupon distribution: Triggers when new coupons are created in WooCommerce Customer data retrieval: Fetches all customer information from WooCommerce Phone number validation: Verifies WhatsApp numbers before sending messages Personalized messaging: Includes customer name and coupon details in messages Dual logging system: Tracks both successful and failed message attempts Rate limiting: Uses batching and wait nodes to prevent API overload Data formatting: Structures coupon information for consistent messaging Google Sheet Column Structure A Google Sheet formatted like this โค sample The workflow uses a Google Sheet with the following columns to track coupon distribution: | name | number | email | address1 | couponCode | couponTitle | couponType | couponAmount | createDate | expireDate | validity | status | | ----------- | ------------- | --------------------------------------------------- | --------- | ---------- | -------------- | ---------- | ------------ | ------------------- | ------------------- | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur-DOHS | 62dhryst | eid offer 2025 | percent | 20.00 | 2025-09-11 06:08:02 | 2025-09-15 00:00:00 | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur-DOHS | 62dhryst | eid offer 2025 | percent | 20.00 | 2025-09-11 06:08:02 | 2025-09-15 00:00:00 | verified | sent | Requirements n8n instance with the following nodes: WooCommerce Trigger, Code, SplitInBatches, HTTP Request, IF, Google Sheets, Wait WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) Important Notes Phone Number Format**: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits**: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy**: Ensure compliance with data protection regulations when sending marketing messages. Error Handling**: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Message Content**: The current message template references the first coupon only (coupons[0]). Adjust if you need to handle multiple coupons. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Jason Stelo
This workflow uses Tally.so to collect client input that you input yourself during the meeting or after from a simple form and send that data to n8n via an API webhook. Once received, n8n processes the information and uses OpenAI to expand on the provided details transforming the short client notes into a complete, well-structured proposal. After generating the proposal, the workflow automatically: Drafts a professional follow-up email using the generated proposal details. Prepares the email inside Gmail, ready for your review and final send. This creates a fast, automated process turning raw meeting notes into a polished, client-ready deliverable within minutes.
by Intuz
This n8n template from Intuz provides a complete and automated solution for secure document archiving. It automatically saves new QuickBooks invoice PDFs directly into Google Drive, creating a reliable backup system. For perfect organization, the workflow uses keywords from the invoice, like the client name or invoice number, to dynamically name the PDF files, ensuring you have a complete and easily searchable financial record. Use Cases 1. Automated Document Archiving: Eliminate the manual work of downloading and saving invoices. Set it up once and let it run. 2. Compliance & Auditing: Maintain a clean, chronological, and separate record of all issued invoices for easy access during audits. 3. Secure Backup: Create a redundant, secure backup of your critical financial documents in your own cloud storage. 4. Enhanced Team Access: Share the Google Drive folder with accountants, bookkeepers, or team members who need access to invoices but not to your full QuickBooks account. How It Works: 1. Real-Time Invoice Trigger: The workflow starts the instant a new invoice is created in your QuickBooks account. A configured webhook sends a notification to n8n, kicking off the automation immediately. 2. Fetch Invoice Metadata: The workflow uses the invoice ID from the webhook to retrieve the full invoice details, such as the customer's name and the transaction date. This information is used in the next steps. 3. Generate the Invoice PDF: A crucial HTTP Request node makes a direct API call to QuickBooks, requesting a PDF version of the invoice. This ensures the archived document is the official, formatted PDF, exactly as it appears in QuickBooks. 4. Upload and Archive in Google Drive: The final node takes the binary PDF data and uploads it to your specified Google Drive folder. It dynamically names the file for easy identification (e.g., CustomerName_TransactionDate.pdf), creating a perfectly organized and searchable archive. Setup Instructions To get this workflow running, follow these key setup steps: 1. Credentials: QuickBooks: Connect your QuickBooks account credentials to n8n. Google: Connect your Google account using OAuth2 credentials and ensure the Google Drive API is enabled. 2. QuickBooks Webhook Configuration: First, activate this n8n workflow to make the webhook URL live. Copy the Production URL from the QuickBooks Webhook node. In your Intuit Developer Portal, go to the webhooks section for your app, paste the URL, and subscribe to Invoice creation events. 3. Node Configuration: Get an invoice & Generate PDF File: These nodes will use your configured QuickBooks credentials automatically. Upload file (Google Drive): In the parameters for this node, you must select the Folder ID where you want your invoices to be saved. Support If you need help setting up this workflow or require a custom version tailored to your specific use case, please feel free to reach out to the template author: Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Kaden Reese
AI-Powered Mortgage Rate Updates with Client Messaging Keep your clients informed without the repetitive work. This workflow automatically pulls the latest mortgage rates, cleans the data, and uses AI to craft polished messages you can send directly to clients. Whether you want professional emails, quick SMS-style updates, or even CRM-ready messages, this setup saves time while making you look on top of the market. How it Works Daily Trigger โ Runs on a schedule you choose (default: multiple times per day). Fetch Rates โ Pulls the latest mortgage rates from Mortgage News Daily (you can swap to another source). Clean Data โ Prepares and formats the raw rate data for messaging. AI Messaging โ Uses Google AI Studio (Gemini) to generate text/email content thatโs clear, professional, and client-ready. You can customize the prompt to adjust tone or style. Include variables (like client names or CRM fields) for personalized outreach. Send Updates โ Delivers the AI-crafted message to Discord by default for you to copy and send to your clients or upload yto your bulk iMessage or email tool, but can be adapted for: Slack, Telegram, WhatsApp, or Gmail Why Use This Save hours** - No more copy-pasting rates into client messages. Look prepared** - Clients see you as proactive, not reactive. Customizable** - Use AI prompts to match your personal voice, include client-specific details, or change the delivery channel. Scalable** โ Works for one agent or an entire brokerage team. With this workflow, by the time your client asks โwhat are rates today?โ, theyโll already have a polished update waiting in their inbox or chat. ๐