by Roman Rozenberger
This workflow is perfect for technical writers, content creators, marketers, and developers who write in Markdown but need to collaborate or publish using Google Docs format. Ideal for teams that want to streamline their content creation and review process. What problem does this workflow solve? Manual conversion from Markdown to Google Docs is time-consuming and often loses formatting. This workflow eliminates the tedious copy-paste process, automatically preserves formatting, and creates organized, timestamped documents in your Google Drive. Perfect for content teams who write in Markdown but need Google Docs for collaboration and review. What this workflow does Converts Markdown to HTML** with proper formatting preservation (headers, lists, links, tables) Creates timestamped Google Docs** documents with automatic naming Adds Drive location metadata** for better organization and reference Maintains document structure** including emojis, tables, and text formatting Automates file creation** in specified Google Drive folders Setup Google Drive OAuth2 credentials configured in n8n Target Google Drive folder URL Input your content title and Markdown text in the "Set Input Data" node How to customize this workflow to your needs Modify HTML formatting options** in the Markdown conversion node Change file naming patterns** to match your organization system Adjust Drive folder structure** and metadata inclusion Update MIME type handling** for different output requirements Add additional processing steps** like notifications or integrations Perfect for technical documentation workflows, content publishing pipelines, blog preparation, and automated report generation. Setup Instructions - Markdown to Google Docs Converter Prerequisites n8n instance** (local or cloud) Google account** with Google Drive access Basic understanding** of n8n workflow configuration Step 1: Import the Workflow Open n8n and navigate to Workflows Click "Add workflow" → "Import from JSON" Upload the Export_Markdown_Content_do_Google_Docs_Document.json file Save the workflow with a descriptive name Step 2: Configure Google Drive Credentials Create Google Drive OAuth2 Credentials In n8n, go to Settings → Credentials Click "Add credential" → "Google Drive OAuth2 API" Follow the OAuth setup to authorize n8n access to Google Drive: Visit Google Cloud Console Create or select a project Enable Google Drive API Create OAuth2 credentials Add authorized redirect URI for your n8n instance Name the credential (e.g., "Google Drive - Markdown Converter") Configure Google Drive Nodes Update these nodes with your Google Drive credentials: Create Empty File Update Document with Correct HTML Formatting In each node: Select your Google Drive credential from the dropdown Test the connection to ensure it works properly Step 3: Prepare Your Google Drive Create Target Folder Go to Google Drive (drive.google.com) Create a new folder for your converted documents Copy the folder URL (will look like: https://drive.google.com/drive/folders/FOLDER_ID) Ensure the folder has proper permissions for your Google account Step 4: Configure Input Data Set Your Default Values Open the "Set Input Data" node Update the assignments with your preferences: Google Drive URL: Replace the example URL with your target folder URL Format: https://drive.google.com/drive/folders/YOUR_FOLDER_ID Content Title: Set a default title or leave placeholder text This will be used in the document filename Content in Markdown: Add your Markdown content or keep example for testing Supports standard Markdown syntax (headers, lists, links, tables) Step 5: Test the Workflow Initial Test Run Ensure all credentials are configured Click the "Test workflow" button on the Manual Trigger node Monitor the execution - check for any errors in node outputs Verify the result: Check your Google Drive folder Look for a new document with timestamp in the name Open the document to verify formatting Troubleshooting Common Issues Google Drive Permission Errors: Verify OAuth2 credentials are properly configured Check that the target folder exists and is accessible Ensure Google Drive API is enabled in Google Cloud Console Markdown Conversion Issues: Check that your Markdown syntax is valid Test with simple content first (headers, paragraphs, lists) Verify the "Change Markdown To HTML" node settings File Creation Problems: Confirm the Google Drive folder URL format is correct Check that the folder ID in the URL is valid Ensure your Google account has write permissions to the folder Step 6: Customize for Your Needs Modify HTML Formatting Options In the "Change Markdown To HTML" node: Enable/disable emoji support** (currently enabled) Adjust table formatting** (currently enabled) Modify header ID generation** (currently disabled) Configure space requirements** for headers Update File Naming Pattern In the "Create Empty File" node: Change the naming convention**: Currently uses _PUB {Content Title} {timestamp} Modify timestamp format**: Currently yyyy-MM-dd HH:mm:ss Add prefixes or suffixes** as needed for your organization Step 7: Production Usage Regular Workflow Execution Update the "Set Input Data" node with new content Execute the workflow manually or set up triggers Monitor execution logs for any issues Check Google Drive for generated documents Integration Options Webhook Integration: Add a Webhook trigger to accept external Markdown content Useful for automated content publishing workflows Email Integration: Add email notifications when documents are created Include links to generated Google Docs Advanced Configuration Error Handling Add error handling nodes after critical operations Implement retry logic for API failures Set up notifications for failed executions Performance Optimization Adjust the "Wait for Document Creation" timing if needed Consider file size limits for Google Docs Support and Troubleshooting Common Solutions Timeout errors**: Increase wait time in "Wait for Document Creation" Authentication failures**: Refresh Google OAuth2 credentials Formatting issues**: Test with simpler Markdown first Getting Help Check n8n community forums for Google Drive integration issues Review Google Drive API documentation for rate limits Test with minimal Markdown content to isolate problems Total setup time: ~15-20 minutes Difficulty level: Intermediate Requirements: Google account, n8n instance, basic OAuth2 setup knowledge
by Damian Karzon
This workflow randomly select recipes from a Mealie instance (can use a specific category) and then creates a meal plan in Mealie with those recipes. How it works: Workflow has a scheduled trigger (set to run weekly on a Friday) Config node sets a few properties to configure the workflow A call to the Mealie API to get the list of recipes The code node holds most of the logic, this will loop through the number of recipes defined in the config node and randomly select a recipe from the list (making sure not to double up any recipes) Once all the recipes are selected it will call the Mealie API to set up the meal plan on the days Setup Add your Mealie API token as a credential and set it on the Http Request nodes Set the relevant schedule trigger to run when you like Update the Config node with the config you want numberOfRecipes - Number of recipes to populate for the meal plan offsetPlanDays - Number of days in the future to start the plan (0 will start it today, 1 tomorrow, etc.) mealieCategoryId - A category id of the category you want to pull in recipes from (default to select from all recipes) mealieBaseUrl - The base url of your Mealie instance
by Gain FLow AI
Inquiry Form to Personalised WhatsApp Message Overview This workflow creates a smart, automated system for capturing leads from an inquiry form, initiating personalized WhatsApp message via Unipile API, and updating your Google Sheet CRM. It uses AI to craft initial outreach messages and logs the success or failure of each message sent, ensuring you track every lead effectively. This automation helps you engage leads quickly and efficiently, without manual effort. Use Case This workflow is ideal for: Sales Teams**: Automate the first touchpoint with new leads, qualifying them and initiating conversations. Small Businesses**: Provide immediate, personalized responses to inquiries, enhancing customer experience. Customer Support**: Quickly gather more context from users after they fill out a help form. Lead Generation**: Streamline the process from form submission to active lead engagement and CRM tracking. How It Works Form Submission Trigger: The workflow is activated when someone submits an "Inquiry Form." This form collects essential lead details such as: Full Name Email WhatsApp number Company Name "How can we help you?" (a notes field) AI Crafts Personalized Message: An OpenAI node, acting as "Alex" (a friendly, approachable human assistant), generates a short, personalized, and engaging opening message for the lead. This message directly addresses the lead by their first name and includes an open-ended question to encourage them to share more details about their needs. WhatsApp Outreach: The AI then uses the WhatsApp API (via Unipile) to send this personalized message directly to the lead's WhatsApp number. Unipile is key here, as it allows sending messages without prior chat history and can connect to your personal WhatsApp. Log Success or Failure: The AI checks the response from the WhatsApp API. If the WhatsApp message is sent successfully: The lead's details, along with the personalized message, WhatsApp chat ID, and message ID, are logged into a "Successful" sheet in your Google Sheet CRM. If the WhatsApp message fails to send: The lead's information, the attempted message, and the reason for failure are logged into a "Failed" sheet in your Google Sheet CRM. This helps you identify and follow up on problematic leads. How to Set It Up To set up your Lead Capture Agent, follow these steps: Google Sheet Setup: Copy the Template: Make a copy of the provided Google Sheet Template ("Sales Agent" with "Successful" and "Failed" sheets) into your own Google Drive. Connect Google Sheets: Ensure your Google Sheets OAuth2 API credentials are set up in n8n and linked to the "Google Sheets" and "Google Sheets3" nodes. Update Sheet IDs: In both "Google Sheets" and "Google Sheets3" nodes, update the documentId with the ID of your copied "Sales Agent" Google Sheet. Unipile (WhatsApp API) Credentials: Sign up for Unipile: Get your DSN and API key from Unipile (they offer a 7-day free trial). Replace Placeholders: In the "Whatsapp API" node, replace <YOUR_DSN>, <YOUR_API_KEY>, and <YOUR_ACCOUNT_ID> with your actual Unipile credentials. OpenAI API Key: Connect your OpenAI API key as an API credential in n8n and link it to the "OpenAI" node. Inquiry Form Setup: The "Enquiry Form" node generates a public webhook URL. You can embed this form on your website or share the URL directly. Alternatively, if you use your own form solution, configure it to send data via a webhook to the URL provided by the "Enquiry Form" node. Import the Workflow: Import the provided workflow JSON into your n8n instance. Activate and Test: Once all settings are complete, activate the workflow. Test it by submitting a new entry through the "Inquiry Form." Check your Google Sheet to see the lead captured and the message status. This workflow is designed to ensure no lead falls through the cracks, giving your sales or support team a powerful edge!
by Teddy
Retrieve 20 Latest TechCrunch Articles Who is this for? This workflow is designed for developers, content creators, and data analysts who need to scrape recent articles from TechCrunch. It’s perfect for anyone looking to aggregate news articles or create custom feeds for analysis, reporting, or integration into other systems. What problem is this workflow solving? This workflow automates the process of scraping recent articles from TechCrunch. Manually collecting article data can be time-consuming and inefficient, but with this workflow, you can quickly gather up-to-date news articles with relevant metadata, saving time and effort. What this workflow does This workflow retrieves the latest 20 news articles from TechCrunch’s “Recent” page. It extracts the article URLs, metadata (such as titles and publication dates), and main content for each article, allowing you to access the information you need without any manual effort. Setup Clone or download the workflow template. Ensure you have a working n8n environment. Configure the HTTP Request nodes with your desired parameters to connect to the TechCrunch API. (Optional) Customize the workflow to target specific sections or topics of interest. Run the workflow to retrieve the latest 20 articles. How to customize this workflow to your needs Modify the HTTP request to pull articles from different pages or sections of TechCrunch. Adjust the number of articles to retrieve by changing the selection criteria. Add additional processing steps to further filter or analyze the article data. Workflow Steps Send an HTTP request to the TechCrunch "Recent" page. Parse a posts box that holds the list of articles. Parse all posts to extract all articles. spilt out posts for each article. Extract the URL and metadata from each article. Send an HTTP request for each article using its URL. Locate and parse the main content of each article. Note: Be sure to update the HTTP Request nodes with any necessary headers or authentication to work with TechCrunch’s website.
by Mauricio Perera
n8n Workflow: Calculate the Centroid of a Set of Vectors Overview This workflow receives an array of vectors in JSON format, validates that all vectors have the same dimensions, and computes the centroid. It is designed to be reusable across different projects. Workflow Structure Nodes and Their Functions: Receive Vectors (Webhook): Accepts a GET request containing an array of vectors in the vectors parameter. Expected Input: vectors parameter in JSON format. Example Request: /webhook/centroid?vectors=[[2,3,4],[4,5,6],[6,7,8]] Output: Passes the received data to the next node. Extract & Parse Vectors (Set Node): Converts the input string into a proper JSON array for processing. Ensures vectors is a valid array. If the parameter is missing, it may generate an error. Expected Output Example: { "vectors": [[2,3,4],[4,5,6],[6,7,8]] } Validate & Compute Centroid (Code Node): Validates vector dimensions and calculates the centroid. Validation: Ensures all vectors have the same number of dimensions. Computation: Averages each dimension to determine the centroid. If validation fails: Returns an error message indicating inconsistent dimensions. Successful Output Example: { "centroid": [4,5,6] } Error Output Example: { "error": "Vectors have inconsistent dimensions." } Return Centroid Response (Respond to Webhook Node): Sends the final response back to the client. If the computation is successful, it returns the centroid. If an error occurs, it returns a descriptive error message. Example Response: { "centroid": [4, 5, 6] } Inputs JSON array of vectors, where each vector is an array of numerical values. Example Input { "vectors": [ [1, 2, 3], [4, 5, 6], [7, 8, 9] ] } Setup Guide Create a new workflow in n8n. Add a Webhook node (Receive Vectors) to receive JSON input. Add a Set node (Extract & Parse Vectors) to extract and convert the data. Add a Code node (Validate & Compute Centroid) to: Validate dimensions. Compute the centroid. Add a Respond to Webhook node (Return Centroid Response) to return the result. Function Node Script Example const input = items[0].json; const vectors = input.vectors; if (!Array.isArray(vectors) || vectors.length === 0) { return [{ json: { error: "Invalid input: Expected an array of vectors." } }]; } const dimension = vectors[0].length; if (!vectors.every(v => v.length === dimension)) { return [{ json: { error: "Vectors have inconsistent dimensions." } }]; } const centroid = new Array(dimension).fill(0); vectors.forEach(vector => { vector.forEach((val, index) => { centroid[index] += val; }); }); for (let i = 0; i < dimension; i++) { centroid[i] /= vectors.length; } return [{ json: { centroid } }]; Testing Use a tool like Postman or the n8n UI to send sample inputs and verify the responses. Modify the input vectors to test different scenarios. This workflow provides a simple yet flexible solution for vector centroid computation, ensuring validation and reliability.
by Roger Filomeno
Introduction: This workflow template helps you determine if a Twitch user's stream is currently live or offline. Setup Instructions: The Document node holds the sample Twitch username you wish to check, you may adapt it in your workflow by replacing this with a chain that contains the Twitch username you want to check. This value is passed to the GraphQL node query as $('Document').item.json.twitch so make sure to change this based on your workflow. How it Works: The important nodes here are the GrapQL and IF nodes. The GrapQL queries the Twitch API, and then the output returns a document with the stream property. The IF node then checks if this property has a value, if null means the user is offline, otherwise the user is online or live. Common Use Cases: You can use this with other workflow templates to post live stream alerts to Twitter/X, Bluesky, and Discord via webhooks, etc to notify your community to join youR stream. You may also use an LLM node to write a custom alert based on the value of property title How to adjust this template If you want to check a list of Twitch channels, you can simply exchange the Document set node in the beginning with your list of channels. For more information on the GraphQL output please see the official Twitch API documentation: Get Streams
by Zacharia Kimotho
This workflow takes off the task of backing up workflows regularly on Github and uses Google Drive as the main tool to host these. This can be a good way to keep track of your workflows so that you never lose any workflows in case your n8n goes down. How does it work Creates a new folder within a specified folder with the time its backed up Loops around all workflows, converts them to a JSON file and uploads them to the created folder Gets the previous backups and deletes them This has a clean feel and look as it simplifies the backup while not keeping a cache of workflows on your drive. Setup Create a new folder Create new service account credentials Share the folder with the service account email Upload this workflow to your canvas and map the credentials Set the schedule that you need your workflows to run and manage your backups Activate the workflow Happy Productivity! @Imperol
by David Olusola
AI Lead Capture System - Complete Setup Guide Prerequisites n8n instance (cloud or self-hosted) Google AI Studio account (free tier available) Google account for Sheets integration Website with chat widget capability Phase 1: Core Infrastructure Setup Step 1: Set Up Google AI Studio Go to Google AI Studio Create account or sign in with Google Navigate to "Get API Key" Create new API key for your project Copy and securely store the API key Free tier limits: 15 requests/minute, 1 million tokens/month Step 2: Configure Google Sheets Create new Google Sheet for lead storage Add column headers (exact names): Full Name Company Name Email Address Phone Number Project Intent/Needs Project Timeline Budget Range Preferred Communication Channel How they heard about DAEX AI Copy the Google Sheet ID from URL (between /d/ and /edit) Ensure sheet is accessible to your Google account Step 3: Import n8n Workflow Open your n8n instance Create new workflow Click "..." menu → Import from JSON Paste the provided workflow JSON Workflow will appear with all nodes connected Phase 2: Credential Configuration Step 4: Set Up Google Gemini API In n8n, go to Credentials → Add Credential Search for "Google PaLM API" Enter your API key from Step 1 Test connection Link to the "Google Gemini Chat Model" node Step 5: Configure Google Sheets Access Go to Credentials → Add Credential Select "Google Sheets OAuth2 API" Follow OAuth flow to authorize your Google account Test connection with your sheet Link to the "Google Sheets" node Phase 3: Workflow Customization Step 6: Update Company Information Open the AI Agent node In the system message, replace all mentions of: Company name and description Service offerings and specializations FAQ knowledge base Typical project timelines and pricing ranges Adjust conversation tone to match your brand voice Step 7: Configure Lead Qualification Fields In the AI Agent system message, modify the required information list: Add/remove qualification questions Adjust budget ranges for your services Customize timeline options Update communication channel preferences In Google Sheets node, update column mappings if you changed fields Step 8: Set Up Sheet Integration Open Google Sheets node Click on Document ID dropdown Select your lead capture sheet Verify all column mappings match your sheet headers Test with sample data Phase 4: Website Integration Step 9: Get Webhook URL Open Webhook node in n8n Copy the webhook URL (starts with your n8n domain) Note: URL format is https://your-n8n-domain.com/webhook/[unique-id] Step 10: Connect Your Chat Widget Choose your integration method: Option A: Direct JavaScript Integration javascript// Add to your website function sendMessage(message, sessionId) { fetch('YOUR_WEBHOOK_URL', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ message: message, sessionId: sessionId || 'visitor-' + Date.now() }) }) .then(response => response.json()) .then(data => { // Display AI response in your chat widget displayMessage(data.message); }); } Option B: Chat Platform Webhook Open your chat platform settings (Intercom, Crisp, etc.) Find webhook/integration section Add webhook URL pointing to your n8n endpoint Configure to send message and session data Option C: Zapier/Make.com Integration Create new Zap/Scenario Trigger: New chat message from your platform Action: HTTP POST to your n8n webhook Map message content and session ID Phase 5: Testing & Optimization Step 11: Test Complete Flow Send test message through your chat widget Verify AI responds appropriately Check conversation context is maintained Confirm lead data appears in Google Sheets Test with various conversation scenarios Step 12: Monitor Performance Check n8n execution logs for errors Monitor Google Sheets for data quality Review conversation logs for improvement opportunities Track response times and conversion rates Step 13: Fine-Tune Conversations Analyze real conversation logs Update system prompts based on common questions Add new FAQ knowledge to the AI agent Adjust qualification questions based on lead quality Optimize for your specific customer patterns Phase 6: Advanced Features (Optional) Step 14: Add Lead Scoring Create new column in Google Sheets for "Lead Score" Update AI agent to calculate scores based on: Budget range (higher budget = higher score) Timeline urgency (sooner = higher score) Project complexity (complex = higher score) Add conditional formatting in Google Sheets to highlight high-value leads Step 15: Set Up Notifications Add email notification node after Google Sheets Configure to send alerts for high-priority leads Include lead details and conversation summary Set up different notification rules for different lead scores Step 16: Analytics Dashboard Connect Google Sheets to Google Data Studio or similar Create dashboard showing: Daily lead volume Conversion rates by source Average qualification time Lead quality scores Revenue pipeline from captured leads Troubleshooting Common Issues AI Not Responding Check Google Gemini API key validity Verify API quota not exceeded Review n8n execution logs for errors Data Not Saving to Sheets Confirm Google Sheets permissions Check column name matching Verify sheet ID is correct Chat Widget Not Connecting Test webhook URL directly with curl/Postman Verify JSON format matches expected structure Check CORS settings if browser-based integration Conversation Context Lost Ensure sessionId is unique per visitor Check memory node configuration Verify sessionId is passed consistently
by JaredCo
This n8n workflow demonstrates how to transform natural language date and time expressions into structured data with 96%+ accuracy. Parse complex expressions like "early next July", "2 weeks after project launch", or "end of Q3" into precise datetime objects with confidence scoring, timezone intelligence, and business rules validation for any automation workflow. Good to know Achieves 96%+ accuracy on complex natural language date expressions At time of writing, this is the most advanced open-source date parser available Includes AI learning that improves over time with user corrections Supports 6 languages with auto-detection (English, Spanish, French, German, Italian, Portuguese) Sub-millisecond response times with intelligent caching Enterprise-grade with business intelligence and timezone handling How it works Natural Language Input**: Receives date expressions via webhook, form, email, or chat AI-Powered Parsing**: Your world-class date parser processes the text through: 50+ custom rule patterns for complex expressions Multi-language auto-detection and smart translation Confidence scoring (0.0-1.0) for AI decision-making Ambiguity detection with helpful suggestions Business Intelligence**: Applies enterprise rules automatically: Holiday calendar awareness (US + International) Working hours validation and warnings Business day auto-adjustment Timezone normalization (IANA format) Smart Scheduling**: Creates calendar events with: Structured datetime objects (start/end times) Confidence metadata for workflow decisions Alternative interpretations for ambiguous inputs Rich context for follow-up actions Integration Ready**: Outputs connect seamlessly to: Google Calendar, Outlook, Apple Calendar CRM systems (HubSpot, Salesforce) Project management tools (Notion, Asana) Communication platforms (Slack, Teams) How to use The webhook trigger receives natural language date requests from any source Replace the MCP server URL with your deployed date parser endpoint Configure timezone preferences for your organization Customize business rules (working hours, holidays) in the parser settings Connect calendar integration nodes for automatic event creation Add notification workflows for scheduling confirmations Use Cases Meeting Scheduling**: "Schedule our quarterly review for early Q3" Project Management**: "Set deadline 2 weeks after product launch" Event Planning**: "Book venue for the weekend before Labor Day" Personal Assistant**: "Remind me about dentist appointment next Tuesday morning" International Teams**: "Team standup tomorrow morning" (auto-timezone conversion) Seasonal Planning**: "Launch campaign in late spring 2025" Requirements Natural Language Date Parser MCP server (provided code) Webhook endpoint or form trigger Calendar integration (Google Calendar, Outlook, etc.) Optional: Slack/Teams for notifications Optional: Database for learning pattern storage Customizing this workflow Multi-language Support**: Enable auto-detection for global teams Business Rules**: Configure company holidays and working hours Learning System**: Enable AI learning from user corrections Integration Depth**: Connect to your existing calendar and CRM systems Confidence Thresholds**: Set minimum confidence levels for auto-scheduling Ambiguity Handling**: Route unclear dates to human review or clarification requests Sample Input/Output Input Examples: "early next July" "2 weeks after Thanksgiving" "next Wednesday evening" "Q3 2025" "mañana por la mañana" (Spanish) "first thing Monday" Rich Output: { "parsed": [{ "start": "2025-07-01T00:00:00Z", "end": "2025-07-10T23:59:59Z", "timezone": "America/New_York" }], "confidence": 0.95, "method": "custom_rules", "business_insights": [{ "type": "business_warning", "message": "Selected date range includes July 4th holiday" }], "predictions": [{ "type": "time_preference", "suggestion": "You usually schedule meetings at 10 AM" }], "ambiguities": [], "alternatives": [{ "interpretation": "Early July 2026", "confidence": 0.15 }], "performance": { "cache_hit": true, "response_time": "0.8ms" } } Why This Workflow is Unique World-Class Accuracy**: 96%+ success rate on complex expressions AI Learning**: Improves over time with user feedback Global Ready**: Multi-language and timezone intelligence Business Smart**: Enterprise rules and holiday awareness Performance Optimized**: Sub-millisecond cached responses Context Aware**: Provides confidence scores and alternatives for AI decision-making Transform your scheduling workflows from rigid form inputs to natural, conversational date requests that your users will love!
by Hostinger
This n8n workflow template is designed to help system administrators and DevOps professionals monitor key resource usage metrics — CPU, RAM, and Disk — on a VPS (Virtual Private Server). The workflow automatically checks these resources every 15 minutes and sends an email alert if any resource usage exceeds the 80% threshold. This proactive monitoring helps maintain optimal server performance and prevents resource-related downtimes. Who This Workflow Is For • System Administrators managing Linux-based servers who need to ensure their systems are running smoothly without manual monitoring. • DevOps Professionals who manage multiple environments and need automated tools to alert them to potential issues before they affect operations. • IT Support Teams who require an easy way to keep tabs on server health across an organization’s infrastructure. How It Works Schedule Trigger: The workflow is triggered every 15 minutes by a Cron node. Resource Checks: Separate SSH Command nodes are configured to execute specific commands that check the current usage of RAM, Disk, and CPU. Data Aggregation: The results from each check are merged using a Merge node, which combines the data into a single payload for analysis. Threshold Analysis: A Function node evaluates whether any resource’s usage exceeds the predefined 80% threshold. Alerts: If any metric exceeds the threshold, an email alert is sent through an Email node, ensuring that administrators can react promptly to potential issues. Setup Steps Configure SSH Nodes: Update each SSH node with the appropriate credentials and target server details where the resource checks will be performed. Set Thresholds: If different sensitivity levels are required, review and adjust the resource usage thresholds within the Function node. Email Configuration: Enter the correct email addresses in the Email node for where alerts should be sent. Ensure that your email-sending credentials and server details are correctly configured.
by Oneclick AI Squad
AI-Powered Email Draft Automation Workflow In this guide, we’ll walk you through setting up an AI-driven workflow that automatically processes incoming emails using a custom AI model (e.g., Llama), prepares email content, and saves it as a Gmail draft. Ready to automate your email drafting process? Let’s dive in! What’s the Goal? Automatically detect and process new emails via IMAP. Use a custom AI model to analyze and generate email content. Prepare structured and relevant email responses. Save the generated content as a Gmail draft for review or sending. Enable 24/7 email automation with seamless integration. By the end, you’ll have a self-running email assistant that drafts responses effortlessly. Why Does It Matter? Manual email drafting is time-consuming and prone to delays. Here’s why this workflow is a game changer: Zero Human Error:** AI ensures consistent and accurate drafts. Time-Saving Automation:** Instantly process and draft emails, boosting efficiency. 24/7 Availability:** Handle emails anytime without manual intervention. Focus on Strategy:** Free your team from repetitive drafting tasks. Think of it as your tireless email drafting assistant that never misses a beat. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Detect new emails using IMAP via the Check New Email (IMAP) node. Capture incoming email content for processing. Step 2: Process Email with AI Send the email text to a custom AI model (e.g., Llama) for analysis. Use the Custom AI Model node to generate a context-aware response or draft content. Step 3: Prepare Email Content Format the AI-generated content into a polished email structure using the Prepare Email Content node. Ensure the content is ready for drafting with proper salutations and structure. Step 4: Save as Gmail Draft Route the prepared email content to the Save as Gmail Draft node. Save the draft in Gmail for review or manual sending. Step 5: Log & Optimize Log all processed emails and drafts in a database (e.g., Airtable, Google Sheets). Continuously improve the AI model based on feedback or new email patterns. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built or shared workflows to save time. Below is a step-by-step guide to importing the Smart Email Draft Generator workflow in n8n, based on the official documentation and community resources. Steps to Import a Workflow in n8n 1. Obtain the Workflow JSON Source the Workflow:** Workflows are typically shared as JSON files or code snippets. You might receive them from: The n8n community (e.g., n8n.io workflows page). A colleague or tutorial (e.g., a .json file or copied JSON code). Exported from another n8n instance. Format:** Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or as text copied to your clipboard. 2. Access the n8n Workflow Editor Log in to n8n:** Open your n8n instance (via n8n Cloud or your self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Open a New Workflow:** Click Add Workflow to create a blank workflow, or open an existing workflow if you want to merge the imported workflow. 3. Import the Workflow Option 1: Import via JSON Code (Clipboard): In the n8n editor, click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code of the workflow into the provided text box. Click Import to load the workflow into the editor. Option 2: Import via JSON File: In the n8n editor, click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import the workflow. Setup Notes: IMAP Credentials:** Configure IMAP settings in the Check New Email (IMAP) node with your email account credentials (e.g., Gmail IMAP settings). Custom AI Model:** Set up the Custom AI Model node with your AI model credentials (e.g., Llama API key or endpoint). Gmail Integration:** Authorize the Save as Gmail Draft node with Gmail API credentials to save drafts. Content Customization:** Adjust the Prepare Email Content node to tailor the email structure or tone as needed.
by Khairul Muhtadin
❓ What Problem Does It Solve? Manual exporting or copying of leads and newsletter signups from web forms to spreadsheets is time-consuming, error-prone, and delays follow-ups or marketing activities. Traditional workflows can lose data due to mistakes or lack of automation. The Fluentform Export workflow automates the capture and organization of form submissions and newsletter signups into Google Sheets 💡 Why Use this workflow? Save Time:** Automate tedious manual data entry for form leads and newsletter signups Avoid Data Loss:** Ensure all submissions are reliably logged with real-time updates Organized Data:** Separate sheets for newsletter and contact form data maintain clarity Easy Integration:** Works seamlessly with Fluentform submissions and Google Sheets Flexible & Scalable:** Quickly adapt to changes in form structure or spreadsheet columns ⚡ Who Is This For? Marketers & Growth Teams:** Automatically gather leads and newsletter contacts to fuel campaigns Small to Medium Businesses:** Reduce overhead from manual data management and errors Customer Support Teams:** Keep track of form submissions in a centralized, accessible place Website Admins:** Simplify data workflow from Fluentform plugins without coding 🔧 What This Workflow Does ⏱ Trigger:** Listens for incoming POST requests from Fluentform via webhook 📎 Step 2:** Evaluates if the submission is a newsletter signup or a form based on a specific token 🔄 Step 3 (Newsletter Path):** Maps email from newsletter submissions and appends/updates Google Sheets "News Letter" tab 🔄 Step 3 (Form Path):** Extracts full name, email, phone, subject, and message fields and appends/updates the Google Sheets "form" tab 💌 Step 4:** Sends a JSON success response back to Fluentform confirming receipt 🔐 Setup Instructions Import the provided .json workflow file into your n8n instance Set up credentials: Google Sheets OAuth2 credential with access to your target spreadsheets Customize workflow elements: Update Fluentform webhook URL in your Fluentform settings to the n8n webhook URL generated Adjust field names or spreadsheet columns if your form structure changes Update spreadsheet IDs and sheet names used in the Google Sheets nodes to match your own Sheets Test workflow thoroughly with actual Fluentform submissions to verify data flows correctly 🧩 Pre-Requirements Running n8n instance (Cloud or self-hosted) Google account with access to Google Sheets and OAuth credentials Fluentform installed on your website with ability to set webhook URL Target Google Sheets prepared with tabs named "News Letter" and "form" with expected columns 🧠 Nodes Used Webhook (POST - Retrieve Leads) If (Form or newsletter?) Set (newsletter and form data preparation) Google Sheets (Append/update for newsletter and form sheets) Respond to Webhook 📞 Support Made by: khaisa Studio Tag: automation, Google Sheets, Fluentform, Leads Category: Marketing Need a custom? Contact Me