by Gwa Shark
Who For? Gamers who don't like bad and slow preformance playing games and want to find a good preformance based server near them. Setup? None! Modes Available Auto - Optimized (Recommended & Default) Ping - Finds the lowest ping Latency - Lowest ping & highest FPS
by Rodrigo
How it works This workflow automatically responds to incoming emails identified as potential leads using AI-generated text. It connects to your email inbox via IMAP, classifies incoming messages with an AI model, filters out non-leads, and sends a personalized reply to relevant messages. Steps Email Trigger (IMAP): Watches your inbox for new emails in real time. Is Lead? (Message Model): Uses AI to determine whether the sender is a lead. Filter: Passes only lead emails to the next step. Write Customized Reply (Message Model): Generates a personalized response using AI. Get Message: Retrieves original email details to ensure correct threading. Reply to Message: Sends the AI-generated reply to the sender. Setup Instructions Connect your IMAP Email credentials to the first node and set the folder to watch (e.g., INBOX). In the "Filter leads" node, adjust the AI prompt to match your lead qualification criteria. In the "Reply with customized message" node, edit the AI prompt to reflect your product, service, or business tone. Connect your Gmail (or other email provider) credentials in the Get Message and Reply to Message nodes. Test with a few sample emails before activating. Requirements IMAP-enabled email account (for receiving messages) Gmail API access (or modify to your email provider) OpenAI or other AI model credentials for message analysis and reply generation This template is ready to use, with all steps documented inside sticky notes for easy customization.
by Yang
π What this workflow does This workflow automatically scrapes product information from any website URL entered into a Google Sheet and stores the extracted product details into another sheet. It uses Dumpling AI to extract product data such as name, price, description, and reviews. π€ Who is this for This is ideal for: Lead generation specialists capturing product info from prospect websites eCommerce researchers collecting data on competitor product listings Sales teams building enriched product databases from lead URLs Anyone who needs to automate product scraping from multiple websites β Requirements A Google Sheet with a column labeled Website where URLs will be added A second sheet (e.g., product details) where extracted data will be saved Dumpling AI** API access to perform the extraction Connected Google Sheets credentials in n8n βοΈ How to set up Replace the Google Sheet and tab IDs in the workflow with your own. Make sure your source sheet includes a Website column. Connect your Dumpling AI and Google Sheets credentials. Make sure the output sheet has the following headers: productName price productDescription (The workflow supports review, but itβs optional.) Activate the workflow to start processing new rows. π How it works (Workflow Steps) Watch New Website URL in Google Sheets: Triggers when a new row is added with a website URL. Extract Product Info with Dumpling AI: Sends the URL to Dumpling AIβs extract endpoint using a defined schema for product details. Split Extracted Products: Separates multiple products into individual items if the page contains more than one. Append Product Info to Google Sheets: Adds the structured results to the specified product details sheet. π οΈ Customization Ideas Add a column to store the original source URL alongside each product Use OpenAI to generate short SEO summaries for each product Add filters to ignore pages without valid product details Send Slack or email notifications when new products are added to the sheet
by Ziad Adel
What this workflow does This workflow sends a daily Slack report with the current number of subscribers in your Mailchimp list. Itβs a simple way to keep your marketing or growth team informed without logging into Mailchimp. How it works Cron Trigger starts the workflow once per day (default: 09:00). Mailchimp node retrieves the total number of subscribers for a specific list. Slack node posts a formatted message with the subscriber count into your chosen Slack channel. Pre-conditions / Requirements A Mailchimp account with API access enabled. At least one Mailchimp audience list created (youβll need the List ID). A Slack workspace with permission to post to your chosen channel. n8n connected to both Mailchimp and Slack via credentials. Setup Cron Trigger Default is set to 09:00 AM daily. Adjust the time or frequency as needed. Mailchimp: Get Subscribers Connect your Mailchimp account in n8n credentials. Replace {{MAILCHIMP_LIST_ID}} with the List ID of the audience you want to monitor. To find the List ID: Log into Mailchimp β Audience β All contacts β Settings β Audience name and defaults. Slack: Send Summary Connect your Slack account in n8n credentials. Replace {{SLACK_CHANNEL}} with the name of the channel where the summary should appear (e.g., #marketing). The message template can be customized, e.g., include emojis, or additional Mailchimp stats. Customization Options Multiple lists:** Duplicate the Mailchimp node for different audience lists and send combined stats. Formatting:** Add more details like new subscribers in the last 24h by comparing with previous runs (using Google Sheets or a database). Notifications:** Instead of Slack, send the update to email or Microsoft Teams by swapping the output node. Benefits Automation:** Removes the need for manual Mailchimp checks. Visibility:** Keeps the whole team updated on subscriber growth in real time. Motivation:** Celebrate growth milestones directly in team channels. Use Cases Daily subscriber growth tracking for newsletters. Sharing metrics with leadership without giving Mailchimp access. Monitoring the effectiveness of campaigns in near real time.
by Shahrear
Process Physician Orders into Google Sheets with VLM Run AI Extraction What this workflow does Monitors Google Drive for new physician order files in a target folder Downloads the file automatically inside n8n for processing Sends the file to VLM Run for AI transcription and structured data extraction Parses healthcare-specific details from the healthcare.physician-order domain as JSON Appends normalized attributes to a Google Sheet as a new row Setup Prerequisites: Google account, VLM Run API credentials, Google Sheets access, n8n. Install the verified VLM Run node from the n8n node list, then click Install. Once installed, you can integrate it directly in your workflow. Quick Setup: Create the Drive folder you want to watch and copy its Folder ID Create a Google Sheet with headers such as: timestamp, file_name, file_id, mime_type, size_bytes, uploader_email, patient_name, patient_dob, patient_gender, patient_address, patient_phone_no, physician_name, physician_phone_no, physician_email, referring_clinic, diagnosis, exam_date, form_signed_in, β¦other physician order fields as needed Configure Google Drive OAuth2 for the trigger and download nodes Add your VLM Run API credentials from VLM Run Dashboard to the VLM Run node Configure Google Sheets OAuth2 and set Spreadsheet ID + target tab Upload a sample physician order file to the Drive folder to test, then activate Perfect for Converting physician order documents into structured, machine-readable text Automating extraction of patient, physician, and clinical details with VLM Run Creating a centralized archive of orders for compliance, auditing, or reporting Reducing manual data entry and ensuring consistent formatting Key Benefits End-to-end automation** from Drive upload to structured Google Sheets row AI-powered accuracy** using VLM Runβs healthcare-specific extraction models Standardized attribute mapping** for patient and physician records Instantly searchable archive** directly in Google Sheets Hands-free processing** once the workflow is activated How to customize Extend by adding: Attribute-specific parsing (e.g., ICD/CPT diagnosis codes, insurance details) Automatic classification of orders by specialty, urgency, or exam type Slack, Teams, or email notifications when new physician orders are logged Keyword tagging for fast filtering and downstream workflows Error-handling rules that move failed conversions into a review folder or error sheet
by Amir Safavi-Naini
LLM Cost Monitor & Usage Tracker for n8n π― What This Workflow Does This workflow provides comprehensive monitoring and cost tracking for all LLM/AI agent usage across your n8n workflows. It extracts detailed token usage data from any workflow execution and calculates precise costs based on current model pricing. The Problem It Solves When running LLM nodes in n8n workflows, the token usage and intermediate data are not directly accessible within the same workflow. This monitoring workflow bridges that gap by: Retrieving execution data using the execution ID Extracting all LLM usage from any nested structure Calculating costs with customizable pricing Providing detailed analytics per node and model WARNING: it works after the full execution of the workflow (i.e. you can't get this data before completion of all tasks in the workflow) βοΈ Setup Instructions Prerequisites Experience Required: Basic familiarity with n8n LLM nodes and AI agents Agent Configuration: In your monitored workflows, go to agent settings and enable "Return Intermediate Steps" For getting execution data, you need to set upthe n8n API in your instance (also available onthe free version) Installation Steps Import this monitoring workflow into your n8n instance Go to Settings >> select n8n API from left bar >> define an API. Now you can add this as the credential for your "Get an Execution" node Configure your model name mappings in the "Standardize Names" node Update model pricing in the "Model Prices" node (prices per 1M tokens) To monitor a workflow: Add an "Execute Workflow" node at the end of your target workflow Select this monitoring workflow Important: Turn OFF "Wait For Sub-Workflow Completion" Pass the execution ID as input π§ Customization When You See Errors If the workflow enters the error path, it means an undefined model was detected. Simply: Add the model name to the standardize_names_dic Add its pricing to the model_price_dic Re-run the workflow Configurable Elements Model Name Mapping**: Standardize different model name variations (e.g., "gpt-4-0613" β "gpt-4") Pricing Dictionary**: Set costs per million tokens for input/output Extraction Depth**: Captures tokens from any nesting level automatically π Output Data Per LLM Call Cost Breakdown**: Prompt, completion, and total costs in USD Token Metrics**: Prompt tokens, completion tokens, total tokens Performance**: Execution time, start time, finish reason Content Preview**: First 100 chars of input/output for debugging Model Parameters**: Temperature, max tokens, timeout, retry count Execution Context**: Workflow name, node name, execution status Flow Tracking**: Previous nodes chain Summary Statistics Total executions and costs Breakdown by model type Breakdown by node Average cost per call Total execution time β¨ Key Benefits No External Dependencies**: Everything runs within n8n Universal Compatibility**: Works with any workflow structure Automatic Detection**: Finds LLM usage regardless of nesting Real-time Monitoring**: Track costs as workflows execute Debugging Support**: Preview actual prompts and responses Scalable**: Handles multiple models and complex workflows π Example Use Cases Cost Optimization**: Identify expensive nodes and optimize prompts Usage Analytics**: Track token consumption across teams/projects Budget Monitoring**: Set alerts based on cost thresholds Performance Analysis**: Find slow-running LLM calls Debugging**: Review actual inputs/outputs without logs Compliance**: Audit AI usage across your organization π Quick Start Import workflow Update model prices (if needed) Add monitoring to any workflow with the Execute Workflow node View detailed cost breakdowns instantly Note: Prices are configured per million tokens. Default includes GPT-4, GPT-3.5, Claude, and other popular models. Add custom models as needed.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works Automate transactional emails from KlickTipp to Gmail This workflow receives contact data from a KlickTipp Outbound rule, generates a personalized HTML email, and sends it via Gmail. Key fields (e.g., first name, company, website, phone, or other custom fields) are dynamically mapped into the body. After sending, the workflow saves the emailβs HTML content and writes back an Email delivery status (βSentβ or βFailedβ) to the contact in KlickTipp for clear visibility. Key Features KlickTipp Outbound Trigger Starts when a KlickTipp Outbound rule calls the webhook (e.g., after a tag is applied). Accepts payload with recipient email and optional custom fields (first name, company, website, phone, etc.). Easy to adapt for confirmations, updates, welcomes, and announcements. HTML Email Composer Builds a clean, brandable HTML template with safe fallbacks. Supports per-contact personalization via mapped fields. Gmail Delivery Sends via Gmail (OAuth) with From/Reply-To, Subject, and HTML body. Supports CC/BCC and attachments if needed. Delivery Status Write-Back On success: updates a KlickTipp custom field (e.g., Email delivery status = Sent). On error: updates the same field to Failed (error details available in execution logs). Setup Instructions Install and Configure Nodes Add/enable KlickTipp community nodes and authenticate with valid API credentials. Create/authorize a Gmail credential (OAuth) and select it in the Send an email node. Paste/import your HTML into the Generate HTML template node. Activate the workflow. Workflow Logic Trigger from KlickTipp: Outbound sends contact data to the workflow. Generate HTML: Build personalized HTML (and optional plain-text). Send via Gmail: Deliver the message with the Gmail node. On Success: Update KlickTipp contact β Email delivery status: Sent. On Error: Update KlickTipp contact β Email delivery status: Failed (see logs for details). Benefits Immediate, personalized communication** without manual steps. Consistent branding** with reusable HTML templates. Clear observability** by writing back delivery status to KlickTipp. Flexible & extensible** for many message types beyond payments. Testing and Deployment Tag a test contact in KlickTipp to trigger the Outbound rule. Verify the Gmail email arrives with correct personalization. Confirm the Email delivery status field updates to Sent (or Failed for negative tests). Review execution logs and adjust field mappings if necessary. Notes Customization:** Swap templates, add CC/BCC, attachments, or a plain-text part for deliverability.
by Biznova
Transform Product Photos into Marketing Images with AI Made by Biznova | TikTok π― Who's it for E-commerce sellers, social media marketers, small business owners, and content creators who need professional product advertising images without expensive photoshoots or graphic designers. β¨ What it does This workflow automatically transforms simple product photos into polished, professional marketing images featuring: Professional models showcasing your product Aesthetically pleasing, contextual backgrounds Professional lighting and composition Lifestyle scenes that help customers envision using the product Commercial-ready quality suitable for ads and e-commerce π How it works Upload your basic product photo via the web form AI analyzes your product and generates a complete marketing scene Download your professional marketing image automatically Use it immediately in ads, social media, or product listings βοΈ Setup Requirements OpenRouter Account: Create a free account at openrouter.ai API Key: Generate your API key from the OpenRouter dashboard Add Credentials: Configure the OpenRouter API credentials in the "AI Marketing Image Generator" node Test: Upload a sample product image to test the workflow π¨ How to customize Edit the prompt** in the "AI Marketing Image Generator" node to match your brand style Adjust file formats** in the upload form (currently accepts JPG/PNG) Modify the response message** in the final form node Add your branding** by including brand colors or style preferences in the prompt π‘ Pro Tips Use high-resolution product images for best results Test different prompt variations to find your ideal style Save successful prompts for consistent brand imagery Batch process multiple products by running the workflow multiple times π§ Quick Setup Guide Prerequisites OpenRouter account (Sign up here) API key from OpenRouter dashboard Configuration Steps Click on "AI Marketing Image Generator" node Add your OpenRouter API credentials Save and activate the workflow Test with a product image Customization To change the image style: Edit the prompt in the "AI Marketing Image Generator" node Add specific instructions about colors, mood, or setting Include brand-specific requirements Example custom prompt additions: "Use a minimalist white background" "Feature a modern, urban setting" "Include warm, natural lighting" "Show the product in a luxury lifestyle context"
by Nabin Bhandari
WhatsApp Voice Agent with Twilio, VAPI, Google Calendar, Gmail & Supabase This workflow turns WhatsApp voice messages into an AI assistant using Twilio, VAPI, and modular MCP servers. It handles scheduling, email, and knowledge queries all by voice. π How it works WhatsApp β Twilio β VAPI A WhatsApp Business number (via TwiML app) receives a voice message. Twilio streams the audio into VAPI for processing. VAPI β n8n Webhook VAPI interprets the intent and routes the request to the correct MCP server. MCP Servers in n8n π Calendar MCP β create, fetch, update, delete Google Calendar events π§ Gmail MCP β send confirmation or reminder emails π Knowledge Base MCP β query Supabase Vector Store with OpenAI embeddings n8n β VAPI β WhatsApp n8n executes the task and returns the result via VAPI back to the user. π οΈ How to use Import this workflow into your n8n instance. Configure a Twilio WhatsApp-enabled number and connect it to a TwiML app. Point the TwiML app to your VAPI project. Add credentials for Google Calendar, Gmail, Supabase, and OpenAI in n8n. Test by sending a WhatsApp voice command like: βBook a meeting tomorrow at 3pmβ βSend a confirmation email to the clientβ βWhatβs included in the AI receptionist package?β π¨ Customisation ideas Add more MCP servers (e.g. CRM, Notion, Slack). Swap Supabase for another vector database. Extend Gmail flows with templates or multiple senders. Adjust the VAPI assistantβs tone and role to fit your brand. π Requirements Twilio WhatsApp-enabled number + TwiML app (verified in WhatsApp Manager) VAPI project (assistant configured) n8n instance (Cloud or self-hosted) Google Calendar & Gmail credentials Supabase project OpenAI API key π‘ Good to know Twilio must have a verified WhatsApp Business number. VAPI handles voice infra + intent routing; n8n only executes actions. The design is modularβeasy to expand with new MCP servers. Works best when tested with short, clear commands. π Use cases Hands-free scheduling with Google Calendar. Voice-triggered email confirmations & reminders. Conversational knowledge base access. Extendable to CRMs, team chat, or business workflows. π With this setup, you get a scalable voice-first AI agent on WhatsApp that connects seamlessly to your business systems.
by RJ Nelson
Who's it for This workflow is ideal for content creators, marketers, and business professionals who want to automatically repurpose their LinkedIn content into email newsletters. If you're actively posting on LinkedIn and want to maximize your content's reach by delivering it directly to your email subscribers, this automation saves you hours of manual work each week. What it does This workflow automatically transforms your latest LinkedIn posts into professionally formatted email newsletters. It monitors your LinkedIn profile, extracts your most recent post, and intelligently converts it into engaging newsletter content with proper HTML formatting. The final newsletter is then delivered directly to your target email inbox, ready to be reviewed and sent to your subscribers. How it works The workflow operates through four key stages: Content Extraction: Uses an Apify actor node to scrape all posts from your specified LinkedIn account Latest Post Filter: Automatically identifies and isolates your most recent LinkedIn post from the scraped data AI Transformation: Leverages OpenAI to convert the LinkedIn post into newsletter-style content with improved formatting and structure Email Delivery: Applies email-friendly HTML formatting and sends the polished newsletter to your designated email inbox Requirements To use this workflow, you'll need: Apify account - For LinkedIn data scraping (free tier available) OpenAI API key - For content transformation and formatting Gmail API credentials - For email delivery Setup instructions Configure Apify: Add your Apify API credentials and specify the LinkedIn profile URL you want to monitor Connect OpenAI: Insert your OpenAI API key in the AI transformation node Set up Gmail: Authenticate your Gmail account and specify the recipient email address Customize prompts: Adjust the OpenAI prompts to match your newsletter's tone and style Test the workflow: Run manually first to ensure all connections work properly How to customize Modify the OpenAI prompts to adjust newsletter tone, length, and formatting style Change the filtering logic to select posts based on engagement metrics instead of recency Add additional nodes to schedule automatic runs or post to multiple platforms Integrate with email marketing platforms like Mailchimp or SendGrid for direct subscriber delivery
by PrideVel
π§ Overview This workflow automatically fetches user data from an API, formats it, and stores it in Google Sheets and CSV files. π‘ Use Cases Collect user records for analytics or reporting Maintain centralized spreadsheets for marketing or CRM Export CSV backups for offline analysis or integrations π Good to Know Uses the BASE_URL environment variable for the API endpoint Google Sheets integration requires OAuth2 credentials CSV export saves a local copy for backup or external use Optional error handler ensures the workflow stops safely if any step fails βοΈ How It Works Manual Trigger β Start the workflow manually or via schedule HTTP Request β Fetch data from the API endpoint Set Node β Extract and format fields like name (first + last) and country Google Sheets β Append processed data to a Google Sheet Sheet ID: qwertz Range: A:C Spreadsheet File β Export data to CSV (users_spreadsheet.csv) Error Handler β Stops workflow if any step fails π How to Use Trigger the workflow manually or schedule it Ensure environment variables and credentials are configured Check the Google Sheet for appended user data Access the CSV file for backups or external use π§© Requirements API endpoint (BASE_URL) Google Sheets OAuth2 credentials Permission to write files for CSV export βοΈ Setup Instructions Set Environment Variables: Add BASE_URL in workflow environment variables. Configure Google Sheets Node: Authenticate with your Google account using OAuth2. Specify the target Sheet ID and range. Check File Permissions: Ensure the workflow has write access to store users_spreadsheet.csv. Test the API Response Format: Expected JSON fields: first_name, last_name, country, etc. Adjust the Set Node to map any additional fields as required. π οΈ Customizing the Workflow Change API Source:** Update the HTTP Request node to any user data API. Add More Fields:** Modify the Set node to map additional fields from the API response. Multi-Sheet Distribution:** Append data to multiple Google Sheets if needed. Automated Scheduling:** Trigger workflow automatically at intervals. Add Notifications:** Send email or Slack notifications after data collection.
by Naveen Choudhary
Description This workflow automatically validates and enriches contact form submissions from JotForm, ensuring you only store high-quality leads with complete business information. Who's it for Marketing teams, sales professionals, and business owners who collect leads through forms and want to automatically verify email validity and enrich contact data before adding them to their CRM or database. What it does When someone submits a contact form on JotForm, this workflow: Captures the submission data (name, email, phone, message) Creates a new record in Google Sheets Verifies the email address using Reoon's email verification API Saves validation metrics (deliverability, spam trap detection, disposable email check) Filters out unsafe or invalid emails Enriches validated contacts with professional data from Apollo (LinkedIn URL, job title, company name) Updates the Google Sheet with enriched information How it works JotForm Trigger - Listens for new form submissions Initial Storage - Creates a contact record in Google Sheets with basic form data Email Verification - Sends email to Reoon API for comprehensive validation Save Verification Results - Updates the sheet with email quality metrics Safety Filter - Only passes emails marked as "safe" to enrichment Contact Enrichment - Queries Apollo API to find professional information Final Update - Saves enriched data (LinkedIn, title, company) back to the sheet Requirements Services you'll need: JotForm account (free plan available) Reoon Email Verifier API access Apollo.io account for contact enrichment Google account for Google Sheets access Setup steps: Copy this Google Sheet template (make your own copy) Create a JotForm with fields: First Name, Last Name, E-mail, Phone, Message Get your Reoon API key from their dashboard Get your Apollo.io API key from account settings Connect your Google Sheets account in n8n How to customize Change verification level**: Modify the "mode" parameter in the Reoon API call (options: quick, power) Adjust filtering criteria**: Update the IF node to filter by different email quality metrics Add more enrichment**: Apollo returns additional data fields you can map to your sheet Notification layer**: Add a Send Email or Slack node after enrichment to notify your team of high-quality leads CRM integration**: Replace or supplement Google Sheets with HubSpot, Salesforce, or Pipedrive nodes This workflow provides a complete lead qualification pipeline that saves time and ensures only high-quality, validated contacts make it into your database with enriched professional information.