by Benjamin Jones (SaaS Alerts)
Collect and Email Authentication IP Addresses from SaaS Alerts (Last 24 Hours) Description This n8n workflow automates the process of collecting sign-in IP addresses from SaaS Alerts over the past 24 hours and emailing the results using SMTP2Go. Designed for security teams, IT administrators, and compliance officers, this workflow helps monitor user authentication activity, detect unusual sign-ins, and respond to potential security threats in real time. By automating data collection and email alerts, organizations can proactively track login patterns, ensure compliance with security policies, and mitigate risks associated with unauthorized access. Use Case This workflow is ideal for businesses and IT teams that need to: Monitor user authentication activity across SaaS applications. Identify login attempts from suspicious IPs. Automate security reporting and compliance tracking. Receive real-time alerts for unusual sign-in behaviors. Pre-Conditions & Requirements Before using this workflow, ensure you have: A SaaS Alerts account or another system that logs authentication IPs. An SMTP2Go account for sending email notifications. n8n set up with proper API credentials and database access (if applicable). Setup Instructions Configure SaaS Alerts API Obtain API from the SaaS Alerts Platform under the Settings menu. Set Up SMTP2Go for Email Alerts Create an SMTP2Go account if you don’t have one. Generate a SMTP2Go API key Verify that your sending email address has been configured and verified. Define recipient email addresses for security alerts. Customize the Workflow Modify filtering rules to track specific users, IP ranges, or flagged login attempts. Adjust email content to include relevant details for your team. Test & Deploy Run the workflow manually to verify data retrieval and email notifications. Schedule the workflow to run daily for automated monitoring. Workflow Steps Trigger – Starts manually or on a scheduled interval (e.g., every 24 hours). Fetch Authentication Logs – Retrieves sign-in IPs from SaaS Alerts or a custom API. Filter & Process Data – Extracts relevant login attempts based on defined criteria. Format Data for Reporting – Structures the data for readability in an email alert. Send Email Notification via SMTP2Go – Delivers the security report to designated recipients. Customization Options Modify Filtering Rules** – Track specific login behaviors, flagged IPs, or unusual patterns. Change Email Recipients** – Update the recipient list based on security team needs. Integrate with Security Dashboards** – Expand the workflow to log data into a SIEM system or incident response platform. Add Additional Triggers** – Configure alerts for specific login anomalies, such as failed login attempts. Keywords n8n security automation, authentication monitoring, login IP tracking, SMTP2Go email alerts, SaaS Alerts workflow, IT security automation, login anomaly detection
by Nick Saraev
AI Upwork Application Agent with OpenAI & Google Docs Categories: AI Agents, Freelance Automation, Proposal Generation This workflow creates an intelligent AI agent that automates Upwork job applications by generating highly personalized proposals, professional Google Doc presentations, and visual workflow diagrams. Built by someone who earned over $500,000 on Upwork, this system demonstrates the exact templates and strategies that achieve superior response rates through perceived customization and value demonstration. Benefits Complete Application Automation** - Transform job descriptions into custom proposals, documents, and diagrams in minutes Proven Templates** - Based on $500K+ in Upwork earnings using exact strategies for high-converting applications Intelligent Personalization** - AI analyzes job requirements and customizes responses with relevant social proof Professional Asset Generation** - Creates Google Doc proposals and Mermaid workflow diagrams for enhanced perceived value Modular Architecture** - Three specialized sub-workflows handle different aspects of proposal generation High Response Rates** - Focuses on perceived customization and value demonstration over generic applications How It Works AI Agent Orchestration: Receives Upwork job descriptions through chat interface Maintains conversation context with window buffer memory Coordinates three specialized sub-workflows for comprehensive proposal generation Automatically integrates generated assets into cohesive application packages Application Copy Generation: Uses proven templates based on $500K+ Upwork success Follows structure: "Hi, I do [thing] all the time. So confident I created a demo: [link]" Incorporates personal social proof and achievements automatically Generates concise, spartan-toned applications that avoid generic AI language Google Doc Proposal Creation: Copies professional proposal template from Google Drive Generates structured content including system title, explanation, scope, and timeline Uses find-and-replace to populate template with AI-generated, personalized content Creates shareable documents with proper permissions for immediate client access Mermaid Diagram Visualization: Analyzes job requirements to create relevant workflow diagrams Generates Mermaid.js code for professional flowchart visualization Provides visual representation of proposed solutions Enhances perceived value through custom diagram creation Smart Template Integration: Automatically replaces placeholder text with generated Google Doc links Maintains consistent messaging across all generated assets Ensures cohesive presentation of application, proposal, and supporting materials Required Setup Configuration Personal Information Setup: Update the "aboutMe" variable in both Set Variable nodes with your credentials: Professional background and specializations Notable client achievements with specific revenue numbers Social proof elements (community size, subscriber count, etc.) Relevant project examples with quantified results Google Services Integration: Google Drive API Setup: Enable Google Drive API in Google Cloud Console Create OAuth2 credentials (Client ID and Client Secret) Connect n8n to Google Drive with proper permissions Google Docs Template: Copy the provided Google Docs proposal template to your Drive Update the template ID in the Google Drive node Customize template with your branding and standard language Google Docs API: Ensure Google Docs API is enabled in your Google Cloud project Test document creation and sharing permissions OpenAI API Configuration: Set up OpenAI API credentials across all OpenAI nodes Configure appropriate models (GPT-4O-mini recommended for speed) Set temperature to 0.7 for optimal personalization balance Monitor API usage to control costs Template Customization: Application Template**: Modify the proposal structure in OpenAI prompts to match your services Google Doc Template**: Update the document template with your standard proposal format Personal Details**: Replace all placeholder information with your actual achievements and social proof Business Use Cases Freelance Professionals** - Automate high-quality Upwork applications across multiple job categories Automation Specialists** - Demonstrate capabilities through automated proposal generation itself Service Providers** - Scale application volume while maintaining personalization quality Agency Owners** - Offer proposal automation services to freelance clients Consultants** - Streamline business development with automated custom proposals Content Creators** - Generate professional project proposals with visual workflow representations Revenue Potential This system transforms freelance business development: 10x Application Speed**: Generate comprehensive proposals in minutes vs. hours Higher Response Rates**: Perceived customization and value demonstration increase client engagement Scalable Outreach**: Apply to more jobs with maintained quality through automation Professional Positioning**: Visual diagrams and structured proposals demonstrate expertise Competitive Advantage**: Deliver proposals faster than competitors through intelligent automation Difficulty Level: Advanced Estimated Build Time: 3-4 hours Monthly Operating Cost: ~$30 (OpenAI + Google APIs) Watch My Complete Live Build Want to see me build this entire system from scratch? I walk through every component live - including the AI agent setup, prompt engineering strategies, Google Docs integration, and all the debugging that goes into creating a production-ready freelance automation system. 🎥 See My Live Build Process: "I Built An AI Agent That Automates Upwork ($500K+ Earned)" This comprehensive tutorial shows the real development process - including advanced prompt engineering, modular workflow design, and the exact business strategies that generated $500K+ in Upwork revenue. Set Up Steps AI Agent Foundation: Configure chat trigger and AI agent node with OpenAI integration Set up window buffer memory for conversation context Define system message with clear agent instructions and behavior rules Sub-Workflow Creation: Build three specialized workflows: Application Copy, Google Doc Proposal, Mermaid Code Configure execute workflow triggers for each sub-workflow Set up proper data passing between agent and sub-workflows Google Services Configuration: Create Google Cloud Console project with Drive and Docs APIs enabled Set up OAuth2 credentials and connect to n8n Copy and customize the proposal template document Personalization Setup: Update all "aboutMe" variables with your specific achievements and social proof Customize prompt templates to match your service offerings and communication style Test individual sub-workflows with sample job descriptions Agent Tool Integration: Connect sub-workflows as tools in the main AI agent Configure proper tool descriptions and response property names Test complete agent functionality with realistic job posting scenarios Template Optimization: Refine proposal templates based on your specific service offerings Adjust AI prompts for optimal personalization and response quality Test with various job types to ensure consistent quality output Advanced Optimizations Scale the system with: Job Scraping Integration:** Automatically discover and apply to relevant Upwork jobs Response Tracking:** Monitor application success rates and optimize templates Multi-Platform Support:** Extend to other freelance platforms (Fiverr, Freelancer, etc.) Client Communication:** Automate follow-up sequences for proposal responses Portfolio Integration:** Automatically include relevant portfolio pieces based on job requirements Important Considerations Template Authenticity:** Customize templates significantly to avoid detection as automated Upwork Compliance:** Ensure applications meet platform guidelines and quality standards Personal Branding:** Maintain consistent voice and positioning across all generated content Response Management:** Be prepared to handle increased application volume and client responses Quality Control:** Regularly review and refine generated content for accuracy and relevance Why This System Works The competitive advantage lies in proven strategies: Perceived Customization:** AI generates content that appears manually crafted for each job Value Demonstration:** Visual diagrams and structured proposals show immediate value Speed Advantage:** Deliver comprehensive proposals before competitors finish reading job posts Professional Presentation:** Consistent quality and formatting across all applications Scalable Personalization:** Maintain individual attention at volume through intelligent automation Check Out My Channel For more advanced automation systems and proven freelance business strategies that generate real revenue, explore my YouTube channel where I share the exact methodologies used to build successful automation agencies and scale to $72K+ monthly revenue.
by Akash Kankariya
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🎯 Overview This n8n workflow template automates the process of monitoring Instagram comments and sending predefined responses based on specific comment keywords. It integrates Instagram's Graph API with Google Sheets to manage comment responses and maintains an interaction log for customer relationship management (CRM) purposes. 🔧 Workflow Components The workflow consists of 9 main nodes organized into two primary sections: 📡 Section 1: Webhook Verification ✅ Get Verification (Webhook node) 🔄 Respond to Verification Message (Respond to Webhook node) 🤖 Section 2: Auto Comment Response 📬 Insta Update (Webhook node) ❓ Check if update is of comment? (Switch node) 👤 Comment if of other user (If node) 📊 Comment List (Google Sheets node) 💬 Send Message for Comment (HTTP Request node) 📝 Add Interaction in Sheet (CRM) (Google Sheets node) 🛠️ Prerequisites and Setup Requirements 1. 🔵 Meta/Facebook Developer Setup 📱 Create Facebook App > 📋 Action Items: > - [ ] Navigate to Facebook Developers > - [ ] Click "Create App" and select "Business" type > - [ ] Configure the following products: > - ✅ Instagram Graph API > - ✅ Facebook Login for Business > - ✅ Webhooks 🔐 Required Permissions Configure the following permissions in your Meta app: | instagram_basic | 📖 Read Instagram account profile info and media | instagram_manage_comments | 💬 Create, delete, and manage comments | instagram_manage_messages | 📤 Send and receive Instagram messages | pages_show_list | 📄 Access connected Facebook pages 🎫 Access Token Generation > ⚠️ Important Setup:+ > - [ ] Use Facebook's Graph API Explorer > - [ ] Generate a User Access Token with required permissions > - [ ] ⚡ Important: Tokens expire periodically and need refreshing 2. 🌐 Webhook Configuration 🔗 Setup Webhook URL > 📌 Configuration Checklist: > - [ ] In Meta App Dashboard, navigate to Products → Webhooks > - [ ] Subscribe to Instagram object > - [ ] Configure webhook URL: your-n8n-domain/webhook/instagram > - [ ] Set verification token (use "test" or create secure token) > - [ ] Select webhook fields: > - ✅ comments - For comment notifications > - ✅ messages - For DM notifications (if needed) ✅ Webhook Verification Process The workflow handles Meta's webhook verification automatically: 📡 Meta sends GET request with hub.challenge parameter 🔄 Workflow responds with the challenge value to confirm subscription 3. 📊 Google Sheets Setup Example - https://docs.google.com/spreadsheets/d/1ONPKJZOpQTSxbasVcCB7oBjbZcCyAm9gZ-UNPoXM21A/edit?usp=sharing 📋 Create Response Management Sheet Set up a Google Sheets document with the following structure: 📝 Sheet 1 - Comment Responses: | Column | Description | Example | |--------|-------------|---------| | 💬 Comment | Trigger keywords | "auto", "info", "help" | | 📝 Message | Corresponding response message | "Thanks for your comment! We'll get back to you soon." | 📈 Sheet 2 - Interaction Log: | Column | Description | Purpose | |--------|-------------|---------| | ⏰ Time | Timestamp of interaction | Track when interactions occur | | 🆔 User Id | Instagram user ID | Identify unique users | | 👤 Username | Instagram username | Human-readable identification | | 📝 Note | Additional notes or error messages | Debugging and analytics | 🔧 Built By - akash@codescale.tech
by Yang
Who is this for? This workflow is perfect for eCommerce teams, market researchers, and product analysts who want to track or extract product information from websites that restrict scraping tools. It’s also useful for virtual assistants handling product comparison tasks. What problem is this workflow solving? Many eCommerce and retail sites use dynamic content or anti-bot protections that make traditional scraping methods unreliable. This workflow bypasses those issues by taking a screenshot of the full page, using OCR to extract visible text, and summarizing product information with GPT-4o—all fully automated. What this workflow does This workflow monitors a Google Sheet for new URLs. Once a new link is added, it performs the following steps: Trigger on New URL in Sheet – Watches for new rows added to a Google Sheet. Screenshot URL via Dumpling AI – Sends the URL to Dumpling AI’s screenshot endpoint to capture a full-page image of the product webpage. Save Screenshot to Drive Folder – Uploads the screenshot to a specific Google Drive folder for reference or logging. Extract Text from Screenshot with Dumpling AI – Uses Dumpling AI’s image-to-text endpoint to pull all visible content from the screenshot. Extract Product Info from Screenshot Text with GPT-4o – Sends the extracted raw text to GPT-4o, prompting it to identify structured product information such as product name, price, ratings, deals, and purchase options. Split Each Product Entry – Splits the GPT response (an array of product objects) so each product becomes an individual item for saving. Save Products info to Google Sheet – Appends each product’s structured details to a separate sheet in the same spreadsheet. Setup Google Sheet Create a Google Sheet with at least two sheets: Sheet1 should contain a header row with a column labeled URL. Sheet2 should contain headers: Product Name, price, purchased, ratings, deal, buyingOptions. Connect your Google account in both the trigger and final write-back node. Dumpling AI Sign up at Dumpling AI Create an API key and use it for both HTTP modules: Screenshot URL via Dumpling AI Extract Text from Screenshot with Dumpling AI The screenshot endpoint used is https://app.dumplingai.com/api/v1/screenshot. Google Drive Create a folder for storing screenshots. In the Save Screenshot to Drive Folder node, select the correct folder or provide the folder ID. Make sure permissions allow uploading from n8n. OpenAI Provide an API key for GPT-4o in the Extract Product Info from Screenshot Text with GPT-4o node. The prompt is structured to return structured product listings in JSON format. Split & Save Split Each Product Entry takes the array of product objects from GPT and makes each one a separate execution. Save Products info to Google Sheet writes structured fields into Sheet2 under: Product Name, price, purchased, ratings, deal, buyingOptions. How to customize this workflow Adjust the GPT prompt to return different product fields (e.g., shipping info, product categories). Use a filter node to limit which types of products get written to the final sheet. Add sentiment analysis to analyze review content if available. Replace Google Drive with Dropbox or another file storage app. Notes Make sure you monitor your API usage on both Dumpling AI and OpenAI to avoid rate limits. This setup is great for snapshot-based extraction where scraping is blocked or unreliable.
by David Olusola
n8n Set Node Tutorial - Complete Guide 🎯 How It Works This tutorial workflow teaches you everything about n8n's Set node through hands-on examples. The Set node is one of the most powerful tools in n8n - it allows you to create, modify, and transform data as it flows through your workflow. What makes this tutorial special: Progressive Learning**: Starts simple, builds to complex concepts Interactive Examples**: Real working nodes you can modify and test Visual Guidance**: Sticky notes explain every concept Branching Logic**: Shows how Set nodes work in different workflow paths Real Data**: Uses practical examples you'll encounter in automation The workflow demonstrates 6 core concepts: Basic data types (strings, numbers, booleans) Expression syntax with {{ }} and $json references Complex data structures (objects and arrays) "Keep Only Set" option for clean outputs Conditional data setting with branching logic Data transformation and aggregation techniques 📋 Setup Steps Step 1: Import the Workflow Copy the JSON from the code artifact above Open your n8n instance in your browser Navigate to Workflows section Click "Import from JSON" or the import button (usually a "+" or import icon) Paste the JSON into the import dialog Click "Import" to load the workflow Save the workflow (Ctrl+S or click Save button) Step 2: Choose Your Starting Point Option A: Default Tutorial Mode (Recommended for beginners) The workflow is ready to run as-is Uses simple "Welcome" message as starting data Click "Execute Workflow"** to begin Option B: Rich Test Data Mode (Recommended for experimentation) Locate the nodes: Find "Start (Manual Trigger)" and "0. Test Data Input" Disconnect default: Click the connection line between "Start (Manual Trigger)" → "1. Set Basic Values" and delete it Connect test data: Drag from "0. Test Data Input" output to "1. Set Basic Values" input Execute: Click "Execute Workflow" to run with rich test data Step 3: Execute and Learn Run the workflow: Click the "Execute Workflow" button Check outputs: Click on each node to see its output data Read the notes: Each sticky note explains what's happening Follow the flow: Data flows from left to right, top to bottom Step 4: Experiment and Modify Try These Experiments: 🔧 Change Basic Values: Click on "1. Set Basic Values" Modify user_age (try 20 vs 35) Change user_name to see how it propagates Execute and see the changes flow through 📊 Test Conditional Logic: Set user_age to 20 → triggers "Student Discount" path Set user_age to 30 → triggers "Premium Access" path Watch how the workflow branches differently 🎨 Modify Expressions: In "2. Set with Expressions", try changing: ={{ $json.score * 2 }} to ={{ $json.score * 3 }} ={{ $json.user_name }} Smith to ={{ $json.user_name }} Johnson 🏗️ Complex Data Structures: In "3. Set Complex Data", modify the JSON structure Add new properties to the user_profile object Try nested expressions 🎓 Learning Path Beginner Level (Nodes 1-2) Focus**: Understanding basic Set operations Learn**: Data types, static values, simple expressions Time**: 10-15 minutes Intermediate Level (Nodes 3-4) Focus**: Complex data and output control Learn**: Objects, arrays, "Keep Only Set" option Time**: 15-20 minutes Advanced Level (Nodes 5-6) Focus**: Conditional logic and data aggregation Learn**: Branching workflows, merging data, complex expressions Time**: 20-25 minutes 🔍 What Each Node Teaches | Node | Concept | Key Learning | |------|---------|-------------| | 1. Set Basic Values | Data Types | String, number, boolean basics | | 2. Set with Expressions | Dynamic Data | {{ }} syntax, $json references, $now functions | | 3. Set Complex Data | Advanced Structures | Objects, arrays, nested properties | | 4. Set Clean Output | Data Management | "Keep Only Set" for clean final outputs | | 5a/5b. Conditional Sets | Branching Logic | Different data based on conditions | | 6. Tutorial Summary | Data Aggregation | Combining and summarizing workflow data | 💡 Pro Tips 🚀 Quick Wins: Always check node outputs after execution Use sticky notes as your learning guide Experiment with small changes first Copy nodes to try variations 🛠️ Advanced Techniques: Use Keep Only Set for API responses Combine static and dynamic data in complex objects Leverage conditional paths for different user types Reference nested object properties with dot notation 🐛 Troubleshooting: If expressions don't work, check the {{ }} syntax Ensure field names match exactly (case-sensitive) Use the expression editor for complex logic Check data types match your expectations 🎯 Next Steps After Tutorial Create your own Set nodes in a new workflow Practice with real data from APIs or databases Build data transformation workflows for your specific use cases Combine Set nodes with other n8n nodes like HTTP, Webhook, etc. Explore advanced expressions using JavaScript functions Congratulations! You now have the foundation to use Set nodes effectively in any n8n workflow. The Set node is truly the "Swiss Army knife" of n8n automation! 🛠️
by assert
Who this template is for This template is for every engineer who wants to automate their code reviews or just get a 2nd opinion on their PR. How it works This workflow will automatically review your changes in a Gitlab PR using the power of AI. It will trigger whenever you comment with +0 to a Gitlab PR, get the code changes, analyze them with GPT, and reply to the PR discussion. Set up Steps Set up webhook of note_events in Gitlab repository (see here on how to do it) Configure ChatGPT credentials Note "+0" in MergeRequest to trigger automatic review by ChatGPT
by Sherlockes
What does this template help with? Save the data of activities recorded and stored in Strava to a Google Sheets document. How it works: We have a Google Sheets spreadsheet where each row represents a Strava activity with the date, reference, distance, time, and elevation. Periodically, the workflow checks the latest activities in our Strava account to see if any are missing from the spreadsheet and adds them to the list. All fields must be properly formatted according to how they are stored in the Google Sheets spreadsheet. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need a Google Sheets and Strava account. In the 'activities' node, you must enter the name of the file and the sheet where you want to save the imported data. In the 'Strava' node, you must select the corresponding credential. You can adjust the format of dates, times, and distances according to your needs in the 'strava_last' node. The rest of the information is available at sherblog.es Template was created in n8n v1.72.1
by Ferenc Erb
Use Case Automate chat interactions in Bitrix24 with a customizable bot that can handle various events and respond to user messages. What This Workflow Does Processes incoming webhook requests from Bitrix24 Handles authentication and token validation Routes different event types (messages, joins, installations) Provides automated responses and bot registration Manages secure communication between Bitrix24 and external services Setup Instructions Configure Bitrix24 webhook endpoints Set up authentication credentials Customize bot responses and behavior Deploy and test the workflow
by InfraNodus
This template can be used to generate research questions from PDF documents (e.g. research papers, market reports) based on the content gaps found in text using the InfraNodus knowledge graph GraphRAG knowledge graph representation. Simply upload several PDF files (research papers, corporate or market reports, etc) and generate a research question / AI prompt in seconds. The template is useful for: generating research questions generating AI prompts that drive research further finding blind spots in any discourse and generating ideas that address them. avoiding the generic bias of LLM models and focusing on what's important in your particular context Using Content Gaps for Generating Research Questions Knowledge graphs represent any text as a network: the main concepts are the nodes, their co-occurrences are the connections between them. Based on this representation, we build a graph and apply network science metrics to rank the most important nodes (concepts) that serve as the crossroads of meaning and also the main topical clusters that they connect. Naturally, some of the clusters will be disconnected and will have gaps between them. These are the topics (groups of concepts) that exist in this context (the documents you uploaded) but that are not very well connected. Addressing those gaps can help you see which groups of concepts you could connect with your own ideas. This is exactly what InfraNodus does: builds the structure, finds the gaps, then uses the built-in AI to generate research questions that bridge those gaps. How it works 1) Step 1: First, you upload your PDF files using an online web form, which you can run from n8n or even make publicly available. 2) Steps 2-4: The documents are processed using the Code and PDF to Text nodes to extract plain text from them. 3) Step 5: This text is then sent to the InfraNodus GraphRAG node that creates a knowledge graph, identifies structural gaps in this graph, and then uses built-in AI to research questions / prompts. 4) Step 6: The ideas are then shown to the user in the same web form. Optionally, you can hook this template to your own workflow and send the question generated to an InfraNodus expert or your own AI model / agent for further processing. If you'd like to sync this workflow to PDF files in a Google Drive folder, you can copy our Google Drive PDF processing workflow for n8n. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key. Add this key into the InfraNodus GraphRAG HTTP node(s) you use in this workflow. You do not need any OpenAI keys for this to work. Optionally, you can change the settings in the Step 4 of this workflow and enforce it to always use the biggest gap it identifies. Requirements An InfraNodus account and API key Note: OpenAI key is not required. You will have direct access to the InfraNodus AI with the API key. Customizing this workflow You can use this same workflow with a Telegram bot or Slack (to be notified of the summaries and ideas). You can also hook up automated social media content creation workflows in the end of this template, so you can generate posts that are relevant (covering the important topics in your niche) but also novel (because they connect them in a new way). Check out our n8n templates for ideas at https://n8n.io/creators/infranodus/ Also check the full tutorial with a conceptual explanation at https://support.noduslabs.com/hc/en-us/articles/20454382597916-Beat-Your-Competition-Target-Their-Content-Gaps-with-this-n8n-Automation-Workflow Also check out the video introduction to InfraNodus to better understand how knowledge graphs and content gaps work: For support and help with this workflow, please, contact us at https://support.noduslabs.com
by Jimleuk
This n8n template demonstrates how to calculate the evaluation metric "Similarity" which in this scenario, measures the consistency of the agent. The scoring approach is adapted from the open-source evaluations project RAGAS and you can see the source here https://github.com/explodinggradients/ragas/blob/main/ragas/src/ragas/metrics/_answer_similarity.py How it works This evaluation works best where questions are close-ended or about facts where the answer can have little to no deviation. For our scoring, we generate embeddings for both the AI's response and ground truth and calculate the cosine similarity between them. A high score indicates LLM consistency with expected results whereas a low score could signal model hallucination. Requirements n8n version 1.94+ Check out this Google Sheet for a sample data https://docs.google.com/spreadsheets/d/1YOnu2JJjlxd787AuYcg-wKbkjyjyZFgASYVV0jsij5Y/edit?usp=sharing
by Yaron Been
🧨 VIP Radar: Instantly Spot & Summarize High-Value Shopify Orders with AI + Slack Alerts Automatically detect when a new Shopify order exceeds $200, fetch the customer’s purchase history, generate an AI-powered summary, and alert your team in Slack—so no VIP goes unnoticed. 🛠️ Workflow Overview | Feature | Description | |------------------------|-----------------------------------------------------------------------------| | Trigger | Shopify “New Order” webhook | | Conditional Check | Filters for orders > $200 | | Data Enrichment | Pulls full order history for the customer from Shopify | | AI Summary | Uses OpenAI to summarize buying behavior | | Notification | Sends detailed alert to Slack with name, order total, and customer insights | | Fallback | Ignores low-value orders and terminates flow | 📘 What This Workflow Does This automation monitors your Shopify store and reacts to any high-value order (over $200). When triggered: It fetches all past orders of that customer, Summarizes the history using OpenAI, Sends a full alert with context to your Slack channel. No more guessing who’s worth a closer look. Your team gets instant insights, and your VIPs get the attention they deserve. 🧩 Node-by-Node Breakdown 🔔 1. Trigger: New Shopify Order Type**: Shopify Trigger Event**: orders/create Purpose**: Starts workflow on new order Pulls**: Order total, customer ID, name, etc. 🔣 2. Set: Convert Order Total to Number Ensures the total_price is treated as a number for comparison. ❓ 3. If: Is Order > $200? Condition**: $json.total_price > 200 Yes** → Continue No** → End workflow 🔗 4. HTTP: Fetch Customer Order History Uses the Shopify Admin API to retrieve all orders from this customer. Requires your Shopify access token. 🧾 5. Set: Convert Orders Array to String Formats the order data so it's prompt-friendly for OpenAI. 🧠 6. LangChain Agent: Summarize Order History Prompt**: "Summarize the customer's order history for Slack. Here is their order data: {{ $json.orders }}" Model**: GPT-4o Mini (customizable) 📨 7. Slack: Send VIP Alert Sends a rich message to a Slack channel. Includes: Customer name Order value Summary of past behavior 🧱 8. No-Op (Optional) Used to safely end workflow if the order is not high-value. 🔧 How to Customize | What | How | |--------------------------|----------------------------------------------------------------------| | Order threshold | Change 200 in the If node | | Slack channel | Update channelId in the Slack node | | AI prompt style | Edit text in LangChain Agent node | | Shopify auth token | Replace shpat_abc123xyz... with your actual private token | 🚀 Setup Instructions Open n8n editor. Go to Workflows → Import → Paste JSON. Paste this workflow JSON. Replace your Shopify token and Slack credentials. Save and activate. Place a test order in Shopify to watch it work. 💡 Real-World Use Cases 🎯 Notify sales team when a potential VIP buys 🛎️ Prep support reps with customer history 📈 Detect repeat buyers and upsell opportunities 🔗 Resources & Support 👨💻 Creator: Yaron Been 📺 YouTube: NoFluff with Yaron Been 🌐 Website: https://nofluff.online 📩 Contact: Yaron@nofluff.online 🏷️ Tags #shopify, #openai, #slack, #vip-customers, #automation, #n8n, #workflow, #ecommerce, #customer-insights, #ai-summaries, #gpt4o
by phil
This workflow automates voice reminders for upcoming appointments by generating a professional audio message and sending it to clients via email with the voice file attached. It integrates Google Calendar to track appointments, ElevenLabs to generate high-quality voice messages, and Gmail to deliver them efficiently. Who Needs Automated Voice Appointment Reminders? This automated voice appointment reminder system is ideal for businesses that rely on scheduled appointments. It helps reduce no-shows, improve client engagement, and streamline communication. Medical Offices & Clinics – Ensure patients receive timely appointment reminders. Real Estate Agencies – Keep potential buyers and renters informed about property visits. Service-Based Businesses – Perfect for salons, consultants, therapists, and coaches. Legal & Financial Services – Help clients remember important meetings and consultations. If your business depends on scheduled appointments, this workflow saves time and enhances client satisfaction. 🚀 Why Use This Workflow? Ensures clients receive timely reminders. Reduces appointment no-shows and scheduling issues. Automates the process with a personalized voice message. Step-by-Step: How This Workflow Automates Voice Reminders Trigger the Workflow – The system runs manually or on a schedule to check upcoming appointments in Google Calendar. Retrieve Appointment Data – It fetches event details (client name, time, and location) from Google Calendar. The workflow uses the summary, start.dateTime, location, and attendees[0].email fields from Google Calendar to personalize and send the voice reminders. Generate a Voice Reminder – Using ElevenLabs, the workflow converts the appointment details into a natural-sounding voice message. Send via Email – The generated audio file is attached to an email and sent to the client as a reminder. Customization: Tailor the Workflow to Your Business Needs Adjust Trigger Frequency – Modify the scheduling to run daily, hourly, or at specific intervals. Customize Voice Message Format – Change the script structure and voice tone to match your business needs. Change Notification Method – Instead of email, integrate SMS or WhatsApp for delivery. 🔑 Prerequisites Google Calendar Access** – Ensure you have access to the calendar with scheduled appointments. ElevenLabs API Key – Required for generating voice messages (you can start for free). Gmail API Access** – Needed for sending reminder emails. n8n Setup** – The workflow runs on an n8n instance (self-hosted or cloud). 🚀 Step-by-Step Installation & Setup Set Up Google Calendar API** Go to Google Cloud Console. Create a new project and enable Google Calendar API. Generate OAuth 2.0 credentials and save them for n8n. Get an ElevenLabs API Key** Sign up at ElevenLabs. Retrieve your API key from the dashboard. Configure Gmail API** Enable Gmail API in Google Cloud Console. Create OAuth credentials and authorize your email address for sending. Deploy n8n & Install the Workflow** Install n8n (Installation Guide). Add the required Google Calendar, ElevenLabs, and Gmail nodes. Import or build the workflow with the correct credentials. Test and fine-tune as needed. ⚠ Important: The LangChain Community node used in this workflow only works on self-hosted n8n instances. It is not compatible with n8n Cloud. Please ensure you are running a self-hosted instance before using this workflow. Summary This workflow ensures a professional and seamless experience for your clients, keeping them informed and engaged. 🚀🔊 Phil | Inforeole