by Rodrigue Gbadou
How it works Automatic Detection: Instantly identifies abandoned carts via webhook from your e-commerce store. Progressive Sequence: Automatically sends 3 recovery emails over 7 days with increasing incentives. Dynamic Personalization: Inserts abandoned products, customer name, and unique promo codes. Performance Tracking: Analyzes conversion rates and recovered revenue. Set up steps Configure the webhook: Connect your e-commerce platform (Shopify, WooCommerce, Magento) to trigger the workflow when a cart is abandoned. Email service: Set up your email sending service (Gmail, SendGrid, Mailgun) with proper credentials. Customization: Adapt email templates with your brand guidelines, logo, and tone of voice. Promo codes: Integrate your discount code system (10%, 15%, 20%). Analytics tracking: Connect a Google Sheet to track recovery performance. Testing: Validate the workflow with test data before activation. Key Features π― Smart targeting: Automatically filters qualified carts (minimum value, valid email) β° Optimized timing: Scientifically timed sequence (1h, 24h, 72h) to maximize conversions π° Progressive incentives: Increasing discounts (10% β 15% β 20%) to create urgency π± Responsive design: Email templates optimized for all devices π Unique codes: Automatically generates personalized promo codes for each customer π Built-in analytics: Real-time tracking of open rates, clicks, and conversions π‘οΈ Error handling: Robust system with notifications in case of technical issues π¨ Professional templates: Modern email designs with optimized call-to-actions Advanced Features Customer segmentation**: Differentiates between new and returning customers Automatic exclusions**: Avoids sending to customers who already purchased Multi-language**: Supports different languages based on location A/B Testing**: Tests different email versions to optimize performance CRM integration**: Syncs data with your customer management system Metrics Tracked Recovery rate per email in the sequence Real-time recovered revenue Open and click-through rates for each email Promo codes used and their effectiveness Average delay between abandonment and conversion Customization Options Flexible timing**: Adjust sending delays to fit your industry Variable incentives**: Change discount percentages as needed Dynamic content**: Adjust messages based on product types Configurable thresholds**: Set your own qualification criteria Full branding**: Integrate your complete visual identity > This workflow automatically turns abandoned carts into sales opportunities with a scientific and personalized approach, generating measurable ROI for your e-commerce.
by David Roberts
AI evaluation in n8n This is a template for n8n's evaluation feature. Evaluation is a technique for getting confidence that your AI workflow performs reliably, by running a test dataset containing different inputs through the workflow. By calculating a metric (score) for each input, you can see where the workflow is performing well and where it isn't. How it works This template shows how to calculate a workflow evaluation metric: text similarity, measured character-by-character. The workflow takes images of hand-written codes, extracts the code and compares it with the expected answer from the dataset. The images look like this: The workflow works as follows: We use an evaluation trigger to read in our dataset It is wired up in parallel with the regular trigger so that the workflow can be started from either one. More info We download the image and use AI to extract the code If weβre evaluating (i.e. the execution started from the evaluation trigger), we calculate the string distance metric We pass this information back to n8n as a metric
by InfraNodus
Set Up ElevenLabs Voice Chat Agent using Graph RAG Knowledge Graphs as Experts This workflow creates an AI voice chatbot agent that has access to several knowledge bases at the same time (used as "experts"). These knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows. We use ElevenLabs to set up a voice agent that can be embedded to any website or used via their API. The advantages of using GraphRAG instead of the standard vector stores for knowledge are: Easy and quick to set up (no complex data import workflows needed) and to update with new knowledge A knowledge graph has a holistic overview of your knowledge base Better retrieval of relations between the document chunks = higher quality responses Ability to reuse in other n8n workflows How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. The user's prompt is received from the ElevenLabs Conversational AI agent via an n8n Webhook, which also takes care of the voice interaction. The response from n8n is then sent to the Webhook, which is polled by the ElevenLabs voice agent. This agent processes the response and provides the final answer. Here's a description step by step: The user submits a question using ElevenLabs voice interface The question is sent via the knowledge_base tool in ElevenLabs to the n8n Webhook with the POST request containing the user's prompt and sessionID for Chat Memory node in n8n. The n8n AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge auto-generated by InfraNodus (we call each tool an "expert"). The n8n AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert. The query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert. Each InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expertΒ (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG. The n8n AI Agent node integrates the responses received from the experts to produce the final answer. The final answer is sent back to the Webhook endpoint ElevenLabs conversational AI agent picks up the response arriving from the knowledge_base tool via the webhook and then condenses it for conversational format and transforms text into voice. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow You will also need to set up an ElevenLabs account and to set up a conversational AI agent there. See the Post note in the n8n workflow for a complete step-by-step description or our support article on setting up ElevenLabs AI voice agent Once the voice AI agent is ready, you might want to combine it with a text AI chatbot workflow so your users have a choice between the text and voice interaction. In that case, you may be interested to use our free open-source website popup chat widget popupchat.dev where you can create an embed code to add to your blog or website and allow the user to choose between the text and voice interaction. Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key An ElevenLabs account FAQ 1. How many "experts" should I aim for? We recommend to aim for the number of experts as the optimal number of people in a team, which is usually 2-7. If you add more experts, your AI orchestrating agent will have troubles choosing the most suitable "expert" tool for the user's query. You can mitigate this by specifying in the AI agent description that it can choose maximum 3-7 experts to provide a response. 2. Why use InfraNodus GraphRAG and not standard vector store for knowledge? First, vector stores are complex to set up and to update. You'd need a separate workflow for that, decide on the vector dimensions, add metadata to your knowledge, etc. With InfraNodus, you have a complete RAG / GraphRAG solution under the hood that is easy to set up and provides high-quality responses that takes the overall structure and the relations between your ideas into account. 3 Why not use ElevenLabs' own knowledge? One of the reasons is that you want your knowledge base to be in one place so you can reuse it in other n8n workflows. Another reason is that you will not have such a good separation between the "experts" when you converse with the agent. So the answers you get will be based on top matches from all the books / articles you upload, while with the InfraNodus GraphRAG setup you can better control which graphs are consulted as experts and have an explicit way to display this data. Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available on our GitHub repo for n8n workflows. Check out the complete setup guide for this workflow at https://support.noduslabs.com/hc/en-us/articles/20318967066396-How-to-Build-a-Text-Voice-AI-Agent-Chatbot-with-n8n-Elevenlabs-and-InfraNodus Also check out the video tutorial with a demo:
by Benjamin Jones (SaaS Alerts)
Collect and Email Authentication IP Addresses from SaaS Alerts (Last 24 Hours) Description This n8n workflow automates the process of collecting sign-in IP addresses from SaaS Alerts over the past 24 hours and emailing the results using SMTP2Go. Designed for security teams, IT administrators, and compliance officers, this workflow helps monitor user authentication activity, detect unusual sign-ins, and respond to potential security threats in real time. By automating data collection and email alerts, organizations can proactively track login patterns, ensure compliance with security policies, and mitigate risks associated with unauthorized access. Use Case This workflow is ideal for businesses and IT teams that need to: Monitor user authentication activity across SaaS applications. Identify login attempts from suspicious IPs. Automate security reporting and compliance tracking. Receive real-time alerts for unusual sign-in behaviors. Pre-Conditions & Requirements Before using this workflow, ensure you have: A SaaS Alerts account or another system that logs authentication IPs. An SMTP2Go account for sending email notifications. n8n set up with proper API credentials and database access (if applicable). Setup Instructions Configure SaaS Alerts API Obtain API from the SaaS Alerts Platform under the Settings menu. Set Up SMTP2Go for Email Alerts Create an SMTP2Go account if you donβt have one. Generate a SMTP2Go API key Verify that your sending email address has been configured and verified. Define recipient email addresses for security alerts. Customize the Workflow Modify filtering rules to track specific users, IP ranges, or flagged login attempts. Adjust email content to include relevant details for your team. Test & Deploy Run the workflow manually to verify data retrieval and email notifications. Schedule the workflow to run daily for automated monitoring. Workflow Steps Trigger β Starts manually or on a scheduled interval (e.g., every 24 hours). Fetch Authentication Logs β Retrieves sign-in IPs from SaaS Alerts or a custom API. Filter & Process Data β Extracts relevant login attempts based on defined criteria. Format Data for Reporting β Structures the data for readability in an email alert. Send Email Notification via SMTP2Go β Delivers the security report to designated recipients. Customization Options Modify Filtering Rules** β Track specific login behaviors, flagged IPs, or unusual patterns. Change Email Recipients** β Update the recipient list based on security team needs. Integrate with Security Dashboards** β Expand the workflow to log data into a SIEM system or incident response platform. Add Additional Triggers** β Configure alerts for specific login anomalies, such as failed login attempts. Keywords n8n security automation, authentication monitoring, login IP tracking, SMTP2Go email alerts, SaaS Alerts workflow, IT security automation, login anomaly detection
by Khairul Muhtadin
Who is this for? This workflow is perfect for Gmail users who want a tidy inbox without manual effort. Itβs especially great for those overwhelmed by SPAM, social media updates, or promotional emails and want them automatically removed regularly. What problem is this workflow solving? Unwanted emails like SPAM, social notifications, and promotions can clutter your Gmail inbox, making it hard to focus on what matters. Manually deleting them is repetitive and time-consuming. This workflow automates the cleanup, keeping your inbox streamlined. What this workflow does Every 3 days, this workflow deletes emails from Gmailβs SPAM, Social, and Promotions categories. It uses n8nβs Gmail node to fetch these emails, merges them into a single list, splits out individual email IDs, and deletes each one. The scheduled process ensures consistent inbox maintenance. Setup Set up valid Gmail OAuth2 credentials in n8n. Import the "Clean My Mail" workflow into your n8n instance. Confirm the Gmail nodes target SPAM, CATEGORY_SOCIAL, and CATEGORY_PROMOTIONS labels. Adjust the "Run Every 3 Days (Trigger)" nodeβs schedule if needed. Activate the workflow to begin automated cleaning. How to customize this workflow to your needs Change the Gmail node labels to target other categories or custom labels. Adjust the schedule frequency in the trigger node. Add filters to spare specific emails from deletion. Extend functionality with nodes for archiving or notifications. made by:* khmuhtadin Need a custom? contact me on LinkedIn or Web
by Zakaria Ben
This workflow template is designed for dental assistants and anyone looking to automate appointment scheduling. It integrates Google Calendar for booking appointments and Google Sheets as a database to store patient information. How It Works The user interacts with the chatbot to schedule an appointment. The chatbot collects necessary details and checks availability via Google Calendar. If the requested time is available, the AI books the appointment. If unavailable, the AI suggests alternative time slots. Once booked, the AI logs the appointment details into Google Sheets for record-keeping. Setup Instructions π Watch this π₯ Setup Video for detailed instructions on running and customizing this workflow. Step 1: Set Up Credentials OpenAI API Key (for chatbot functionality). Google Account (for Google Sheets & Google Calendar integration). Step 2: Choose the Right Tools Select the correct Google Calendar in the Google Calendar tool. Choose the appropriate Google Sheets file in the Google Sheets tool. Step 3: Test Run a test to ensure everything works correctly. Once tested. Example Templates Below are sample Google Sheets template to help you get started.
by Nick Saraev
AI Upwork Application Agent with OpenAI & Google Docs Categories: AI Agents, Freelance Automation, Proposal Generation This workflow creates an intelligent AI agent that automates Upwork job applications by generating highly personalized proposals, professional Google Doc presentations, and visual workflow diagrams. Built by someone who earned over $500,000 on Upwork, this system demonstrates the exact templates and strategies that achieve superior response rates through perceived customization and value demonstration. Benefits Complete Application Automation** - Transform job descriptions into custom proposals, documents, and diagrams in minutes Proven Templates** - Based on $500K+ in Upwork earnings using exact strategies for high-converting applications Intelligent Personalization** - AI analyzes job requirements and customizes responses with relevant social proof Professional Asset Generation** - Creates Google Doc proposals and Mermaid workflow diagrams for enhanced perceived value Modular Architecture** - Three specialized sub-workflows handle different aspects of proposal generation High Response Rates** - Focuses on perceived customization and value demonstration over generic applications How It Works AI Agent Orchestration: Receives Upwork job descriptions through chat interface Maintains conversation context with window buffer memory Coordinates three specialized sub-workflows for comprehensive proposal generation Automatically integrates generated assets into cohesive application packages Application Copy Generation: Uses proven templates based on $500K+ Upwork success Follows structure: "Hi, I do [thing] all the time. So confident I created a demo: [link]" Incorporates personal social proof and achievements automatically Generates concise, spartan-toned applications that avoid generic AI language Google Doc Proposal Creation: Copies professional proposal template from Google Drive Generates structured content including system title, explanation, scope, and timeline Uses find-and-replace to populate template with AI-generated, personalized content Creates shareable documents with proper permissions for immediate client access Mermaid Diagram Visualization: Analyzes job requirements to create relevant workflow diagrams Generates Mermaid.js code for professional flowchart visualization Provides visual representation of proposed solutions Enhances perceived value through custom diagram creation Smart Template Integration: Automatically replaces placeholder text with generated Google Doc links Maintains consistent messaging across all generated assets Ensures cohesive presentation of application, proposal, and supporting materials Required Setup Configuration Personal Information Setup: Update the "aboutMe" variable in both Set Variable nodes with your credentials: Professional background and specializations Notable client achievements with specific revenue numbers Social proof elements (community size, subscriber count, etc.) Relevant project examples with quantified results Google Services Integration: Google Drive API Setup: Enable Google Drive API in Google Cloud Console Create OAuth2 credentials (Client ID and Client Secret) Connect n8n to Google Drive with proper permissions Google Docs Template: Copy the provided Google Docs proposal template to your Drive Update the template ID in the Google Drive node Customize template with your branding and standard language Google Docs API: Ensure Google Docs API is enabled in your Google Cloud project Test document creation and sharing permissions OpenAI API Configuration: Set up OpenAI API credentials across all OpenAI nodes Configure appropriate models (GPT-4O-mini recommended for speed) Set temperature to 0.7 for optimal personalization balance Monitor API usage to control costs Template Customization: Application Template**: Modify the proposal structure in OpenAI prompts to match your services Google Doc Template**: Update the document template with your standard proposal format Personal Details**: Replace all placeholder information with your actual achievements and social proof Business Use Cases Freelance Professionals** - Automate high-quality Upwork applications across multiple job categories Automation Specialists** - Demonstrate capabilities through automated proposal generation itself Service Providers** - Scale application volume while maintaining personalization quality Agency Owners** - Offer proposal automation services to freelance clients Consultants** - Streamline business development with automated custom proposals Content Creators** - Generate professional project proposals with visual workflow representations Revenue Potential This system transforms freelance business development: 10x Application Speed**: Generate comprehensive proposals in minutes vs. hours Higher Response Rates**: Perceived customization and value demonstration increase client engagement Scalable Outreach**: Apply to more jobs with maintained quality through automation Professional Positioning**: Visual diagrams and structured proposals demonstrate expertise Competitive Advantage**: Deliver proposals faster than competitors through intelligent automation Difficulty Level: Advanced Estimated Build Time: 3-4 hours Monthly Operating Cost: ~$30 (OpenAI + Google APIs) Watch My Complete Live Build Want to see me build this entire system from scratch? I walk through every component live - including the AI agent setup, prompt engineering strategies, Google Docs integration, and all the debugging that goes into creating a production-ready freelance automation system. π₯ See My Live Build Process: "I Built An AI Agent That Automates Upwork ($500K+ Earned)" This comprehensive tutorial shows the real development process - including advanced prompt engineering, modular workflow design, and the exact business strategies that generated $500K+ in Upwork revenue. Set Up Steps AI Agent Foundation: Configure chat trigger and AI agent node with OpenAI integration Set up window buffer memory for conversation context Define system message with clear agent instructions and behavior rules Sub-Workflow Creation: Build three specialized workflows: Application Copy, Google Doc Proposal, Mermaid Code Configure execute workflow triggers for each sub-workflow Set up proper data passing between agent and sub-workflows Google Services Configuration: Create Google Cloud Console project with Drive and Docs APIs enabled Set up OAuth2 credentials and connect to n8n Copy and customize the proposal template document Personalization Setup: Update all "aboutMe" variables with your specific achievements and social proof Customize prompt templates to match your service offerings and communication style Test individual sub-workflows with sample job descriptions Agent Tool Integration: Connect sub-workflows as tools in the main AI agent Configure proper tool descriptions and response property names Test complete agent functionality with realistic job posting scenarios Template Optimization: Refine proposal templates based on your specific service offerings Adjust AI prompts for optimal personalization and response quality Test with various job types to ensure consistent quality output Advanced Optimizations Scale the system with: Job Scraping Integration:** Automatically discover and apply to relevant Upwork jobs Response Tracking:** Monitor application success rates and optimize templates Multi-Platform Support:** Extend to other freelance platforms (Fiverr, Freelancer, etc.) Client Communication:** Automate follow-up sequences for proposal responses Portfolio Integration:** Automatically include relevant portfolio pieces based on job requirements Important Considerations Template Authenticity:** Customize templates significantly to avoid detection as automated Upwork Compliance:** Ensure applications meet platform guidelines and quality standards Personal Branding:** Maintain consistent voice and positioning across all generated content Response Management:** Be prepared to handle increased application volume and client responses Quality Control:** Regularly review and refine generated content for accuracy and relevance Why This System Works The competitive advantage lies in proven strategies: Perceived Customization:** AI generates content that appears manually crafted for each job Value Demonstration:** Visual diagrams and structured proposals show immediate value Speed Advantage:** Deliver comprehensive proposals before competitors finish reading job posts Professional Presentation:** Consistent quality and formatting across all applications Scalable Personalization:** Maintain individual attention at volume through intelligent automation Check Out My Channel For more advanced automation systems and proven freelance business strategies that generate real revenue, explore my YouTube channel where I share the exact methodologies used to build successful automation agencies and scale to $72K+ monthly revenue.
by sayamol thiramonpaphakul
This workflow automatically checks the status of your websites using UptimeRobot API. If any site is down or unstable, it will: Generate a natural-language alert message using GPT-4o Push the message to a LINE group (with funny IT-style encouragement) Log all DOWN status entries into your Supabase database Wait 30 minutes before repeating π§ How It Works Schedule Trigger β Runs on a fixed interval (every few minutes). UptimeRobot Node β Fetches website monitor data. Code Node (Filter) β Filters only websites with status 8 (may be down) or 9 (down). IF Node β If any site is down, proceed. LangChain LLM Node β Formats alert with a humorous message using GPT-4o. Line Notify (HTTP Request) β Sends the alert to your LINE group. Loop Over Items β Loops through all monitors. Filter Down (Status = 9) β Selects only βfully downβ sites. Supabase Node β Logs these into synlora_uptime_down table. Wait Node β Delays next alert by 30 minutes to avoid spamming. βοΈ Setup Steps Required: π UptimeRobot API Key π² LINE Channel Access Token and Group ID π§ OpenAI Key (GPT-4o Mini) ποΈ Supabase Project & Table Step-by-step: Go to UptimeRobot β Get API key and ensure monitors are set up. Create a Supabase table with fields: website, status, uptime_id. Create a LINE Messaging API bot, join it to your group, and get: Access Token Group ID (userId or groupId) Add your OpenAI API Key for GPT-4o Mini (or switch to your preferred LLM). Import the workflow JSON into n8n. Set credentials in all necessary nodes. Activate the workflow.
by Akash Kankariya
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. π― Overview This n8n workflow template automates the process of monitoring Instagram comments and sending predefined responses based on specific comment keywords. It integrates Instagram's Graph API with Google Sheets to manage comment responses and maintains an interaction log for customer relationship management (CRM) purposes. π§ Workflow Components The workflow consists of 9 main nodes organized into two primary sections: π‘ Section 1: Webhook Verification β Get Verification (Webhook node) π Respond to Verification Message (Respond to Webhook node) π€ Section 2: Auto Comment Response π¬ Insta Update (Webhook node) β Check if update is of comment? (Switch node) π€ Comment if of other user (If node) π Comment List (Google Sheets node) π¬ Send Message for Comment (HTTP Request node) π Add Interaction in Sheet (CRM) (Google Sheets node) π οΈ Prerequisites and Setup Requirements 1. π΅ Meta/Facebook Developer Setup π± Create Facebook App > π Action Items: > - [ ] Navigate to Facebook Developers > - [ ] Click "Create App" and select "Business" type > - [ ] Configure the following products: > - β Instagram Graph API > - β Facebook Login for Business > - β Webhooks π Required Permissions Configure the following permissions in your Meta app: | instagram_basic | π Read Instagram account profile info and media | instagram_manage_comments | π¬ Create, delete, and manage comments | instagram_manage_messages | π€ Send and receive Instagram messages | pages_show_list | π Access connected Facebook pages π« Access Token Generation > β οΈ Important Setup:+ > - [ ] Use Facebook's Graph API Explorer > - [ ] Generate a User Access Token with required permissions > - [ ] β‘ Important: Tokens expire periodically and need refreshing 2. π Webhook Configuration π Setup Webhook URL > π Configuration Checklist: > - [ ] In Meta App Dashboard, navigate to Products β Webhooks > - [ ] Subscribe to Instagram object > - [ ] Configure webhook URL: your-n8n-domain/webhook/instagram > - [ ] Set verification token (use "test" or create secure token) > - [ ] Select webhook fields: > - β comments - For comment notifications > - β messages - For DM notifications (if needed) β Webhook Verification Process The workflow handles Meta's webhook verification automatically: π‘ Meta sends GET request with hub.challenge parameter π Workflow responds with the challenge value to confirm subscription 3. π Google Sheets Setup Example - https://docs.google.com/spreadsheets/d/1ONPKJZOpQTSxbasVcCB7oBjbZcCyAm9gZ-UNPoXM21A/edit?usp=sharing π Create Response Management Sheet Set up a Google Sheets document with the following structure: π Sheet 1 - Comment Responses: | Column | Description | Example | |--------|-------------|---------| | π¬ Comment | Trigger keywords | "auto", "info", "help" | | π Message | Corresponding response message | "Thanks for your comment! We'll get back to you soon." | π Sheet 2 - Interaction Log: | Column | Description | Purpose | |--------|-------------|---------| | β° Time | Timestamp of interaction | Track when interactions occur | | π User Id | Instagram user ID | Identify unique users | | π€ Username | Instagram username | Human-readable identification | | π Note | Additional notes or error messages | Debugging and analytics | π§ Built By - akash@codescale.tech
by Yang
Who is this for? This workflow is perfect for eCommerce teams, market researchers, and product analysts who want to track or extract product information from websites that restrict scraping tools. Itβs also useful for virtual assistants handling product comparison tasks. What problem is this workflow solving? Many eCommerce and retail sites use dynamic content or anti-bot protections that make traditional scraping methods unreliable. This workflow bypasses those issues by taking a screenshot of the full page, using OCR to extract visible text, and summarizing product information with GPT-4oβall fully automated. What this workflow does This workflow monitors a Google Sheet for new URLs. Once a new link is added, it performs the following steps: Trigger on New URL in Sheet β Watches for new rows added to a Google Sheet. Screenshot URL via Dumpling AI β Sends the URL to Dumpling AIβs screenshot endpoint to capture a full-page image of the product webpage. Save Screenshot to Drive Folder β Uploads the screenshot to a specific Google Drive folder for reference or logging. Extract Text from Screenshot with Dumpling AI β Uses Dumpling AIβs image-to-text endpoint to pull all visible content from the screenshot. Extract Product Info from Screenshot Text with GPT-4o β Sends the extracted raw text to GPT-4o, prompting it to identify structured product information such as product name, price, ratings, deals, and purchase options. Split Each Product Entry β Splits the GPT response (an array of product objects) so each product becomes an individual item for saving. Save Products info to Google Sheet β Appends each productβs structured details to a separate sheet in the same spreadsheet. Setup Google Sheet Create a Google Sheet with at least two sheets: Sheet1 should contain a header row with a column labeled URL. Sheet2 should contain headers: Product Name, price, purchased, ratings, deal, buyingOptions. Connect your Google account in both the trigger and final write-back node. Dumpling AI Sign up at Dumpling AI Create an API key and use it for both HTTP modules: Screenshot URL via Dumpling AI Extract Text from Screenshot with Dumpling AI The screenshot endpoint used is https://app.dumplingai.com/api/v1/screenshot. Google Drive Create a folder for storing screenshots. In the Save Screenshot to Drive Folder node, select the correct folder or provide the folder ID. Make sure permissions allow uploading from n8n. OpenAI Provide an API key for GPT-4o in the Extract Product Info from Screenshot Text with GPT-4o node. The prompt is structured to return structured product listings in JSON format. Split & Save Split Each Product Entry takes the array of product objects from GPT and makes each one a separate execution. Save Products info to Google Sheet writes structured fields into Sheet2 under: Product Name, price, purchased, ratings, deal, buyingOptions. How to customize this workflow Adjust the GPT prompt to return different product fields (e.g., shipping info, product categories). Use a filter node to limit which types of products get written to the final sheet. Add sentiment analysis to analyze review content if available. Replace Google Drive with Dropbox or another file storage app. Notes Make sure you monitor your API usage on both Dumpling AI and OpenAI to avoid rate limits. This setup is great for snapshot-based extraction where scraping is blocked or unreliable.
by David Olusola
n8n Set Node Tutorial - Complete Guide π― How It Works This tutorial workflow teaches you everything about n8n's Set node through hands-on examples. The Set node is one of the most powerful tools in n8n - it allows you to create, modify, and transform data as it flows through your workflow. What makes this tutorial special: Progressive Learning**: Starts simple, builds to complex concepts Interactive Examples**: Real working nodes you can modify and test Visual Guidance**: Sticky notes explain every concept Branching Logic**: Shows how Set nodes work in different workflow paths Real Data**: Uses practical examples you'll encounter in automation The workflow demonstrates 6 core concepts: Basic data types (strings, numbers, booleans) Expression syntax with {{ }} and $json references Complex data structures (objects and arrays) "Keep Only Set" option for clean outputs Conditional data setting with branching logic Data transformation and aggregation techniques π Setup Steps Step 1: Import the Workflow Copy the JSON from the code artifact above Open your n8n instance in your browser Navigate to Workflows section Click "Import from JSON" or the import button (usually a "+" or import icon) Paste the JSON into the import dialog Click "Import" to load the workflow Save the workflow (Ctrl+S or click Save button) Step 2: Choose Your Starting Point Option A: Default Tutorial Mode (Recommended for beginners) The workflow is ready to run as-is Uses simple "Welcome" message as starting data Click "Execute Workflow"** to begin Option B: Rich Test Data Mode (Recommended for experimentation) Locate the nodes: Find "Start (Manual Trigger)" and "0. Test Data Input" Disconnect default: Click the connection line between "Start (Manual Trigger)" β "1. Set Basic Values" and delete it Connect test data: Drag from "0. Test Data Input" output to "1. Set Basic Values" input Execute: Click "Execute Workflow" to run with rich test data Step 3: Execute and Learn Run the workflow: Click the "Execute Workflow" button Check outputs: Click on each node to see its output data Read the notes: Each sticky note explains what's happening Follow the flow: Data flows from left to right, top to bottom Step 4: Experiment and Modify Try These Experiments: π§ Change Basic Values: Click on "1. Set Basic Values" Modify user_age (try 20 vs 35) Change user_name to see how it propagates Execute and see the changes flow through π Test Conditional Logic: Set user_age to 20 β triggers "Student Discount" path Set user_age to 30 β triggers "Premium Access" path Watch how the workflow branches differently π¨ Modify Expressions: In "2. Set with Expressions", try changing: ={{ $json.score * 2 }} to ={{ $json.score * 3 }} ={{ $json.user_name }} Smith to ={{ $json.user_name }} Johnson ποΈ Complex Data Structures: In "3. Set Complex Data", modify the JSON structure Add new properties to the user_profile object Try nested expressions π Learning Path Beginner Level (Nodes 1-2) Focus**: Understanding basic Set operations Learn**: Data types, static values, simple expressions Time**: 10-15 minutes Intermediate Level (Nodes 3-4) Focus**: Complex data and output control Learn**: Objects, arrays, "Keep Only Set" option Time**: 15-20 minutes Advanced Level (Nodes 5-6) Focus**: Conditional logic and data aggregation Learn**: Branching workflows, merging data, complex expressions Time**: 20-25 minutes π What Each Node Teaches | Node | Concept | Key Learning | |------|---------|-------------| | 1. Set Basic Values | Data Types | String, number, boolean basics | | 2. Set with Expressions | Dynamic Data | {{ }} syntax, $json references, $now functions | | 3. Set Complex Data | Advanced Structures | Objects, arrays, nested properties | | 4. Set Clean Output | Data Management | "Keep Only Set" for clean final outputs | | 5a/5b. Conditional Sets | Branching Logic | Different data based on conditions | | 6. Tutorial Summary | Data Aggregation | Combining and summarizing workflow data | π‘ Pro Tips π Quick Wins: Always check node outputs after execution Use sticky notes as your learning guide Experiment with small changes first Copy nodes to try variations π οΈ Advanced Techniques: Use Keep Only Set for API responses Combine static and dynamic data in complex objects Leverage conditional paths for different user types Reference nested object properties with dot notation π Troubleshooting: If expressions don't work, check the {{ }} syntax Ensure field names match exactly (case-sensitive) Use the expression editor for complex logic Check data types match your expectations π― Next Steps After Tutorial Create your own Set nodes in a new workflow Practice with real data from APIs or databases Build data transformation workflows for your specific use cases Combine Set nodes with other n8n nodes like HTTP, Webhook, etc. Explore advanced expressions using JavaScript functions Congratulations! You now have the foundation to use Set nodes effectively in any n8n workflow. The Set node is truly the "Swiss Army knife" of n8n automation! π οΈ
by Oneclick AI Squad
This automated n8n workflow continuously monitors airline schedule changes by fetching real-time flight data, comparing it with stored schedules, and instantly notifying both internal teams and affected passengers through multiple communication channels. The system ensures stakeholders are immediately informed of any flight delays, cancellations, gate changes, or other critical updates. Good to Know Flight data accuracy depends on the aviation API provider's update frequency and reliability Critical notifications (cancellations, major delays) trigger immediate passenger alerts via SMS and email Internal Slack notifications keep operations teams informed in real-time Database logging maintains a complete audit trail of all schedule changes The system processes only confirmed schedule changes to avoid false notifications Passenger notifications are sent only to those with confirmed tickets for affected flights How It Works Schedule Trigger - Automatically runs every 30 minutes to check for flight schedule updates Fetch Airline Data - Retrieves current flight information from aviation APIs Get Current Schedules - Pulls existing schedule data from the internal database Process Changes - Compares API data with database records to identify schedule changes Check for Changes - Determines if any updates require processing and notifications Update Database - Saves schedule changes to the internal flight database Notify Slack Channel - Sends operational updates to the flight operations team Check Urgent Notifications - Identifies critical changes requiring immediate passenger alerts Get Affected Passengers - Retrieves contact information for passengers on changed flights Send Email Notifications - Dispatches detailed schedule change emails via SendGrid Send SMS (Critical Only) - Sends urgent text alerts for cancellations and major delays Update Internal Systems - Syncs changes with other airline systems via webhooks Log Sync Activity - Records all synchronization activities for audit and monitoring Data Sources The workflow integrates with multiple data sources and systems: Aviation API (Primary Data Source) Real-time flight status and schedule data Departure/arrival times, gates, terminals Flight status (on-time, delayed, cancelled, diverted) Aircraft and route information Internal Flight Database flight_schedules table - Current schedule data with columns: flight_number (text) - Flight identifier (e.g., "AA123") departure_time (timestamp) - Scheduled departure time arrival_time (timestamp) - Scheduled arrival time status (text) - Flight status (active, delayed, cancelled, diverted) gate (text) - Departure gate number terminal (text) - Terminal identifier airline_code (text) - Airline IATA code origin_airport (text) - Departure airport code destination_airport (text) - Arrival airport code aircraft_type (text) - Aircraft model updated_at (timestamp) - Last update timestamp created_at (timestamp) - Record creation timestamp passengers table - Passenger contact information with columns: passenger_id (integer) - Unique passenger identifier name (text) - Full passenger name email (text) - Email address for notifications phone (text) - Mobile phone number for SMS alerts notification_preferences (json) - Communication preferences created_at (timestamp) - Registration timestamp updated_at (timestamp) - Last profile update tickets table - Booking and ticket status with columns: ticket_id (integer) - Unique ticket identifier passenger_id (integer) - Foreign key to passengers table flight_number (text) - Flight identifier flight_date (date) - Travel date seat_number (text) - Assigned seat ticket_status (text) - Status (confirmed, cancelled, checked-in) booking_reference (text) - Booking confirmation code fare_class (text) - Ticket class (economy, business, first) created_at (timestamp) - Booking timestamp updated_at (timestamp) - Last modification timestamp sync_logs table - Audit trail and system logs with columns: log_id (integer) - Unique log identifier workflow_name (text) - Name of the workflow that created the log total_changes (integer) - Number of schedule changes processed sync_status (text) - Status (completed, failed, partial) sync_timestamp (timestamp) - When the sync occurred details (json) - Detailed log information and changes error_message (text) - Error details if sync failed execution_time_ms (integer) - Processing time in milliseconds Communication Channels Slack - Internal team notifications SendGrid - Passenger email notifications Twilio - Critical SMS alerts Internal webhooks - System integrations How to Use Import the workflow into your n8n instance Configure aviation API credentials (AviationStack, FlightAware, or airline-specific APIs) Set up PostgreSQL database connection with required tables Configure Slack bot token for operations team notifications Set up SendGrid API key and email templates for passenger notifications Configure Twilio credentials for SMS alerts (critical notifications only) Test with sample flight data to verify all notification channels Adjust monitoring frequency and severity thresholds based on operational needs Monitor sync logs to ensure reliable data synchronization Requirements API Access Aviation data provider (AviationStack, FlightAware, etc.) SendGrid account for email delivery Twilio account for SMS notifications Slack workspace and bot token Database Setup PostgreSQL database with flight schedule tables Passenger and ticket management tables Audit logging tables for tracking changes Infrastructure n8n instance with appropriate node modules Reliable internet connection for API calls Proper credential management and security Customizing This Workflow Modify the Process Changes node to adjust change detection sensitivity, add custom business rules, or integrate additional data sources like weather or airport operational data. Customize notification templates in the email and SMS nodes to match your airline's branding and communication style. Adjust the Schedule Trigger frequency based on your operational requirements and API rate limits.