by Louis Chan
How it works Transform medical documents into structured data using Google Gemini AI with enterprise-grade accuracy. Classifies document types (receipts, prescriptions, lab reports, clinical notes) Extracts text with 95%+ accuracy using advanced OCR Structures data according to medical taxonomy standards Supports multiple languages (English, Chinese, auto-detect) Tracks processing costs and quality metrics automatically Set up steps Prerequisites Google Gemini API key (get from Google AI Studio) Quick setup Import this workflow template Configure Google Gemini API credentials in n8n Test with a sample medical document URL Deploy your webhook endpoint Usage Send POST request to your webhook: { "image_url": "https://example.com/medical-receipt.jpg", "expected_type": "financial", "language_hint": "auto" } Get structured response: json{ "success": true, "result": { "documentType": "financial", "metadata": { "providerName": "Dr. Smith Clinic", "createdDate": "2025-01-06", "currency": "USD" }, "content": { "amount": 150.00, "services": [...] }, "quality_metrics": { "overall_confidence": 0.95 } } } Use cases Healthcare Organizations Medical billing automation - Process receipts and invoices automatically Insurance claim processing - Extract data from claim documents Clinical documentation - Digitize patient records and notes Data standardization - Consistent structured output format System Integrators EMR integration - Connect with existing healthcare systems Workflow automation - Reduce manual data entry by 90% Multi-language support - Handle international medical documents Quality assurance - Built-in confidence scoring and validation Supported Document Types Financial: Medical receipts, bills, insurance claims, invoices Clinical: Medical charts, progress notes, consultation reports Prescription: Prescriptions, medication lists, pharmacy records Administrative: Referrals, authorizations, patient registration Diagnostic: Lab reports, test results, screening reports Legal: Medical certificates, documentation forms
by simonscrapes
Use Case Transform and optimize images for web use: You need to host local images online You want to reduce image file sizes automatically You need image URLs for web content You want to generate and optimize AI-created images What this Workflow Does The workflow processes images through two services: Uploads images to ImgBB for hosting and URL generation (free but need API key) Optimizes images using ReSmush.it to reduce file size (free) Optional: Creates images using OpenAI's image generation Returns optimized image URLs ready for use Setup Create an ImgBB account and get your API key Add your ImgBB API key to the HTTP Request node (key parameter) Optional: Configure OpenAI credentials for image generation Connect your image input source How to Adjust it to Your Needs Skip OpenAI nodes if using your own image files Adjust image optimization parameters Customize image hosting settings Modify output format for your needs More templates and n8n workflows >>> @simonscrapes
by Nick Saraev
This workflow creates an end-to-end Instagram content pipeline that automatically discovers trending content from competitor channels, extracts valuable insights, and generates new high-quality scripts for your own content creation. The system helped scale an Instagram channel from 0 to 10,000 followers in just 15 days through intelligent content repurposing. Benefits: Complete Content Automation - Monitors competitor Instagram accounts, downloads new reels, and processes them without manual intervention AI-Powered Script Generation - Uses ChatGPT and Perplexity to analyze content, identify tools/technologies, and rewrite scripts with fresh angles Smart Duplicate Prevention - Automatically tracks processed content in a database to avoid redundant work Multi-Platform Intelligence - Combines Instagram scraping, AI transcription, web research, and content generation in one seamless flow Scalable Content Strategy - Process content from multiple niches and creators to fuel unlimited content ideas Revenue-Focused Approach - Specifically designed to identify monetizable tools and technologies for business-focused content How It Works: Instagram Content Discovery: Uses Apify's Instagram scraper to monitor specified creator accounts for new reels Automatically downloads video content and metadata from target accounts Filters content based on engagement metrics and relevance Intelligent Processing Pipeline: Transcribes video content using OpenAI Whisper for accurate text extraction Filters content using AI to identify tools, technologies, and automation opportunities Cross-references against existing database to prevent duplicate processing Enhanced Research & Analysis: Searches Perplexity AI for additional insights about discovered tools Generates step-by-step usage guides and implementation instructions Identifies unique angles and opportunities for content improvement Script Generation & Optimization: Creates new, original scripts optimized for your specific audience Maintains consistent brand voice while adding fresh perspectives Includes strategic call-to-action elements for audience engagement Required Google Sheets Database Setup: Before running this workflow, create a Google Sheets database with these exact column headers: Essential Columns: id - Unique Instagram post identifier (primary key for duplicate detection) timestamp - When the reel was posted caption - Original reel caption text hashtags - Hashtags used in the post videoUrl - Direct link to download the video file username - Account that posted the reel scrapedTranscript - Original transcript from video (added by workflow) newTranscript - AI-generated script for your content (added by workflow) Additional Tracking Columns: shortCode - Instagram's internal post code url - Public Instagram post URL commentsCount - Number of comments firstComment - Top comment on the post likesCount - Number of likes videoViewCount - View count metrics videoDuration - Length of video in seconds Setup Instructions: Create a new Google Sheet with these column headers in the first row Name the sheet "Reels" Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The merge logic relies on the id column to prevent duplicate processing, so this structure is essential for the workflow to function correctly. Business Use Cases: Content Creators - Scale content production by 10x while maintaining quality and originality Marketing Agencies - Offer content research and ideation as a premium service Course Creators - Identify trending tools and technologies for educational content Revenue Potential: This exact system can be sold as a service for $3,000-$5,000 to growing channels or agencies. The automation saves 10+ hours weekly of manual research and content planning. Difficulty Level: Intermediate Estimated Build Time: 1-2 hours Monthly Operating Cost: ~$30 (API usage) Watch the Complete Build Process Want to see exactly how this system was built from scratch? Nick Saraev walks through the entire development process in this comprehensive tutorial, including all the debugging, dead ends, and problem-solving that goes into building real automation systems. π₯ Watch: "The N8N Instagram Parasite System (10K Followers In 15 Days)" This 1.5-hour deep-dive shows the actual build process - not a polished demo, but real system development with all the thinking and iteration included. Set Up Steps: Configure Apify Integration: Sign up for Apify account and obtain API key Replace the bearer token in the "Run Actor Synchronously" node Customize the username array with your target Instagram accounts Set Up AI Services: Add OpenAI API credentials for transcription and content generation Configure Perplexity API for enhanced research capabilities Set up appropriate rate limiting for cost control Database Configuration: Create Google Sheets database with provided column structure Connect Google Sheets OAuth credentials Configure the merge logic for duplicate detection Content Filtering Setup: Customize the AI prompts for your specific niche and requirements Adjust the filtering criteria for tool/technology detection Set up the script generation template to match your brand voice Automation Schedule: Configure the schedule trigger for daily content monitoring Set optimal timing based on your content creation workflow Test the complete flow with a small number of accounts first Advanced Customization: Add additional content sources beyond Instagram Integrate with your existing content management systems Scale up monitoring to dozens of competitor accounts More AI Automation Systems:* For more advanced automation tutorials and business systems, check out My YouTube Channel where I share proven automation strategies that generate real revenue.
by Airtop
Automating Person Data Enrichment and CRM Update Use Case This automation enriches a personβs professional profile using their name and work email, scores them against an ICP (Ideal Customer Profile), and updates their record in HubSpot. Itβs ideal for sales, marketing, and recruitment teams needing reliable contact insights. What This Automation Does This automation performs the following using the input parameters: Person name**: The full name of the individual. Work email**: The professional email address of the contact. Airtop Profile (connected to LinkedIn)**: An authenticated Airtop Profile used for LinkedIn-based enrichment. Hubspot object id**: The internal HubSpot ID for the contact to be updated. How It Works Initiates the workflow using a form or external trigger. Uses the name and email to extract and enrich the personβs data, including: LinkedIn profile and company page About section, job title, location ICP score, seniority level, AI interest, technical depth, connection and follower counts Formats and maps the enriched data. Pushes the updated data to HubSpot using the object ID. Setup Requirements Airtop API Key Airtop Profile logged in to LinkedIn. HubSpot access with object ID field for each contact to update. Next Steps Combine with Lead Generation**: Use as part of an end-to-end workflow that sources leads and enriches them in real time. Trigger from CRM**: Initiate this workflow when a new contact is added in HubSpot or another CRM. Customize Scoring Logic**: Tailor the ICP calculation to your teamβs specific criteria. Read more about person data enrichment
by Ron
Objective In industry and production sometimes machine data is available in databases. That might be sensor data like temperature or pressure or just binary information. In this sample flow reads machine data and sends an alert to your SIGNL4 team when the machine is down. When the machine is up again the alert in SIGNL4 will get closed automatically. Setup We simulate the machine data using a Notion table. When we un-check the Up box we simulate a machine-down event. In certain intervals n8n checks the database for down items. If such an item has been found an alert is send using SIGNL4 and the item in Notion is updates (in order not to read it again). Status updates from SIGNL4 (acknowledgement, close, annotation, escalation, etc.) are received via webhook and we update the Notion item accordingly. This is how the alert looks like in the SIGNL4 app. The flow can be easily adapted to other database monitoring scenarios.
by Gain FLow AI
Overview This workflow automates the process of sending personalized cold email sequences to your prospects. It fetches un-emailed leads from your Google Sheet, validates their email addresses, and then dispatches tailored emails according to a predefined schedule. It updates your CRM (Google Sheet) with the status of each sent email, ensuring your outreach efforts are tracked and efficient. Use Case This workflow is perfect for: Sales Teams**: Automate the delivery of multi-stage cold email campaigns to a large volume of leads. Business Development**: Nurture prospects over time with a structured email sequence. Recruiters**: Send out introductory emails to potential candidates for open positions. Marketers**: Distribute personalized outreach for events, content, or product launches. Anyone doing cold outreach**: Ensure consistent follow-up and track email performance without manual effort. How It Works Scheduled Trigger: The workflow is set to run automatically at a defined interval (e.g., every 6 hours, as currently configured by the "Set Timer" node). This ensures regular outreach without manual intervention. Fetch Unsent Emails: The "Get Emails" node queries your Google Sheet to identify prospects who haven't yet received the current email in the sequence (i.e., "Email Sent " is "No"). Control Volume: A "Limit" node can be used to control the number of emails sent in each batch, preventing you from sending too many emails at once and potentially hitting sending limits. Loop Through Prospects: The "Loop Over Items" node processes each selected prospect individually. Email Validation (Conditional Send): An "If" node checks if the prospect's "Email Address" is valid and exists. This prevents sending emails to invalid addresses, improving deliverability. Send Email: "Send Email" Node: For valid email addresses, this node dispatches the personalized email to the prospect. It retrieves the recipient's email, subject, and body from your Google Sheet. "connect" Node: (Note: The provided JSON uses a generic emailSend node named "connect" that links to an SMTP credential. This represents the actual email sending mechanism, whether it's Gmail or a custom SMTP server.) Update CRM: After successfully sending an email, the "Update Records" node updates your Google Sheet. It marks the "Email Sent " column as "Yes" and records the "Sent on" timestamp and a "Message Id" for tracking. Delay Between Sends: A "Wait" node introduces a delay between sending emails to individual prospects. This helps mimic human sending behavior and can improve deliverability. How to Set It Up To set up your Automated Cold Email Sender, follow these steps: Google Sheet Setup: Duplicate the Provided Template: Make a copy of the Google Sheet Template (1TjXelyGPg5G8lbPDI9_XOReTzmU1o52z2R3v8dYaoQM) into your own Google Drive. This sheet should contain columns for "Name", "Email Address ", "Sender Email", "Email Subject", "Email Body", "Email Sent ", "Sent on", and "Message Id". Connect Google Sheets: Ensure your Google Sheets OAuth2 API credentials are set up in n8n and linked to the "Get Emails" and "Update Records" nodes. Update Sheet IDs: In both "Get Emails" and "Update Records" nodes, update the documentId with the ID of your copied template. Email Sending Service Credentials: Gmail: If using Gmail, ensure your Gmail OAuth2 credentials are configured and connected to the "Send Email" node (or the "connect" node, if that's your chosen sender). Other Email Services (SMTP): If you use a different email service, you'll need to set up an SMTP credential in n8n and connect it to the "connect" node. Refer to the "Sticky Note4" for guidance on non-Google email services. Configure Timer: In the "Set Timer" node, adjust the hoursInterval or other time settings to define how frequently you want the email sending process to run (e.g., every 6 hours, once a day, etc.). Control Volume (Optional): In the "Limit" node, you can set the maxItems to control how many emails are processed and sent in each batch. This is useful for managing email sending limits or gradual outreach. Import the Workflow: Import the provided workflow JSON into your n8n instance. Populate Your Sheet: Fill your copied Google Sheet with prospect data, including the email subject and body for each email you wish to send. Ensure the "Email Sent " column is initially "No". Activate and Monitor: Activate the workflow. It will begin fetching and sending emails based on your configured schedule. Monitor your Google Sheet to track the "Email Sent " status. This workflow provides a robust and automated solution for managing your cold email campaigns, saving you time and increasing your outreach efficiency.
by Rodrigue Gbadou
How it works Automatic Detection: Instantly identifies abandoned carts via webhook from your e-commerce store. Progressive Sequence: Automatically sends 3 recovery emails over 7 days with increasing incentives. Dynamic Personalization: Inserts abandoned products, customer name, and unique promo codes. Performance Tracking: Analyzes conversion rates and recovered revenue. Set up steps Configure the webhook: Connect your e-commerce platform (Shopify, WooCommerce, Magento) to trigger the workflow when a cart is abandoned. Email service: Set up your email sending service (Gmail, SendGrid, Mailgun) with proper credentials. Customization: Adapt email templates with your brand guidelines, logo, and tone of voice. Promo codes: Integrate your discount code system (10%, 15%, 20%). Analytics tracking: Connect a Google Sheet to track recovery performance. Testing: Validate the workflow with test data before activation. Key Features π― Smart targeting: Automatically filters qualified carts (minimum value, valid email) β° Optimized timing: Scientifically timed sequence (1h, 24h, 72h) to maximize conversions π° Progressive incentives: Increasing discounts (10% β 15% β 20%) to create urgency π± Responsive design: Email templates optimized for all devices π Unique codes: Automatically generates personalized promo codes for each customer π Built-in analytics: Real-time tracking of open rates, clicks, and conversions π‘οΈ Error handling: Robust system with notifications in case of technical issues π¨ Professional templates: Modern email designs with optimized call-to-actions Advanced Features Customer segmentation**: Differentiates between new and returning customers Automatic exclusions**: Avoids sending to customers who already purchased Multi-language**: Supports different languages based on location A/B Testing**: Tests different email versions to optimize performance CRM integration**: Syncs data with your customer management system Metrics Tracked Recovery rate per email in the sequence Real-time recovered revenue Open and click-through rates for each email Promo codes used and their effectiveness Average delay between abandonment and conversion Customization Options Flexible timing**: Adjust sending delays to fit your industry Variable incentives**: Change discount percentages as needed Dynamic content**: Adjust messages based on product types Configurable thresholds**: Set your own qualification criteria Full branding**: Integrate your complete visual identity > This workflow automatically turns abandoned carts into sales opportunities with a scientific and personalized approach, generating measurable ROI for your e-commerce.
by Benjamin Jones (SaaS Alerts)
Collect and Email Authentication IP Addresses from SaaS Alerts (Last 24 Hours) Description This n8n workflow automates the process of collecting sign-in IP addresses from SaaS Alerts over the past 24 hours and emailing the results using SMTP2Go. Designed for security teams, IT administrators, and compliance officers, this workflow helps monitor user authentication activity, detect unusual sign-ins, and respond to potential security threats in real time. By automating data collection and email alerts, organizations can proactively track login patterns, ensure compliance with security policies, and mitigate risks associated with unauthorized access. Use Case This workflow is ideal for businesses and IT teams that need to: Monitor user authentication activity across SaaS applications. Identify login attempts from suspicious IPs. Automate security reporting and compliance tracking. Receive real-time alerts for unusual sign-in behaviors. Pre-Conditions & Requirements Before using this workflow, ensure you have: A SaaS Alerts account or another system that logs authentication IPs. An SMTP2Go account for sending email notifications. n8n set up with proper API credentials and database access (if applicable). Setup Instructions Configure SaaS Alerts API Obtain API from the SaaS Alerts Platform under the Settings menu. Set Up SMTP2Go for Email Alerts Create an SMTP2Go account if you donβt have one. Generate a SMTP2Go API key Verify that your sending email address has been configured and verified. Define recipient email addresses for security alerts. Customize the Workflow Modify filtering rules to track specific users, IP ranges, or flagged login attempts. Adjust email content to include relevant details for your team. Test & Deploy Run the workflow manually to verify data retrieval and email notifications. Schedule the workflow to run daily for automated monitoring. Workflow Steps Trigger β Starts manually or on a scheduled interval (e.g., every 24 hours). Fetch Authentication Logs β Retrieves sign-in IPs from SaaS Alerts or a custom API. Filter & Process Data β Extracts relevant login attempts based on defined criteria. Format Data for Reporting β Structures the data for readability in an email alert. Send Email Notification via SMTP2Go β Delivers the security report to designated recipients. Customization Options Modify Filtering Rules** β Track specific login behaviors, flagged IPs, or unusual patterns. Change Email Recipients** β Update the recipient list based on security team needs. Integrate with Security Dashboards** β Expand the workflow to log data into a SIEM system or incident response platform. Add Additional Triggers** β Configure alerts for specific login anomalies, such as failed login attempts. Keywords n8n security automation, authentication monitoring, login IP tracking, SMTP2Go email alerts, SaaS Alerts workflow, IT security automation, login anomaly detection
by Akash Kankariya
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. π― Overview This n8n workflow template automates the process of monitoring Instagram comments and sending predefined responses based on specific comment keywords. It integrates Instagram's Graph API with Google Sheets to manage comment responses and maintains an interaction log for customer relationship management (CRM) purposes. π§ Workflow Components The workflow consists of 9 main nodes organized into two primary sections: π‘ Section 1: Webhook Verification β Get Verification (Webhook node) π Respond to Verification Message (Respond to Webhook node) π€ Section 2: Auto Comment Response π¬ Insta Update (Webhook node) β Check if update is of comment? (Switch node) π€ Comment if of other user (If node) π Comment List (Google Sheets node) π¬ Send Message for Comment (HTTP Request node) π Add Interaction in Sheet (CRM) (Google Sheets node) π οΈ Prerequisites and Setup Requirements 1. π΅ Meta/Facebook Developer Setup π± Create Facebook App > π Action Items: > - [ ] Navigate to Facebook Developers > - [ ] Click "Create App" and select "Business" type > - [ ] Configure the following products: > - β Instagram Graph API > - β Facebook Login for Business > - β Webhooks π Required Permissions Configure the following permissions in your Meta app: | instagram_basic | π Read Instagram account profile info and media | instagram_manage_comments | π¬ Create, delete, and manage comments | instagram_manage_messages | π€ Send and receive Instagram messages | pages_show_list | π Access connected Facebook pages π« Access Token Generation > β οΈ Important Setup:+ > - [ ] Use Facebook's Graph API Explorer > - [ ] Generate a User Access Token with required permissions > - [ ] β‘ Important: Tokens expire periodically and need refreshing 2. π Webhook Configuration π Setup Webhook URL > π Configuration Checklist: > - [ ] In Meta App Dashboard, navigate to Products β Webhooks > - [ ] Subscribe to Instagram object > - [ ] Configure webhook URL: your-n8n-domain/webhook/instagram > - [ ] Set verification token (use "test" or create secure token) > - [ ] Select webhook fields: > - β comments - For comment notifications > - β messages - For DM notifications (if needed) β Webhook Verification Process The workflow handles Meta's webhook verification automatically: π‘ Meta sends GET request with hub.challenge parameter π Workflow responds with the challenge value to confirm subscription 3. π Google Sheets Setup Example - https://docs.google.com/spreadsheets/d/1ONPKJZOpQTSxbasVcCB7oBjbZcCyAm9gZ-UNPoXM21A/edit?usp=sharing π Create Response Management Sheet Set up a Google Sheets document with the following structure: π Sheet 1 - Comment Responses: | Column | Description | Example | |--------|-------------|---------| | π¬ Comment | Trigger keywords | "auto", "info", "help" | | π Message | Corresponding response message | "Thanks for your comment! We'll get back to you soon." | π Sheet 2 - Interaction Log: | Column | Description | Purpose | |--------|-------------|---------| | β° Time | Timestamp of interaction | Track when interactions occur | | π User Id | Instagram user ID | Identify unique users | | π€ Username | Instagram username | Human-readable identification | | π Note | Additional notes or error messages | Debugging and analytics | π§ Built By - akash@codescale.tech
by David Olusola
n8n Set Node Tutorial - Complete Guide π― How It Works This tutorial workflow teaches you everything about n8n's Set node through hands-on examples. The Set node is one of the most powerful tools in n8n - it allows you to create, modify, and transform data as it flows through your workflow. What makes this tutorial special: Progressive Learning**: Starts simple, builds to complex concepts Interactive Examples**: Real working nodes you can modify and test Visual Guidance**: Sticky notes explain every concept Branching Logic**: Shows how Set nodes work in different workflow paths Real Data**: Uses practical examples you'll encounter in automation The workflow demonstrates 6 core concepts: Basic data types (strings, numbers, booleans) Expression syntax with {{ }} and $json references Complex data structures (objects and arrays) "Keep Only Set" option for clean outputs Conditional data setting with branching logic Data transformation and aggregation techniques π Setup Steps Step 1: Import the Workflow Copy the JSON from the code artifact above Open your n8n instance in your browser Navigate to Workflows section Click "Import from JSON" or the import button (usually a "+" or import icon) Paste the JSON into the import dialog Click "Import" to load the workflow Save the workflow (Ctrl+S or click Save button) Step 2: Choose Your Starting Point Option A: Default Tutorial Mode (Recommended for beginners) The workflow is ready to run as-is Uses simple "Welcome" message as starting data Click "Execute Workflow"** to begin Option B: Rich Test Data Mode (Recommended for experimentation) Locate the nodes: Find "Start (Manual Trigger)" and "0. Test Data Input" Disconnect default: Click the connection line between "Start (Manual Trigger)" β "1. Set Basic Values" and delete it Connect test data: Drag from "0. Test Data Input" output to "1. Set Basic Values" input Execute: Click "Execute Workflow" to run with rich test data Step 3: Execute and Learn Run the workflow: Click the "Execute Workflow" button Check outputs: Click on each node to see its output data Read the notes: Each sticky note explains what's happening Follow the flow: Data flows from left to right, top to bottom Step 4: Experiment and Modify Try These Experiments: π§ Change Basic Values: Click on "1. Set Basic Values" Modify user_age (try 20 vs 35) Change user_name to see how it propagates Execute and see the changes flow through π Test Conditional Logic: Set user_age to 20 β triggers "Student Discount" path Set user_age to 30 β triggers "Premium Access" path Watch how the workflow branches differently π¨ Modify Expressions: In "2. Set with Expressions", try changing: ={{ $json.score * 2 }} to ={{ $json.score * 3 }} ={{ $json.user_name }} Smith to ={{ $json.user_name }} Johnson ποΈ Complex Data Structures: In "3. Set Complex Data", modify the JSON structure Add new properties to the user_profile object Try nested expressions π Learning Path Beginner Level (Nodes 1-2) Focus**: Understanding basic Set operations Learn**: Data types, static values, simple expressions Time**: 10-15 minutes Intermediate Level (Nodes 3-4) Focus**: Complex data and output control Learn**: Objects, arrays, "Keep Only Set" option Time**: 15-20 minutes Advanced Level (Nodes 5-6) Focus**: Conditional logic and data aggregation Learn**: Branching workflows, merging data, complex expressions Time**: 20-25 minutes π What Each Node Teaches | Node | Concept | Key Learning | |------|---------|-------------| | 1. Set Basic Values | Data Types | String, number, boolean basics | | 2. Set with Expressions | Dynamic Data | {{ }} syntax, $json references, $now functions | | 3. Set Complex Data | Advanced Structures | Objects, arrays, nested properties | | 4. Set Clean Output | Data Management | "Keep Only Set" for clean final outputs | | 5a/5b. Conditional Sets | Branching Logic | Different data based on conditions | | 6. Tutorial Summary | Data Aggregation | Combining and summarizing workflow data | π‘ Pro Tips π Quick Wins: Always check node outputs after execution Use sticky notes as your learning guide Experiment with small changes first Copy nodes to try variations π οΈ Advanced Techniques: Use Keep Only Set for API responses Combine static and dynamic data in complex objects Leverage conditional paths for different user types Reference nested object properties with dot notation π Troubleshooting: If expressions don't work, check the {{ }} syntax Ensure field names match exactly (case-sensitive) Use the expression editor for complex logic Check data types match your expectations π― Next Steps After Tutorial Create your own Set nodes in a new workflow Practice with real data from APIs or databases Build data transformation workflows for your specific use cases Combine Set nodes with other n8n nodes like HTTP, Webhook, etc. Explore advanced expressions using JavaScript functions Congratulations! You now have the foundation to use Set nodes effectively in any n8n workflow. The Set node is truly the "Swiss Army knife" of n8n automation! π οΈ
by Oneclick AI Squad
This automated n8n workflow continuously monitors airline schedule changes by fetching real-time flight data, comparing it with stored schedules, and instantly notifying both internal teams and affected passengers through multiple communication channels. The system ensures stakeholders are immediately informed of any flight delays, cancellations, gate changes, or other critical updates. Good to Know Flight data accuracy depends on the aviation API provider's update frequency and reliability Critical notifications (cancellations, major delays) trigger immediate passenger alerts via SMS and email Internal Slack notifications keep operations teams informed in real-time Database logging maintains a complete audit trail of all schedule changes The system processes only confirmed schedule changes to avoid false notifications Passenger notifications are sent only to those with confirmed tickets for affected flights How It Works Schedule Trigger - Automatically runs every 30 minutes to check for flight schedule updates Fetch Airline Data - Retrieves current flight information from aviation APIs Get Current Schedules - Pulls existing schedule data from the internal database Process Changes - Compares API data with database records to identify schedule changes Check for Changes - Determines if any updates require processing and notifications Update Database - Saves schedule changes to the internal flight database Notify Slack Channel - Sends operational updates to the flight operations team Check Urgent Notifications - Identifies critical changes requiring immediate passenger alerts Get Affected Passengers - Retrieves contact information for passengers on changed flights Send Email Notifications - Dispatches detailed schedule change emails via SendGrid Send SMS (Critical Only) - Sends urgent text alerts for cancellations and major delays Update Internal Systems - Syncs changes with other airline systems via webhooks Log Sync Activity - Records all synchronization activities for audit and monitoring Data Sources The workflow integrates with multiple data sources and systems: Aviation API (Primary Data Source) Real-time flight status and schedule data Departure/arrival times, gates, terminals Flight status (on-time, delayed, cancelled, diverted) Aircraft and route information Internal Flight Database flight_schedules table - Current schedule data with columns: flight_number (text) - Flight identifier (e.g., "AA123") departure_time (timestamp) - Scheduled departure time arrival_time (timestamp) - Scheduled arrival time status (text) - Flight status (active, delayed, cancelled, diverted) gate (text) - Departure gate number terminal (text) - Terminal identifier airline_code (text) - Airline IATA code origin_airport (text) - Departure airport code destination_airport (text) - Arrival airport code aircraft_type (text) - Aircraft model updated_at (timestamp) - Last update timestamp created_at (timestamp) - Record creation timestamp passengers table - Passenger contact information with columns: passenger_id (integer) - Unique passenger identifier name (text) - Full passenger name email (text) - Email address for notifications phone (text) - Mobile phone number for SMS alerts notification_preferences (json) - Communication preferences created_at (timestamp) - Registration timestamp updated_at (timestamp) - Last profile update tickets table - Booking and ticket status with columns: ticket_id (integer) - Unique ticket identifier passenger_id (integer) - Foreign key to passengers table flight_number (text) - Flight identifier flight_date (date) - Travel date seat_number (text) - Assigned seat ticket_status (text) - Status (confirmed, cancelled, checked-in) booking_reference (text) - Booking confirmation code fare_class (text) - Ticket class (economy, business, first) created_at (timestamp) - Booking timestamp updated_at (timestamp) - Last modification timestamp sync_logs table - Audit trail and system logs with columns: log_id (integer) - Unique log identifier workflow_name (text) - Name of the workflow that created the log total_changes (integer) - Number of schedule changes processed sync_status (text) - Status (completed, failed, partial) sync_timestamp (timestamp) - When the sync occurred details (json) - Detailed log information and changes error_message (text) - Error details if sync failed execution_time_ms (integer) - Processing time in milliseconds Communication Channels Slack - Internal team notifications SendGrid - Passenger email notifications Twilio - Critical SMS alerts Internal webhooks - System integrations How to Use Import the workflow into your n8n instance Configure aviation API credentials (AviationStack, FlightAware, or airline-specific APIs) Set up PostgreSQL database connection with required tables Configure Slack bot token for operations team notifications Set up SendGrid API key and email templates for passenger notifications Configure Twilio credentials for SMS alerts (critical notifications only) Test with sample flight data to verify all notification channels Adjust monitoring frequency and severity thresholds based on operational needs Monitor sync logs to ensure reliable data synchronization Requirements API Access Aviation data provider (AviationStack, FlightAware, etc.) SendGrid account for email delivery Twilio account for SMS notifications Slack workspace and bot token Database Setup PostgreSQL database with flight schedule tables Passenger and ticket management tables Audit logging tables for tracking changes Infrastructure n8n instance with appropriate node modules Reliable internet connection for API calls Proper credential management and security Customizing This Workflow Modify the Process Changes node to adjust change detection sensitivity, add custom business rules, or integrate additional data sources like weather or airport operational data. Customize notification templates in the email and SMS nodes to match your airline's branding and communication style. Adjust the Schedule Trigger frequency based on your operational requirements and API rate limits.
by InfraNodus
This template can be used to generate research questions from PDF documents (e.g. research papers, market reports) based on the content gaps found in text using the InfraNodus knowledge graph GraphRAG knowledge graph representation. Simply upload several PDF files (research papers, corporate or market reports, etc) and generate a research question / AI prompt in seconds. The template is useful for: generating research questions generating AI prompts that drive research further finding blind spots in any discourse and generating ideas that address them. avoiding the generic bias of LLM models and focusing on what's important in your particular context Using Content Gaps for Generating Research Questions Knowledge graphs represent any text as a network: the main concepts are the nodes, their co-occurrences are the connections between them. Based on this representation, we build a graph and apply network science metrics to rank the most important nodes (concepts) that serve as the crossroads of meaning and also the main topical clusters that they connect. Naturally, some of the clusters will be disconnected and will have gaps between them. These are the topics (groups of concepts) that exist in this context (the documents you uploaded) but that are not very well connected. Addressing those gaps can help you see which groups of concepts you could connect with your own ideas. This is exactly what InfraNodus does: builds the structure, finds the gaps, then uses the built-in AI to generate research questions that bridge those gaps. How it works 1) Step 1: First, you upload your PDF files using an online web form, which you can run from n8n or even make publicly available. 2) Steps 2-4: The documents are processed using the Code and PDF to Text nodes to extract plain text from them. 3) Step 5: This text is then sent to the InfraNodus GraphRAG node that creates a knowledge graph, identifies structural gaps in this graph, and then uses built-in AI to research questions / prompts. 4) Step 6: The ideas are then shown to the user in the same web form. Optionally, you can hook this template to your own workflow and send the question generated to an InfraNodus expert or your own AI model / agent for further processing. If you'd like to sync this workflow to PDF files in a Google Drive folder, you can copy our Google Drive PDF processing workflow for n8n. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key. Add this key into the InfraNodus GraphRAG HTTP node(s) you use in this workflow. You do not need any OpenAI keys for this to work. Optionally, you can change the settings in the Step 4 of this workflow and enforce it to always use the biggest gap it identifies. Requirements An InfraNodus account and API key Note: OpenAI key is not required. You will have direct access to the InfraNodus AI with the API key. Customizing this workflow You can use this same workflow with a Telegram bot or Slack (to be notified of the summaries and ideas). You can also hook up automated social media content creation workflows in the end of this template, so you can generate posts that are relevant (covering the important topics in your niche) but also novel (because they connect them in a new way). Check out our n8n templates for ideas at https://n8n.io/creators/infranodus/ Also check the full tutorial with a conceptual explanation at https://support.noduslabs.com/hc/en-us/articles/20454382597916-Beat-Your-Competition-Target-Their-Content-Gaps-with-this-n8n-Automation-Workflow Also check out the video introduction to InfraNodus to better understand how knowledge graphs and content gaps work: For support and help with this workflow, please, contact us at https://support.noduslabs.com