by Lucas Walter
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. AI dental appointment booking with Google Calendar and Sheets Who's it for This workflow is perfect for dental practices, medical offices, and healthcare providers who want to automate their appointment scheduling process. It's ideal for practices that receive high volumes of appointment requests and want to reduce manual booking while maintaining accurate patient records. What it does This AI-powered voice agent handles complete appointment booking workflows for "Pearly Whites Dental." When patients call or submit requests, the system: Analyzes the request using Google Gemini AI to understand patient needs Checks calendar availability in real-time via Google Calendar integration Automatically finds and offers up to 2 available appointment slots when the preferred time isn't available Books confirmed appointments directly to the practice calendar Logs all patient information (name, insurance, concerns) to Google Sheets for record-keeping Maintains conversation context across interactions for natural dialogue flow The workflow operates in Central Time Zone and assumes standard business hours (8 AM - 5 PM, excluding lunch). How it works The system receives webhook requests containing patient interaction data. The AI agent processes this information and determines which tools to use based on the request type. For availability checks, it intelligently searches multiple time slots in 30-minute increments until finding suitable options. All appointments are automatically formatted as "Dental Appointment | [Patient Name]" and logged with complete patient details. Requirements Google Calendar API access with OAuth2 credentials Google Sheets API access for patient data logging Google Gemini API key for AI processing Webhook endpoint for receiving requests Pre-configured Google Calendar and Sheets document How to set up Configure Google Calendar credentials in the calendar tool nodes Set up Google Sheets integration with your patient tracking spreadsheet Add your Google Gemini API key to the language model node Update the calendar ID in both calendar nodes to match your practice calendar Modify the Google Sheets document ID to point to your patient records sheet Test the webhook endpoint to ensure proper request processing How to customize the workflow Adjust business hours** by modifying the availability checking logic in the system prompt Change appointment duration** by updating the end time calculation (currently set to 1 hour) Modify patient data fields** by updating the Google Sheets column mapping Update practice name** by changing "Pearly Whites Dental" references in the system prompt Customize response format** by adjusting the AI agent's instructions for different appointment types
by Oneclick AI Squad
This automated n8n workflow delivers an instant DevOps toolkit by installing Docker, K3s, Jenkins, Grafana, and more on a Linux server within 10 seconds. It optimizes performance, enhances security, and provides ready-to-use templates for DevOps projects. Main Components Configure Parameters** - Defines server details, tool versions, and credentials System Preparation** - Updates the system and installs base packages Install Docker** - Deploys Docker Engine and Docker Compose Install Kubernetes** - Sets up K3s cluster with kubectl, Helm, and k9s Install Jenkins** - Configures Jenkins CI/CD server with Docker integration Install Monitoring** - Deploys Prometheus and Grafana using Helm charts Create DevOps User** - Establishes a dedicated user with appropriate permissions Security Configuration** - Implements firewall, VS Code, and Terraform Final Configuration** - Sets up sample projects and configuration files Setup Complete** - Provides a summary and access details Essential Prerequisites Linux server with SSH access Root-level administrative privileges Customization Guide Adjust tool versions or credentials in the Configure Parameters node Modify the number of nodes or security settings as needed Features 🔧 Core DevOps Tools Installed: Docker - Container platform with Docker Compose Kubernetes - K3s (lightweight) with kubectl and Helm Jenkins - CI/CD automation server Prometheus - Monitoring and alerting Grafana - Visualization and dashboards ⚡ Optimizations Made: Streamlined Commands - Combined multiple operations into single bash scripts Reduced Nodes - 10 nodes vs 12 in original (more efficient) Better Error Handling - Each step includes verification Cloud-Ready - Includes AWS CLI, Azure CLI, and Google Cloud SDK Security First - Proper firewall configuration and user permissions Parameters to Configure server_host: Your Linux server IP address server_user: SSH username (typically 'root') server_password: SSH password docker_version: Docker version to install k3s_version: K3s version to install username: DevOps username user_password: Password for the DevOps user How to Use Copy the JSON code from the artifact Open your n8n workspace Select "Import from JSON" or "+" → "From JSON" Paste the JSON code Configure parameters in the "Configure Parameters" node with your server details Run the workflow Workflow Actions Install: Deploys Docker, K3s, Jenkins, Prometheus, and Grafana with optimizations Create User: Sets up a DevOps user with necessary permissions Configure: Applies security settings and provides templates
by Oneclick AI Squad
This automated n8n workflow continuously monitors airline schedule changes by fetching real-time flight data, comparing it with stored schedules, and instantly notifying both internal teams and affected passengers through multiple communication channels. The system ensures stakeholders are immediately informed of any flight delays, cancellations, gate changes, or other critical updates. Good to Know Flight data accuracy depends on the aviation API provider's update frequency and reliability Critical notifications (cancellations, major delays) trigger immediate passenger alerts via SMS and email Internal Slack notifications keep operations teams informed in real-time Database logging maintains a complete audit trail of all schedule changes The system processes only confirmed schedule changes to avoid false notifications Passenger notifications are sent only to those with confirmed tickets for affected flights How It Works Schedule Trigger - Automatically runs every 30 minutes to check for flight schedule updates Fetch Airline Data - Retrieves current flight information from aviation APIs Get Current Schedules - Pulls existing schedule data from the internal database Process Changes - Compares API data with database records to identify schedule changes Check for Changes - Determines if any updates require processing and notifications Update Database - Saves schedule changes to the internal flight database Notify Slack Channel - Sends operational updates to the flight operations team Check Urgent Notifications - Identifies critical changes requiring immediate passenger alerts Get Affected Passengers - Retrieves contact information for passengers on changed flights Send Email Notifications - Dispatches detailed schedule change emails via SendGrid Send SMS (Critical Only) - Sends urgent text alerts for cancellations and major delays Update Internal Systems - Syncs changes with other airline systems via webhooks Log Sync Activity - Records all synchronization activities for audit and monitoring Data Sources The workflow integrates with multiple data sources and systems: Aviation API (Primary Data Source) Real-time flight status and schedule data Departure/arrival times, gates, terminals Flight status (on-time, delayed, cancelled, diverted) Aircraft and route information Internal Flight Database flight_schedules table - Current schedule data with columns: flight_number (text) - Flight identifier (e.g., "AA123") departure_time (timestamp) - Scheduled departure time arrival_time (timestamp) - Scheduled arrival time status (text) - Flight status (active, delayed, cancelled, diverted) gate (text) - Departure gate number terminal (text) - Terminal identifier airline_code (text) - Airline IATA code origin_airport (text) - Departure airport code destination_airport (text) - Arrival airport code aircraft_type (text) - Aircraft model updated_at (timestamp) - Last update timestamp created_at (timestamp) - Record creation timestamp passengers table - Passenger contact information with columns: passenger_id (integer) - Unique passenger identifier name (text) - Full passenger name email (text) - Email address for notifications phone (text) - Mobile phone number for SMS alerts notification_preferences (json) - Communication preferences created_at (timestamp) - Registration timestamp updated_at (timestamp) - Last profile update tickets table - Booking and ticket status with columns: ticket_id (integer) - Unique ticket identifier passenger_id (integer) - Foreign key to passengers table flight_number (text) - Flight identifier flight_date (date) - Travel date seat_number (text) - Assigned seat ticket_status (text) - Status (confirmed, cancelled, checked-in) booking_reference (text) - Booking confirmation code fare_class (text) - Ticket class (economy, business, first) created_at (timestamp) - Booking timestamp updated_at (timestamp) - Last modification timestamp sync_logs table - Audit trail and system logs with columns: log_id (integer) - Unique log identifier workflow_name (text) - Name of the workflow that created the log total_changes (integer) - Number of schedule changes processed sync_status (text) - Status (completed, failed, partial) sync_timestamp (timestamp) - When the sync occurred details (json) - Detailed log information and changes error_message (text) - Error details if sync failed execution_time_ms (integer) - Processing time in milliseconds Communication Channels Slack - Internal team notifications SendGrid - Passenger email notifications Twilio - Critical SMS alerts Internal webhooks - System integrations How to Use Import the workflow into your n8n instance Configure aviation API credentials (AviationStack, FlightAware, or airline-specific APIs) Set up PostgreSQL database connection with required tables Configure Slack bot token for operations team notifications Set up SendGrid API key and email templates for passenger notifications Configure Twilio credentials for SMS alerts (critical notifications only) Test with sample flight data to verify all notification channels Adjust monitoring frequency and severity thresholds based on operational needs Monitor sync logs to ensure reliable data synchronization Requirements API Access Aviation data provider (AviationStack, FlightAware, etc.) SendGrid account for email delivery Twilio account for SMS notifications Slack workspace and bot token Database Setup PostgreSQL database with flight schedule tables Passenger and ticket management tables Audit logging tables for tracking changes Infrastructure n8n instance with appropriate node modules Reliable internet connection for API calls Proper credential management and security Customizing This Workflow Modify the Process Changes node to adjust change detection sensitivity, add custom business rules, or integrate additional data sources like weather or airport operational data. Customize notification templates in the email and SMS nodes to match your airline's branding and communication style. Adjust the Schedule Trigger frequency based on your operational requirements and API rate limits.
by Juan Carlos Cavero Gracia
Description This automation template is designed for Instagram marketers, influencers, and businesses looking to supercharge their Instagram engagement strategy. It automatically monitors Instagram post comments and sends personalized direct messages (DMs) to new commenters, while maintaining a smart tracking system to prevent duplicate messages. The workflow runs continuously, checking for new comments every 15 minutes and responding instantly to maintain high engagement rates. Note: This workflow uses the upload-post.com API for Instagram interactions and Google Sheets for contact tracking. The workflow is configured to monitor a specific Instagram post* Who Is This For? Instagram Marketers & Influencers:** Automatically engage with every commenter by sending personalized DMs with valuable content, links, or offers. E-commerce Businesses:** Convert Instagram comments into sales opportunities by instantly sending product links, discount codes, or catalog information via DM. Content Creators & Coaches:** Build deeper relationships with your audience by automatically reaching out to commenters with additional resources, course links, or exclusive content. Social Media Managers:** Scale client engagement without manual monitoring, ensuring no potential lead or follower interaction goes unnoticed. What Problem Does This Workflow Solve? Manually monitoring Instagram comments and sending follow-up DMs is time-consuming and often leads to missed opportunities. This workflow addresses these challenges by: Automated Comment Monitoring:** Continuously checks for new comments on your specified Instagram post every 15 minutes. Smart Duplicate Prevention:** Uses Google Sheets to track already contacted users, preventing spam and maintaining professional communication. Instant Response System:** Sends personalized DMs immediately when new comments are detected, maximizing engagement while the interaction is fresh. Scalable Engagement:** Handles multiple commenters simultaneously without manual intervention, perfect for viral posts or high-engagement content. Comprehensive Tracking:** Maintains detailed logs of all interactions including timestamps, usernames, and message content for analytics and follow-up. How It Works Post Configuration: Set your Instagram post URL, reply message, and profile username in the configuration node. Comment Monitoring: The workflow fetches all comments from your specified Instagram post using the upload-post.com API. Smart Filtering: Compares new comments against your Google Sheets database to identify users who haven't been contacted yet. Automated DM Sending: Sends personalized direct messages to new commenters with your configured message. Contact Tracking: Records each successful interaction in Google Sheets with comment ID, username, message sent, timestamp, and post URL. Continuous Monitoring: Automatically repeats the process every 15 minutes using the built-in scheduler. Setup Upload-Post API Credentials: Create an account at upload-post.com connect your Instagram account and add your API credentials to the HTTP request nodes. Google Sheets Setup: Create a Google Sheet with columns: comment_id, username, message_sent, timestamp, post_url Connect your Google account to the Google Sheets nodes Update the document ID in the "Read Contacted Users" and "Record Contacted User" nodes Instagram Post Configuration: In the "Configure Post & Message" node, update: postUrl: Your Instagram post URL to monitor replyMessage: The DM message to send to commenters profileUsername: Your Upload-post profile username Monitoring Schedule: The workflow is set to run every 15 minutes. You can adjust this in the "Schedule Trigger" node based on your needs. Requirements Accounts:** n8n, upload-post.com, Google (for Sheets access), Instagram business account. API Keys & Credentials:** Upload-post.com API token, Google Sheets OAuth2 credentials. Instagram Setup:** Business/Creator account with API access through upload-post.com. Features Duplicate Prevention:** Advanced comment ID tracking prevents sending multiple DMs to the same user Error Handling:** Robust error handling for API failures and edge cases Detailed Logging:** Comprehensive console logging for debugging and monitoring Flexible Configuration:** Easy to modify for different posts, messages, and monitoring intervals Success Tracking:** Monitors both successful and failed DM attempts for analytics Use this template to transform your Instagram engagement strategy, automatically converting every comment into a potential lead or deeper connection while maintaining professional communication standards.
by Flavio Angeleu
WhatsApp Flows Encrypted Data Exchange Workflow Summary This workflow enables secure end-to-end encrypted data exchange with WhatsApp Flows for interactive applications inside Whatsapp. It implements the WhatsApp Business Encryption protocol using RSA for key exchange and AES-GCM for payload encryption, providing a secure channel for sensitive data transmission while interfacing with WhatsApp's Business API. This follows the official WhatsApp Business Encryption specifications to establish an encrypted GraphQL-powered data exchange channel between your business and the WhatsApp consumer client. How It Works Encryption Flow Webhook Reception: Receives encrypted data from WhatsApp containing: encrypted_flow_data: The AES-encrypted payload encrypted_aes_key: The RSA-encrypted AES key initial_vector: Initialization vector for AES decryption Decryption Process: The workflow decrypts the AES key using an RSA private key Then uses this AES key to decrypt the payload data The inverted IV is used for response encryption Data Processing: The workflow parses the decrypted JSON data Routes requests based on the screen parameter. Response Generation: Generates appropriate response data based on the request type Encrypts the response using the same AES key and inverted IV Returns the base64-encoded encrypted response Key Components Webhook Endpoint**: Entry point for encrypted WhatsApp requests Decryption Pipeline**: RSA and AES decryption components Business Logic Router**: Screen-based routing for different functionality Encryption Pipeline**: Secure response encryption How to Use Deploy the Workflow: Import the workflow JSON into your n8n instance Set Up WhatsApp Integration: Configure your WhatsApp Business API to send requests to your n8n webhook URL Ensure your WhatsApp integration is set up to encrypt data using the public key pair of the private key used in this workflow Test the Flow: Send an encrypted test message from WhatsApp to verify connectivity Check if appointment data is being retrieved correctly Validate that seat selection is functioning as expected Production Use: Monitor the workflow performance in production Set up error notification if needed Requirements Authentication Keys RSA Private Key: Required for decrypting the AES key (included in the workflow) WhatsApp Business Public Key: Must be registered with the WhatsApp Business API PostgreSQL Credentials: For accessing appointment data from the database WhatsApp Business Encryption Setup As specified in the WhatsApp Business Encryption documentation: Generate a 2048-bit RSA Key Pair: The private key remains with your business (used in this workflow) The public key is shared with WhatsApp Register the Public Key with WhatsApp: Use the WhatsApp Cloud API to register your public key Set up the public key using the /v17.0/{WhatsApp-Business-Account-ID}/whatsapp_business_encryption endpoint Key Registration API Call: POST /v17.0/{WhatsApp-Business-Account-ID}/whatsapp_business_encryption { "business_public_key": "YOUR_PUBLIC_KEY" } Verification: Verify your public key is registered using a GET request to the same endpoint Ensure the key status is "active"
by explorium
HubSpot Contact Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: hubspot\_flow.json Overview This n8n workflow monitors your HubSpot instance for newly created contacts and automatically enriches them with additional contact information. When a contact is created, the workflow: Detects the new contact via HubSpot webhook trigger Retrieves recent contact details from HubSpot Matches the contact against Explorium's database using name, company, and email Enriches the contact with professional emails and phone numbers Updates the HubSpot contact record with discovered information This automation ensures your sales and marketing teams have complete contact information, improving outreach success rates and data quality. Key Features Real-time Webhook Trigger**: Instantly processes new contacts as they're created Intelligent Matching**: Uses multiple data points (name, company, email) for accurate matching Comprehensive Enrichment**: Adds both professional and work emails, plus phone numbers Batch Processing**: Efficiently handles multiple contacts to optimize API usage Smart Data Mapping**: Intelligently maps multiple emails and phone numbers Profile Enrichment**: Optional additional enrichment for deeper contact insights Error Resilience**: Continues processing other contacts if some fail to match Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) HubSpot account with: Developer API access (for webhooks) Private App or OAuth2 app created Contact object permissions (read/write) Explorium API credentials (Bearer token) - Get explorium api key Understanding of HubSpot contact properties HubSpot Requirements Required Contact Properties The workflow uses these HubSpot contact properties: firstname - Contact's first name lastname - Contact's last name company - Associated company name email - Primary email (read and updated) work_email - Work email (updated by workflow) phone - Phone number (updated by workflow) API Access Setup Create a Private App in HubSpot: Navigate to Settings → Integrations → Private Apps Create new app with Contact read/write scopes Copy the Access Token Set up Webhooks (for Developer API): Create app in HubSpot Developers portal Configure webhook for contact.creation events Note the App ID and Developer API Key Custom Properties (Optional) Consider creating custom properties for: Multiple email addresses Mobile vs. office phone numbers Data enrichment timestamps Match confidence scores Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure HubSpot Developer API (Webhook) Click on the HubSpot Trigger node Under Credentials, click Create New Enter your HubSpot Developer credentials: App ID: From your HubSpot app Developer API Key: From your developer account Client Secret: From your app settings Save as "HubSpot Developer account" Step 3: Configure HubSpot App Token Click on the HubSpot Recently Created node Under Credentials, click Create New (App Token) Enter your Private App access token Save as "HubSpot App Token account" Apply the same credentials to the Update HubSpot node Step 4: Configure Explorium API Credentials Click on the Explorium Match Prospects node Under Credentials, click Create New (HTTP Header Auth) Configure the authentication: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth Connection" Apply to all Explorium nodes: Explorium Enrich Contacts Information Explorium Enrich Profiles Step 5: Configure Webhook Subscription In HubSpot Developers portal: Go to your app's webhook settings Add subscription for contact.creation events Set the target URL from the HubSpot Trigger node Activate the subscription Step 6: Activate the Workflow Save the workflow Toggle the Active switch to ON The webhook is now listening for new contacts Node Descriptions HubSpot Trigger: Webhook that fires when new contacts are created HubSpot Recently Created: Fetches details of recently created contacts Loop Over Items: Processes contacts in batches of 6 Explorium Match Prospects: Finds matching person in Explorium database Filter: Validates successful matches Extract Prospect IDs: Collects matched prospect identifiers Enrich Contacts Information: Fetches emails and phone numbers Enrich Profiles: Gets additional profile data (optional) Merge: Combines all enrichment results Split Out: Separates individual enriched records Update HubSpot: Updates contact with new information Data Mapping Logic The workflow maps Explorium data to HubSpot properties: | Explorium Data | HubSpot Property | Notes | | ------------------------------ | ------------------ | ----------------------------- | | professions_email | email | Primary professional email | | emails[].address | work_email | All email addresses joined | | phone_numbers[].phone_number | phone | All phones joined with commas | | mobile_phone | phone (fallback) | Used if no other phones found | Data Processing The workflow handles complex data scenarios: Multiple emails**: Joins all discovered emails with commas Phone numbers**: Combines all phone numbers into a single field Missing data**: Uses "null" as placeholder for empty fields Name parsing**: Cleans sample data and special characters Usage & Operation Automatic Processing Once activated: Every new contact triggers the webhook immediately Contact is enriched within seconds HubSpot record is updated automatically Process repeats for each new contact Manual Testing To test the workflow: Use the pinned test data in the HubSpot Trigger node, or Create a test contact in HubSpot Monitor the execution in n8n Verify the contact was updated in HubSpot Monitoring Performance Track workflow health: Go to Executions in n8n Filter by this workflow Monitor success rates Review any failed executions Check webhook delivery in HubSpot Troubleshooting Common Issues Webhook not triggering Verify webhook subscription is active in HubSpot Check the webhook URL is correct and accessible Ensure workflow is activated in n8n Test webhook delivery in HubSpot developers portal Contacts not matching Verify contact has firstname, lastname, and company Check for typos or abbreviations in company names Some individuals may not be in Explorium's database Email matching improves accuracy significantly Updates failing in HubSpot Check API token has contact write permissions Verify property names exist in HubSpot Ensure rate limits haven't been exceeded Check for validation rules on properties Missing enrichment data Not all prospects have all data types Phone numbers may be less available than emails Profile enrichment is optional and may not always return data Error Handling Built-in error resilience: Failed matches don't block other contacts Each batch processes independently Partial enrichment is possible All errors are logged for review Debugging Tips Check webhook logs: HubSpot shows delivery attempts Review executions: n8n logs show detailed error messages Test with pinned data: Use the sample data for isolated testing Verify API responses: Check Explorium API returns expected data Best Practices Data Quality Complete contact records: Ensure name and company are populated Standardize company names: Use official names, not abbreviations Include existing emails: Improves match accuracy Regular data hygiene: Clean up test and invalid contacts Performance Optimization Batch size: 6 is optimal for rate limits Webhook reliability: Monitor delivery success API quotas: Track usage in both platforms Execution history: Regularly clean old executions Compliance & Privacy GDPR compliance: Ensure lawful basis for enrichment Data minimization: Only enrich necessary fields Access controls: Limit who can modify enriched data Audit trail: Document enrichment for compliance Customization Options Additional Enrichment Extend with more Explorium data: Job titles and departments Social media profiles Professional experience Skills and interests Company information Enhanced Processing Add workflow logic for: Lead scoring based on enrichment Routing based on data quality Notifications for high-value matches Custom field mapping Integration Extensions Connect to other systems: Sync enriched data to CRM Trigger marketing automation Update data warehouse Send notifications to Slack API Considerations HubSpot Limits API calls**: Monitor daily limits Webhook payload**: Max 200 contacts per trigger Rate limits**: 100 requests per 10 seconds Property limits**: Max 1000 custom properties Explorium Limits Match API**: Batched for efficiency Enrichment calls**: Two parallel enrichments Rate limits**: Based on your plan Data freshness**: Real-time matching Architecture Considerations This workflow integrates with: HubSpot workflows and automation Marketing campaigns and sequences Sales engagement tools Reporting and analytics Other enrichment services Security Best Practices Webhook validation**: Verify requests are from HubSpot Token security**: Rotate API tokens regularly Access control**: Limit workflow modifications Data encryption**: All API calls use HTTPS Audit logging**: Track all enrichments Advanced Configuration Custom Field Mapping Modify the Update HubSpot node to map to custom properties: // Example custom mapping { "custom_mobile": "{{ $json.data.mobile_phone }}", "custom_linkedin": "{{ $json.data.linkedin_url }}", "enrichment_date": "{{ $now.toISO() }}" } Conditional Processing Add logic to process only certain contacts: Filter by contact source Check for specific properties Validate email domains Exclude test contacts Support Resources For assistance: n8n issues**: Check n8n documentation and forums HubSpot API**: Reference HubSpot developers documentation Explorium API**: Contact Explorium support Webhook issues**: Use HubSpot webhook testing tools
by Agus Narestha
🔒 SSL Certificate Monitoring & Expiry Alert with Spreadsheet [FREE APIs] ✅ What This Workflow Does This n8n template automatically monitors SSL certificates of websites listed in a Google Sheet and sends email alerts if any are expiring within 14 days. It helps ensure you avoid downtime, security issues, and trust warnings due to expired certificates. 🧩 Key Features 📅 Weekly Automation: Runs every Monday at 7:00 AM (configurable). 📄 Google Sheets Integration: Fetches and updates data in a spreadsheet. 🔍 SSL Check via API: Uses ssl-checker.io to get certificate details. ⚠️ SSL Expiry Filter: Identifies certificates expiring within 14 days. 📧 Email Alerts: Sends notifications for certificates close to expiration. 📂 Input Spreadsheet Format Your Google Sheet should have the following columns: | No | Name | Link | SSL Issued On | SSL Expired On | SSL Status | |----|-----------------|-----------------------|-------------------|-------------------|------------| | 1 | Example Site | https://example.com | 2024-07-01 | 2025-07-01 | Valid | | 2 | My Blog | https://myblog.org | 2024-07-05 | 2024-07-20 | Expiring | Each row should include a valid website URL in the Link column. 🛠️ How It Works Scheduled Trigger Executes weekly (Monday 7:00 AM). Fetch Website List Reads all website entries from the Google Sheet. Check SSL Certificates Uses ssl-checker.io API to retrieve certificate details for each website. Update Spreadsheet Writes "Issued On" and "Expired On" fields back to the spreadsheet. Evaluate SSL Expiry Filters for certificates expiring within 14 days. Check Condition Determines whether to send alerts based on filtered results. Send Email Alert Notifies via email if any certificates are expiring soon. 📬 Example Email Output Subject: ⚠️ ALERT!! SSL EXPIRED SSL certificates expiring soon: example.com (expires in 5 days) anotherdomain.net (expires in 3 days) 🧰 Setup Requirements A Google Sheet with the correct columns and website links. SMTP credentials to send alert emails.
by Marketing Canopy
UTM Link Creator & QR Code Generator with Scheduled Google Analytics Reports This workflow enables marketers to generate UTM-tagged links, convert them into QR codes, and automate performance tracking in Google Analytics with scheduled reports every 7 days. This solution helps monitor traffic sources from different marketing channels and optimize campaign performance based on analytics data. Prerequisites Before implementing this workflow, ensure you have the following: Google Analytics 4 (GA4) Account & Access Ensure you have a GA4 property set up. Access to the GA4 Data API to schedule performance tracking. Refer to the Google Analytics Data API Overview for more information. Airtable Account & API Key Create an Airtable base to store UTM links, QR codes, and analytics data. Obtain an Airtable API key from your Account Settings. Detailed instructions are available in the Airtable API Authentication Guide. Step-by-Step Guide to Setting Up the Workflow 1. Generate UTM Links Create a form or interface to input: Base URL** (e.g., https://example.com) Campaign Name** (utm_campaign) Source** (utm_source) Medium** (utm_medium) Term** (Optional: utm_term) Content** (Optional: utm_content) Append UTM parameters to generate a trackable URL. 2. Store UTM Links & QR Codes in Airtable Set up an Airtable base with the following columns: UTM Link** QR Code** Campaign Name** Source** Medium** Date Created** Adjust as needed based on your tracking requirements. For guidance on setting up your Airtable base and using the API, refer to the Airtable Web API Documentation. 3. Convert UTM Links to QR Codes Use a QR code generator API (e.g., goqr.me, qrserver.com) to generate QR codes for each UTM link and store them in Airtable. 4. Schedule Google Analytics Performance Reports (Every 7 Days) Use the Google Analytics Data API to pull weekly performance reports based on UTM parameters. Extract key metrics such as: Sessions Users Bounce Rate Conversions Revenue (if applicable) Store the data in Airtable for tracking and analysis. Adjust timeframe as needed For more details on accessing and using the Google Analytics Data API, consult the Google Analytics Data API Overview. Benefits of This Workflow ✅ Track Marketing Campaigns: Easily monitor which channels drive traffic. ✅ Automate QR Code Creation: Seamless integration of UTM links with QR codes. ✅ Scheduled Google Analytics Reports: No manual reporting—everything runs automatically. ✅ Improve Data-Driven Decisions: Optimize ad spend and marketing strategies based on performance insights. This version ensures proper Markdown structure, includes relevant documentation links, and improves readability. Let me know if you need any further refinements! 🚀
by Airtop
About The Product Hunt Automation Staying up-to-date with specific topics and launches on Product Hunt can be time-consuming. Manually checking the site multiple times a day interrupts your workflow and risks missing important launches. What if you could automatically get relevant launches delivered to your Slack workspace? How to Monitor Product Hunt In this guide, you'll learn how to create a Product Hunt monitoring system using Airtop's built-in node in n8n. This automation will scan Product Hunt for your chosen topics and deliver the most relevant launches directly to Slack. What You'll Need A free Airtop API key A Slack workspace with permissions to add incoming webhooks Estimated setup time: 5 minutes Understanding the Process The Monitor Product Hunt automation uses Airtop's cloud browser capabilities to access Product Hunt and extract launch information. Here's what happens: Airtop visits Product Hunt and navigates the page It searches for and extracts up to 5 launches related to your chosen topic The information is formatted and sent to your specified Slack channel This process can run on your preferred schedule, ensuring you never miss relevant launches. Setting Up Your Automation We've created a ready-to-use template that handles all the complex parts. Here's how to get started: Connect your Airtop account by adding the API key you created Connect your Slack account Set your prompt in the Airtop node. For this example, we’ve set it to be “Extract up to 5 launches related to AI products” Choose your preferred monitoring schedule. Customization Options While our template works immediately, you might want to customize it for your specific needs: Adjust the prompt and the maximum number of launches to monitor Customize the Slack message format Change the monitoring frequency Add filters for particular keywords or companies Real-World Applications Here's how teams can use this automation: A startup's engineering team could track trends in other product’s tech stack, helping them stay informed about potential issues and improvements. A product manager can track launches of competitor products, enabling them to gather valuable market insights and user feedback directly from the tech community on that launch. Best Practices To get the most out of this automation: Choose Specific Search Terms**: For more relevant results, instead of broad terms like "AI," use specific phrases like "machine learning for healthcare" Optimize Scheduling**: When setting the monitoring frequency, consider your team's workflow. Running the scenario every 4 hours during working hours often provides a good balance between staying updated and avoiding notification fatigue. Set Up Error Handling**: Enable n8n's error output to alert you if the automation encounters any issues with accessing Product Hunt or sending messages to Slack. Regular Topic Review**: Schedule a monthly review of your monitored topics to ensure they're still relevant to your needs and adjust as necessary. What's Next? Now that you've set up your Product Hunt monitor automation, you might be interested in: Creating a similar monitor for other tech websites Setting up automated content curation for your team's newsletter Building a competitive intelligence dashboard using web monitoring Happy Automating!
by Airtop
About the Automation Staying on top of competitor pricing changes can be a full-time job. Manual price tracking is time-consuming and prone to errors, especially when dealing with complex pricing structures and multiple subscription tiers. Paid competitor price monitoring tools like Competera, Visualping and Fluxguard can be expensive. What if you could automate this process and get instant alerts when competitors adjust their pricing? How to easily monitor competitor pricing With this automation, you'll learn how to set up automated price monitoring system using Airtop's built-in node in n8n. By the end, your system will automatically track competitor pricing changes and notify you of any modifications. What You'll Need A free Airtop API Key Google Sheets account with a copy of this sheet URLs of competitors' pricing pages Understanding the Process This automation continuously monitors competitor pricing pages and compares them against your baseline data. The workflow: Tracks all different pricing plans (monthly, yearly, etc.). Monitors feature changes across different tiers. Detects and logs pricing structure modifications. Alerts you via Slack when changes are detected Setting Up Your Automation We've created a ready-to-use blueprint for seamless price monitoring. Here's how to get started: Connect your Google Sheets Set up your Airtop API connection Define update frequency Customization Options Enhance the basic template with these popular modifications: Add other notification channels (Email, Telegram, etc.). Include feature comparison tracking. Set up threshold-based alerts for significant price changes Track historical pricing trends Real-World Applications Case Study 1: A B2B SaaS company can use this automation to track competitors' pricing changes. When they identify a market-wide pricing shift, they can adjust their strategy proactively within minutes. Case Study 2: An online Ecommerce retailer automates monitoring of 100+ competitor products, maintaining optimal pricing positions and increasing profit margins. Best Practices To ensure accurate tracking: Include detailed baseline data for each pricing tier Specify both monthly and annual pricing clearly List all features included in each plan Update your baseline data whenever you verify changes Include any promotional pricing or special offers Document currency and regional variations if applicable Example Structure in Google Sheets: Competitor: Acme Tools Basic Plan: Monthly: $29 Annual: $290 ($24.17/mo) Features: 5 users, 10GB storage, basic support Pro Plan: Monthly: $79 Annual: $790 ($65.83/mo) Features: 20 users, 50GB storage, priority support What's Next? After setting up your price monitoring automation, consider the following: Creating automated competitive analysis reports Setting up market trend analysis Implementing automatic pricing recommendations Expanding monitoring to feature changes Happy monitoring!
by Ron
Get weather alerts on your mobile phone via push, SMS or voice call. This flow gets weather information every morning and sends out an alert to your SIGNL4 on-call team. For example you can send out weather alerts in case of freezing temperatures, snow, rain, hail storms, hot weather, etc. The flow also supports automatic alert resolution. So, for example if the temperature goes up again the alert is closed automatically in the app. User cases: Dispatch snow removal teams Inform car dealers to protect the cars outside in case of hail storms Set sails if there are high winds And much more ... Can be adapted easily to other weather warnings, like rain, hail storm, etc.
by SpaGreen Creative
WhatsApp Bulk Message Broadcast via Google Sheets (n8n Workflow) Use Case This workflow enables automated bulk WhatsApp message broadcasting using the WhatsApp Business Cloud API. It pulls recipient and message data from a Google Sheet, sends templated messages (optionally with image headers), and updates the sheet with the message status. It is ideal for marketing teams, support agents, and businesses handling high-volume outreach. Who Is This For? Businesses conducting WhatsApp marketing or outreach campaigns Customer support or notification teams Administrators seeking an automated, no-code message distribution system using Google Sheets What This Workflow Does Triggers automatically every minute to scan for pending messages Fetches unsent entries from a Google Sheet Limits the number of messages processed per execution to comply with API usage guidelines Sanitizes WhatsApp numbers for proper formatting Sends messages using a pre-approved WhatsApp template (text and optional image) Marks the row as "Sent" in the sheet upon successful delivery Workflow Breakdown (Node by Node) 1. Trigger Every 5 Minutes Initiates the workflow every minute using a scheduled trigger to continuously monitor pending rows. 2. Fetch All Pending Queries for Messaging Reads rows from a Google Sheet where the Status column is empty, indicating they haven’t been processed yet. 3. Limit Restricts processing to 2 rows per execution to manage API throughput. 4. Loop Over Items Uses SplitInBatches to iterate through each row individually. 5. Clean WhatsApp Number A code node that strips non-numeric characters from the WhatsApp No field, ensuring the format is valid for the API. 6. Send Message to 300 Phone No Sends a WhatsApp message using the WhatsApp Cloud API and a pre-approved template. Template includes: An image from the Image URL column (as header, optional) Dynamic variables for the recipient's Name and Message fields Template variables must be pre-defined and approved in the Meta Developer Portal, such as {{1}}, {{2}}. 7. Change State of Rows in Sent1 Updates the Status column to Sent for each successfully processed row using the row number as a reference. Google Sheet Format Structure your Google Sheet as shown below: | WhatsApp No | Name | Message | Image URL | Status | |--------------|------------|---------------------------|---------------------|--------| | +8801XXXXXXX | John Doe | Hello, your order shipped | https://.../img.jpg | | Leave the Status column empty for rows that need to be processed. Requirements WhatsApp Business Cloud API access via Meta for Developers A properly structured Google Sheet as described above Active OAuth2 credentials configured in n8n for: googleSheetsOAuth2Api whatsAppApi Customization Options Update the Limit node to control how many rows are processed in each run Adjust the trigger schedule (e.g., change to every 5 minutes) Replace the message template ID with your own custom-approved one from Meta Add error-handling logic (e.g., IF or Try/Catch nodes) to log failures or set Status = Failed Sample Sheet Template View Sample Google Sheet Workflow Highlights Automated execution every 1 minute Reads and processes only pending records Verifies WhatsApp numbers and delivers templated messages Updates Google Sheet after each attempt Support & Community Need help setting up or customizing the workflow? WhatsApp: Contact Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: Visit SpaGreen Creative