by Francis Njenga
Workflow Documentation: Auto-Retry Engine – Error Recovery Workflow Detailed Description The Auto-Retry Engine: Error Recovery Workflow is designed to automate the process of identifying and retrying failed executions in n8n workflows. By leveraging scheduled triggers, API integrations, and conditional logic, this workflow ensures that any failed executions are automatically retried on an hourly basis. This reduces manual intervention, improves system reliability, and ensures smoother workflow operations. Who is this for? This workflow is ideal for: Automation Engineers**: Managing and maintaining workflows with minimal manual intervention. DevOps Teams**: Ensuring high availability and reliability of automated processes. IT Administrators**: Reducing downtime and improving system performance by automating error recovery. What problem does this workflow solve? Manual Error Handling**: Eliminates the need for manual monitoring and retrying of failed executions. Improved Reliability**: Automatically retries failed executions, reducing downtime and improving workflow success rates. Time Efficiency**: Saves time by automating repetitive error recovery tasks, allowing teams to focus on higher-priority work. What this workflow does This workflow automates the following steps: Scheduled Monitoring: Checks for failed executions hourly using a schedule trigger. Error Filtering: Identifies executions that have failed and filters out those that have already been successfully retried. Authentication: Logs into the n8n instance using API credentials to retrieve session details. Automatic Retry: Retries the failed executions using the n8n API. Batch Processing: Processes multiple failed executions in batches to avoid overloading the system. Setup Prerequisites To use this workflow, you’ll need: n8n Account**: To create and run the workflow. n8n API Credentials**: For logging into the n8n instance and retrying executions. HTTP Request Node**: Configured to interact with the n8n API. Schedule Trigger**: Set to run the workflow hourly. Setup Process Configure Schedule Trigger Set the trigger to run hourly to check for failed executions. Set Login Credentials Add your n8n instance URL, username, and password in the Set Node. Integrate n8n API Use the HTTP Request node to log into the n8n instance and retrieve session details. Retry Failed Executions Configure the HTTP Request node to retry failed executions using the session details. Batch Processing Use the Split in Batches node to process multiple failed executions in batches. How to customize this workflow Tailor the workflow to fit your specific needs: Adjust Schedule Frequency** Modify the schedule trigger to run at different intervals (e.g., every 30 minutes). Add Notifications** Integrate email or Slack notifications to alert teams about failed retries. Refine Error Filtering** Customize the filtering logic to exclude specific types of failed executions. Scale Batch Size** Adjust the batch size in the Split in Batches node to optimize performance. Conclusion The Auto-Retry Engine: Error Recovery Workflow is a powerful tool for automating error recovery in n8n workflows. By reducing manual intervention and ensuring failed executions are retried automatically, this workflow enhances system reliability and operational efficiency. Whether you're managing a few workflows or a complex automation ecosystem, this workflow ensures your processes run smoothly and consistently.
by ankitkansaldev
🎬 TikTok Influencer Scraper (URL Input) via Bright Data + n8n & Sheets A comprehensive n8n automation that scrapes TikTok influencer profiles using Bright Data's TikTok dataset and automatically saves detailed profile information to Google Sheets. 📋 Overview This workflow provides an automated TikTok influencer data collection solution that scrapes comprehensive profile information and saves it to Google Sheets. Perfect for influencer marketing research, competitor analysis, social media monitoring, and marketing campaign planning. ✨ Key Features 📝 Form-Based Input: Simple web form to submit TikTok profile URLs 🤖 Bright Data Integration: Uses Bright Data's TikTok dataset for reliable scraping ⏳ Status Monitoring: Intelligent polling system to check scraping progress 🔄 Retry Logic: Automatic retry mechanism with 30-second intervals 📊 Data Extraction: Comprehensive profile data including engagement metrics 📈 Google Sheets Storage: Automatic data storage and organization ⚡ Error Handling: Built-in error handling and status reporting 🎯 Custom Fields: Configurable output fields for specific data needs 🎯 What This Workflow Does Input Profile URLs**: TikTok profile URLs submitted through web form Custom Fields**: Configurable data fields for extraction Country Settings**: Geo-targeting for accurate data collection Processing Form Submission: User submits TikTok profile URL through web form API Trigger: Sends profile data to Bright Data for scraping Status Polling: Continuously checks scraping progress Wait & Retry: Implements 30-second delays between status checks Data Retrieval: Fetches complete profile data when ready Sheet Update: Saves extracted data to Google Sheets Status Reporting: Provides completion status and messages Output Data Points | Field | Description | Example | |-------|-------------|---------| | Account ID | Unique TikTok account identifier | @username123 | | Nickname | Display name on profile | "John Doe" | | Biography | Profile bio/description | "Content creator & influencer" | | Followers | Number of followers | 1,250,000 | | Following | Number of accounts following | 500 | | Likes | Total likes across all videos | 50,000,000 | | Videos Count | Total number of videos posted | 1,200 | | Profile URL | Direct link to TikTok profile | https://www.tiktok.com/@username | | Profile Picture | Profile image URL | https://p16-sign-sg.tiktokcdn.com/... | | Profile Picture HD | High-definition profile image | https://p16-sign-sg.tiktokcdn.com/... | | Is Verified | Verification status | true/false | | Bio Link | External link in bio | https://linktr.ee/username | | Like Engagement Rate | Engagement rate based on likes | 5.2% | | Comment Engagement Rate | Engagement rate based on comments | 2.1% | | Top Videos | List of top performing videos | [video_objects] | | Region | Geographic region | "US" | | Is Under Age 18 | Age status indicator | true/false | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with TikTok dataset access Valid TikTok profile URLs for testing 10-15 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Request Generic Credential Name: "Bright Data API" Authentication: Bearer Token Token: Your Bright Data API key Test the connection Configure dataset: Ensure you have access to TikTok dataset (gd_l1villgoiiidt09ci) Verify dataset permissions in Bright Data dashboard Check dataset limits and pricing Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "TikTok Influencer Data" Create a sheet tab named "TikTok profile by url" Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Account ID Column B: Nickname Column C: Biography Column D: Followers Column E: Following Column F: Likes Column G: Videos Count Column H: Profile URL Column I: Is Verified Column J: Bio Link Column K: Like Engagement Rate Column L: Comment Engagement Rate Column M: Region Column N: Status Column O: Message Step 4: Update Workflow Settings Update API credentials: Open "Sends profile URLs to Bright Data to trigger scraping" node Replace BRIGHT_DATA_API_KEY with your actual API key Update dataset ID if different Update Google Sheets nodes: Open "Google Sheets" node Replace document ID: 1OeqtCFm4Wek9DI5YFOWQXTpQJS-SJxC10iAPKEKkmiY Select your Google Sheets credential Choose the correct sheet/tab name Configure form settings: Open "Search by Profile URL" node Customize form title and field labels as needed Note the webhook URL for form access Step 5: Test & Activate Add test profiles: Access the form using the webhook URL Submit 1-2 TikTok profile URLs for testing Use full URLs (e.g., https://www.tiktok.com/@username) Test the workflow: Submit a test profile through the form Monitor execution in n8n Verify data appears in Google Sheet Check for any error messages 📖 Usage Guide Submitting TikTok Profiles Navigate to your form URL (found in Form Trigger node) Enter TikTok profile URL in the format: https://www.tiktok.com/@username Click Submit to start the scraping process Wait for processing (typically 1-3 minutes) Understanding the Process The workflow follows this sequence: Form Submission → Profile URL captured API Trigger → Scraping job submitted to Bright Data Status Polling → Checks every 30 seconds if data is ready Data Retrieval → Fetches complete profile information Sheet Update → Saves data to Google Sheets Monitoring Progress Check n8n execution logs for real-time status Bright Data dashboard shows scraping progress Google Sheets will populate when data is ready Status column shows "ready" when complete Reading the Results Your Google Sheet will show: Complete TikTok profile information Engagement metrics and statistics Profile verification status Bio links and external connections Timestamp of data collection 🔧 Customization Options Adding More Data Points Edit the JSON body in "Sends profile URLs to Bright Data" node to include additional fields: "custom_output_fields": [ "account_id", "nickname", "biography", "followers", "following", "likes", "videos_count", "language", "creation_time", "last_post_time", "avg_video_duration", "hashtags_used", "music_used" ] Modifying Input Parameters Customize the scraping parameters: Country targeting**: Change "country" field in input Search limits**: Adjust "limit_per_input" value Discovery method**: Modify "discover_by" parameter Error handling**: Toggle "include_errors" setting Batch Processing Multiple Profiles To process multiple profiles simultaneously: Modify the input array in the API call Add multiple profile URLs in single request Implement loop logic for processing results Add rate limiting between requests Custom Form Fields Enhance the form with additional inputs: Open "Search by Profile URL" node Add form fields for: Country selection Number of videos to analyze Specific date ranges Custom tags or categories 🚨 Troubleshooting Common Issues & Solutions "Bright Data connection failed" Cause: Invalid API credentials or dataset access Solution: Verify API key in Bright Data dashboard, check dataset permissions "Profile not found or private" Cause: Invalid TikTok URL or private profile Solution: Verify profile URL format, ensure profile is public "Google Sheets permission denied" Cause: Incorrect credentials or sheet permissions Solution: Re-authenticate Google Sheets, check sheet sharing settings "Scraping timeout" Cause: Profile data taking too long to process Solution: Increase wait time or implement longer polling intervals "Invalid dataset ID" Cause: Incorrect or expired dataset configuration Solution: Check Bright Data dashboard for correct dataset ID "Form submission failed" Cause: Webhook configuration issues Solution: Verify webhook URL and form trigger settings Advanced Troubleshooting Check execution logs** in n8n for detailed error messages Test individual nodes** by running them separately Verify data formats** ensure URLs are properly formatted Monitor API limits** check Bright Data usage quotas Add error handling** implement try-catch logic for robust operation 📊 Use Cases & Examples 1. Influencer Marketing Research Goal: Identify and analyze potential influencers for campaigns Research influencers in specific niches Analyze engagement rates and audience size Compare multiple influencers for campaign selection Track influencer growth over time 2. Competitive Analysis Goal: Monitor competitors' TikTok presence and performance Track competitor follower growth Analyze content strategies and engagement Monitor posting frequency and timing Identify trending content themes 3. Social Media Monitoring Goal: Track brand mentions and user-generated content Monitor branded hashtag usage Track brand advocates and micro-influencers Analyze sentiment and engagement patterns Identify trending topics in your industry 4. Market Research Pipeline Goal: Gather social media intelligence for business decisions Analyze target audience behavior Study content preferences and trends Generate reports for stakeholders Support marketing strategy development ⚙ Advanced Configuration Rate Limiting and Performance To optimize for large-scale scraping: Adjust wait times between status checks Implement exponential backoff for retries Add batch processing for multiple profiles Monitor API usage to avoid limits Data Validation and Cleaning Enhance data quality with validation: Add data type validation for numeric fields Implement URL format checking Clean and standardize text fields Add data completeness checks Integration with Business Tools Connect the workflow to your existing systems: CRM Integration**: Update customer records with influencer data Slack Notifications**: Send alerts when new data is available Database Storage**: Store data in PostgreSQL/MySQL for analysis BI Tools**: Connect to Tableau/Power BI for visualization Webhook Integration For real-time updates: Add webhook triggers for immediate profile checks Integrate with external systems via webhooks Create API endpoints for programmatic access Implement authentication for secure access 📈 Performance & Limits Expected Performance Single Profile**: 30-60 seconds average processing time Concurrent Requests**: 5-10 simultaneous (depends on Bright Data plan) Data Accuracy**: 95%+ for public TikTok profiles Success Rate**: 90%+ for accessible profiles Daily Capacity**: 100-1000 profiles (depends on rate limits) Resource Usage Memory**: ~50MB per execution Storage**: Minimal (data stored in Google Sheets) API Calls**: 3-5 Bright Data calls per profile (including status checks) Bandwidth**: ~1-2MB per profile scraped Execution Time**: 1-2 minutes per profile Scaling Considerations Rate Limiting**: Add delays for high-volume scraping Error Handling**: Implement retry logic for failed requests Data Validation**: Add checks for malformed profile data Monitoring**: Track success/failure rates over time Cost Optimization**: Monitor API usage to control costs 🤝 Support & Community Getting Help n8n Community Forum**: community.n8n.io Documentation**: docs.n8n.io Bright Data Support**: Contact through your dashboard GitHub Issues**: Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 📋 Quick Setup Checklist Before You Start ☐ n8n instance running (self-hosted or cloud) ☐ Google account with Sheets access ☐ Bright Data account with TikTok dataset access ☐ Valid TikTok profile URLs for testing ☐ 15 minutes for setup Setup Steps ☐ Import Workflow - Copy JSON and import to n8n ☐ Configure Bright Data - Set up API credentials and test ☐ Create Google Sheet - New sheet with proper column structure ☐ Set up Google Sheets credentials - OAuth setup and test ☐ Update workflow settings - Replace sheet ID and API keys ☐ Test with sample profiles - Submit 1-2 URLs and verify results ☐ Activate workflow - Enable form trigger for production use Ready to Use! 🎉 Your form URL: https://your-n8n-instance.com/form/[webhook-id] 🎯 Happy TikTok Scraping! This workflow provides a solid foundation for automated TikTok influencer data collection. Customize it to fit your specific needs and use cases for influencer marketing, competitive analysis, and social media research.
by Rui Borges
Workflow Purpose This workflow periodically checks a service's availability and sends an SMS notification if the service is down. High-Level Steps Schedule Trigger: The workflow is triggered at a specified interval, such as every minute. HTTP Request: An HTTP request is sent to the URL of the service being monitored. If: The HTTP status code of the response is checked. If the status code is 200 (OK), the workflow ends. If the status code is not 200, indicating a potential issue, an SMS notification is sent using Twilio. Setup Setting up this workflow is relatively straightforward and should only take a few minutes: Create a new n8n workflow. Add the nodes: Schedule Trigger, HTTP Request, If, and Twilio. Configure the nodes: Schedule Trigger: Specify the desired interval. HTTP Request: Enter the URL of the service to be monitored. If: Set the condition to check for a status code other than 200. Twilio: Enter the Twilio account credentials and the phone numbers for sending and receiving the SMS notification. Connect the nodes: Connect the nodes as shown in the workflow diagram. Activate the workflow: Save the workflow and activate it. Additional Notes The workflow can be customized by changing the interval, the URL, the Twilio credentials, and the SMS message. This workflow is a simple example, and more complex workflows can be created to meet specific needs.
by Elie Kattar
Multi-Channel Customer Support Automation Suite Transform your customer support operations with this enterprise-grade automation workflow that unifies, categorizes, and intelligently routes support tickets from multiple channels. 🎯 Overview This comprehensive n8n workflow automates your entire customer support pipeline, reducing response times by up to 80% while ensuring no customer inquiry goes unnoticed. It seamlessly integrates email, web forms, and webhooks into a single, intelligent support system that works 24/7. 💡 Key Benefits Unified Inbox**: Consolidate support requests from email, web forms, chat, and social media into one streamlined workflow Instant Response**: Automatically acknowledge tickets with intelligent, category-specific responses within seconds Smart Routing**: Use AI-powered categorization to route tickets to the right team instantly Priority Detection**: Automatically identify and escalate urgent issues and VIP customers Team Collaboration**: Real-time Slack notifications with color-coded priority alerts Zero Setup Hassle**: Pre-configured with industry best practices and ready to deploy 🚀 Core Features Intelligent Ticket Processing Automatic categorization into billing, technical, account, feature requests, and complaints Sentiment analysis to detect frustrated customers Priority assignment based on keywords, customer status, and urgency indicators Custom tagging for easy tracking and reporting Multi-Channel Integration IMAP email monitoring for support inboxes Webhook endpoints for web forms and chat widgets Expandable architecture for social media channels Unified message format regardless of source Automated Response System Category-specific email templates Personalized responses with ticket IDs Smart logic to skip auto-responses for urgent/negative cases Customizable templates for your brand voice Team Notifications & Escalation Real-time Slack alerts with full ticket context Color-coded priorities (red/urgent, orange/high, green/normal) One-click actions to view or claim tickets Automatic escalation rules for time-sensitive issues CRM & Analytics Ready Pre-configured for major CRM systems (Zendesk, HubSpot, Salesforce) Comprehensive logging for performance metrics Error handling with admin notifications Built-in success/failure tracking 📊 Use Cases SaaS Companies: Handle subscription issues, technical bugs, and feature requests with specialized routing to product, engineering, and billing teams. E-commerce: Manage order inquiries, shipping issues, and returns while maintaining high customer satisfaction scores. Agencies: Provide white-label support services with customizable branding and client-specific routing rules. Startups: Scale support operations without hiring additional staff by automating 70% of routine inquiries. 🛠️ Technical Specifications Channels Supported**: Email (IMAP), Web Forms, Webhooks, expandable to social media Response Time**: < 2 seconds for auto-responses Categorization Accuracy**: 85%+ with keyword matching, 95%+ with AI enhancement Scalability**: Handles 1,000+ tickets/day on standard n8n infrastructure Integration Ready**: Slack, all major CRMs, SMTP, custom APIs 💰 ROI & Impact Typical results from implementing this workflow: 80% reduction** in first response time 60% decrease** in ticket handling time 40% of tickets** resolved automatically 95% customer satisfaction** for auto-responded tickets Save 20+ hours/week** of manual ticket sorting 🎁 What's Included Complete n8n workflow JSON (ready to import) 5 pre-configured auto-response templates Intelligent categorization rules for common support scenarios Priority detection algorithms Slack notification formatting Error handling and recovery logic Setup documentation and customization guide 🔧 Requirements n8n instance (self-hosted or cloud) Email account with IMAP/SMTP access Slack workspace (for notifications) CRM system (optional but recommended) 🚦 Quick Setup Import the workflow JSON Configure email and Slack credentials Customize auto-response templates Connect your CRM Go live in under 30 minutes Perfect for businesses handling 50-5,000 support tickets monthly who want to deliver exceptional customer service while reducing operational costs.
by Ron
Objective In industry and production sometimes machine data is available in databases. That might be sensor data like temperature or pressure or just binary information. In this sample flow reads machine data and sends an alert to your SIGNL4 team when the machine is down. When the machine is up again the alert in SIGNL4 will get closed automatically. Setup We simulate the machine data using a Notion table. When we un-check the Up box we simulate a machine-down event. In certain intervals n8n checks the database for down items. If such an item has been found an alert is send using SIGNL4 and the item in Notion is updates (in order not to read it again). Status updates from SIGNL4 (acknowledgement, close, annotation, escalation, etc.) are received via webhook and we update the Notion item accordingly. This is how the alert looks like in the SIGNL4 app. The flow can be easily adapted to other database monitoring scenarios.
by Agus Narestha
🔒 SSL Certificate Monitoring & Expiry Alert with Spreadsheet [FREE APIs] ✅ What This Workflow Does This n8n template automatically monitors SSL certificates of websites listed in a Google Sheet and sends email alerts if any are expiring within 14 days. It helps ensure you avoid downtime, security issues, and trust warnings due to expired certificates. 🧩 Key Features 📅 Weekly Automation: Runs every Monday at 7:00 AM (configurable). 📄 Google Sheets Integration: Fetches and updates data in a spreadsheet. 🔍 SSL Check via API: Uses ssl-checker.io to get certificate details. ⚠️ SSL Expiry Filter: Identifies certificates expiring within 14 days. 📧 Email Alerts: Sends notifications for certificates close to expiration. 📂 Input Spreadsheet Format Your Google Sheet should have the following columns: | No | Name | Link | SSL Issued On | SSL Expired On | SSL Status | |----|-----------------|-----------------------|-------------------|-------------------|------------| | 1 | Example Site | https://example.com | 2024-07-01 | 2025-07-01 | Valid | | 2 | My Blog | https://myblog.org | 2024-07-05 | 2024-07-20 | Expiring | Each row should include a valid website URL in the Link column. 🛠️ How It Works Scheduled Trigger Executes weekly (Monday 7:00 AM). Fetch Website List Reads all website entries from the Google Sheet. Check SSL Certificates Uses ssl-checker.io API to retrieve certificate details for each website. Update Spreadsheet Writes "Issued On" and "Expired On" fields back to the spreadsheet. Evaluate SSL Expiry Filters for certificates expiring within 14 days. Check Condition Determines whether to send alerts based on filtered results. Send Email Alert Notifies via email if any certificates are expiring soon. 📬 Example Email Output Subject: ⚠️ ALERT!! SSL EXPIRED SSL certificates expiring soon: example.com (expires in 5 days) anotherdomain.net (expires in 3 days) 🧰 Setup Requirements A Google Sheet with the correct columns and website links. SMTP credentials to send alert emails.
by explorium
HubSpot Contact Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: hubspot\_flow.json Overview This n8n workflow monitors your HubSpot instance for newly created contacts and automatically enriches them with additional contact information. When a contact is created, the workflow: Detects the new contact via HubSpot webhook trigger Retrieves recent contact details from HubSpot Matches the contact against Explorium's database using name, company, and email Enriches the contact with professional emails and phone numbers Updates the HubSpot contact record with discovered information This automation ensures your sales and marketing teams have complete contact information, improving outreach success rates and data quality. Key Features Real-time Webhook Trigger**: Instantly processes new contacts as they're created Intelligent Matching**: Uses multiple data points (name, company, email) for accurate matching Comprehensive Enrichment**: Adds both professional and work emails, plus phone numbers Batch Processing**: Efficiently handles multiple contacts to optimize API usage Smart Data Mapping**: Intelligently maps multiple emails and phone numbers Profile Enrichment**: Optional additional enrichment for deeper contact insights Error Resilience**: Continues processing other contacts if some fail to match Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) HubSpot account with: Developer API access (for webhooks) Private App or OAuth2 app created Contact object permissions (read/write) Explorium API credentials (Bearer token) - Get explorium api key Understanding of HubSpot contact properties HubSpot Requirements Required Contact Properties The workflow uses these HubSpot contact properties: firstname - Contact's first name lastname - Contact's last name company - Associated company name email - Primary email (read and updated) work_email - Work email (updated by workflow) phone - Phone number (updated by workflow) API Access Setup Create a Private App in HubSpot: Navigate to Settings → Integrations → Private Apps Create new app with Contact read/write scopes Copy the Access Token Set up Webhooks (for Developer API): Create app in HubSpot Developers portal Configure webhook for contact.creation events Note the App ID and Developer API Key Custom Properties (Optional) Consider creating custom properties for: Multiple email addresses Mobile vs. office phone numbers Data enrichment timestamps Match confidence scores Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure HubSpot Developer API (Webhook) Click on the HubSpot Trigger node Under Credentials, click Create New Enter your HubSpot Developer credentials: App ID: From your HubSpot app Developer API Key: From your developer account Client Secret: From your app settings Save as "HubSpot Developer account" Step 3: Configure HubSpot App Token Click on the HubSpot Recently Created node Under Credentials, click Create New (App Token) Enter your Private App access token Save as "HubSpot App Token account" Apply the same credentials to the Update HubSpot node Step 4: Configure Explorium API Credentials Click on the Explorium Match Prospects node Under Credentials, click Create New (HTTP Header Auth) Configure the authentication: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth Connection" Apply to all Explorium nodes: Explorium Enrich Contacts Information Explorium Enrich Profiles Step 5: Configure Webhook Subscription In HubSpot Developers portal: Go to your app's webhook settings Add subscription for contact.creation events Set the target URL from the HubSpot Trigger node Activate the subscription Step 6: Activate the Workflow Save the workflow Toggle the Active switch to ON The webhook is now listening for new contacts Node Descriptions HubSpot Trigger: Webhook that fires when new contacts are created HubSpot Recently Created: Fetches details of recently created contacts Loop Over Items: Processes contacts in batches of 6 Explorium Match Prospects: Finds matching person in Explorium database Filter: Validates successful matches Extract Prospect IDs: Collects matched prospect identifiers Enrich Contacts Information: Fetches emails and phone numbers Enrich Profiles: Gets additional profile data (optional) Merge: Combines all enrichment results Split Out: Separates individual enriched records Update HubSpot: Updates contact with new information Data Mapping Logic The workflow maps Explorium data to HubSpot properties: | Explorium Data | HubSpot Property | Notes | | ------------------------------ | ------------------ | ----------------------------- | | professions_email | email | Primary professional email | | emails[].address | work_email | All email addresses joined | | phone_numbers[].phone_number | phone | All phones joined with commas | | mobile_phone | phone (fallback) | Used if no other phones found | Data Processing The workflow handles complex data scenarios: Multiple emails**: Joins all discovered emails with commas Phone numbers**: Combines all phone numbers into a single field Missing data**: Uses "null" as placeholder for empty fields Name parsing**: Cleans sample data and special characters Usage & Operation Automatic Processing Once activated: Every new contact triggers the webhook immediately Contact is enriched within seconds HubSpot record is updated automatically Process repeats for each new contact Manual Testing To test the workflow: Use the pinned test data in the HubSpot Trigger node, or Create a test contact in HubSpot Monitor the execution in n8n Verify the contact was updated in HubSpot Monitoring Performance Track workflow health: Go to Executions in n8n Filter by this workflow Monitor success rates Review any failed executions Check webhook delivery in HubSpot Troubleshooting Common Issues Webhook not triggering Verify webhook subscription is active in HubSpot Check the webhook URL is correct and accessible Ensure workflow is activated in n8n Test webhook delivery in HubSpot developers portal Contacts not matching Verify contact has firstname, lastname, and company Check for typos or abbreviations in company names Some individuals may not be in Explorium's database Email matching improves accuracy significantly Updates failing in HubSpot Check API token has contact write permissions Verify property names exist in HubSpot Ensure rate limits haven't been exceeded Check for validation rules on properties Missing enrichment data Not all prospects have all data types Phone numbers may be less available than emails Profile enrichment is optional and may not always return data Error Handling Built-in error resilience: Failed matches don't block other contacts Each batch processes independently Partial enrichment is possible All errors are logged for review Debugging Tips Check webhook logs: HubSpot shows delivery attempts Review executions: n8n logs show detailed error messages Test with pinned data: Use the sample data for isolated testing Verify API responses: Check Explorium API returns expected data Best Practices Data Quality Complete contact records: Ensure name and company are populated Standardize company names: Use official names, not abbreviations Include existing emails: Improves match accuracy Regular data hygiene: Clean up test and invalid contacts Performance Optimization Batch size: 6 is optimal for rate limits Webhook reliability: Monitor delivery success API quotas: Track usage in both platforms Execution history: Regularly clean old executions Compliance & Privacy GDPR compliance: Ensure lawful basis for enrichment Data minimization: Only enrich necessary fields Access controls: Limit who can modify enriched data Audit trail: Document enrichment for compliance Customization Options Additional Enrichment Extend with more Explorium data: Job titles and departments Social media profiles Professional experience Skills and interests Company information Enhanced Processing Add workflow logic for: Lead scoring based on enrichment Routing based on data quality Notifications for high-value matches Custom field mapping Integration Extensions Connect to other systems: Sync enriched data to CRM Trigger marketing automation Update data warehouse Send notifications to Slack API Considerations HubSpot Limits API calls**: Monitor daily limits Webhook payload**: Max 200 contacts per trigger Rate limits**: 100 requests per 10 seconds Property limits**: Max 1000 custom properties Explorium Limits Match API**: Batched for efficiency Enrichment calls**: Two parallel enrichments Rate limits**: Based on your plan Data freshness**: Real-time matching Architecture Considerations This workflow integrates with: HubSpot workflows and automation Marketing campaigns and sequences Sales engagement tools Reporting and analytics Other enrichment services Security Best Practices Webhook validation**: Verify requests are from HubSpot Token security**: Rotate API tokens regularly Access control**: Limit workflow modifications Data encryption**: All API calls use HTTPS Audit logging**: Track all enrichments Advanced Configuration Custom Field Mapping Modify the Update HubSpot node to map to custom properties: // Example custom mapping { "custom_mobile": "{{ $json.data.mobile_phone }}", "custom_linkedin": "{{ $json.data.linkedin_url }}", "enrichment_date": "{{ $now.toISO() }}" } Conditional Processing Add logic to process only certain contacts: Filter by contact source Check for specific properties Validate email domains Exclude test contacts Support Resources For assistance: n8n issues**: Check n8n documentation and forums HubSpot API**: Reference HubSpot developers documentation Explorium API**: Contact Explorium support Webhook issues**: Use HubSpot webhook testing tools
by Cheng Siong Chin
How It Works This workflow automates comprehensive real estate investment analysis by orchestrating specialized AI agents to evaluate property data, market trends, and financial metrics. Designed for real estate investors, portfolio managers, and property analysts managing multiple properties or evaluating acquisition opportunities, it eliminates the manual research and analysis that typically requires days of work across multiple data sources. The system aggregates data from real estate APIs, market databases, and local statistics, then deploys specialized agents: performance analysis evaluates ROI and cash flow, recommendation engines identify optimal properties, market analysis assesses location trends, sentiment analysis mines reviews and local feedback, and workflow tools calculate financial projections. An orchestrator coordinates these agents to generate consolidated investment reports with property rankings, risk assessments, and portfolio recommendations. Results populate Google Sheets dashboards and trigger email notifications, transforming weeks of analysis into automated insights delivered in hours. Setup Steps Configure real estate API credentials (Zillow/Realtor.com) Add market data API keys for local statistics and demographics Input NVIDIA API keys for all OpenAI Model nodes Set OpenAI API key in Team Collaboration Agent/Orchestrator Configure Calculator Tool parameters for financial projections Connect Google Sheets and specify portfolio tracking spreadsheet ID Set up Gmail credentials and specify recipient addresses for reports Prerequisites NVIDIA API access, OpenAI API key, real estate data API subscriptions Use Cases Multi-property portfolio analysis, acquisition opportunity screening. Customization Adjust investment criteria thresholds, add custom financial metrics Benefits Reduces analysis time by 90%, evaluates unlimited properties simultaneously
by Marketing Canopy
UTM Link Creator & QR Code Generator with Scheduled Google Analytics Reports This workflow enables marketers to generate UTM-tagged links, convert them into QR codes, and automate performance tracking in Google Analytics with scheduled reports every 7 days. This solution helps monitor traffic sources from different marketing channels and optimize campaign performance based on analytics data. Prerequisites Before implementing this workflow, ensure you have the following: Google Analytics 4 (GA4) Account & Access Ensure you have a GA4 property set up. Access to the GA4 Data API to schedule performance tracking. Refer to the Google Analytics Data API Overview for more information. Airtable Account & API Key Create an Airtable base to store UTM links, QR codes, and analytics data. Obtain an Airtable API key from your Account Settings. Detailed instructions are available in the Airtable API Authentication Guide. Step-by-Step Guide to Setting Up the Workflow 1. Generate UTM Links Create a form or interface to input: Base URL** (e.g., https://example.com) Campaign Name** (utm_campaign) Source** (utm_source) Medium** (utm_medium) Term** (Optional: utm_term) Content** (Optional: utm_content) Append UTM parameters to generate a trackable URL. 2. Store UTM Links & QR Codes in Airtable Set up an Airtable base with the following columns: UTM Link** QR Code** Campaign Name** Source** Medium** Date Created** Adjust as needed based on your tracking requirements. For guidance on setting up your Airtable base and using the API, refer to the Airtable Web API Documentation. 3. Convert UTM Links to QR Codes Use a QR code generator API (e.g., goqr.me, qrserver.com) to generate QR codes for each UTM link and store them in Airtable. 4. Schedule Google Analytics Performance Reports (Every 7 Days) Use the Google Analytics Data API to pull weekly performance reports based on UTM parameters. Extract key metrics such as: Sessions Users Bounce Rate Conversions Revenue (if applicable) Store the data in Airtable for tracking and analysis. Adjust timeframe as needed For more details on accessing and using the Google Analytics Data API, consult the Google Analytics Data API Overview. Benefits of This Workflow ✅ Track Marketing Campaigns: Easily monitor which channels drive traffic. ✅ Automate QR Code Creation: Seamless integration of UTM links with QR codes. ✅ Scheduled Google Analytics Reports: No manual reporting—everything runs automatically. ✅ Improve Data-Driven Decisions: Optimize ad spend and marketing strategies based on performance insights. This version ensures proper Markdown structure, includes relevant documentation links, and improves readability. Let me know if you need any further refinements! 🚀
by Ron
Get weather alerts on your mobile phone via push, SMS or voice call. This flow gets weather information every morning and sends out an alert to your SIGNL4 on-call team. For example you can send out weather alerts in case of freezing temperatures, snow, rain, hail storms, hot weather, etc. The flow also supports automatic alert resolution. So, for example if the temperature goes up again the alert is closed automatically in the app. User cases: Dispatch snow removal teams Inform car dealers to protect the cars outside in case of hail storms Set sails if there are high winds And much more ... Can be adapted easily to other weather warnings, like rain, hail storm, etc.
by Airtop
About the Automation Staying on top of competitor pricing changes can be a full-time job. Manual price tracking is time-consuming and prone to errors, especially when dealing with complex pricing structures and multiple subscription tiers. Paid competitor price monitoring tools like Competera, Visualping and Fluxguard can be expensive. What if you could automate this process and get instant alerts when competitors adjust their pricing? How to easily monitor competitor pricing With this automation, you'll learn how to set up automated price monitoring system using Airtop's built-in node in n8n. By the end, your system will automatically track competitor pricing changes and notify you of any modifications. What You'll Need A free Airtop API Key Google Sheets account with a copy of this sheet URLs of competitors' pricing pages Understanding the Process This automation continuously monitors competitor pricing pages and compares them against your baseline data. The workflow: Tracks all different pricing plans (monthly, yearly, etc.). Monitors feature changes across different tiers. Detects and logs pricing structure modifications. Alerts you via Slack when changes are detected Setting Up Your Automation We've created a ready-to-use blueprint for seamless price monitoring. Here's how to get started: Connect your Google Sheets Set up your Airtop API connection Define update frequency Customization Options Enhance the basic template with these popular modifications: Add other notification channels (Email, Telegram, etc.). Include feature comparison tracking. Set up threshold-based alerts for significant price changes Track historical pricing trends Real-World Applications Case Study 1: A B2B SaaS company can use this automation to track competitors' pricing changes. When they identify a market-wide pricing shift, they can adjust their strategy proactively within minutes. Case Study 2: An online Ecommerce retailer automates monitoring of 100+ competitor products, maintaining optimal pricing positions and increasing profit margins. Best Practices To ensure accurate tracking: Include detailed baseline data for each pricing tier Specify both monthly and annual pricing clearly List all features included in each plan Update your baseline data whenever you verify changes Include any promotional pricing or special offers Document currency and regional variations if applicable Example Structure in Google Sheets: Competitor: Acme Tools Basic Plan: Monthly: $29 Annual: $290 ($24.17/mo) Features: 5 users, 10GB storage, basic support Pro Plan: Monthly: $79 Annual: $790 ($65.83/mo) Features: 20 users, 50GB storage, priority support What's Next? After setting up your price monitoring automation, consider the following: Creating automated competitive analysis reports Setting up market trend analysis Implementing automatic pricing recommendations Expanding monitoring to feature changes Happy monitoring!
by Airtop
About The Product Hunt Automation Staying up-to-date with specific topics and launches on Product Hunt can be time-consuming. Manually checking the site multiple times a day interrupts your workflow and risks missing important launches. What if you could automatically get relevant launches delivered to your Slack workspace? How to Monitor Product Hunt In this guide, you'll learn how to create a Product Hunt monitoring system using Airtop's built-in node in n8n. This automation will scan Product Hunt for your chosen topics and deliver the most relevant launches directly to Slack. What You'll Need A free Airtop API key A Slack workspace with permissions to add incoming webhooks Estimated setup time: 5 minutes Understanding the Process The Monitor Product Hunt automation uses Airtop's cloud browser capabilities to access Product Hunt and extract launch information. Here's what happens: Airtop visits Product Hunt and navigates the page It searches for and extracts up to 5 launches related to your chosen topic The information is formatted and sent to your specified Slack channel This process can run on your preferred schedule, ensuring you never miss relevant launches. Setting Up Your Automation We've created a ready-to-use template that handles all the complex parts. Here's how to get started: Connect your Airtop account by adding the API key you created Connect your Slack account Set your prompt in the Airtop node. For this example, we’ve set it to be “Extract up to 5 launches related to AI products” Choose your preferred monitoring schedule. Customization Options While our template works immediately, you might want to customize it for your specific needs: Adjust the prompt and the maximum number of launches to monitor Customize the Slack message format Change the monitoring frequency Add filters for particular keywords or companies Real-World Applications Here's how teams can use this automation: A startup's engineering team could track trends in other product’s tech stack, helping them stay informed about potential issues and improvements. A product manager can track launches of competitor products, enabling them to gather valuable market insights and user feedback directly from the tech community on that launch. Best Practices To get the most out of this automation: Choose Specific Search Terms**: For more relevant results, instead of broad terms like "AI," use specific phrases like "machine learning for healthcare" Optimize Scheduling**: When setting the monitoring frequency, consider your team's workflow. Running the scenario every 4 hours during working hours often provides a good balance between staying updated and avoiding notification fatigue. Set Up Error Handling**: Enable n8n's error output to alert you if the automation encounters any issues with accessing Product Hunt or sending messages to Slack. Regular Topic Review**: Schedule a monthly review of your monitored topics to ensure they're still relevant to your needs and adjust as necessary. What's Next? Now that you've set up your Product Hunt monitor automation, you might be interested in: Creating a similar monitor for other tech websites Setting up automated content curation for your team's newsletter Building a competitive intelligence dashboard using web monitoring Happy Automating!