by n8n Team
This template shows how you can take any event from any service, transform its data and send an alert to your desired app. Specifically, this example monitors a Linear project for new bug submissions. Then it only sends a Slack notification to a channel if a new bug is urgent. You can swap the Linear trigger for another Task Management app such as Jira or Asana; or an entirely different usecase. Setup instructions are located inside the workflow template.
by Martech Mafia
Problem Monitoring SEO performance from Google Search Console (GSC) manually is repetitive and prone to human error. For marketers or analysts managing multiple domains, checking reports manually and copying data into spreadsheets or databases is time-consuming. There is a strong need for an automated solution that collects, stores, and updates SEO metrics regularly for easier analysis and dashboarding. Solution This workflow automatically pulls performance metrics from Google Search Console — including queries, pages, CTR, impressions, positions, and devices — and stores them in a structured format inside a NocoDB table. It’s ideal for SEO specialists, marketing teams, or data analysts who need to automate SEO reporting and centralize data for analytics or dashboards (like Superset or Metabase). Setup Instructions Authorize your Google Search Console account Connect via OAuth2 (requires GSC API access). Create a NocoDB table Define fields to match GSC response: query (text) page (URL) device (text) clicks (number) impressions (number) ctr (percentage) position (number) Add credentials in n8n Use credential nodes for both: Google OAuth2 NocoDB API Token Customize schedule trigger Set the frequency (e.g., weekly) and adjust the domain/date range as needed. Generalize domains Replace specific domains like martechmafia.net with your-domain.com before submission. NocoDB Table Structure The NocoDB table must match the fields coming from GSC's Search Analytics API. Here's a sample schema: { "query": "string", "page": "string", "device": "string", "clicks": "number", "impressions": "number", "ctr": "number", "position": "number" }
by M Shehroz Sajjad
Monitor BeyondPresence video agent conversations in real-time to automatically score leads (0-100+) based on buying signals and send instant Slack alerts when hot opportunities or competitors are mentioned. This template helps sales teams prioritize leads immediately, never miss competitor mentions, and respond to high-intent prospects while they're still engaged. How it works Real-time webhook** processes each user message as it happens during calls Scoring engine** analyzes for buying signals (+points) and objections (-points) Competitor detection** instantly identifies when alternatives are mentioned Smart routing** sends alerts to different Slack channels based on urgency Hot leads** (70+ score) trigger immediate notifications with recommendations Call summary (Optional)** provides final qualification score when conversation ends Set up steps Connect Slack OAuth2 - Use n8n's built-in Slack integration (no webhooks needed!) Create Slack channels - Set up #sales-hot-leads, #sales-competitors, #sales-qualified Add webhook to BeyondPresence - Copy URL from n8n to BeyondPresence Settings → Webhooks Customize competitors - Edit the scoring node to add your specific competitor names Adjust scoring weights (optional) - Tune point values for your sales process Setup time: 10-15 minutes Requirements: BeyondPresence account, Slack workspace admin access
by Rui Borges
Workflow Purpose This workflow periodically checks a service's availability and sends an SMS notification if the service is down. High-Level Steps Schedule Trigger: The workflow is triggered at a specified interval, such as every minute. HTTP Request: An HTTP request is sent to the URL of the service being monitored. If: The HTTP status code of the response is checked. If the status code is 200 (OK), the workflow ends. If the status code is not 200, indicating a potential issue, an SMS notification is sent using Twilio. Setup Setting up this workflow is relatively straightforward and should only take a few minutes: Create a new n8n workflow. Add the nodes: Schedule Trigger, HTTP Request, If, and Twilio. Configure the nodes: Schedule Trigger: Specify the desired interval. HTTP Request: Enter the URL of the service to be monitored. If: Set the condition to check for a status code other than 200. Twilio: Enter the Twilio account credentials and the phone numbers for sending and receiving the SMS notification. Connect the nodes: Connect the nodes as shown in the workflow diagram. Activate the workflow: Save the workflow and activate it. Additional Notes The workflow can be customized by changing the interval, the URL, the Twilio credentials, and the SMS message. This workflow is a simple example, and more complex workflows can be created to meet specific needs.
by explorium
Salesforce Lead Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: salesforce\_Workflow.json Overview This n8n workflow monitors your Salesforce instance for new leads and automatically enriches them with missing contact information. When a lead is created, the workflow: Detects the new lead via Salesforce trigger Matches the lead against Explorium's database using name and company Enriches the lead with professional email addresses and phone numbers Updates the Salesforce lead record with the discovered contact information This automation ensures your sales team always has the most up-to-date contact information for new leads, improving reach rates and accelerating the sales process. Key Features Real-time Processing**: Triggers automatically when new leads are created in Salesforce Intelligent Matching**: Uses lead name and company to find the correct person in Explorium's database Contact Enrichment**: Adds professional emails, mobile phones, and office phone numbers Batch Processing**: Efficiently handles multiple leads to optimize API usage Error Handling**: Continues processing other leads even if some fail to match Selective Updates**: Only updates leads that successfully match in Explorium Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Salesforce account with: OAuth2 API access enabled Lead object permissions (read/write) API usage limits available Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of Salesforce lead management Salesforce Requirements Required Lead Fields The workflow expects these standard Salesforce lead fields: FirstName - Lead's first name LastName - Lead's last name Company - Company name Email - Will be populated/updated by the workflow Phone - Will be populated/updated by the workflow MobilePhone - Will be populated/updated by the workflow API Permissions Your Salesforce integration user needs: Read access to Lead object Write access to Lead object fields (Email, Phone, MobilePhone) API enabled on the user profile Sufficient API calls remaining in your org limits Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Salesforce OAuth2 Credentials Click on the Salesforce Trigger node Under Credentials, click Create New Follow the OAuth2 flow: Client ID: From your Salesforce Connected App Client Secret: From your Salesforce Connected App Callback URL: Copy from n8n and add to your Connected App Authorize the connection Save the credentials as "Salesforce account connection" Note: Use the same credentials for all Salesforce nodes in the workflow. Step 3: Configure Explorium API Credentials Click on the Match\_prospect node Under Credentials, click Create New (HTTP Header Auth) Configure the header: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth account" Apply the same credentials to the Explorium Enrich Contacts Information node Step 4: Verify Node Settings Salesforce Trigger: Trigger On: Lead Created Poll Time: Every minute (adjust based on your needs) Salesforce Get Leads: Operation: Get All Condition: CreatedDate = TODAY (fetches today's leads) Limit: 20 (adjust based on volume) Loop Over Items: Batch Size: 6 (optimal for API rate limits) Step 5: Activate the Workflow Save the workflow Toggle the Active switch to ON The workflow will now monitor for new leads every minute Detailed Node Descriptions Salesforce Trigger: Polls Salesforce every minute for new leads Get Today's Leads: Retrieves all leads created today to ensure none are missed Loop Over Items: Processes leads in batches of 6 for efficiency Match Prospect: Searches Explorium for matching person using name + company Filter: Checks if a valid match was found Extract Prospect IDs: Collects all matched prospect IDs Enrich Contacts: Fetches detailed contact information from Explorium Merge: Combines original lead data with enrichment results Split Out: Separates individual enriched records Update Lead: Updates Salesforce with new contact information Data Mapping The workflow maps Explorium data to Salesforce fields as follows: | Explorium Field | Salesforce Field | Fallback Logic | | ------------------- | ---------------- | --------------------------------- | | emails[0].address | Email | Falls back to professions_email | | mobile_phone | MobilePhone | Falls back to phone_numbers[1] | | phone_numbers[0] | Phone | Falls back to mobile_phone | Usage & Monitoring Automatic Operation Once activated, the workflow runs automatically: Checks for new leads every minute Processes any leads created since the last check Updates leads with discovered contact information Continues running until deactivated Manual Testing To test the workflow manually: Create a test lead in Salesforce Click "Execute Workflow" in n8n Monitor the execution to see each step Verify the lead was updated in Salesforce Monitoring Executions Track workflow performance: Go to Executions in n8n Filter by this workflow Review successful and failed executions Check logs for any errors or issues Troubleshooting Common Issues No leads are being processed Verify the workflow is activated Check Salesforce API limits haven't been exceeded Ensure new leads have FirstName, LastName, and Company populated Confirm OAuth connection is still valid Leads not matching in Explorium Verify company names are accurate (not abbreviations) Check that first and last names are properly formatted Some individuals may not be in Explorium's database Try testing with known companies/contacts Contact information not updating Check Salesforce field-level security Verify the integration user has edit permissions Ensure Email, Phone, and MobilePhone fields are writeable Check for validation rules blocking updates Authentication errors Salesforce: Re-authorize OAuth connection Explorium: Verify Bearer token is valid and not expired Check API quotas haven't been exceeded Error Handling The workflow includes built-in error handling: Failed matches don't stop other leads from processing Each batch is processed independently Failed executions are logged for review Partial successes are possible (some leads updated, others skipped) Best Practices Data Quality Ensure complete lead data: FirstName, LastName, and Company should be populated Use full company names: "Microsoft Corporation" matches better than "MSFT" Standardize data entry: Consistent formatting improves match rates Performance Optimization Adjust batch size: Lower if hitting API limits, higher for efficiency Modify polling frequency: Every minute for high volume, less frequent for lower volume Set appropriate limits: Balance between processing speed and API usage Compliance & Privacy Data permissions: Ensure you have rights to enrich lead data GDPR compliance: Consider privacy regulations in your region Data retention: Follow your organization's data policies Audit trail: Monitor who has access to enriched data Customization Options Extend the Enrichment Add more Explorium enrichment by: Adding firmographic data (company size, revenue) Including technographic information Appending social media profiles Adding job title and department verification Modify Trigger Conditions Change when enrichment occurs: Trigger on lead updates (not just creation) Add specific lead source filters Process only leads from certain campaigns Include lead score thresholds Add Notifications Enhance with alerts: Email sales reps when leads are enriched Send Slack notifications for high-value matches Create tasks for leads that couldn't be enriched Log enrichment metrics to dashboards API Considerations Salesforce Limits API calls: Each execution uses \~4 Salesforce API calls Polling frequency: Consider your daily API limit Batch processing: Reduces API usage vs. individual processing Explorium Limits Match API: One call per batch of leads Enrichment API: One call per batch of matched prospects Rate limits: Respect your plan's requests per minute Integration Architecture This workflow can be part of a larger lead management system: Lead Capture → This Workflow → Lead Scoring → Assignment Can trigger additional workflows based on enrichment results Compatible with existing Salesforce automation (Process Builder, Flows) Works alongside other enrichment tools Security Considerations Credentials**: Stored securely in n8n's credential system Data transmission**: Uses HTTPS for all API calls Access control**: Limit who can modify the workflow Audit logging**: All executions are logged with details Support Resources For assistance with: n8n issues**: Consult n8n documentation or community forum Salesforce integration**: Reference Salesforce API documentation Explorium API**: Contact Explorium support for API questions Workflow logic**: Review execution logs for debugging
by Stathis Askaridis
Integrate Xero with FileMaker using Webhooks Workflow Description This n8n workflow automates the integration between Xero and FileMaker, allowing for seamless data transfer between the two platforms. By listening for webhooks from Xero (e.g., new invoices, payments, or contacts), this workflow ensures that data is automatically sent and recorded in a FileMaker database. Who is This For? This workflow template is ideal for: Accountants** who need a streamlined process to sync financial data between Xero and FileMaker. Business Owners** looking to automate data entry and improve accuracy across their systems. Developers** building solutions for clients that require integration between accounting software and databases. Operations Teams** focused on minimizing manual work and improving efficiency. Key Steps Xero Webhook Trigger: The workflow starts by capturing events from Xero via a webhook. Data Processing: Transforms and maps the incoming data to match FileMaker’s required format. FileMaker Node: Utilizes the FileMaker node to create or update records directly in the FileMaker database. Logging & Error Handling: Tracks successful entries and manages any errors with automated alerts. Setup Instructions Set Up the Xero Webhook: Create a webhook in Xero and point it to your n8n webhook node URL. Configure the types of events to trigger the workflow (e.g., new invoices or payments). Xero will then send some test calls to test you are doing proper hash control. Connect the FileMaker Node: Set up your FileMaker node with the appropriate credentials and database configuration. Map the fields between the incoming Xero data and your FileMaker database structure. Customize Data Processing: Adjust data transformations as needed to ensure compatibility with your FileMaker schema. Test and Deploy: Run the workflow with sample data to ensure everything is functioning correctly. Monitor the execution log to verify data transfer and make any adjustments as needed. Error Handling Configuration: Configure error-handling nodes or alerts to notify you of any issues during data processing. Benefits This setup facilitates real-time data synchronization between Xero and FileMaker, reducing the need for manual data entry and improving overall operational efficiency.
by Angel Menendez
Who is this for? This workflow is ideal for IT operations teams or system administrators who use ServiceNow to track incidents and Slack for team communication. It provides real-time updates on new ServiceNow incidents directly in a designated Slack channel, ensuring timely response and collaboration. What problem is this workflow solving? / Use case Manually monitoring ServiceNow for new incidents can be time-consuming and prone to delays. This workflow automates the process, ensuring that team members are instantly notified of new incidents, complete with all relevant details, in a Slack channel. It enhances operational efficiency and incident response time. What this workflow does Schedule or Manual Trigger: The workflow can be triggered manually or set to run automatically every 5 minutes. Retrieve New Incidents: Fetches incidents created in ServiceNow within the last 5 minutes. Error Handling: Posts an error message in Slack if there are issues connecting to ServiceNow. Incident Processing: If new incidents are found, they are sorted in ascending order by their number. Detailed incident information is formatted and sent to a specified Slack channel. No Incidents: If no new incidents are found, the workflow does nothing. Setup ServiceNow API Credentials: Configure ServiceNow Basic Authentication in the workflow to connect to your ServiceNow instance. Slack API Credentials: Add your Slack API credentials to enable message posting. Slack Channel Configuration: Define the Slack channel where notifications should be sent. Ensure the channel ID is correctly set in the Slack node. Adjust the Schedule: Modify the schedule in the Schedule Trigger node to suit your requirements. How to customize this workflow to your needs Notification Format: Customize the Slack message format to include additional or fewer details. Update the Blocks section in the Slack node for personalized messages. Incident Query Parameters: Adjust the sysparm_query parameter in the ServiceNow node to filter incidents based on specific criteria. Error Handling: Modify the error message in the Slack node for more detailed troubleshooting information. Features Real-Time Notifications**: Immediate updates on new ServiceNow incidents. Error Handling**: Alerts in Slack if the workflow encounters issues connecting to ServiceNow. Customizable Notifications**: Flexibility to modify incident details sent to Slack. This workflow streamlines incident management and fosters collaboration by delivering actionable updates directly to your team.
by Ron
Objective In industry and production sometimes machine data is available in databases. That might be sensor data like temperature or pressure or just binary information. In this sample flow reads machine data and sends an alert to your SIGNL4 team when the machine is down. When the machine is up again the alert in SIGNL4 will get closed automatically. Setup We simulate the machine data using a Notion table. When we un-check the Up box we simulate a machine-down event. In certain intervals n8n checks the database for down items. If such an item has been found an alert is send using SIGNL4 and the item in Notion is updates (in order not to read it again). Status updates from SIGNL4 (acknowledgement, close, annotation, escalation, etc.) are received via webhook and we update the Notion item accordingly. This is how the alert looks like in the SIGNL4 app. The flow can be easily adapted to other database monitoring scenarios.
by Francis Njenga
Workflow Documentation: Auto-Retry Engine – Error Recovery Workflow Detailed Description The Auto-Retry Engine: Error Recovery Workflow is designed to automate the process of identifying and retrying failed executions in n8n workflows. By leveraging scheduled triggers, API integrations, and conditional logic, this workflow ensures that any failed executions are automatically retried on an hourly basis. This reduces manual intervention, improves system reliability, and ensures smoother workflow operations. Who is this for? This workflow is ideal for: Automation Engineers**: Managing and maintaining workflows with minimal manual intervention. DevOps Teams**: Ensuring high availability and reliability of automated processes. IT Administrators**: Reducing downtime and improving system performance by automating error recovery. What problem does this workflow solve? Manual Error Handling**: Eliminates the need for manual monitoring and retrying of failed executions. Improved Reliability**: Automatically retries failed executions, reducing downtime and improving workflow success rates. Time Efficiency**: Saves time by automating repetitive error recovery tasks, allowing teams to focus on higher-priority work. What this workflow does This workflow automates the following steps: Scheduled Monitoring: Checks for failed executions hourly using a schedule trigger. Error Filtering: Identifies executions that have failed and filters out those that have already been successfully retried. Authentication: Logs into the n8n instance using API credentials to retrieve session details. Automatic Retry: Retries the failed executions using the n8n API. Batch Processing: Processes multiple failed executions in batches to avoid overloading the system. Setup Prerequisites To use this workflow, you’ll need: n8n Account**: To create and run the workflow. n8n API Credentials**: For logging into the n8n instance and retrying executions. HTTP Request Node**: Configured to interact with the n8n API. Schedule Trigger**: Set to run the workflow hourly. Setup Process Configure Schedule Trigger Set the trigger to run hourly to check for failed executions. Set Login Credentials Add your n8n instance URL, username, and password in the Set Node. Integrate n8n API Use the HTTP Request node to log into the n8n instance and retrieve session details. Retry Failed Executions Configure the HTTP Request node to retry failed executions using the session details. Batch Processing Use the Split in Batches node to process multiple failed executions in batches. How to customize this workflow Tailor the workflow to fit your specific needs: Adjust Schedule Frequency** Modify the schedule trigger to run at different intervals (e.g., every 30 minutes). Add Notifications** Integrate email or Slack notifications to alert teams about failed retries. Refine Error Filtering** Customize the filtering logic to exclude specific types of failed executions. Scale Batch Size** Adjust the batch size in the Split in Batches node to optimize performance. Conclusion The Auto-Retry Engine: Error Recovery Workflow is a powerful tool for automating error recovery in n8n workflows. By reducing manual intervention and ensuring failed executions are retried automatically, this workflow enhances system reliability and operational efficiency. Whether you're managing a few workflows or a complex automation ecosystem, this workflow ensures your processes run smoothly and consistently.
by Elie Kattar
Multi-Channel Customer Support Automation Suite Transform your customer support operations with this enterprise-grade automation workflow that unifies, categorizes, and intelligently routes support tickets from multiple channels. 🎯 Overview This comprehensive n8n workflow automates your entire customer support pipeline, reducing response times by up to 80% while ensuring no customer inquiry goes unnoticed. It seamlessly integrates email, web forms, and webhooks into a single, intelligent support system that works 24/7. 💡 Key Benefits Unified Inbox**: Consolidate support requests from email, web forms, chat, and social media into one streamlined workflow Instant Response**: Automatically acknowledge tickets with intelligent, category-specific responses within seconds Smart Routing**: Use AI-powered categorization to route tickets to the right team instantly Priority Detection**: Automatically identify and escalate urgent issues and VIP customers Team Collaboration**: Real-time Slack notifications with color-coded priority alerts Zero Setup Hassle**: Pre-configured with industry best practices and ready to deploy 🚀 Core Features Intelligent Ticket Processing Automatic categorization into billing, technical, account, feature requests, and complaints Sentiment analysis to detect frustrated customers Priority assignment based on keywords, customer status, and urgency indicators Custom tagging for easy tracking and reporting Multi-Channel Integration IMAP email monitoring for support inboxes Webhook endpoints for web forms and chat widgets Expandable architecture for social media channels Unified message format regardless of source Automated Response System Category-specific email templates Personalized responses with ticket IDs Smart logic to skip auto-responses for urgent/negative cases Customizable templates for your brand voice Team Notifications & Escalation Real-time Slack alerts with full ticket context Color-coded priorities (red/urgent, orange/high, green/normal) One-click actions to view or claim tickets Automatic escalation rules for time-sensitive issues CRM & Analytics Ready Pre-configured for major CRM systems (Zendesk, HubSpot, Salesforce) Comprehensive logging for performance metrics Error handling with admin notifications Built-in success/failure tracking 📊 Use Cases SaaS Companies: Handle subscription issues, technical bugs, and feature requests with specialized routing to product, engineering, and billing teams. E-commerce: Manage order inquiries, shipping issues, and returns while maintaining high customer satisfaction scores. Agencies: Provide white-label support services with customizable branding and client-specific routing rules. Startups: Scale support operations without hiring additional staff by automating 70% of routine inquiries. 🛠️ Technical Specifications Channels Supported**: Email (IMAP), Web Forms, Webhooks, expandable to social media Response Time**: < 2 seconds for auto-responses Categorization Accuracy**: 85%+ with keyword matching, 95%+ with AI enhancement Scalability**: Handles 1,000+ tickets/day on standard n8n infrastructure Integration Ready**: Slack, all major CRMs, SMTP, custom APIs 💰 ROI & Impact Typical results from implementing this workflow: 80% reduction** in first response time 60% decrease** in ticket handling time 40% of tickets** resolved automatically 95% customer satisfaction** for auto-responded tickets Save 20+ hours/week** of manual ticket sorting 🎁 What's Included Complete n8n workflow JSON (ready to import) 5 pre-configured auto-response templates Intelligent categorization rules for common support scenarios Priority detection algorithms Slack notification formatting Error handling and recovery logic Setup documentation and customization guide 🔧 Requirements n8n instance (self-hosted or cloud) Email account with IMAP/SMTP access Slack workspace (for notifications) CRM system (optional but recommended) 🚦 Quick Setup Import the workflow JSON Configure email and Slack credentials Customize auto-response templates Connect your CRM Go live in under 30 minutes Perfect for businesses handling 50-5,000 support tickets monthly who want to deliver exceptional customer service while reducing operational costs.
by Oneclick AI Squad
Description AI-Powered Multi-language Customer Support In this guide, we'll walk you through setting up a comprehensive AI-driven workflow that handles customer messages in any language through WhatsApp and email channels, providing intelligent translation, summarization, and automated responses. Ready to revolutionize your customer support? Let's get started! What's the Goal? Automatically handle customer messages** from WhatsApp and email in any language Translate and validate** incoming messages with smart language detection Generate intelligent summaries** with priority classification for support teams Provide automated responses** back to customers via their preferred channel Log all interactions** to database for tracking and analytics Send notifications** to admin team for high-priority cases Deliver 24/7 multilingual customer support** without manual effort Integrate seamlessly** with WhatsApp Business API and email systems By the end, you'll have a fully automated customer support system that handles multilingual communications, prioritizes urgent cases, and maintains comprehensive interaction logs. Why Does It Matter? Manual handling of multilingual customer support can be overwhelming and inefficient. Here's why this workflow is a game-changer: Break Global Language Barriers**: Handle customer inquiries in any language effortlessly Never Miss Important Messages**: Priority detection ensures urgent cases get immediate attention Save 80% of Manual Work**: Automation handles routine inquiries and escalates complex ones 24/7 Availability**: Respond to customers anytime, enhancing satisfaction and retention Professional Customer Experience**: Consistent, well-formatted responses in the customer's language Complete Audit Trail**: Database logging provides insights and accountability Scalable Solution**: Handle growing customer base without proportional staff increase Think of it as your always-on, multilingual customer support team that never sleeps and never misses a beat. How It Works Here's the step-by-step magic behind the automation: Step 1: Multi-Channel Message Capture WhatsApp Trigger**: Captures incoming WhatsApp messages via Business API webhook Email Trigger (IMAP)**: Monitors designated customer support email for new messages Both channels feed into the same processing pipeline for consistent handling Step 2: Data Normalization & Validation Data Normalizer & Validator**: Standardizes message format regardless of source channel Extracts key information: sender details, message content, timestamp, channel source Validates data integrity and handles malformed inputs gracefully Step 3: Smart Language Translation Smart Language Translator**: Automatically detects source language and translates to English Preserves original message context and cultural nuances Stores both original and translated versions for reference Step 4: Enhanced Summary & Priority Processing Enhanced Summary & Priority Processor**: Uses AI to analyze translated content Generates concise summaries highlighting key customer concerns Priority Classification**: Automatically tags messages as: 🔴 High Priority: Urgent issues, complaints, billing problems 🟡 Medium Priority: Product inquiries, general support 🟢 Low Priority: Thank you messages, general feedback Creates structured output with priority flags for support team triage Step 5: Message Source Intelligence Check Message Source**: Determines optimal response channel and method Routes WhatsApp messages back to WhatsApp, emails back to email Maintains conversation context and threading Step 6: Automated Customer Response Customer WhatsApp Auto-Response**: Sends acknowledgment via WhatsApp Customer Email Auto-Response**: Sends professional email replies Responses include: Confirmation of message receipt Estimated response time based on priority Reference number for tracking Next steps or immediate solutions for common issues Step 7: Database Logging & Analytics Log to Database**: Stores complete interaction history including: Original message and translation Priority classification and reasoning Response sent and timestamp Customer contact information Channel and source metadata Enables analytics, reporting, and quality assurance Step 8: Admin Notifications & Alerts Admin Email Notification**: Immediate email alerts for high-priority cases Admin WhatsApp Alert**: SMS/WhatsApp notifications for urgent escalations Workflow Completion & Metrics**: Performance tracking and completion confirmations Workflow Architecture ┌─────────────────┐ ┌──────────────────┐ │ WhatsApp │ │ Email Trigger │ │ Trigger │ │ (IMAP) │ └─────────┬───────┘ └─────────┬────────┘ │ │ └──────────┬───────────┘ │ ┌──────────▼──────────┐ │ Data Normalizer & │ │ Validator │ └──────────┬──────────┘ │ ┌──────────▼──────────┐ │ Smart Language │ │ Translator │ └──────────┬──────────┘ │ ┌──────────▼──────────┐ │ Enhanced Summary & │ │ Priority Processor │ └──────────┬──────────┘ │ ┌──────────▼──────────┐ │ Check Message │ │ Source │ └─────────┬┬──────────┘ ┌┘└┐ ┌──────────▼┐ ┌▼──────────┐ │ Customer │ │ Customer │ │ WhatsApp │ │ Email │ │ Response │ │ Response │ └──────────┬┘ └┬──────────┘ └┬─┬┘ ┌─────────▼─▼─────────┐ │ Log to Database │ └─────────┬───────────┘ │ ┌─────────▼───────────┐ │ Admin Email │ │ Notification │ └─────────┬───────────┘ │ ┌─────────▼───────────┐ │ Admin WhatsApp │ │ Alert │ └─────────┬───────────┘ │ ┌─────────▼───────────┐ │ Workflow Completion │ │ & Metrics │ └─────────────────────┘ How to Use the Workflow? Importing a workflow in n8n is straightforward and allows you to use pre-built or shared workflows to save time. Below is a step-by-step guide to importing the Multi-language Customer Support workflow in n8n. Steps to Import a Workflow in n8n 1. Obtain the Workflow JSON Source the Workflow: Workflows are typically shared as JSON files or code snippets. You might receive them from: The n8n community (e.g., n8n.io workflows page) A colleague or tutorial (e.g., a .json file or copied JSON code) Exported from another n8n instance Format**: Ensure you have the workflow in JSON format, either as a file (e.g., customer-support-workflow.json) or as text copied to your clipboard 2. Access the n8n Workflow Editor Log in to n8n: Open your n8n instance (via n8n Cloud or your self-hosted instance) Navigate to the Workflows tab in the n8n dashboard Open a New Workflow: Click Add Workflow to create a blank workflow, or open an existing workflow if you want to merge the imported workflow 3. Import the Workflow Option 1: Import via JSON Code (Clipboard): In the n8n editor, click the three dots (⋯) in the top-right corner to open the menu Select Import from Clipboard Paste the JSON code of the workflow into the provided text box Click Import to load the workflow into the editor Option 2: Import via JSON File: In the n8n editor, click the three dots (⋯) in the top-right corner Select Import from File Choose the .json file from your computer Click Open to import the workflow Configuration Requirements Essential Setup Notes: WhatsApp Integration: Configure WhatsApp Business API credentials in the WhatsApp Trigger node Set up webhook URL in your WhatsApp Business account Test connection with a sample message Email Configuration: Set up IMAP credentials for your customer support email in the Email Trigger node Configure SMTP settings for outbound email responses Ensure proper email authentication (SPF, DKIM records) Translation Services: Add Google Translate API credentials in the Smart Language Translator node Alternative: Configure Azure Translator or AWS Translate based on preference Set up language detection and translation parameters Database Connection: Configure database credentials in the "Log to Database" node Create required tables for storing customer interactions: CREATE TABLE customer_interactions ( id SERIAL PRIMARY KEY, customer_contact VARCHAR(255), channel VARCHAR(50), original_message TEXT, translated_message TEXT, summary TEXT, priority VARCHAR(20), response_sent TEXT, timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); Admin Notifications: Set up admin email addresses in notification nodes Configure WhatsApp/SMS credentials for urgent alerts Customize notification templates and thresholds Priority Classification Rules: Customize JavaScript code in "Enhanced Summary & Priority Processor" node Define keywords and patterns for priority detection: // High Priority Keywords const urgentKeywords = ['urgent', 'emergency', 'billing issue', 'not working', 'broken', 'refund', 'complaint']; // Medium Priority Keywords const mediumKeywords = ['question', 'how to', 'support', 'help', 'information']; // Classification logic if (urgentKeywords.some(keyword => message.toLowerCase().includes(keyword))) { priority = 'HIGH'; } else if (mediumKeywords.some(keyword => message.toLowerCase().includes(keyword))) { priority = 'MEDIUM'; } else { priority = 'LOW'; } Response Templates: Customize auto-response templates in both WhatsApp and Email response nodes Include your company branding and contact information Set up response templates for different priority levels and common scenarios Testing and Deployment: Test Each Channel: Send test messages via WhatsApp and email to verify end-to-end flow Verify Translations: Test with messages in different languages Check Database Logging: Confirm all interactions are properly stored Test Admin Notifications: Verify alerts are sent for high-priority cases Monitor Performance: Set up workflow execution monitoring and error handling Your Multi-language Customer Support workflow is now ready to handle customer communications 24/7 across multiple channels with intelligent automation and human oversight where needed!
by ankitkansaldev
🎬 TikTok Influencer Scraper (URL Input) via Bright Data + n8n & Sheets A comprehensive n8n automation that scrapes TikTok influencer profiles using Bright Data's TikTok dataset and automatically saves detailed profile information to Google Sheets. 📋 Overview This workflow provides an automated TikTok influencer data collection solution that scrapes comprehensive profile information and saves it to Google Sheets. Perfect for influencer marketing research, competitor analysis, social media monitoring, and marketing campaign planning. ✨ Key Features 📝 Form-Based Input: Simple web form to submit TikTok profile URLs 🤖 Bright Data Integration: Uses Bright Data's TikTok dataset for reliable scraping ⏳ Status Monitoring: Intelligent polling system to check scraping progress 🔄 Retry Logic: Automatic retry mechanism with 30-second intervals 📊 Data Extraction: Comprehensive profile data including engagement metrics 📈 Google Sheets Storage: Automatic data storage and organization ⚡ Error Handling: Built-in error handling and status reporting 🎯 Custom Fields: Configurable output fields for specific data needs 🎯 What This Workflow Does Input Profile URLs**: TikTok profile URLs submitted through web form Custom Fields**: Configurable data fields for extraction Country Settings**: Geo-targeting for accurate data collection Processing Form Submission: User submits TikTok profile URL through web form API Trigger: Sends profile data to Bright Data for scraping Status Polling: Continuously checks scraping progress Wait & Retry: Implements 30-second delays between status checks Data Retrieval: Fetches complete profile data when ready Sheet Update: Saves extracted data to Google Sheets Status Reporting: Provides completion status and messages Output Data Points | Field | Description | Example | |-------|-------------|---------| | Account ID | Unique TikTok account identifier | @username123 | | Nickname | Display name on profile | "John Doe" | | Biography | Profile bio/description | "Content creator & influencer" | | Followers | Number of followers | 1,250,000 | | Following | Number of accounts following | 500 | | Likes | Total likes across all videos | 50,000,000 | | Videos Count | Total number of videos posted | 1,200 | | Profile URL | Direct link to TikTok profile | https://www.tiktok.com/@username | | Profile Picture | Profile image URL | https://p16-sign-sg.tiktokcdn.com/... | | Profile Picture HD | High-definition profile image | https://p16-sign-sg.tiktokcdn.com/... | | Is Verified | Verification status | true/false | | Bio Link | External link in bio | https://linktr.ee/username | | Like Engagement Rate | Engagement rate based on likes | 5.2% | | Comment Engagement Rate | Engagement rate based on comments | 2.1% | | Top Videos | List of top performing videos | [video_objects] | | Region | Geographic region | "US" | | Is Under Age 18 | Age status indicator | true/false | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with TikTok dataset access Valid TikTok profile URLs for testing 10-15 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Request Generic Credential Name: "Bright Data API" Authentication: Bearer Token Token: Your Bright Data API key Test the connection Configure dataset: Ensure you have access to TikTok dataset (gd_l1villgoiiidt09ci) Verify dataset permissions in Bright Data dashboard Check dataset limits and pricing Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "TikTok Influencer Data" Create a sheet tab named "TikTok profile by url" Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Account ID Column B: Nickname Column C: Biography Column D: Followers Column E: Following Column F: Likes Column G: Videos Count Column H: Profile URL Column I: Is Verified Column J: Bio Link Column K: Like Engagement Rate Column L: Comment Engagement Rate Column M: Region Column N: Status Column O: Message Step 4: Update Workflow Settings Update API credentials: Open "Sends profile URLs to Bright Data to trigger scraping" node Replace BRIGHT_DATA_API_KEY with your actual API key Update dataset ID if different Update Google Sheets nodes: Open "Google Sheets" node Replace document ID: 1OeqtCFm4Wek9DI5YFOWQXTpQJS-SJxC10iAPKEKkmiY Select your Google Sheets credential Choose the correct sheet/tab name Configure form settings: Open "Search by Profile URL" node Customize form title and field labels as needed Note the webhook URL for form access Step 5: Test & Activate Add test profiles: Access the form using the webhook URL Submit 1-2 TikTok profile URLs for testing Use full URLs (e.g., https://www.tiktok.com/@username) Test the workflow: Submit a test profile through the form Monitor execution in n8n Verify data appears in Google Sheet Check for any error messages 📖 Usage Guide Submitting TikTok Profiles Navigate to your form URL (found in Form Trigger node) Enter TikTok profile URL in the format: https://www.tiktok.com/@username Click Submit to start the scraping process Wait for processing (typically 1-3 minutes) Understanding the Process The workflow follows this sequence: Form Submission → Profile URL captured API Trigger → Scraping job submitted to Bright Data Status Polling → Checks every 30 seconds if data is ready Data Retrieval → Fetches complete profile information Sheet Update → Saves data to Google Sheets Monitoring Progress Check n8n execution logs for real-time status Bright Data dashboard shows scraping progress Google Sheets will populate when data is ready Status column shows "ready" when complete Reading the Results Your Google Sheet will show: Complete TikTok profile information Engagement metrics and statistics Profile verification status Bio links and external connections Timestamp of data collection 🔧 Customization Options Adding More Data Points Edit the JSON body in "Sends profile URLs to Bright Data" node to include additional fields: "custom_output_fields": [ "account_id", "nickname", "biography", "followers", "following", "likes", "videos_count", "language", "creation_time", "last_post_time", "avg_video_duration", "hashtags_used", "music_used" ] Modifying Input Parameters Customize the scraping parameters: Country targeting**: Change "country" field in input Search limits**: Adjust "limit_per_input" value Discovery method**: Modify "discover_by" parameter Error handling**: Toggle "include_errors" setting Batch Processing Multiple Profiles To process multiple profiles simultaneously: Modify the input array in the API call Add multiple profile URLs in single request Implement loop logic for processing results Add rate limiting between requests Custom Form Fields Enhance the form with additional inputs: Open "Search by Profile URL" node Add form fields for: Country selection Number of videos to analyze Specific date ranges Custom tags or categories 🚨 Troubleshooting Common Issues & Solutions "Bright Data connection failed" Cause: Invalid API credentials or dataset access Solution: Verify API key in Bright Data dashboard, check dataset permissions "Profile not found or private" Cause: Invalid TikTok URL or private profile Solution: Verify profile URL format, ensure profile is public "Google Sheets permission denied" Cause: Incorrect credentials or sheet permissions Solution: Re-authenticate Google Sheets, check sheet sharing settings "Scraping timeout" Cause: Profile data taking too long to process Solution: Increase wait time or implement longer polling intervals "Invalid dataset ID" Cause: Incorrect or expired dataset configuration Solution: Check Bright Data dashboard for correct dataset ID "Form submission failed" Cause: Webhook configuration issues Solution: Verify webhook URL and form trigger settings Advanced Troubleshooting Check execution logs** in n8n for detailed error messages Test individual nodes** by running them separately Verify data formats** ensure URLs are properly formatted Monitor API limits** check Bright Data usage quotas Add error handling** implement try-catch logic for robust operation 📊 Use Cases & Examples 1. Influencer Marketing Research Goal: Identify and analyze potential influencers for campaigns Research influencers in specific niches Analyze engagement rates and audience size Compare multiple influencers for campaign selection Track influencer growth over time 2. Competitive Analysis Goal: Monitor competitors' TikTok presence and performance Track competitor follower growth Analyze content strategies and engagement Monitor posting frequency and timing Identify trending content themes 3. Social Media Monitoring Goal: Track brand mentions and user-generated content Monitor branded hashtag usage Track brand advocates and micro-influencers Analyze sentiment and engagement patterns Identify trending topics in your industry 4. Market Research Pipeline Goal: Gather social media intelligence for business decisions Analyze target audience behavior Study content preferences and trends Generate reports for stakeholders Support marketing strategy development ⚙ Advanced Configuration Rate Limiting and Performance To optimize for large-scale scraping: Adjust wait times between status checks Implement exponential backoff for retries Add batch processing for multiple profiles Monitor API usage to avoid limits Data Validation and Cleaning Enhance data quality with validation: Add data type validation for numeric fields Implement URL format checking Clean and standardize text fields Add data completeness checks Integration with Business Tools Connect the workflow to your existing systems: CRM Integration**: Update customer records with influencer data Slack Notifications**: Send alerts when new data is available Database Storage**: Store data in PostgreSQL/MySQL for analysis BI Tools**: Connect to Tableau/Power BI for visualization Webhook Integration For real-time updates: Add webhook triggers for immediate profile checks Integrate with external systems via webhooks Create API endpoints for programmatic access Implement authentication for secure access 📈 Performance & Limits Expected Performance Single Profile**: 30-60 seconds average processing time Concurrent Requests**: 5-10 simultaneous (depends on Bright Data plan) Data Accuracy**: 95%+ for public TikTok profiles Success Rate**: 90%+ for accessible profiles Daily Capacity**: 100-1000 profiles (depends on rate limits) Resource Usage Memory**: ~50MB per execution Storage**: Minimal (data stored in Google Sheets) API Calls**: 3-5 Bright Data calls per profile (including status checks) Bandwidth**: ~1-2MB per profile scraped Execution Time**: 1-2 minutes per profile Scaling Considerations Rate Limiting**: Add delays for high-volume scraping Error Handling**: Implement retry logic for failed requests Data Validation**: Add checks for malformed profile data Monitoring**: Track success/failure rates over time Cost Optimization**: Monitor API usage to control costs 🤝 Support & Community Getting Help n8n Community Forum**: community.n8n.io Documentation**: docs.n8n.io Bright Data Support**: Contact through your dashboard GitHub Issues**: Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 📋 Quick Setup Checklist Before You Start ☐ n8n instance running (self-hosted or cloud) ☐ Google account with Sheets access ☐ Bright Data account with TikTok dataset access ☐ Valid TikTok profile URLs for testing ☐ 15 minutes for setup Setup Steps ☐ Import Workflow - Copy JSON and import to n8n ☐ Configure Bright Data - Set up API credentials and test ☐ Create Google Sheet - New sheet with proper column structure ☐ Set up Google Sheets credentials - OAuth setup and test ☐ Update workflow settings - Replace sheet ID and API keys ☐ Test with sample profiles - Submit 1-2 URLs and verify results ☐ Activate workflow - Enable form trigger for production use Ready to Use! 🎉 Your form URL: https://your-n8n-instance.com/form/[webhook-id] 🎯 Happy TikTok Scraping! This workflow provides a solid foundation for automated TikTok influencer data collection. Customize it to fit your specific needs and use cases for influencer marketing, competitive analysis, and social media research.