by PDF Vector
Overview Organizations dealing with high-volume document processing face challenges in efficiently handling diverse document types while maintaining quality and tracking performance metrics. This enterprise-grade workflow provides a scalable solution for batch processing documents including PDFs, scanned documents, and images (JPG, PNG) with comprehensive analytics, error handling, and quality assurance. What You Can Do Process thousands of documents in parallel batches efficiently Monitor performance metrics and success rates in real-time Handle diverse document formats with automatic format detection Generate comprehensive analytics dashboards and reports Implement automated quality assurance and error handling Who It's For Large organizations, document processing centers, digital transformation teams, enterprise IT departments, and businesses that need to process thousands of documents reliably with detailed performance tracking and analytics. The Problem It Solves High-volume document processing without proper monitoring leads to bottlenecks, quality issues, and inefficient resource usage. Organizations struggle to track processing success rates, identify problematic document types, and optimize their workflows. This template provides enterprise-grade batch processing with comprehensive analytics and automated quality assurance. Setup Instructions: Configure Google Drive credentials for document folder access Install the PDF Vector community node from the n8n marketplace Configure PDF Vector API credentials with appropriate rate limits Set up batch processing parameters (batch size, retry logic) Configure quality thresholds and validation rules Set up analytics dashboard and reporting preferences Configure error handling and notification systems Key Features: Parallel batch processing for maximum throughput Support for mixed document formats (PDFs, Word docs, images) OCR processing for handwritten and scanned documents Comprehensive analytics dashboard with success rates and performance metrics Automatic document prioritization based on size and complexity Intelligent error handling with automatic retry logic Quality assurance checks and validation Real-time processing monitoring and alerts Customization Options: Configure custom document categories and processing rules Set up specific extraction templates for different document types Implement automated workflows for documents that fail quality checks Configure credit usage optimization to minimize costs Set up custom analytics and reporting dashboards Add integration with existing document management systems Configure automated notifications for processing completion or errors Implementation Details: The workflow uses intelligent batching to process documents efficiently while monitoring performance metrics in real-time. It automatically handles different document formats, applies OCR when needed, and provides detailed analytics to help organizations optimize their document processing operations. The system includes sophisticated error recovery and quality assurance mechanisms. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by vinci-king-01
Creative Asset Manager with ScrapeGraphAI Analysis and Brand Compliance ๐ฏ Target Audience Creative directors and design managers Marketing teams managing brand assets Digital asset management (DAM) administrators Brand managers ensuring compliance Content creators and designers Marketing operations teams Creative agencies managing client assets Brand compliance officers ๐ Problem Statement Managing creative assets manually is inefficient and error-prone, often leading to inconsistent branding, poor organization, and compliance issues. This template solves the challenge of automatically analyzing, organizing, and ensuring brand compliance for creative assets using AI-powered analysis and automated workflows. ๐ง How it Works This workflow automatically processes uploaded creative assets using ScrapeGraphAI for intelligent analysis, generates comprehensive tags, checks brand compliance, organizes files systematically, and maintains a centralized dashboard for creative teams. Key Components Asset Upload Trigger - Webhook endpoint that activates when new creative assets are uploaded ScrapeGraphAI Asset Analyzer - Uses AI to extract detailed information from visual assets Tag Generator - Creates comprehensive, searchable tags based on asset analysis Brand Compliance Checker - Evaluates assets against brand guidelines and standards Asset Organizer - Creates organized folder structures and standardized naming Creative Team Dashboard - Updates Google Sheets with organized asset information ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | asset_id | String | Unique asset identifier | "asset_1703123456789_abc123def" | | name | String | Standardized filename | "image-social-media-2024-01-15T10-30-00.jpg" | | path | String | Storage location path | "/creative-assets/2024/01/image/social-media" | | asset_type | String | Type of creative asset | "image" | | dimensions | String | Asset dimensions | "1920x1080" | | file_format | String | File format | "jpg" | | primary_colors | Array | Extracted color palette | ["#FF6B35", "#004E89"] | | content_description | String | AI-generated content description | "Modern office workspace with laptop" | | text_content | String | Any text visible in asset | "Welcome to our workspace" | | style_elements | Array | Detected style characteristics | ["modern", "minimalist"] | | generated_tags | Array | Comprehensive tag list | ["high-resolution", "brand-logo", "social-media"] | | usage_context | String | Suggested usage context | "social-media" | | brand_elements | Array | Detected brand elements | ["logo", "typography"] | | compliance_score | Number | Brand compliance score (0-100) | 85 | | compliance_status | String | Approval status | "approved-with-warnings" | | compliance_issues | Array | List of compliance problems | ["Non-brand colors detected"] | | upload_date | DateTime | Asset upload timestamp | "2024-01-15T10:30:00Z" | | searchable_keywords | String | Search-optimized keywords | "image social-media modern brand-logo" | ๐ ๏ธ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access File upload system or DAM integration Brand guidelines document (for compliance configuration) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for creative asset management Configure the sheet name (default: "Creative Assets Dashboard") 4. Configure Webhook Trigger Set up the webhook endpoint for asset uploads Configure the webhook URL in your file upload system Ensure asset_url parameter is passed in webhook payload Test webhook connectivity 5. Customize Brand Guidelines Update the Brand Compliance Checker node with your brand colors Configure approved file formats and size limits Set required brand elements and fonts Define resolution standards and quality requirements 6. Configure Asset Organization Customize folder structure preferences Set up naming conventions for different asset types Configure metadata extraction preferences Set up search optimization parameters 7. Test and Validate Upload a test asset to trigger the workflow Verify all analysis steps complete successfully Check Google Sheets for proper data formatting Validate brand compliance scoring ๐ Workflow Customization Options Modify Analysis Parameters Adjust ScrapeGraphAI prompts for specific asset types Customize tag generation algorithms Modify color analysis sensitivity Add industry-specific analysis criteria Extend Brand Compliance Add more sophisticated brand guideline checks Implement automated correction suggestions Include legal compliance verification Add accessibility compliance checks Customize Organization Structure Modify folder hierarchy based on team preferences Implement custom naming conventions Add version control and asset history Configure backup and archiving rules Output Customization Add integration with DAM systems Implement asset approval workflows Create automated reporting and analytics Add team collaboration features ๐ Use Cases Brand Asset Management**: Automatically organize and tag brand assets Compliance Monitoring**: Ensure all assets meet brand guidelines Creative Team Collaboration**: Centralized asset management and sharing Marketing Campaign Management**: Organize assets by campaign and context Asset Discovery**: AI-powered search and recommendation system Quality Control**: Automated quality and compliance checks ๐จ Important Notes Respect ScrapeGraphAI API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update brand guidelines in the compliance checker Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and copyright compliance for creative assets Ensure proper backup and version control for important assets ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Webhook trigger failures: Check webhook URL and payload format Google Sheets permission errors: Check OAuth2 scope and permissions Asset analysis errors: Review the ScrapeGraphAI prompt configuration Brand compliance false positives: Adjust guideline parameters File organization issues: Check folder permissions and naming conventions Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Digital asset management best practices Brand compliance and governance guidelines
by vinci-king-01
Influencer Content Monitor with ScrapeGraphAI Analysis and ROI Tracking ๐ฏ Target Audience Marketing managers and brand managers Influencer marketing agencies Social media managers Digital marketing teams Brand partnerships coordinators Marketing analysts and strategists Campaign managers ROI and performance analysts ๐ Problem Statement Manual monitoring of influencer campaigns is time-consuming and often misses critical performance insights, brand mentions, and ROI calculations. This template solves the challenge of automatically tracking influencer content, analyzing engagement metrics, detecting brand mentions, and calculating campaign ROI using AI-powered analysis and automated workflows. ๐ง How it Works This workflow automatically monitors influencer profiles and content using ScrapeGraphAI for intelligent analysis, tracks brand mentions and sponsored content, calculates performance metrics, and provides comprehensive ROI analysis for marketing campaigns. Key Components Daily Schedule Trigger - Runs automatically every day at 9:00 AM to monitor influencer campaigns ScrapeGraphAI - Influencer Profiles - Uses AI to extract profile data and recent posts from Instagram Content Analyzer - Analyzes post content for engagement rates and quality scoring Brand Mention Detector - Identifies brand mentions and sponsored content indicators Campaign Performance Tracker - Tracks campaign metrics and KPIs Marketing ROI Calculator - Calculates return on investment for campaigns ๐ Data Analysis Specifications The template analyzes and tracks the following metrics: | Metric Category | Data Points | Description | Example | |----------------|-------------|-------------|---------| | Profile Data | Username, Followers, Following, Posts Count, Bio, Verification Status | Basic influencer profile information | "@influencer", "100K followers", "Verified" | | Post Analysis | Post URL, Caption, Likes, Comments, Date, Hashtags, Mentions | Individual post performance data | "5,000 likes", "150 comments" | | Engagement Metrics | Engagement Rate, Content Quality Score, Performance Tier | Calculated performance indicators | "3.2% engagement rate", "High performance" | | Brand Detection | Brand Mentions, Sponsored Content, Mention Count | Brand collaboration tracking | "Nike mentioned", "Sponsored post detected" | | Campaign Performance | Total Reach, Total Engagement, Average Engagement, Performance Score | Overall campaign effectiveness | "50K total reach", "85.5 performance score" | | ROI Analysis | Total Investment, Estimated Value, ROI Percentage, Cost per Engagement | Financial performance metrics | "$2,500 investment", "125% ROI" | ๐ ๏ธ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Instagram accounts to monitor (influencer usernames) Campaign budget and cost data for ROI calculations Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Schedule Trigger Configure the daily schedule (default: 9:00 AM UTC) Adjust timezone to match your business hours Set appropriate frequency for your monitoring needs 4. Configure Influencer Monitoring Update the websiteUrl parameter with target influencer usernames Customize the user prompt to extract specific profile data Set up monitoring for multiple influencers if needed Configure brand keywords for mention detection 5. Customize Brand Detection Update brand keywords in the Brand Mention Detector node Add sponsored content indicators (#ad, #sponsored, etc.) Configure brand mention sensitivity levels Set up competitor brand monitoring 6. Configure ROI Calculations Update cost estimates in the Marketing ROI Calculator Set value per engagement and reach metrics Configure campaign management costs Adjust ROI calculation parameters 7. Test and Validate Run the workflow manually with test data Verify all analysis steps complete successfully Check data accuracy and calculation precision Validate ROI calculations with actual campaign data ๐ Workflow Customization Options Modify Monitoring Parameters Adjust monitoring frequency (hourly, daily, weekly) Add more social media platforms (TikTok, YouTube, etc.) Customize engagement rate calculations Modify content quality scoring algorithms Extend Brand Detection Add more sophisticated brand mention detection Implement sentiment analysis for brand mentions Include competitor brand monitoring Add automated alert systems for brand mentions Customize Performance Tracking Modify performance tier thresholds Add more detailed engagement metrics Implement trend analysis and forecasting Include audience demographic analysis Output Customization Add integration with marketing dashboards Implement automated reporting systems Create alert systems for performance drops Add campaign comparison features ๐ Use Cases Influencer Campaign Monitoring**: Track performance of influencer partnerships Brand Mention Detection**: Monitor brand mentions across influencer content ROI Analysis**: Calculate return on investment for marketing campaigns Competitive Intelligence**: Monitor competitor brand mentions Performance Optimization**: Identify top-performing content and influencers Campaign Reporting**: Generate automated reports for stakeholders ๐จ Important Notes Respect Instagram's terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update brand keywords and detection parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and compliance requirements Ensure accurate cost data for ROI calculations ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Instagram access issues: Check account accessibility and rate limits Brand detection false positives: Adjust keyword sensitivity ROI calculation errors: Verify cost and value parameters Schedule trigger failures: Check timezone and cron expression Data parsing errors: Review the Code node's JavaScript logic Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Instagram API documentation and best practices Influencer marketing analytics best practices ROI calculation methodologies and standards
by Dariusz Koryto
Get automated weather updates delivered directly to your Telegram chat at scheduled intervals. This workflow fetches current weather data from OpenWeatherMap and sends formatted weather reports via a Telegram bot. Use Cases Daily morning weather briefings Regular weather monitoring for outdoor activities Automated weather alerts for specific locations Personal weather assistant for travel planning Prerequisites Before setting up this workflow, ensure you have: An OpenWeatherMap API account (free tier available) A Telegram bot token Your Telegram chat ID n8n instance (cloud or self-hosted) Setup Instructions Step 1: Create OpenWeatherMap Account Go to OpenWeatherMap and sign up for a free account Navigate to the API keys section in your account Copy your API key (you'll need this for the workflow configuration) Step 2: Create Telegram Bot Open Telegram and search for @BotFather Start a chat and use the /newbot command Follow the prompts to create your bot and get the bot token Save the bot token securely Step 3: Get Your Telegram Chat ID Start a conversation with your newly created bot Send any message to the bot Visit https://api.telegram.org/bot<YourBOTToken>/getUpdates in your browser Look for your chat ID in the response (it will be a number like 123456789) Step 4: Configure the Workflow Import this workflow into your n8n instance Configure each node with your credentials: Schedule Trigger Node Set your preferred schedule (default: daily at 8:00 AM) Use cron expression format (e.g., 0 8 * * * for 8 AM daily) Get Weather Node Add your OpenWeatherMap credentials Update the cityName parameter to your desired location Format: "CityName,CountryCode" (e.g., "London,UK") Send a text message Node Add your Telegram bot credentials (bot token) Replace XXXXXXX in the chatId field with your actual chat ID Customization Options Location Settings In the "Get Weather" node, modify the cityName parameter to change the location. You can specify: City name only: "Paris" City with country: "Paris,FR" City with state and country: "Miami,FL,US" Schedule Frequency In the "Schedule Trigger" node, adjust the cron expression: Every 6 hours: 0 */6 * * * Twice daily (8 AM & 6 PM): 0 8,18 * * * Weekly on Mondays at 9 AM: 0 9 * * 1 Message Format In the "Format Weather" node, you can customize the message template by modifying the message variable in the function code. Current format includes: Current temperature with "feels like" temperature Min/max temperatures for the day Weather description and precipitation Wind speed and direction Cloud coverage percentage Sunrise and sunset times Language Support In the "Get Weather" node, change the language parameter to get weather descriptions in different languages: English: "en" Spanish: "es" French: "fr" German: "de" Polish: "pl" Troubleshooting Common Issues Weather data not updating: Verify your OpenWeatherMap API key is valid and active Check if you've exceeded your API rate limits Ensure the city name format is correct Messages not being sent: Confirm your Telegram bot token is correct Verify the chat ID is accurate (should be a number, not username) Make sure you've started a conversation with your bot Workflow not triggering: Check if the workflow is activated (toggle switch should be ON) Verify the cron expression syntax is correct Ensure your n8n instance is running continuously Testing the Workflow Use the "Test workflow" button to run manually Check each node's output for errors Verify the final message format in Telegram Node Descriptions Schedule Trigger Automatically starts the workflow based on a cron schedule. Runs at specified intervals to fetch fresh weather data. Get Weather Connects to OpenWeatherMap API to retrieve current weather conditions for the specified location. Format Weather Processes the raw weather data and creates a user-friendly message with emojis and organized information. Send a text message Delivers the formatted weather report to your Telegram chat using the configured bot. Additional Features You can extend this workflow by: Adding weather alerts for specific conditions (temperature thresholds, rain warnings) Including weather forecasts for multiple days Sending reports to multiple chat recipients Adding location-based emoji selection Integrating with other notification channels (email, Slack, Discord) Security Notes Keep your API keys and bot tokens secure Don't share your chat ID publicly Consider using n8n's credential system for storing sensitive information Regularly rotate your API keys for better security Special thanks to Arkadiusz, the only person who supports me in n8n mission to make automation great again.
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? ๐ Book a Call | ๐ฌ DM me on Linkedin Overview This workflow automates sending personalized SMS messages directly from a Google Sheet using Twilio. Simply update a row's status to "To send" and the workflow automatically sends the text message, then updates the status to "Success" or "Error" based on delivery results. Perfect for event reminders, bulk notifications, appointment confirmations, or any scenario where you need to send customized messages to multiple recipients. Key Features Simple trigger mechanism**: Change the status column to "To send" to queue messages Personalization support**: Use [First Name] and [Last Name] placeholders in message templates Automatic status tracking**: The workflow updates your spreadsheet with delivery results Error handling**: Failed deliveries are clearly marked, making it easy to identify issues like invalid phone numbers Runs every minute**: The workflow polls your sheet continuously when active Setup Instructions Step 1: Copy the Template Spreadsheet Make a copy of the Google Sheets template by going to File โ Make a copy. You must use your own copy so the workflow has permission to update status values. Step 2: Connect Your Accounts Google Sheets: Add your Google account credentials to the 'Monitor Google Sheet for SMS Queue' trigger node Twilio: Sign up for a free Twilio account (trial works for testing). From your Twilio dashboard, get your Account SID, Auth Token, and Twilio phone number, then add these credentials to the 'Send SMS via Twilio' node Step 3: Configure the Workflow In the Config node, update: sheet_url: Paste the URL of your copied Google Sheet from_number: Enter your Twilio phone number (include country code, e.g., +1234567890) Step 4: Activate and Test Activate the workflow using the toggle in the top right corner. Add a row to your sheet with the required information (ID, First Name, Phone Number, Message Template) and set the Status to "To send". Within one minute, the workflow will process the message and update the status accordingly.
by Pauline
This workflow allows you to have a Slack alert when one of your n8n workflows gets an issue. Error trigger**: This node launched the workflow when one of your active workflows gets an issue Slack node**: This node sends you a customized message to alert you and to check the error โ ๏ธ You don't have to activate this workflow for it to be effective
by Elodie Tasia
Create centralized, structured logs directly from your n8n workflows, using Supabase as your scalable log database. Whether you're debugging a workflow, monitoring execution status, or tracking error events, this template makes it easy to log messages in a consistent, structured format inspired by Log4j2 levels (DEBUG, INFO, WARN, ERROR, FATAL). Youโll get a reusable sub-workflow that lets you log any message with optional metadata, tied to a workflow execution and a specific node. What this template does Provides a sub-workflow that inserts log entries into Supabase. Each log entry supports the following fields: workflow_name: Your n8n workflow identifier node_name: last executed node execution_id: n8n execution ID for correlation log_level: One of DEBUG, INFO, WARN, ERROR, FATAL message: Textual message for the log metadata: Optional JSON metadata (flexible format) Comes with examples for diffrerent log levels: Easily call the sub-workflow from any step with a Execute Workflow node and pass dynamic parameters. Use Cases Debug complex workflows without relying on internal n8n logs. Catch and trace errors with contextual metadata. Integrate logs into external dashboards or monitoring tools via Supabase SQL or APIs. Analyze logs by level, time, or workflow. Requirements To use this template, you'll need: A Supabase project with: A log_level_type enum A logs table matching the expected structure A service role key or supabase credentials available in n8n. The table shema and SQL scripts are given in the template file. How to Use This Template Clone the sub-workflow into your n8n instance. Set up Supabase credentials (in the Supabase node). Call the sub-workflow using the Execute Workflow node. Provide input values like: { "workflow_name": "sync_crm_to_airtable", "execution_id": {{$execution.id}}, "node_name": "Airtable Insert", "log_level": "INFO", "message": "New contact pushed to Airtable successfully", "metadata": { "recordId": "rec123", "fields": ["email", "firstName"] } } Repeat anywhere you need to log custom events.
by Amir Safavi-Naini
LLM Cost Monitor & Usage Tracker for n8n ๐ฏ What This Workflow Does This workflow provides comprehensive monitoring and cost tracking for all LLM/AI agent usage across your n8n workflows. It extracts detailed token usage data from any workflow execution and calculates precise costs based on current model pricing. The Problem It Solves When running LLM nodes in n8n workflows, the token usage and intermediate data are not directly accessible within the same workflow. This monitoring workflow bridges that gap by: Retrieving execution data using the execution ID Extracting all LLM usage from any nested structure Calculating costs with customizable pricing Providing detailed analytics per node and model WARNING: it works after the full execution of the workflow (i.e. you can't get this data before completion of all tasks in the workflow) โ๏ธ Setup Instructions Prerequisites Experience Required: Basic familiarity with n8n LLM nodes and AI agents Agent Configuration: In your monitored workflows, go to agent settings and enable "Return Intermediate Steps" For getting execution data, you need to set upthe n8n API in your instance (also available onthe free version) Installation Steps Import this monitoring workflow into your n8n instance Go to Settings >> select n8n API from left bar >> define an API. Now you can add this as the credential for your "Get an Execution" node Configure your model name mappings in the "Standardize Names" node Update model pricing in the "Model Prices" node (prices per 1M tokens) To monitor a workflow: Add an "Execute Workflow" node at the end of your target workflow Select this monitoring workflow Important: Turn OFF "Wait For Sub-Workflow Completion" Pass the execution ID as input ๐ง Customization When You See Errors If the workflow enters the error path, it means an undefined model was detected. Simply: Add the model name to the standardize_names_dic Add its pricing to the model_price_dic Re-run the workflow Configurable Elements Model Name Mapping**: Standardize different model name variations (e.g., "gpt-4-0613" โ "gpt-4") Pricing Dictionary**: Set costs per million tokens for input/output Extraction Depth**: Captures tokens from any nesting level automatically ๐ Output Data Per LLM Call Cost Breakdown**: Prompt, completion, and total costs in USD Token Metrics**: Prompt tokens, completion tokens, total tokens Performance**: Execution time, start time, finish reason Content Preview**: First 100 chars of input/output for debugging Model Parameters**: Temperature, max tokens, timeout, retry count Execution Context**: Workflow name, node name, execution status Flow Tracking**: Previous nodes chain Summary Statistics Total executions and costs Breakdown by model type Breakdown by node Average cost per call Total execution time โจ Key Benefits No External Dependencies**: Everything runs within n8n Universal Compatibility**: Works with any workflow structure Automatic Detection**: Finds LLM usage regardless of nesting Real-time Monitoring**: Track costs as workflows execute Debugging Support**: Preview actual prompts and responses Scalable**: Handles multiple models and complex workflows ๐ Example Use Cases Cost Optimization**: Identify expensive nodes and optimize prompts Usage Analytics**: Track token consumption across teams/projects Budget Monitoring**: Set alerts based on cost thresholds Performance Analysis**: Find slow-running LLM calls Debugging**: Review actual inputs/outputs without logs Compliance**: Audit AI usage across your organization ๐ Quick Start Import workflow Update model prices (if needed) Add monitoring to any workflow with the Execute Workflow node View detailed cost breakdowns instantly Note: Prices are configured per million tokens. Default includes GPT-4, GPT-3.5, Claude, and other popular models. Add custom models as needed.
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? ๐ Book a Call | ๐ฌ DM me on Linkedin Overview This workflow automatically sends personalized SMS notifications to customers when their order status changes in Airtable. Monitor your order management base and instantly notify customers about updates like "Confirmed" or "Shipped" without manual intervention. When an order status changes in your Orders table, a notification record is automatically created in a Status Notifications table. The workflow monitors this table, prepares personalized messages using the customer's name and order status, sends the SMS via Twilio, and updates the delivery status back to Airtable for complete tracking and logging. Key Features Automated SMS sending triggered by Airtable record changes Personalized messages with customer name and order status Complete delivery tracking with success/error status updates Error handling for invalid phone numbers Works with Twilio's free trial account for testing Common Use Cases E-commerce order status updates Shipping notifications Order confirmation messages Customer communication logging Setup Instructions Step 1: Duplicate the Airtable Base Copy the Order Management Base template to your Airtable workspace. You must use your own copy as the workflow needs write permissions. Step 2: Connect Your Accounts Add your Airtable Personal Access Token credentials to the workflow nodes Create a Twilio account (free trial available) From your Twilio dashboard, obtain your Account SID, Auth Token, and Twilio phone number Add Twilio credentials to the "Send Order Status SMS" node Step 3: Configure the Workflow In the Config node, update these URLs with your duplicated Airtable base: notifications_table_url: Your Status Notifications table URL orders_table_url: Your Orders table URL from_number: Your Twilio phone number Step 4: Customize the Message Modify the "Prepare SMS Content" node to personalize the message template with your brand voice and additional order details. Step 5: Activate Toggle the workflow to 'Active' and the automation will monitor your Airtable base every minute, sending notifications automatically.
by Sk developer
๐ TikTok Account Monitoring Automation This n8n workflow automates the daily process of fetching TikTok account analytics using the TikTok API and logging the results to Google Sheets. It helps marketing teams, social media managers, and influencer agencies track video performance and audience growth across multiple TikTok usernames without manual effort. ๐ Workflow Summary โฐ Trigger via Schedule The workflow runs automatically every day (or any custom interval), ensuring data is consistently updated without manual input. ๐ฅ Sheet 1 โ Read TikTok Usernames A Google Sheet stores the list of TikTok usernames you want to monitor. โ Example Columns: username category priority notes ๐ Loop Through Each Username Each username is processed individually in a loop to make separate API calls and avoid data conflicts. ๐ก Fetch Analytics via RapidAPI The following TikTok API endpoint is used: POST https://tiktok-api42.p.rapidapi.com/videos_view_count.php You get per-user stats like: Number of videos Total views Recent video views This endpoint is highly stable and works without TikTok login or auth. ๐ค Sheet 2 โ Append Analytics Results Fetched data is logged in another Google Sheet for performance tracking. โ Example Columns: username total_videos total_views average_views fetch_date category ๐ฆ Sheet 3 โ Log API History or Errors A third sheet stores logs of API fetch status, failures, or skipped usernames for debugging. โ Example Columns: username status (e.g., success, failed, skipped) message timestamp ๐ RapidAPI Notes You must have an API key from TikTok API All requests are made to https://tiktok-api42.p.rapidapi.com The main endpoint in use is: POST https://tiktok-api42.p.rapidapi.com/videos_view_count.php Each request uses POST with params like username, region, number The response is JSON and easy to parse in n8n workflows ๐ Optional Extensions (Same API, More Insights) This same TikTok API also supports other advanced endpoints that can be added to enrich your workflow: | Endpoint Name | Functionality | |---------------------------|------------------------------------------------------------------| | User Profile Data | Get bio, profile image, followers, likes, etc. | | User Account Stats | Extract detailed user metrics (likes, comments, shares) | | User Audience Stats | Know where their followers are from and gender split | | Historical Data | Track historical performance trends (useful for growth charts) | | HashTags Scraper | Find trending or related hashtags used by the user | | Related User Info | Suggest accounts similar to the one queried | | Videos Views Counts | Already used to get view stats for multiple videos | Each of these can be added using HTTP Request nodes in n8n and plugged into the same sheet or separate ones. โ Benefits ๐ Fully Automated: No manual copy-paste or login required ๐ Centralized Analytics: Track all creators or clients in one dashboard ๐ Performance Insights: Daily growth visibility with historical tracking ๐ค Data Export Ready: Stored in Google Sheets for easy share/report/export ๐ง Scalable & Flexible: Add hashtags, followers, or audience demographics ๐ง Use Cases Influencer Agencies** tracking clients' TikTok growth daily Brands running UGC Campaigns** who want to monitor video traction Analysts** building dashboards from Sheet-to-DataStudio/Looker Marketers** analyzing viral trends or creators across niches ๐ Final Note This workflow is extendable. You can: Merge multiple endpoints per user Schedule it weekly or monthly Send email summaries Push to Slack or Google Data Studio > API Used in this workflow: > TikTok API
by Luis Hernandez
GLPI Pending Tickets Notification to Microsoft Teams ๐ Overview Automate daily notifications for pending GLPI tickets directly to Microsoft Teams. Never miss critical support cases with this workflow that monitors assigned tickets and sends personal alerts. ๐ง How It Works Connect to GLPI - Authenticates and searches for your assigned tickets Filter Results - Finds tickets in "In Progress" status within your entity Send Notifications - Delivers formatted alerts to your Teams chat Clean Up - Properly closes GLPI session for security ๐ What Gets Monitored Tickets assigned to specific technician (configurable) Status: "In Progress/Assigned" Entity: Your organization (customizable) Date range: Tickets after specified date โก Key Benefits Never Miss Deadlines - Daily automated reminders Personal Focus - Only your assigned tickets Time Savings - Eliminates manual checking (15-30 min daily) Rich Details - Shows ticket title, ID, and due date โ๏ธ Setup Steps Time Required: ~30 minutes Import Template - Add workflow to your n8n instance Configure GLPI - Set server URL, credentials, and app token Set Technician ID - Update to your GLPI user ID Connect Teams - Link your Microsoft Teams account Customize Filters - Adjust entity name and date range Test & Schedule - Verify notifications and set daily trigger ๐จ Easy Customization Change technician ID for different users Adjust notification schedule (default: 8 AM daily) Modify entity filters for your organization Add multiple technicians by duplicating workflow ๐ Prerequisites GLPI instance with API enabled GLPI user account with ticket read permissions Microsoft Teams account (basic license) n8n with Microsoft Teams integration Perfect for support technicians who want automated reminders about their pending GLPI tickets without manual daily checks.
by Nima Salimi
๐ง Automated SEO Keyword and SERP Analysis with DataForSEO for High-Converting Content | n8n workflow template Overview ๐ This is a complete SEO automation workflow built for professionals who want to manage all their DataForSEO operations inside n8n โ no coding required โ๏ธ You can easily choose your operator (action), such as: ๐ SERP Analysis โ Get ranking data for specific keywords ๐ Keyword Data โ Retrieve search volume, CPC, and trends ๐ง Competitor Research โ Analyze which domains dominate target queries Once the workflow runs, it automatically creates a new Google Sheet ๐ (if it doesnโt exist) and appends the results โ including metrics like keyword, rank, domain, and date โ to keep a growing historical record of your SEO data ๐ ๐ก Ideal for SEO specialists, agencies, and growth teams who want a single automation to handle all keyword and ranking data pipelines using DataForSEO + Google Sheets + n8n. Examples related keyword sheet Each operator (SERP, Keywords Data, Competitors) automatically creates a separate Google Sheet ๐ ๐ค Whoโs it for? ๐งฉ SEO Specialists who need accurate keyword & SERP insights daily โ๏ธ Content Marketers planning new blog posts or landing pages ๐ Digital Marketing Teams tracking top-performing keywords and competitors ๐ผ Agencies managing multiple websites or niches with automated reports ๐ง AI-Driven SEOs building GPT-powered content strategies using live ranking data โ๏ธ How It Works Trigger & Input Setup Start the workflow manually or schedule it to run daily / weekly ๐ Import a keyword list from Google Sheets ๐, NocoDB, or an internal database Keyword Data Retrieval (DataForSEO Keyword API) Sends requests to the keywords_data endpoint of DataForSEO Gathers search volume, CPC, competition level, and trend data Identifies the most promising keywords for conversion-focused content SERP Analysis (DataForSEO SERP API) Fetches the top organic results for each keyword Extracts domains, titles, snippets, and ranking positions Highlights which competitors dominate the search landscape Data Enrichment & Filtering Uses Code nodes to clean and normalize the DataForSEO JSON output Filters out low-intent or irrelevant keywords automatically Optionally integrates OpenAI or GPT nodes for insight generation โจ Store & Visualize Saves results into Google Sheets, Airtable, or NocoDB for tracking Each run adds fresh data, building a performance history over time ๐ Optional AI Layer (Advanced) Use OpenAI Chat Model to summarize SERP insights: > โTop 3 competitors for cloud storage pricing focus on cost transparency โ recommend including pricing tables.โ Automatically generate content briefs or keyword clusters ๐งฉ Workflow Highlights โก Multiple DataForSEO Endpoints Supported (keywords_data, serp, competitors) ๐ Automated Scheduling for daily / weekly updates ๐ง Data Normalization for clean, structured SEO metrics ๐ Easy Export to Google Sheets or NocoDB ๐งฉ Expandable Design โ integrate GPT, Google Search Console, or Analytics ๐ Multi-Language & Multi-Location Support via language_code and location_code ๐ Example Output (Google Sheets) | keyword | rank | domain | volume | cpc | competition | date | |----------|------|----------------|---------|---------|---------------|------------| | cloud hosting | 1 | cloud.google.com | 18,100 | $2.40 | 0.62 | 2025-10-25 | | cloud server | 3 | aws.amazon.com | 12,900 | $3.10 | 0.75 | 2025-10-25 | | hybrid cloud | 5 | vmware.com | 9,800 | $2.90 | 0.58 | 2025-10-25 | Each run appends new keyword metrics for trend and performance tracking. ๐ก Pro Tips ๐ Combine this workflow with Google Search Console for even richer insights โ๏ธ Adjust the location_code and language_code for local SEO targeting ๐ฌ Add a Slack or Gmail alert to receive weekly keyword opportunity reports ๐ค Extend with OpenAI to automatically create content briefs or topic clusters ๐ Integrations Used ๐งญ DataForSEO API โ Keyword & SERP data source ๐ Google Sheets / Airtable / NocoDB โ Storage and visualization ๐ค OpenAI Chat Model (optional) โ Insight generation and summarization โ๏ธ Code Nodes โ JSON parsing and custom data processing โ Features ๐ Choose from 100+ Locations Select your target country, region, or city using the location_code parameter. Perfect for local SEO tracking or multi-market analysis. ๐ฃ๏ธ Choose from 50+ Languages Define the language_code to get accurate, language-specific keyword and SERP data. Supports English (en), Spanish (es), French (fr), German (de), and more. ๐ Auto-Creates Google Sheets for You No need to manually set up a spreadsheet โ the workflow automatically creates a new Google Sheet (if it doesnโt exist) and structures it with the right columns (query, rank, domain, date, etc.). ๐ Append New Data Automatically Every run adds fresh SEO metrics to your sheet, building a continuous daily or weekly ranking history. โ๏ธ Flexible Operator Selection Choose which DataForSEO operator (action) you want to run: keywords_data, serp, or competitors. Each operator retrieves a different type of SEO insight. ๐ง Fully Expandable Add Slack alerts, Airtable sync, or AI summaries using OpenAI โ all within the same workflow. โ๏ธ How to Set Up ๐ Add DataForSEO Credentials Get your API login from dataforseo.com Add it under HTTP Request โ Basic Auth in n8n ๐ Connect Google Sheets Authorize your Google account The workflow will auto-create the sheet if it doesnโt exist ๐ Choose Operator (Action) Pick one: serp, keywords_data, or competitors Each operator runs a different SEO analysis ๐ Set Location & Language Example: location_code: 2840 (US), language_code: en ๐ Run or Schedule Trigger manually or set a daily schedule New results will append to your Google Sheet automatically ๐บ Check Out My Channel ๐ฌ Learn more about SEO Automation, n8n, and AI-powered content workflows ๐ Connect with me on LinkedIn: Nima Salimi Follow for more templates, AI workflows, and SEO automation tutorials ๐ฅ