by Elodie Tasia
Create centralized, structured logs directly from your n8n workflows, using Supabase as your scalable log database. Whether you're debugging a workflow, monitoring execution status, or tracking error events, this template makes it easy to log messages in a consistent, structured format inspired by Log4j2 levels (DEBUG, INFO, WARN, ERROR, FATAL). You’ll get a reusable sub-workflow that lets you log any message with optional metadata, tied to a workflow execution and a specific node. What this template does Provides a sub-workflow that inserts log entries into Supabase. Each log entry supports the following fields: workflow_name: Your n8n workflow identifier node_name: last executed node execution_id: n8n execution ID for correlation log_level: One of DEBUG, INFO, WARN, ERROR, FATAL message: Textual message for the log metadata: Optional JSON metadata (flexible format) Comes with examples for diffrerent log levels: Easily call the sub-workflow from any step with a Execute Workflow node and pass dynamic parameters. Use Cases Debug complex workflows without relying on internal n8n logs. Catch and trace errors with contextual metadata. Integrate logs into external dashboards or monitoring tools via Supabase SQL or APIs. Analyze logs by level, time, or workflow. Requirements To use this template, you'll need: A Supabase project with: A log_level_type enum A logs table matching the expected structure A service role key or supabase credentials available in n8n. The table shema and SQL scripts are given in the template file. How to Use This Template Clone the sub-workflow into your n8n instance. Set up Supabase credentials (in the Supabase node). Call the sub-workflow using the Execute Workflow node. Provide input values like: { "workflow_name": "sync_crm_to_airtable", "execution_id": {{$execution.id}}, "node_name": "Airtable Insert", "log_level": "INFO", "message": "New contact pushed to Airtable successfully", "metadata": { "recordId": "rec123", "fields": ["email", "firstName"] } } Repeat anywhere you need to log custom events.
by Amir Safavi-Naini
LLM Cost Monitor & Usage Tracker for n8n 🎯 What This Workflow Does This workflow provides comprehensive monitoring and cost tracking for all LLM/AI agent usage across your n8n workflows. It extracts detailed token usage data from any workflow execution and calculates precise costs based on current model pricing. The Problem It Solves When running LLM nodes in n8n workflows, the token usage and intermediate data are not directly accessible within the same workflow. This monitoring workflow bridges that gap by: Retrieving execution data using the execution ID Extracting all LLM usage from any nested structure Calculating costs with customizable pricing Providing detailed analytics per node and model WARNING: it works after the full execution of the workflow (i.e. you can't get this data before completion of all tasks in the workflow) ⚙️ Setup Instructions Prerequisites Experience Required: Basic familiarity with n8n LLM nodes and AI agents Agent Configuration: In your monitored workflows, go to agent settings and enable "Return Intermediate Steps" For getting execution data, you need to set upthe n8n API in your instance (also available onthe free version) Installation Steps Import this monitoring workflow into your n8n instance Go to Settings >> select n8n API from left bar >> define an API. Now you can add this as the credential for your "Get an Execution" node Configure your model name mappings in the "Standardize Names" node Update model pricing in the "Model Prices" node (prices per 1M tokens) To monitor a workflow: Add an "Execute Workflow" node at the end of your target workflow Select this monitoring workflow Important: Turn OFF "Wait For Sub-Workflow Completion" Pass the execution ID as input 🔧 Customization When You See Errors If the workflow enters the error path, it means an undefined model was detected. Simply: Add the model name to the standardize_names_dic Add its pricing to the model_price_dic Re-run the workflow Configurable Elements Model Name Mapping**: Standardize different model name variations (e.g., "gpt-4-0613" → "gpt-4") Pricing Dictionary**: Set costs per million tokens for input/output Extraction Depth**: Captures tokens from any nesting level automatically 📊 Output Data Per LLM Call Cost Breakdown**: Prompt, completion, and total costs in USD Token Metrics**: Prompt tokens, completion tokens, total tokens Performance**: Execution time, start time, finish reason Content Preview**: First 100 chars of input/output for debugging Model Parameters**: Temperature, max tokens, timeout, retry count Execution Context**: Workflow name, node name, execution status Flow Tracking**: Previous nodes chain Summary Statistics Total executions and costs Breakdown by model type Breakdown by node Average cost per call Total execution time ✨ Key Benefits No External Dependencies**: Everything runs within n8n Universal Compatibility**: Works with any workflow structure Automatic Detection**: Finds LLM usage regardless of nesting Real-time Monitoring**: Track costs as workflows execute Debugging Support**: Preview actual prompts and responses Scalable**: Handles multiple models and complex workflows 📝 Example Use Cases Cost Optimization**: Identify expensive nodes and optimize prompts Usage Analytics**: Track token consumption across teams/projects Budget Monitoring**: Set alerts based on cost thresholds Performance Analysis**: Find slow-running LLM calls Debugging**: Review actual inputs/outputs without logs Compliance**: Audit AI usage across your organization 🚀 Quick Start Import workflow Update model prices (if needed) Add monitoring to any workflow with the Execute Workflow node View detailed cost breakdowns instantly Note: Prices are configured per million tokens. Default includes GPT-4, GPT-3.5, Claude, and other popular models. Add custom models as needed.
by Samir Saci
Tags*: Supply Chain, Inventory Management, ABC Analysis, Pareto Principle, Demand Variability, Automation, Google Sheets Context Hi! I’m Samir — a Supply Chain Engineer and Data Scientist based in Paris, and founder of LogiGreen Consulting. I help companies optimise inventory and logistics operations by combining data analytics and workflow automation. This workflow is part of our inventory optimisation toolkit, allowing businesses to perform ABC classification and Pareto analysis directly from their transactional sales data. > Automate inventory segmentation with n8n! 📬 For business inquiries, feel free to connect with me on LinkedIn Who is this template for? This workflow is designed for supply chain analysts, demand planners, or inventory managers who want to: Identify their top-performing items (Pareto 80/20 principle) Classify products into ABC categories based on sales contribution Evaluate demand variability (XYZ classification support) Imagine you have a Google Sheet where daily sales transactions are stored: The workflow aggregates sales by item, calculates cumulative contribution, and assigns A, B, or C classes. It also computes mean, standard deviation, and coefficient of variation (CV) to highlight demand volatility. How does it work? This workflow automates the process of ABC & Pareto analysis from raw sales data: 📊 Google Sheets input provides daily transactional sales 🧮 Aggregation & code nodes compute sales, turnover, and cumulative shares 🧠 ABC class mapping assigns items into A/B/C buckets 📈 Demand variability metrics (XYZ) are calculated 📑 Results are appended into dedicated Google Sheets tabs for reporting 🎥 Watch My Tutorial Steps: 📝 Load daily sales records from Google Sheets 🔎 Filter out items with zero sales 📊 Aggregate sales by store, item, and day 📈 Perform Pareto analysis to calculate cumulative turnover share 🧮 Compute demand variability (mean, stdev, CV) 🧠 Assign ABC classes based on cumulative share thresholds 📥 Append results into ABC XYZ and Pareto output sheets What do I need to get started? You’ll need: A Google Sheet with sales transactions (date, item, quantity, turnover) that is available here: Test Sheet A Google Sheets account connected in n8n Basic knowledge of inventory analysis (ABC/XYZ) Next Steps 🗒️ Use the sticky notes in the n8n canvas to: Add your Google Sheets credentials Replace the Sheet ID with your own sales dataset Run the workflow and check the output tabs: ABC XYZ, Pareto, and Store Sales This template was built using n8n v1.107.3 Submitted: September 15, 2025
by Kirill Khatkevich
This workflow is a production-ready Meta Ads Webhook dispatcher for the Ad Account object. It receives webhook deliveries from Meta, returns the required acknowledgement, routes events by field, logs them to Google Sheets (separate tabs per event type), and sends a compact Slack notification with counts. Use Case Meta webhooks are powerful, but the “last mile” is usually missing: storing raw events, making them readable, and triggering the next automation reliably. This workflow is ideal if you want to: Centralize all Meta Ads webhook events** in one place (one endpoint, one workflow). Log every event** for auditing and analysis (Google Sheets). Get lightweight alerts** in Slack (count-based, not spammy). Trigger downstream workflows** depending on the webhook type (e.g., Creative Fatigue → replace creatives in the affected ad set). How it Works The workflow is organized into clear blocks: 1. Webhook endpoint + verification A Webhook trigger receives requests from Meta. The workflow detects verification requests (hub.mode=subscribe) and validates the Verify Token. It responds with the hub.challenge value so you can successfully subscribe your webhook in Facebook Developers. 2. Route by event field For real webhook deliveries (object=ad_account), the workflow routes execution by: creative_fatigue ad_recommendations ads_async_creation_request in_process_ad_objects product_set_issue with_issues_ad_objects 3. Acknowledge every webhook delivery For each branch, the workflow immediately returns a JSON acknowledgement (e.g., { "status": "received", "field": "...", "webhook_type": "..." }), so Meta considers the delivery successful. 4. Normalize + log to Google Sheets The workflow splits array payloads (Meta can send multiple entry items and multiple changes). Each event type is appended to its own Google Sheets tab (one spreadsheet, multiple sheets), with the raw webhook body also saved for future debugging. 5. Summarize + notify All event logs are merged and summarized to compute a compact count per field. A Slack node sends a short message like “New Meta Webhook / Type / Count” with a link to the spreadsheet. Setup Instructions 1. Create and configure a Meta app Create an app in Facebook Developers. Add the Webhooks product. Subscribe to the Ad Account object. Configure: Callback URL (your n8n webhook URL) Verify Token (must match the value in the workflow) 2. Configure the Webhook node Set the webhook path. Use the Production URL when you go live. 3. Connect credentials Connect Google Sheets OAuth credentials in all Google Sheets nodes. Connect Slack credentials and choose your target channel. 4. Update Google Sheets destinations Set your Spreadsheet ID and ensure the expected sheet tabs exist (one per supported webhook field). 5. Activate Save and activate the workflow. Trigger a verification request from Facebook Developers to confirm everything is wired correctly. Testing To test the dispatcher before going live, use the open-source tool: meta-ads-webhook-tester. Further Ideas & Customization Add Telegram**: duplicate the Slack notification step and send the same summary message to Telegram. Trigger automations**: after routing by field, execute another workflow (e.g., Creative Fatigue → fetch affected ad set → rotate creatives). Hardening**: add de-duplication (event IDs), retries, and a dead-letter/error sheet tab for failed log writes.
by Luis Hernandez
GLPI Pending Tickets Notification to Microsoft Teams 📋 Overview Automate daily notifications for pending GLPI tickets directly to Microsoft Teams. Never miss critical support cases with this workflow that monitors assigned tickets and sends personal alerts. 🔧 How It Works Connect to GLPI - Authenticates and searches for your assigned tickets Filter Results - Finds tickets in "In Progress" status within your entity Send Notifications - Delivers formatted alerts to your Teams chat Clean Up - Properly closes GLPI session for security 📊 What Gets Monitored Tickets assigned to specific technician (configurable) Status: "In Progress/Assigned" Entity: Your organization (customizable) Date range: Tickets after specified date ⚡ Key Benefits Never Miss Deadlines - Daily automated reminders Personal Focus - Only your assigned tickets Time Savings - Eliminates manual checking (15-30 min daily) Rich Details - Shows ticket title, ID, and due date ⚙️ Setup Steps Time Required: ~30 minutes Import Template - Add workflow to your n8n instance Configure GLPI - Set server URL, credentials, and app token Set Technician ID - Update to your GLPI user ID Connect Teams - Link your Microsoft Teams account Customize Filters - Adjust entity name and date range Test & Schedule - Verify notifications and set daily trigger 🎨 Easy Customization Change technician ID for different users Adjust notification schedule (default: 8 AM daily) Modify entity filters for your organization Add multiple technicians by duplicating workflow 📋 Prerequisites GLPI instance with API enabled GLPI user account with ticket read permissions Microsoft Teams account (basic license) n8n with Microsoft Teams integration Perfect for support technicians who want automated reminders about their pending GLPI tickets without manual daily checks.
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? 📞 Book a Call | 💬 DM me on Linkedin Overview This workflow automatically sends personalized SMS notifications to customers when their order status changes in Airtable. Monitor your order management base and instantly notify customers about updates like "Confirmed" or "Shipped" without manual intervention. When an order status changes in your Orders table, a notification record is automatically created in a Status Notifications table. The workflow monitors this table, prepares personalized messages using the customer's name and order status, sends the SMS via Twilio, and updates the delivery status back to Airtable for complete tracking and logging. Key Features Automated SMS sending triggered by Airtable record changes Personalized messages with customer name and order status Complete delivery tracking with success/error status updates Error handling for invalid phone numbers Works with Twilio's free trial account for testing Common Use Cases E-commerce order status updates Shipping notifications Order confirmation messages Customer communication logging Setup Instructions Step 1: Duplicate the Airtable Base Copy the Order Management Base template to your Airtable workspace. You must use your own copy as the workflow needs write permissions. Step 2: Connect Your Accounts Add your Airtable Personal Access Token credentials to the workflow nodes Create a Twilio account (free trial available) From your Twilio dashboard, obtain your Account SID, Auth Token, and Twilio phone number Add Twilio credentials to the "Send Order Status SMS" node Step 3: Configure the Workflow In the Config node, update these URLs with your duplicated Airtable base: notifications_table_url: Your Status Notifications table URL orders_table_url: Your Orders table URL from_number: Your Twilio phone number Step 4: Customize the Message Modify the "Prepare SMS Content" node to personalize the message template with your brand voice and additional order details. Step 5: Activate Toggle the workflow to 'Active' and the automation will monitor your Airtable base every minute, sending notifications automatically.
by Ehsan
Who's it for This template is for sales teams, marketing operations (M-Ops), or freelancers who use Airtable as a "control panel" or staging area for new leads. If you're tired of manually copying and pasting approved leads into HubSpot, this workflow automates the entire process for you. How it works This workflow runs on a schedule (e.g., every 5 minutes) to check for new leads. 1. Before: Your Airtable has new leads with a '📥 New Lead' status. 2. The Trigger: You (or a teammate) manually review and change a lead's status to '👍 Ready to Sync'. 3. The Workflow Runs: n8n fetches all leads in that view (up to 50 at a time) and loops through them one by one. For each lead, it: Finds (or creates) a Company in HubSpot based on the email domain. Creates (or updates) a Contact in HubSpot based on the email. Automatically associates that Contact with that Company. 4. After: The workflow automatically updates the same Airtable row with the new HubSpot IDs and a '✅ Synced' status, completing the 2-way sync. This template includes a full batch-processing loop, robust error-handling (it logs failures back to Airtable), and detailed sticky notes to guide you. How to set up Setup should take less than 10 minutes. All detailed instructions are in the sticky notes inside the workflow. Copy the Airtable Base: This is a mandatory first step! You must use this template. ➡️ Click Here to Copy the Base Template (First time using Airtable? Sign up here with my link) Add Your Credentials: How to connect Airtable to n8n (Video) How to connect HubSpot to n8n (Video) Configure 3 Nodes: Schedule Trigger: Set how often you want it to run (e.g., every 5 minutes). get 👍Ready to Sync: Select your Airtable credential and the Base you copied. Also, Do this for the other Airtable nodes. Search company: Select your HubSpot credential. Also, Do this for the othe HubSpot nodes. Activate! Save and activate the workflow. To test it, just change a lead's 'Status' in Airtable to '👍 Ready to Sync'. Requirements An Airtable account. A HubSpot account (a free developer sandbox account is recommended for testing). n8n credentials for both Airtable and HubSpot (using a Private App Token for HubSpot). How to customize the workflow Add More Fields:** Easily sync more data (like 'Phone Number' or 'Lead Source') by adding columns in your Airtable, then adding those fields to the Create or update a contact node in n8n. Change the Schedule:** Adjust the Schedule Trigger to run more or less frequently. Add Notifications:** Connect a Slack or email node to the 👍Done! Going for next record (success) or specially 👎Failed! Going for next record1 (error) paths to get real-time alerts.
by Sk developer
📊 TikTok Account Monitoring Automation This n8n workflow automates the daily process of fetching TikTok account analytics using the TikTok API and logging the results to Google Sheets. It helps marketing teams, social media managers, and influencer agencies track video performance and audience growth across multiple TikTok usernames without manual effort. 🔁 Workflow Summary ⏰ Trigger via Schedule The workflow runs automatically every day (or any custom interval), ensuring data is consistently updated without manual input. 📥 Sheet 1 – Read TikTok Usernames A Google Sheet stores the list of TikTok usernames you want to monitor. ✅ Example Columns: username category priority notes 🔁 Loop Through Each Username Each username is processed individually in a loop to make separate API calls and avoid data conflicts. 📡 Fetch Analytics via RapidAPI The following TikTok API endpoint is used: POST https://tiktok-api42.p.rapidapi.com/videos_view_count.php You get per-user stats like: Number of videos Total views Recent video views This endpoint is highly stable and works without TikTok login or auth. 📤 Sheet 2 – Append Analytics Results Fetched data is logged in another Google Sheet for performance tracking. ✅ Example Columns: username total_videos total_views average_views fetch_date category 📦 Sheet 3 – Log API History or Errors A third sheet stores logs of API fetch status, failures, or skipped usernames for debugging. ✅ Example Columns: username status (e.g., success, failed, skipped) message timestamp 🔐 RapidAPI Notes You must have an API key from TikTok API All requests are made to https://tiktok-api42.p.rapidapi.com The main endpoint in use is: POST https://tiktok-api42.p.rapidapi.com/videos_view_count.php Each request uses POST with params like username, region, number The response is JSON and easy to parse in n8n workflows 📌 Optional Extensions (Same API, More Insights) This same TikTok API also supports other advanced endpoints that can be added to enrich your workflow: | Endpoint Name | Functionality | |---------------------------|------------------------------------------------------------------| | User Profile Data | Get bio, profile image, followers, likes, etc. | | User Account Stats | Extract detailed user metrics (likes, comments, shares) | | User Audience Stats | Know where their followers are from and gender split | | Historical Data | Track historical performance trends (useful for growth charts) | | HashTags Scraper | Find trending or related hashtags used by the user | | Related User Info | Suggest accounts similar to the one queried | | Videos Views Counts | Already used to get view stats for multiple videos | Each of these can be added using HTTP Request nodes in n8n and plugged into the same sheet or separate ones. ✅ Benefits 🔄 Fully Automated: No manual copy-paste or login required 📊 Centralized Analytics: Track all creators or clients in one dashboard 📈 Performance Insights: Daily growth visibility with historical tracking 📤 Data Export Ready: Stored in Google Sheets for easy share/report/export 🔧 Scalable & Flexible: Add hashtags, followers, or audience demographics 🧠 Use Cases Influencer Agencies** tracking clients' TikTok growth daily Brands running UGC Campaigns** who want to monitor video traction Analysts** building dashboards from Sheet-to-DataStudio/Looker Marketers** analyzing viral trends or creators across niches 📌 Final Note This workflow is extendable. You can: Merge multiple endpoints per user Schedule it weekly or monthly Send email summaries Push to Slack or Google Data Studio > API Used in this workflow: > TikTok API
by Marth
How It Works: The 5-Node Monitoring Flow This concise workflow efficiently captures, filters, and delivers crucial cybersecurity-related mentions. 1. Monitor: Cybersecurity Keywords (X/Twitter Trigger) This is the entry point of your workflow. It actively searches X (formerly Twitter) for tweets containing the specific keywords you define. Function:** Continuously polls X for tweets that match your specified queries (e.g., your company name, "Log4j," "CVE-2024-XXXX," "ransomware"). Process:** As soon as a matching tweet is found, it triggers the workflow to begin processing that information. 2. Format Notification (Code Node) This node prepares the raw tweet data, transforming it into a clean, actionable message for your alerts. Function:** Extracts key details from the raw tweet and structures them into a clear, concise message. Process:** It pulls out the tweet's text, the user's handle (@screen_name), and the direct URL to the tweet. These pieces are then combined into a user-friendly notificationMessage. You can also include basic filtering logic here if needed. 3. Valid Mention? (If Node) This node acts as a quick filter to help reduce noise and prevent irrelevant alerts from reaching your team. Function:** Serves as a simple conditional check to validate the mention's relevance. Process:** It evaluates the notificationMessage against specific criteria (e.g., ensuring it doesn't contain common spam words like "bot"). If the mention passes this basic validation, the workflow continues. Otherwise, it quietly ends for that particular tweet. 4. Send Notification (Slack Node) This is the delivery mechanism for your alerts, ensuring your team receives instant, visible notifications. Function:** Delivers the formatted alert message directly to your designated communication channel. Process:* The notificationMessage is sent straight to your specified *Slack channel** (e.g., #cyber-alerts or #security-ops). 5. End Workflow (No-Op Node) This node simply marks the successful completion of the workflow's execution path. Function:** Indicates the end of the workflow's process for a given trigger. How to Set Up Implementing this simple cybersecurity monitor in your n8n instance is quick and straightforward. 1. Prepare Your Credentials Before building the workflow, ensure all necessary accounts are set up and their respective credentials are ready for n8n. X (Twitter) API:* You'll need an X (Twitter) developer account to create an application and obtain your Consumer Key/Secret and Access Token/Secret. Use these to set up your *Twitter credential** in n8n. Slack API:* Set up your *Slack credential* in n8n. You'll also need the *Channel ID** of the Slack channel where you want your security alerts to be posted (e.g., #security-alerts or #it-ops). 2. Import the Workflow JSON Get the workflow structure into your n8n instance. Import:** In your n8n instance, go to the "Workflows" section. Click the "New" or "+" icon, then select "Import from JSON." Paste the provided JSON code (from the previous response) into the import dialog and import the workflow. 3. Configure the Nodes Customize the imported workflow to fit your specific monitoring needs. Monitor: Cybersecurity Keywords (X/Twitter):** Click on this node. Select your newly created Twitter Credential. CRITICAL: Modify the "Query" parameter to include your specific brand names, relevant CVEs, or general cybersecurity terms. For example: "YourCompany" OR "CVE-2024-1234" OR "phishing alert". Use OR to combine multiple terms. Send Notification (Slack):** Click on this node. Select your Slack Credential. Replace "YOUR_SLACK_CHANNEL_ID" with the actual Channel ID you noted earlier for your security alerts. (Optional: You can adjust the "Valid Mention?" node's condition if you find specific patterns of false positives in your search results that you want to filter out.) 4. Test and Activate Verify that your workflow is working correctly before setting it live. Manual Test:** Click the "Test Workflow" button (usually in the top right corner of the n8n editor). This will execute the workflow once. Verify Output:** Check your specified Slack channel to confirm that any detected mentions are sent as notifications in the correct format. If no matching tweets are found, you won't see a notification, which is expected. Activate:** Once you're satisfied with the test results, toggle the "Active" switch (usually in the top right corner of the n8n editor) to ON. Your workflow will then automatically monitor X (Twitter) at the specified polling interval.
by Nima Salimi
🧠 Automated SEO Keyword and SERP Analysis with DataForSEO for High-Converting Content | n8n workflow template Overview 🌐 This is a complete SEO automation workflow built for professionals who want to manage all their DataForSEO operations inside n8n — no coding required ⚙️ You can easily choose your operator (action), such as: 🔍 SERP Analysis – Get ranking data for specific keywords 📈 Keyword Data – Retrieve search volume, CPC, and trends 🧠 Competitor Research – Analyze which domains dominate target queries Once the workflow runs, it automatically creates a new Google Sheet 📊 (if it doesn’t exist) and appends the results — including metrics like keyword, rank, domain, and date — to keep a growing historical record of your SEO data 📅 💡 Ideal for SEO specialists, agencies, and growth teams who want a single automation to handle all keyword and ranking data pipelines using DataForSEO + Google Sheets + n8n. Examples related keyword sheet Each operator (SERP, Keywords Data, Competitors) automatically creates a separate Google Sheet 📊 👤 Who’s it for? 🧩 SEO Specialists who need accurate keyword & SERP insights daily ✍️ Content Marketers planning new blog posts or landing pages 📊 Digital Marketing Teams tracking top-performing keywords and competitors 💼 Agencies managing multiple websites or niches with automated reports 🧠 AI-Driven SEOs building GPT-powered content strategies using live ranking data ⚙️ How It Works Trigger & Input Setup Start the workflow manually or schedule it to run daily / weekly 🕒 Import a keyword list from Google Sheets 📄, NocoDB, or an internal database Keyword Data Retrieval (DataForSEO Keyword API) Sends requests to the keywords_data endpoint of DataForSEO Gathers search volume, CPC, competition level, and trend data Identifies the most promising keywords for conversion-focused content SERP Analysis (DataForSEO SERP API) Fetches the top organic results for each keyword Extracts domains, titles, snippets, and ranking positions Highlights which competitors dominate the search landscape Data Enrichment & Filtering Uses Code nodes to clean and normalize the DataForSEO JSON output Filters out low-intent or irrelevant keywords automatically Optionally integrates OpenAI or GPT nodes for insight generation ✨ Store & Visualize Saves results into Google Sheets, Airtable, or NocoDB for tracking Each run adds fresh data, building a performance history over time 📈 Optional AI Layer (Advanced) Use OpenAI Chat Model to summarize SERP insights: > “Top 3 competitors for cloud storage pricing focus on cost transparency — recommend including pricing tables.” Automatically generate content briefs or keyword clusters 🧩 Workflow Highlights ⚡ Multiple DataForSEO Endpoints Supported (keywords_data, serp, competitors) 🔁 Automated Scheduling for daily / weekly updates 🧠 Data Normalization for clean, structured SEO metrics 📊 Easy Export to Google Sheets or NocoDB 🧩 Expandable Design — integrate GPT, Google Search Console, or Analytics 🌎 Multi-Language & Multi-Location Support via language_code and location_code 📊 Example Output (Google Sheets) | keyword | rank | domain | volume | cpc | competition | date | |----------|------|----------------|---------|---------|---------------|------------| | cloud hosting | 1 | cloud.google.com | 18,100 | $2.40 | 0.62 | 2025-10-25 | | cloud server | 3 | aws.amazon.com | 12,900 | $3.10 | 0.75 | 2025-10-25 | | hybrid cloud | 5 | vmware.com | 9,800 | $2.90 | 0.58 | 2025-10-25 | Each run appends new keyword metrics for trend and performance tracking. 💡 Pro Tips 🔍 Combine this workflow with Google Search Console for even richer insights ⚙️ Adjust the location_code and language_code for local SEO targeting 💬 Add a Slack or Gmail alert to receive weekly keyword opportunity reports 🤖 Extend with OpenAI to automatically create content briefs or topic clusters 📚 Integrations Used 🧭 DataForSEO API – Keyword & SERP data source 📄 Google Sheets / Airtable / NocoDB – Storage and visualization 🤖 OpenAI Chat Model (optional) – Insight generation and summarization ⚙️ Code Nodes – JSON parsing and custom data processing ✅ Features 🌎 Choose from 100+ Locations Select your target country, region, or city using the location_code parameter. Perfect for local SEO tracking or multi-market analysis. 🗣️ Choose from 50+ Languages Define the language_code to get accurate, language-specific keyword and SERP data. Supports English (en), Spanish (es), French (fr), German (de), and more. 📊 Auto-Creates Google Sheets for You No need to manually set up a spreadsheet — the workflow automatically creates a new Google Sheet (if it doesn’t exist) and structures it with the right columns (query, rank, domain, date, etc.). 🔁 Append New Data Automatically Every run adds fresh SEO metrics to your sheet, building a continuous daily or weekly ranking history. ⚙️ Flexible Operator Selection Choose which DataForSEO operator (action) you want to run: keywords_data, serp, or competitors. Each operator retrieves a different type of SEO insight. 🧠 Fully Expandable Add Slack alerts, Airtable sync, or AI summaries using OpenAI — all within the same workflow. ⚙️ How to Set Up 🔑 Add DataForSEO Credentials Get your API login from dataforseo.com Add it under HTTP Request → Basic Auth in n8n 📄 Connect Google Sheets Authorize your Google account The workflow will auto-create the sheet if it doesn’t exist 🎛 Choose Operator (Action) Pick one: serp, keywords_data, or competitors Each operator runs a different SEO analysis 🌍 Set Location & Language Example: location_code: 2840 (US), language_code: en 🕒 Run or Schedule Trigger manually or set a daily schedule New results will append to your Google Sheet automatically 📺 Check Out My Channel 💬 Learn more about SEO Automation, n8n, and AI-powered content workflows 👉 Connect with me on LinkedIn: Nima Salimi Follow for more templates, AI workflows, and SEO automation tutorials 💥
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? 📞 Book a Call | 💬 DM me on Linkedin Overview This workflow automatically exports customer balance data from QuickBooks to Google Sheets on a monthly basis. It eliminates manual data entry and creates a historical record of customer balances that updates automatically, making it easy to track payment trends, identify outstanding balances, and monitor customer financial health over time. Key Features Automated Monthly Reporting**: Runs on the first day of each month to capture a snapshot of all customer balances Clean Data Structure**: Extracts only the essential fields (Customer ID, Balance, Email, and Period) for easy analysis Historical Tracking**: Each monthly run appends new data to your Google Sheet, building a timeline of customer balances No Manual Work**: Once configured, the workflow runs completely hands-free Common Use Cases Track customer payment patterns and identify accounts with growing balances Create monthly reports for management or finance teams Build dashboards and visualizations from historical QuickBooks data Monitor customer account health without logging into QuickBooks Setup Requirements QuickBooks Developer Account: Register at developer.intuit.com and create a new app in the App Dashboard. Select the 'Accounting' scope for permissions. You'll receive a Client ID and Client Secret to configure your n8n credentials. Credentials: Set up QuickBooks OAuth2 credentials in n8n using your app's Client ID and Client Secret. Use 'Sandbox' environment for testing or 'Production' for live data (requires Intuit app approval). Also connect your Google Sheets account. Google Sheet: Create a spreadsheet with column headers matching the workflow output: Period, Id, Balance, and Email. Configuration Schedule**: The workflow runs monthly on the first day at 8 AM. Modify the Schedule Trigger to change timing or frequency Spreadsheet URL**: Update the 'Export to Google Sheets' node with your destination spreadsheet URL Data Fields**: Customize the 'Prepare Customer Data' node to extract different customer fields if needed
by Mohammed Abid
Shopify Order Data to Airtable This n8n template demonstrates how to capture incoming Shopify order webhooks, transform the data into a structured format, and insert each product line item as a separate record in an Airtable sheet. It provides both high-level order information and detailed product-level metrics, making it ideal for analytics, reporting, inventory management, and customer insights. Good to Know Airtable API Rate Limits: By default, Airtable allows 5 requests per second per base. Consider batching or adding delays if you process high volumes of orders. Shopify Webhook Configuration: Ensure you have configured the orders/create webhook in your Shopify Admin to point to the n8n webhook node. Field Mapping: The template maps standard Shopify fields; if your store uses custom order or line item properties, update the Function nodes accordingly. How It Works Webhook Trigger: A Shopify orders/create webhook fires when a new order is placed. Normalize Order Data: The Function node extracts core order, customer, shipping, and billing details and computes financial totals (subtotal, tax, shipping, discounts). Line Item Breakdown: A second Function node builds an array of objects—one per line item—calculating per-item totals, tax/shipping allocation, and product attributes (color, size, material). Check Customer Record: Optionally check against an Airtable "Customers" sheet to flag new vs existing customers. Auto-Increment Record ID: A Function node generates a running serial number for each Airtable record. Insert Records: The Airtable node writes each line item object into the target base and table, creating rich records with both order-level and product-level details. How to Use Clone the Template: Click "Use Template" in your n8n instance to import this workflow. Configure Credentials: Shopify Trigger: Add your Shopify store domain and webhook secret. Airtable Node: Set up your Airtable API key and select the base and table. Review Field Names: Match the field names in the Function nodes to the columns in your Airtable table. Activate Workflow: Turn on the workflow and place a test order in your Shopify store. Verify Records: Check your Airtable sheet to see the new order and its line items. Requirements n8n@latest Shopify Store with orders/create webhook configured Airtable Account with a base and table ready to receive records Customizing This Workflow Add Custom Fields: Extend the Functions to include additional Shopify metafields, discounts, or customer tags. Alternative Destinations: Replace the Airtable node with Google Sheets, Supabase, or another database by swapping in the corresponding node. Error Handling: Insert If/Wait nodes to retry on API failures or send notifications on errors. Multi-Currency Support: Adapt the currency logic to convert totals based on dynamic exchange rates.