by WeblineIndia
Weekly WooCommerce Finance KPI Automation with HTTP APIs & Slack This workflow automatically gathers weekly WooCommerce order and refund data, calculates essential financial KPIs, detects potential refund-related risks and sends a clear weekly finance summary to Slack. Once configured, it runs on a schedule and delivers leadership-ready insights without any manual reporting. Quick Implementation Steps Import the workflow into your automation platform. Update the WooCommerce store domain in the configuration step. Add WooCommerce Consumer Key and Consumer Secret for API access. Connect your Slack account and choose a destination channel. Enable the workflow to receive weekly finance updates automatically. What It Does This workflow automates the weekly finance reporting process for WooCommerce stores by combining sales and refund data into a single, structured summary. It collects completed orders, cleans and standardizes the data and processes refund records to ensure accurate totals and counts. Using this data, the workflow calculates key metrics such as total sales amount, number of orders, total refunds and refund ratios. These KPIs help teams quickly assess store performance and identify refund patterns that may require attention. The workflow concludes by sending a well-formatted, executive-friendly digest to Slack, ensuring that finance and leadership teams always have timely and reliable insights. Who’s It For This workflow is designed for: Finance and accounting teams CFOs and business leaders WooCommerce store owners Operations and revenue managers Agencies managing WooCommerce stores Requirements to Use This Workflow To use this workflow, you need: A workflow automation platform A WooCommerce store with REST API access enabled WooCommerce Consumer Key and Consumer Secret Access to a Slack workspace Permission to configure API credentials and Slack integrations How It Works & How To Set Up 1. Weekly Schedule Trigger Automatically runs the workflow once every week. Controls when KPI data is generated. 2. WooCommerce Store Configuration Defines the WooCommerce domain used for all API calls. Makes it easy to reuse or update the workflow for another store. 3. Fetch WooCommerce Orders Retrieves order data using the WooCommerce Orders API. Pulls data relevant to the weekly reporting period. Uses HTTP Basic Authentication. 4. Filter Completed Orders Keeps only orders with a completed status. Ensures only successful sales are included. 5. Normalize Order Data Extracts essential finance fields: Order ID Order date Order total Line items Creates a clean data structure for KPI calculations. 6. Fetch WooCommerce Refunds Retrieves refund records using the WooCommerce Refunds API. Ensures refunds are analyzed alongside sales data. 7. Normalize Refund Data Extracts refund ID, parent order ID and refund amount. Standardizes refund information for accurate aggregation. 8. Combine Orders & Refunds Merges sales and refund datasets into a single input. Prepares the data for KPI calculations. 9. Calculate Finance KPIs Calculates: Total sales amount Total order count Total refund amount Total refund count Refund-to-sales ratio Refund-to-order ratio Removes duplicate refunds. Adds automatic risk flags when thresholds are exceeded. 10. Send Weekly KPI Digest to Slack Posts a formatted summary message to Slack. Users can select any Slack channel for delivery. Designed for quick review by leadership teams. How To Customize Nodes Schedule**: Change the weekly run day or time. Order Filters**: Include additional order statuses if required. KPI Logic**: Modify ratios, thresholds or calculations. Slack Message**: Adjust formatting, wording or emojis. Store Setup**: Reuse the workflow for different WooCommerce stores. Add-Ons (Optional Enhancements) This workflow can be extended with: Explicit weekly date filters Spreadsheet or database exports Email delivery in addition to Slack Multi-store KPI reporting Product-level or category-level metrics Automated alerts for unusual refund activity Use Case Examples Common use cases include: Weekly WooCommerce finance performance reporting Refund trend monitoring for leadership teams Automated CFO-level summaries Operations and revenue review meetings Agency reporting for managed WooCommerce stores There are many additional business-specific use cases where this workflow can be applied. Troubleshooting Guide | Issue | Possible Cause | Solution | | --------------------------- | ---------------------------------- | -------------------------------------------- | | Slack message not received | Slack integration not configured | Reconnect Slack account and select a channel | | Sales totals appear as zero | No completed orders for the period | Verify order status and store activity | | Refund data missing | API permission issue | Confirm WooCommerce API access | | Authentication error | Invalid credentials | Regenerate Consumer Key and Secret | | Workflow not running | Automation not activated | Enable the workflow | Need Help? If you need assistance setting up this workflow, customizing KPIs or extending it with advanced reporting features? WeblineIndia can help you: Configure and deploy automation workflows Customize finance and reporting logic Integrate WooCommerce with Slack and other tools Build similar workflows tailored to your business 👉 Reach out to our n8n automation experts at WeblineIndia for expert support and custom automation solutions.
by Cheng Siong Chin
How It Works This workflow automates clinical trial signal validation and regulatory governance through intelligent AI-driven oversight. Designed for clinical research organizations, pharmaceutical companies, and regulatory affairs teams, it solves the critical challenge of ensuring trial compliance while managing post-market surveillance obligations across multiple regulatory frameworks.The system operates on scheduled intervals, fetching data from clinical trial databases and laboratory production signals, then merging these sources for comprehensive analysis. It employs dual AI agents for clinical signal validation and governance assessment, detecting protocol deviations, safety signals, and compliance violations. The workflow intelligently routes findings based on governance action requirements, orchestrating parallel processes for regulatory reporting, batch result documentation, and post-market surveillance logging. By maintaining synchronized audit trails across regulatory reports, batch records, post-market surveillance, and comprehensive action logs, it ensures complete traceability while automating escalation to quality teams when intervention is required. Setup Steps Configure Schedule Trigger with monitoring frequency for trial oversight Connect Workflow Configuration node with trial parameters and compliance rules Set up Fetch Clinical Trial Data and Fetch Lab & Production Signals nodes Configure Merge Signal Sources node for data consolidation logic Connect Clinical Signal Validation Agent with OpenAI/Nvidia API credentials Set up parallel AI processing Configure Regulatory Governance Agent with AI API credentials for compliance assessment Connect Route by Governance Action node with classification logic Prerequisites OpenAI or Nvidia API credentials for AI validation agents, clinical trial database API access Use Cases Pharmaceutical companies managing Phase III trial monitoring, CROs overseeing multi-site clinical studies Customization Adjust signal validation criteria for therapeutic area-specific protocols Benefits Reduces regulatory review cycles by 70%, eliminates manual signal triage
by Alexandra Spalato
Short Description This LinkedIn automation workflow monitors post comments for specific trigger words and automatically sends direct messages with lead magnets to engaged users. The system checks connection status, handles non-connected users with connection requests, and prevents duplicate outreach by tracking all interactions in a database. Key Features Comment Monitoring**: Scans LinkedIn post comments for customizable trigger words Connection Status Check**: Determines if users are 1st-degree connections Automated DMs**: Sends personalized messages with lead magnet links to connected users Connection Requests**: Asks non-connected users to connect via comment replies Duplicate Prevention**: Tracks interactions in NocoDB to avoid repeat messages Message Rotation**: Uses different comment reply variations for authenticity Batch Processing**: Handles multiple comments with built-in delays Who This Workflow Is For Content creators looking to convert post engagement into leads Coaches and consultants sharing valuable LinkedIn content Anyone wanting to automate lead capture from LinkedIn posts How It Works Setup: Configure post ID, trigger word, and lead magnet link via form Comment Extraction: Retrieves all comments from the specified post using Unipile Trigger Detection: Filters comments containing the specified trigger word Connection Check: Determines if commenters are 1st-degree connections Smart Routing: Connected users receive DMs, others get connection requests Database Logging: Records all interactions to prevent duplicates Setup Requirements Required Credentials Unipile API Key**: For LinkedIn API access NocoDB API Token**: For database tracking Database Structure **Table: leads linkedin_id: LinkedIn user ID name: User's full name headline: LinkedIn headline url: Profile URL date: Interaction date posts_id: Post reference connection_status: Network distance dm_status: Interaction type (sent/connection request) Customization Options Message Templates**: Modify DM and connection request messages Trigger Words**: Change the words that activate the workflow Timing**: Adjust delays between messages (8-12 seconds default) Reply Variations**: Add more comment reply options for authenticity Installation Instructions Import the workflow into your n8n instance Set up NocoDB database with required table structure Configure Unipile and NocoDB credentials Set environment variables for Unipile root URL and LinkedIn account ID Test with a sample post before full use
by Growth AI
Google Ads automated reporting to spreadsheets with Airtable Who's it for Digital marketing agencies, PPC managers, and marketing teams who manage multiple Google Ads accounts and need automated monthly performance reporting organized by campaign types and conversion metrics. What it does This workflow automatically retrieves Google Ads performance data from multiple client accounts and populates organized spreadsheets with campaign metrics. It differentiates between e-commerce (conversion value) and lead generation (conversion count) campaigns, then organizes data by advertising channel (Performance Max, Search, Display, etc.) with monthly tracking for budget and performance analysis. How it works The workflow follows an automated data collection and reporting process: Account Retrieval: Fetches client information from Airtable (project names, Google Ads IDs, campaign types) Active Filter: Processes only accounts marked as "Actif" for budget reporting Campaign Classification: Routes accounts through e-commerce or lead generation workflows based on "Typologie ADS" Google Ads Queries: Executes different API calls depending on campaign type (conversion value vs. conversion count) Data Processing: Organizes metrics by advertising channel (Performance Max, Search, Display, Video, Shopping, Demand Gen) Dynamic Spreadsheet Updates: Automatically fills the correct monthly column in client spreadsheets Sequential Processing: Handles multiple accounts with wait periods to avoid API rate limits Requirements Airtable account with client database Google Ads API access with developer token Google Sheets API access Client-specific spreadsheet templates (provided) How to set up Step 1: Prepare your reporting template Copy the Google Sheets reporting template Create individual copies for each client Ensure proper column structure (months B-M for January-December) Link template URLs in your Airtable database Step 2: Configure your Airtable database Set up the following fields in your Airtable: Project names: Client project identifiers ID GADS: Google Ads customer IDs Typologie ADS: Campaign classification ("Ecommerce" or "Lead") Status - Prévisionnel budgétaire: Account status ("Actif" for active accounts) Automation budget: URLs to client-specific reporting spreadsheets Step 3: Set up API credentials Configure the following authentication: Airtable Personal Access Token: For client database access Google Ads OAuth2: For advertising data retrieval Google Sheets OAuth2: For spreadsheet updates Developer Token: Required for Google Ads API access Login Customer ID: Manager account identifier Step 4: Configure Google Ads API settings Update the HTTP request nodes with your credentials: Developer Token: Replace "[Your token]" with your actual developer token Login Customer ID: Replace "[Your customer id]" with your manager account ID API Version: Currently using v18 (update as needed) Step 5: Set up scheduling Default schedule: Runs on the 3rd of each month at 5 AM Cron expression: 0 5 3 * * Recommended timing: Early month execution for complete previous month data Processing delay: 1-minute waits between accounts to respect API limits How to customize the workflow Campaign type customization E-commerce campaigns: Tracks: Cost and conversion value metrics Query: metrics.conversions_value for revenue tracking Use case: Online stores, retail businesses Lead generation campaigns: Tracks: Cost and conversion count metrics Query: metrics.conversions for lead quantity Use case: Service businesses, B2B companies Advertising channel expansion Current channels tracked: Performance Max: Automated campaign type Search: Text ads on search results Display: Visual ads on partner sites Video: YouTube and video partner ads Shopping: Product listing ads Demand Gen: Audience-focused campaigns Add new channels by modifying the data processing code nodes. Reporting period adjustment Current setting: Last month data (DURING LAST_MONTH) Alternative periods: Last 30 days, specific date ranges, quarterly reports Custom timeframes: Modify the Google Ads query date parameters Multi-account management Sequential processing: Handles multiple accounts automatically Error handling: Continues processing if individual accounts fail Rate limiting: Built-in waits prevent API quota issues Batch size: No limit on number of accounts processed Data organization features Dynamic monthly columns Automatic detection: Determines previous month column (B-M) Column mapping: January=B, February=C, ..., December=M Data placement: Updates correct month automatically Multi-year support: Handles year transitions seamlessly Campaign performance breakdown Each account populates 10 rows of data: Performance Max Cost (Row 2) Performance Max Conversions/Value (Row 3) Demand Gen Cost (Row 4) Demand Gen Conversions/Value (Row 5) Search Cost (Row 6) Search Conversions/Value (Row 7) Video Cost (Row 8) Video Conversions/Value (Row 9) Shopping Cost (Row 10) Shopping Conversions/Value (Row 11) Data processing logic Cost conversion: Automatically converts micros to euros (÷1,000,000) Precision rounding: Rounds to 2 decimal places for clean presentation Zero handling: Shows 0 for campaign types with no activity Data validation: Handles missing or null values gracefully Results interpretation Monthly performance tracking Historical data: Year-over-year comparison across all channels Channel performance: Identify best-performing advertising types Budget allocation: Data-driven decisions for campaign investments Trend analysis: Month-over-month growth or decline patterns Account-level insights Multi-client view: Consolidated reporting across all managed accounts Campaign diversity: Understanding which channels clients use most Performance benchmarks: Compare similar account types and industries Resource allocation: Focus on high-performing accounts and channels Use cases Agency reporting automation Client dashboards: Automated population of monthly performance reports Budget planning: Historical data for next month's budget recommendations Performance reviews: Ready-to-present data for client meetings Trend identification: Spot patterns across multiple client accounts Internal performance tracking Team productivity: Track account management efficiency Campaign optimization: Identify underperforming channels for improvement Growth analysis: Monitor client account growth and expansion Forecasting: Use historical data for future performance predictions Strategic planning Budget allocation: Data-driven distribution across advertising channels Channel strategy: Determine which campaign types to emphasize Client retention: Proactive identification of declining accounts New business: Performance data to support proposals and pitches Workflow limitations Monthly execution: Designed for monthly reporting (not real-time) API dependencies: Requires stable Google Ads and Sheets API access Rate limiting: Sequential processing prevents parallel account handling Template dependency: Requires specific spreadsheet structure for proper data placement Previous month focus: Optimized for completed month data (run early in new month) Manual credential setup: Requires individual configuration of API tokens and customer IDs
by Grigory Frolov
📊 YouTube Personal Channel Videos → Google Sheets Automatically sync your YouTube videos (title, description, tags, publish date, captions, etc.) into Google Sheets — perfect for creators and marketers who want a clean content database for analysis or reporting. 🚀 What this workflow does ✅ Connects to your personal YouTube channel via Google OAuth 🔁 Fetches all uploaded videos automatically (with pagination) 🏷 Extracts metadata: title, description, tags, privacy status, upload status, thumbnail, etc. 🧾 Retrieves captions (SRT format) if available 📈 Writes or updates data in your Google Sheets document ⚙️ Can be run manually or scheduled via Cron 🧩 Nodes used Manual Trigger** — to start manually or connect with Cron HTTP Request (YouTube API v3)** — fetches channel, uploads, and captions Code Nodes** — manage pagination and collect IDs SplitOut** — iterates through video lists Google Sheets (appendOrUpdate)** — stores data neatly If Conditions** — control data flow and prevent empty responses ⚙️ Setup guide Connect your Google Account Used for both YouTube API and Google Sheets. Make sure the credentials are set up in Google OAuth2 API and Google Sheets OAuth2 API nodes. Create a Google Sheet Add a tab named Videos. Add these columns: youtube_id | title | description | tags | privacyStatus | uploadStatus | thumbnail | captions You can also include categoryId, maxres, or published if you’d like. Replace the sample Sheet ID In each Google Sheets node, open the “Spreadsheet” field and choose your own document. Make sure the sheet name matches the tab name (Videos). Run the workflow Execute it manually first to pull your latest uploads. Optionally add a Cron Trigger node for daily sync (e.g., once per day). Check your Sheet Your data should appear instantly — with each video’s metadata and captions (if available). 🧠 Notes & tips ⚙️ The flow loops through all pages of your upload playlist automatically — no manual pagination needed. 🕒 The workflow uses YouTube’s “contentDetails.relatedPlaylists.uploads” to ensure you only fetch your own uploads. 💡 Captions fetch may fail for private videos — use “Continue on Fail” if you want the rest to continue. 🧮 Ideal for dashboards, reporting sheets, SEO analysis, or automation triggers. 💾 To improve speed, you can disable the “Captions” branch if you only need metadata. 👥 Ideal for 🎬 YouTube creators maintaining a video database 📊 Marketing teams tracking SEO performance 🧠 Digital professionals building analytics dashboards ⚙️ Automation experts using YouTube data in other workflows 💛 Credits Created by Grigory Frolov YouTube: @gregfrolovpersonal More workflows and guides → ozwebexpert.com/n8n
by Rajeet Nair
Autonomous PostgreSQL Data Quality Monitoring & Remediation Overview This workflow automatically monitors PostgreSQL database data quality and detects structural or statistical anomalies before they impact analytics, pipelines, or applications. Running every 6 hours, it scans database metadata, table statistics, and historical baselines to identify: Schema drift Null value explosions Abnormal data distributions Detected issues are evaluated using a confidence scoring system that considers severity, frequency, and affected data volume. When issues exceed the defined threshold, the workflow generates SQL remediation suggestions, logs the issue to an audit table, and sends alerts to Slack. This automation enables teams to proactively maintain database reliability, detect unexpected schema changes, and quickly respond to data quality problems. How It Works 1. Scheduled Monitoring A Schedule Trigger starts the workflow every 6 hours to run automated database quality checks. 2. Metadata & Statistics Collection The workflow retrieves important metadata from PostgreSQL: Schema metadata** from information_schema.columns Table statistics** from pg_stat_user_tables Historical baselines** from a baseline tracking table These datasets allow the workflow to compare current database conditions against historical norms. 3. Data Quality Detection Engine Three parallel detection checks analyze the database: Schema Drift Detection Identifies new tables or columns Detects removed columns or tables Detects datatype or nullability changes Null Explosion Detection Calculates null percentage per column Flags columns exceeding configured null thresholds Outlier Distribution Detection Compares current column statistics against historical baselines Uses statistical deviation (z-score) to detect abnormal distributions 4. Issue Aggregation & Confidence Scoring All detected issues are aggregated and evaluated using a confidence scoring system based on: Severity of the issue Data volume affected Historical frequency Consistency of detection Only issues above the configured confidence threshold proceed to remediation. 5. SQL Remediation Suggestions For high-confidence issues, the workflow automatically generates SQL investigation or remediation queries, such as: ALTER TABLE fixes NULL cleanup queries Outlier review queries 6. Logging & Alerting Confirmed issues are: Stored in a PostgreSQL audit table Sent as alerts to Slack 7. Baseline Updates Finally, the workflow updates the data quality baseline table, improving anomaly detection accuracy in future runs. Setup Instructions Configure a PostgreSQL credential in n8n. Replace <target schema name> in the SQL queries with your database schema. Create the following tables in PostgreSQL: Audit Table data_quality_audit Stores detected data quality issues and remediation suggestions. Baseline Table data_quality_baselines Stores historical statistics used for anomaly detection. Configure your Slack credential. Replace the placeholder Slack channel ID in the Send Alert to Team node. Optional configuration parameters can be modified in the Workflow Configuration node: confidenceThreshold maxNullPercentage outlierStdDevThreshold auditTableName baselineTableName Use Cases Database Reliability Monitoring Detect unexpected schema changes or structural modifications in production databases. Data Pipeline Validation Identify anomalies in datasets used by ETL pipelines before they propagate errors downstream. Analytics Data Quality Monitoring Prevent reporting inaccuracies caused by missing data or abnormal values. Production Database Observability Provide automated alerts when critical database quality issues occur. Data Governance & Compliance Maintain a historical audit log of database quality issues and remediation actions. Requirements This workflow requires the following services: PostgreSQL Database** Slack Workspace** n8n** Nodes used: Schedule Trigger Set Postgres Code (Python) Aggregate IF Slack Key Features Automated database health monitoring Schema drift detection** Null explosion detection** Statistical anomaly detection** Confidence-based issue filtering Automated SQL remediation suggestions Slack alerting Historical baseline learning system Summary This workflow provides an automated data quality monitoring system for PostgreSQL. It continuously analyzes schema structure, column statistics, and historical baselines to detect anomalies, generate remediation suggestions, and notify teams in real time. By automating database quality checks, teams can identify issues early, reduce debugging time, and maintain reliable data pipelines.
by Stephan Koning
Real-Time ClickUp Time Tracking to HubSpot Project Sync This workflow automates the synchronization of time tracked on ClickUp tasks directly to a custom project object in HubSpot, ensuring your project metrics are always accurate and up-to-date. Use Case & Problem This workflow is designed for teams that use a custom object in HubSpot for high-level project overviews (tracking scoped vs. actual hours per sprint) but manage daily tasks and time logging in ClickUp. The primary challenge is the constant, manual effort required to transfer tracked hours from ClickUp to HubSpot, a process that is both time-consuming and prone to errors. This automation eliminates that manual work entirely. How It Works Triggers on Time Entry:** The workflow instantly starts whenever a user updates the time tracked on any task in a specified ClickUp space. ⏱️ Fetches Task & Time Details:** It immediately retrieves all relevant data about the task (like its name and custom fields) and the specific time entry that was just updated. Identifies the Project & Sprint:** The workflow processes the task data to determine which HubSpot project it belongs to and categorizes the work into the correct sprint (e.g., Sprint 1, Sprint 2, Additional Requests). Updates HubSpot in Real-Time:** It finds the corresponding project record in HubSpot and updates the master actual_hours_tracked property. It then intelligently updates the specific field for the corresponding sprint (e.g., actual_sprint_1_hours), ensuring your reporting remains granular and accurate. Requirements ✅ ClickUp Account with the following custom fields on your tasks: A Dropdown custom field named Sprint to categorize tasks. A Short Text custom field named HubSpot Deal ID or similar to link to the HubSpot record. ✅ HubSpot Account with: A Custom Object used for project tracking. Custom Properties** on that object to store total and sprint-specific hours (e.g., actual_hours_tracked, actual_sprint_1_hours, total_time_remaining, etc.). > Note: Since this workflow interacts with a custom HubSpot object, it uses flexible HTTP Request nodes instead of the standard n8n HubSpot nodes. Setup Instructions Configure Credentials: Add your ClickUp (OAuth2) and HubSpot (Header Auth with a Private App Token) credentials to the respective nodes in the workflow. Set ClickUp Trigger: In the Time Tracked Update Trigger node, select your ClickUp team and the specific space you want to monitor for time updates. Update HubSpot Object ID: Find the ID of your custom project object in HubSpot. In the HubSpot HTTP Request nodes (e.g., OnProjectFolder), replace the placeholder ID objectTypeId in the URL with your own objectTypeId How to Customize Adjust the Code: Extract Sprint & Task Data node to change how sprint names are mapped or how time is calculated. Update the URLs in the HubSpot HTTP Request nodes if your custom object or property names differ.
by Fahmi Fahreza
Sign up for Decodo HERE for Discount Automatically scrape, structure, and log forum or news content using Decodo and Google Gemini AI. This workflow extracts key details like titles, URLs, authors, and engagement stats, then appends them to a Google Sheet for tracking and analysis. Who’s it for? Ideal for data journalists, market researchers, or AI enthusiasts who want to monitor trending topics across specific domains. How it works Trigger: Workflow runs on schedule. Data Setup: Defines forum URLs and geolocation. Scraping: Extracts raw text data using the Decodo API. AI Extraction: Gemini parses and structures the scraped text into clean JSON. Data Storage: Each news item is appended or updated in Google Sheets. Logging: Records scraping results for monitoring and debugging. How to set up Add your Decodo, Google Gemini, and Google Sheets credentials in n8n. Adjust the forum URLs, geolocation, and Google Sheet ID in the Workflow Config node. Set your preferred trigger interval in Schedule Trigger. Activate and monitor from the n8n dashboard.
by Gilbert Onyebuchi
This workflow automates the full invoicing and payment process using n8n and Xero. It allows businesses to generate invoices, track payments, send WhatsApp notifications, and keep records synced automatically, without manual follow-ups or repetitive admin work. It’s designed to plug into your existing tools and scale as your operations grow. How It Works A webhook receives invoice or payment data from your app, form, or system Xero automatically creates or updates the invoice Payments are tracked and verified in real time Clients receive WhatsApp notifications for invoices, reminders, or payments All records are logged in a database and synced to Google Calendar and Google Sheets Automated responses confirm successful actions or errors Everything runs in the background once connected. Setup Connect your Xero account to n8n Set up a database (PostgreSQL via Supabase) for logging invoices and payments Connect Google Calendar for scheduling and tracking Connect Twilio WhatsApp for client notifications Point your system or payment source to the provided webhook URL No complex coding required. I guide you through the setup and ensure everything is tested. Need Help or Customization? If you’d like this workflow customized for your business or want help setting it up properly, feel free to reach out. 🔗 Connect with me on LinkedIn: 👉 Click here to connect I’m happy to walk you through it or adapt it to your specific use case.
by Rahul Joshi
Description Turn incoming Gmail messages into Zendesk tickets and keep a synchronized log in Google Sheets. Uses Gmail as the trigger, creates Zendesk tickets, and appends or updates a central sheet for tracking. Gain a clean, auditable pipeline from inbox to support queue. ✨ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail payload for consistent fields. 🧹 Creates a Zendesk ticket from the email content. 🎫 Formats data for Sheets and appends or updates a row. 📊 Executes helper sub-workflows and writes logs for traceability. 🔁🧾 Key Benefits Converts emails to actionable support tickets automatically. ⚡ Maintains a single source of truth in Google Sheets. 📒 Reduces manual triage and data entry. 🕒 Improves accountability with structured logs. ✅ Features Gmail Trigger for real-time intake. ⏱️ Normalize Gmail Data for consistent fields. 🧩 Create Zendesk Ticket (create: ticket). 🎟️ Format Sheet Data for clean columns. 🧱 Log to Google Sheets with appendOrUpdate. 🔄 Execute workflow (sub-workflow) steps for modularity. 🧩 Requirements n8n instance (cloud or self-hosted). 🛠️ Gmail credentials configured in n8n (with read access to the monitored inbox). ✉️ Zendesk credentials (API token or OAuth) with permission to create tickets. 🔐 Google Sheets credentials with access to the target spreadsheet for append/update. 📊 Access to any sub-workflows referenced by the Execute workflow nodes. 🔁 Target Audience IT support and helpdesk teams managing email-based requests. 🖥️ Ops teams needing auditable intake logs. 🧾 Agencies and service providers converting client emails to tickets. 🤝 Small teams standardizing email-to-ticket flows. 🧑💼 Step-by-Step Setup Instructions Connect Gmail, Zendesk, and Google Sheets in n8n Credentials. 🔑 Set the Gmail Trigger to watch the desired label/inbox. 📨 Map Zendesk fields (description) from normalized Gmail data. 🧭 Point the Google Sheets node to your spreadsheet and confirm appendOrUpdate mode. 📄 Assign credentials to all nodes, including any Execute workflow steps. 🔁 Run once to test end-to-end; then activate the workflow. ✅
by Reinhard Schmidbauer
Overview This template automatically exports Meta (Facebook) Ads campaign performance into Google Sheets — both daily and for historical backfills. It’s ideal for performance marketers, agencies, and analytics teams who want a reliable data pipeline from Meta Ads into their reporting stack. What this workflow does Runs a daily cron job to pull yesterday’s campaign-level performance from the Meta Ads Insights API. Flattens the API response and calculates key KPIs like CPL, CPA, ROAS, CTR, CPC, CPM, frequency and more. Appends one row per campaign per day to a Google Sheet (for dashboards and further analysis). Provides a separate Manual Backfill section to import historical data using a time_range parameter (e.g. last 12–24 months). Use cases Build Looker Studio / Power BI dashboards on top of a clean, daily Meta Ads dataset. Track ROAS, CPL, CPA, CTR, and frequency trends over time. Combine campaign data with CRM or ecommerce data in the same spreadsheet. Quickly backfill past performance when onboarding a new Meta Ads account. How it works Daily Incremental Flow A Schedule Trigger runs every day at 05:00. The Set config node defines ad account, date preset (yesterday), and Google Sheet details. The Meta Insights node calls the Facebook Graph insights edge at level=campaign. The Code node flattens the data and derives CPL, CPA, ROAS, and other KPIs. The Google Sheets node appends the rows to your Meta_Daily_Data sheet. Manual Backfill Flow A Manual Trigger lets you run the flow on demand. The Set backfill config node defines backfillSince and backfillUntil. The Meta Insights (time_range) node fetches performance for that historical range. The same transform logic is applied, and rows are appended to the same sheet. Prerequisites A Meta Business account with: A system user and a long-lived access token with ads_read / read_insights. A Google Sheet with a header row that matches the mapped column names. n8n credentials for: Facebook Graph API Google Sheets OAuth2 Setup steps Import this template into your n8n instance. Open the Set config and Set backfill config nodes: Set your adAccountId (e.g. act_123456789012345). Set your sheetId (Google Sheet ID) and sheet name (e.g. Meta_Daily_Data). Configure your Facebook Graph API and Google Sheets credentials in n8n. (Optional) Run the Manual Backfill section for your desired historical ranges (e.g. per quarter). Enable the workflow so the Daily Incremental section runs automatically. Customization Change level from campaign to adset or ad if you need more granular reporting. Add breakdowns (e.g. publisher_platform, platform_position) to split by platform and placement. Extend the transform code with additional KPIs or dimensions that match your reporting needs. Use a separate sheet for raw data and build dashboards on top of a cleaned or pivoted view. Consulting & support If you need help with: E-Commerce Strategy & Development** (Shopify, Shopware 6, Magento 2, SAP Commerce Cloud, etc.) Growth & Performance Marketing** (Google / Meta / Microsoft Ads, etc.) Data & Analytics Setups** (tracking, dashboards, attribution, gdpr, etc.) please reach out to Serendipity Technologies: 👉 https://www.serendipity.at We can help you turn this workflow into a full analytics stack and reporting system tailored to your business.
by Hiroshi Hashimoto
AI Palm Health Tracker – Overview This workflow receives palm images sent via LINE and provides AI-generated health insights. Step-by-step process: 1.User sends a palm image via LINE 2.Webhook receives the image 3.Image is saved to Google Drive 4.Past records are checked in Google Sheets If this is the first submission: → AI will perform a palm reading If previous records exist: → Retrieve the latest saved image → Validate that two images are available → AI performs a comparison analysis All results are saved in Google Sheets and sent back to the user via LINE.