by Evoort Solutions
Job Search Automation with Job Search Global API & Google Sheet Logging Description: Automate your job search process by querying the Job Search Global API via RapidAPI every 6 hours for a specified keyword like “Web Developer.” This workflow extracts job listings and saves them directly to Google Sheets, with alerts sent for any API failures. Workflow Overview Schedule Trigger Runs the workflow automatically every 6 hours to ensure up-to-date job listings. Set Search Term Defines the dynamic job keyword, e.g., "Web Developer," used in API requests. Fetch Job Listings Sends a POST request to the Job Search Global API (via RapidAPI) to retrieve job listings with pagination. Check API Response Validates the API response status, branching workflow on success or failure. Extract Job Data Parses the job listings array from the API response for processing. Save to Google Sheet Appends or updates job listings in Google Sheets, avoiding duplicates by matching job titles. Send Failure Notification Email Sends an alert email if the API response fails or returns an error. How to Obtain Your RapidAPI Key (Quick Steps) Go to RapidAPI Job Search Global API. Sign up or log in to your RapidAPI account. Subscribe to the API plan that suits your needs. Copy your unique X-RapidAPI-Key from the dashboard. Insert this key into your workflow’s HTTP Request node headers. How to Configure Google Sheets Create a new Google Sheet for job listings. Share the sheet with your Google Service Account email to enable API access. Use the sheet URL in the Google Sheets node within your workflow. Map columns correctly based on the job data fields. Google Sheet Columns Used | Column Name | Description | | ----------- | ----------------------------------- | | title | Job title | | url | Job posting URL | | company | Company name | | postDate | Date job was posted | | jobSource | Source of the job listing | | slug | Unique job identifier or slug | | sentiment | Sentiment analysis score (if any) | | dateAdded | Date the job was added to the sheet | | tags | Associated tags or keywords | | viewCount | Number of views for the job post | Use Cases & Benefits Automated Job Tracking:** Get fresh job listings without manual searching by automatically querying the Job Search Global API multiple times per day. Centralized Job Data:** Save and update listings in Google Sheets for easy filtering, sharing, and tracking. Failure Alerts:** Get notified immediately if API calls fail, helping maintain workflow reliability. Customizable Search:** Change keywords anytime to tailor job searches for different roles or industries. Who Is This Workflow For? Recruiters** looking to monitor job market trends in real-time. Job Seekers** who want to automate job discovery for specific roles like “Web Developer.” HR Teams** managing talent pipelines and job postings. Data Analysts** needing structured job market data for research or reporting. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Vigh Sandor
Setup Instructions Overview This n8n workflow monitors your Proxmox VE server and sends automated reports to Telegram every 15 minutes. It tracks VM status, host resource usage, temperature sensors, and detects recently stopped VMs. Prerequisites Required Software n8n instance (self-hosted or cloud) Proxmox VE server with API access Telegram account with bot created via BotFather lm-sensors package installed on Proxmox host Required Access Proxmox admin credentials (username and password) SSH access to Proxmox server Telegram Bot API token Telegram Chat ID Installation Steps Step 1: Install Temperature Sensors on Proxmox SSH into your Proxmox server and run: apt-get update apt-get install -y lm-sensors sensors-detect Press ENTER to accept default answers during sensors-detect setup. Test that sensors work: sensors | grep -E 'Package|Core' Step 2: Create Telegram Bot Open Telegram and search for BotFather Send /newbot command Follow prompts to create your bot Save the API token provided Get your Chat ID by sending a message to your bot, then visiting: https://api.telegram.org/bot<YOUR_TOKEN>/getUpdates Look for "chat":{"id": YOUR_CHAT_ID in the response Step 3: Configure n8n Credentials SSH Password Credential In n8n, go to Credentials menu Create new credential: SSH Password Enter: Host: Your Proxmox IP address Port: 22 Username: root (or your admin user) Password: Your Proxmox password Telegram API Credential Create new credential: Telegram API Enter the Bot Token from BotFather Step 4: Import and Configure Workflow Import the JSON workflow into n8n Open the "Set Variables" node Update the following values: PROXMOX_IP: Your Proxmox server IP address PROXMOX_PORT: API port (default: 8006) PROXMOX_NODE: Node name (default: pve) TELEGRAM_CHAT_ID: Your Telegram chat ID PROXMOX_USER: Proxmox username with realm (e.g., root@pam) PROXMOX_PASSWORD: Proxmox password Connect credentials: SSH - Get Sensors node: Select your SSH credential Send Telegram Report node: Select your Telegram credential Save the workflow Activate the workflow Configuration Options Adjust Monitoring Interval Edit the "Schedule Every 15min" node: Change minutesInterval value to desired interval (in minutes) Recommended: 5-30 minutes Adjust Recently Stopped VM Detection Window Edit the "Process Data" node: Find line: const fifteenMinutesAgo = now - 900; Change 900 to desired seconds (900 = 15 minutes) Modify Temperature Warning Threshold The workflow uses the "high" threshold defined by sensors. To manually set threshold, edit "Process Data" node: Modify the temperature parsing logic Change comparison: if (current >= high) to use custom value Testing Test Individual Components Execute "Set Variables" node manually - verify output Execute "Proxmox Login" node - check for valid ticket Execute "API - VM List" - confirm VM data received Execute complete workflow - check Telegram for message Troubleshooting Login fails: Verify PROXMOX_USER format includes realm (e.g., root@pam) Check password is correct Ensure allowUnauthorizedCerts is enabled for self-signed certificates No temperature data: Verify lm-sensors is installed on Proxmox Run sensors command manually via SSH Check SSH credentials are correct Recently stopped VMs not detected: Check task log API endpoint returns data Verify VM was stopped within detection window Ensure task types qmstop or qmshutdown are logged Telegram not receiving messages: Verify bot token is correct Confirm chat ID is accurate Check bot was started (send /start to bot) Verify parse_mode is set to HTML in Telegram node How It Works Workflow Architecture The workflow executes in a sequential chain of nodes that gather data from multiple sources, process it, and deliver a formatted report. Execution Flow Schedule Trigger (15min) Set Variables Proxmox Login (get authentication ticket) Prepare Auth (prepare credentials for API calls) API - VM List (get all VMs and their status) API - Node Tasks (get recent task log) API - Node Status (get host CPU, memory, uptime) SSH - Get Sensors (get temperature data) Process Data (analyze and structure all data) Generate Formatted Message (create Telegram message) Send Telegram Report (deliver via Telegram) Data Collection VM Information (Proxmox API) Endpoint: /api2/json/nodes/{node}/qemu Retrieves: Total VM count Running VM count Stopped VM count VM names and IDs Task Log (Proxmox API) Endpoint: /api2/json/nodes/{node}/tasks?limit=100 Retrieves recent tasks to detect: qmstop operations (VM stop commands) qmshutdown operations (VM shutdown commands) Task timestamps Task status Host Status (Proxmox API) Endpoint: /api2/json/nodes/{node}/status Retrieves: CPU usage percentage Memory total and used (in GB) System uptime (in seconds) Temperature Data (SSH) Command: sensors | grep -E 'Package|Core' Retrieves: CPU package temperature Individual core temperatures High and critical thresholds Data Processing VM Status Analysis Counts total, running, and stopped VMs Queries task log for stop/shutdown operations Filters tasks within 15-minute window Extracts VM ID from task UPID string Matches VM ID to VM name from VM list Calculates time elapsed since stop operation Temperature Intelligence The workflow implements smart temperature reporting: Normal Operation (all temps below high threshold): Calculates average temperature across all cores Displays min, max, and average values Example: "Average: 47.5 C (Min: 44.0 C, Max: 52.0 C)" Warning State (any temp at or above high threshold): Displays all temperature readings in detail Shows full sensor output with thresholds Changes section title to "Temperature Warning" Adds fire emoji indicator Resource Calculation CPU Usage: API returns decimal (0.0 to 1.0) Converted to percentage: cpu * 100 Memory: API returns bytes Converted to GB: bytes / (1024^3) Calculates percentage: (used / total) * 100 Uptime: API returns seconds Converted to days and hours: days = seconds / 86400, hours = (seconds % 86400) / 3600 Report Generation Message Structure The Telegram message uses HTML formatting for structure: Header Section Report title Generation timestamp Virtual Machines Section Total VM count Running VMs with checkmark Stopped VMs with stop sign Recently stopped count with warning Detailed list if VMs stopped in last 15 minutes Host Resources Section CPU usage percentage Memory used/total with percentage Host uptime in days and hours Temperature Section Smart display (summary or detailed) Warning indicator if thresholds exceeded Monospace formatting for sensor output HTML Formatting Features Bold tags for headers and labels Italic for timestamps Code blocks for temperature data Unicode separators for visual structure Emoji indicators for status (checkmark, stop, warning, fire) Security Considerations Credential Storage Passwords stored in n8n Set node (encrypted in database) Alternative: Use n8n environment variables Recommendation: Use Proxmox API tokens instead of passwords API Communication HTTPS with self-signed certificate acceptance Authentication via session tickets (15-minute validity) CSRF token validation for API requests SSH Access Password-based authentication (can use key-based) Commands limited to read-only operations No privilege escalation required Performance Impact API Load 3 API calls per execution (VM list, tasks, status) Lightweight endpoints with minimal data 15-minute interval reduces server load Execution Time Typical workflow execution: 5-10 seconds Login: 1-2 seconds API calls: 2-3 seconds SSH command: 1-2 seconds Processing: less than 1 second Resource Usage Minimal CPU impact on Proxmox Small memory footprint Negligible network bandwidth Extensibility Adding Additional Metrics To monitor additional data points: Add new API call node after "Prepare Auth" Update "Process Data" node to include new data Modify "Generate Formatted Message" for display Integration with Other Services The workflow can be extended to: Send to Discord, Slack, or email Write to database or log file Trigger alerts based on thresholds Generate charts or graphs Multi-Node Monitoring To monitor multiple Proxmox nodes: Duplicate API call nodes Update node names in URLs Merge data in processing step Generate combined report
by Rahul Joshi
Description This workflow automates the evaluation of interviewer feedback using AI. It retrieves raw notes from Google Sheets, processes them through GPT-4o-mini for structured scoring, validates outputs, and calculates weighted quality scores. The system provides real-time Slack feedback to interviewers, logs AI errors for transparency, and recommends training if the feedback quality is low. What This Template Does (Step-by-Step) ⚡ Manual Trigger – Runs the workflow manually to start evaluation. 📋 Fetch Raw Feedback Data (Google Sheets) – Reads all feedback entries (Role, Stage, Interviewer Email, Feedback Text, row_number). 🧠 AI Quality Evaluator (Azure GPT-4o-mini) – Processes feedback into structured JSON across 5 dimensions. 🔍 Analyze Feedback Quality (LLM Chain) – Applies scoring rules (Specificity, STAR, Bias-Free, Actionability, Depth) and outputs structured JSON. ✅ Validate AI Response – Ensures AI output isn’t undefined or malformed. 🚨 Log AI Errors (Google Sheets) – Records invalid AI responses for debugging and auditing. 🔄 Parse AI JSON Output (Code Node) – Converts AI JSON text into structured n8n objects with error handling. 🧮 Calculate Weighted Quality Score (Code Node) – Computes final weighted score (0–100), generates flags, formats vague phrases, and preserves context. 💾 Save Scores to Spreadsheet (Google Sheets) – Updates the original feedback row with Score, Flags, and AI JSON. 💬 Send Feedback Summary to Interviewer (Slack) – Sends interviewers a structured Slack report (score, flags, vague phrases, STAR improvement tips). 🎯 Check if Training Needed – Applies threshold logic: if score < 50, route to training recommendations. 📚 Send Training Recommendations (Slack) – Delivers STAR method guides and bias-free interviewing resources to low scorers. Prerequisites Google Sheets (Raw_Feedback + Error Log Sheet) Azure OpenAI API credentials (for GPT-4o-mini) Slack API credentials (for sending feedback & training notifications) n8n instance (cloud or self-hosted) Key Benefits ✅ Automated interview feedback quality scoring ✅ Bias detection and vague feedback flagging ✅ Real-time Slack feedback to interviewers ✅ Error logging for AI reliability tracking ✅ Training recommendations for low scorers ✅ Audit trail maintained in Google Sheets Perfect For HR & Recruitment teams ensuring structured interviewer feedback Organizations enforcing STAR method & bias-free hiring Teams seeking continuous interviewer coaching Companies needing audit-ready records of interview quality
by Rahul Joshi
📊 Description This workflow automatically classifies new Stack Overflow questions by topic, generates structured FAQ content using GPT-4o-mini, logs each entry in Google Sheets, saves formatted FAQs in Notion, and notifies your team on Slack — ensuring your product and support teams stay aligned with real-world developer discussions. 🤖💬📚 ⚙️ What This Template Does Step 1: Monitors Stack Overflow RSS feeds for new questions related to your selected tags. ⏱️ Step 2: Filters out irrelevant or incomplete questions before processing. 🧹 Step 3: Uses OpenAI GPT-4o-mini to classify each question into a topic category (Frontend, Backend, DevOps, etc.). 🧠 Step 4: Generates structured FAQ content including summaries, technical insights, and internal guidance. 📄 Step 5: Saves formatted entries into your Notion knowledge-base database. 📚 Step 6: Logs all FAQ data into a connected Google Sheet for analytics and tracking. 📊 Step 7: Sends real-time Slack notifications with quick links to the new FAQ and the original Stack Overflow post. 🔔 Step 8: Provides automatic error detection — any failed AI or Notion step triggers an instant Slack alert. 🚨 💡 Key Benefits ✅ Builds a continuously updated, AI-driven knowledge base ✅ Reduces repetitive support and documentation work ✅ Keeps product and dev teams aware of trending community issues ✅ Enhances internal docs with verified Stack Overflow insights ✅ Maintains an audit trail via Google Sheets ✅ Alerts your team instantly on errors or new FAQs 🧩 Features Automatic Stack Overflow RSS monitoring Dual-layer OpenAI integration (Topic Classification + FAQ Generation) Structured Notion database integration Google Sheets logging for analytics Slack notifications for new FAQs and error alerts Custom tag-based question filtering Near real-time updates (every minute) Built-in error handling for reliability 🔐 Requirements OpenAI API Key (GPT-4o-mini access) Notion API credentials with database access Google Sheets OAuth2 credentials Slack bot token with chat:write permissions Stack Overflow RSS feed URL for your preferred tags 👥 Target Audience SaaS or product teams building internal FAQ and knowledge systems Developer relations and documentation teams Customer-support teams automating knowledge reuse Technical communities curating content from Stack Overflow 🧭 Setup Instructions Add your OpenAI API credentials in n8n. Connect your Notion database and update the page or database ID. Connect Google Sheets credentials and select your tracking sheet. Connect your Slack account and specify your notification channel. Update the RSS Feed URL with your chosen Stack Overflow tags. Run the workflow manually once to test connectivity, then enable automation.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🎯 What This Workflow Does Transform your digital payment business with a fully-featured Telegram bot that handles everything from product listings to transaction processing. Perfect for entrepreneurs looking to automate their PPOB (mobile credit, data packages, bill payments) business operations without coding expertise. ✨ Key Features 📱 Complete Transaction Management Prepaid Services**: Mobile credit, data packages, PLN tokens Gaming**: Game vouchers for popular platforms E-Wallet**: OVO, DANA, GoPay, ShopeePay top-ups Bill Payments**: PLN postpaid, Telkom, cable TV, internet, credit cards 💰 Smart Business Operations Real-time balance checking with low-balance alerts Automated transaction processing with MD5 security Interactive product catalog with categorized browsing Transaction history and status tracking Deposit request management 🤖 User-Friendly Interface Intuitive inline keyboard navigation Multi-step transaction flows with validation Comprehensive error handling and user feedback Professional messaging with emojis and formatting 🛠️ Technical Highlights Robust Architecture Switch-based routing** for efficient command handling MD5 signature authentication** for secure API communications Session management** for multi-step user interactions Comprehensive error handling** with user-friendly messages API Integrations Digiflazz API**: Balance checking, product listings, transactions, bill inquiries Telegram Bot API**: Message handling, inline keyboards, callback queries Secure credential management** with environment variables 📋 Setup Requirements Prerequisites Active Digiflazz account with API credentials Telegram Bot Token from @BotFather n8n instance (cloud or self-hosted) Environment Variables DIGIFLAZZ_USERNAME=your_digiflazz_username DIGIFLAZZ_API_KEY=your_digiflazz_api_key 🎮 How to Use Customer Commands /start - Welcome message and main menu /menu - Access main navigation /balance - Check account balance /products - Browse product catalog /topup - Process prepaid transactions /checkbill - Inquiry postpaid bills /paybill - Pay postpaid services /deposit - Request balance deposit /history - View transaction history Business Features Automated balance monitoring** with threshold alerts Product categorization** for easy browsing Transaction confirmation** with detailed receipts Multi-payment type support** across various service providers 🔒 Security & Compliance MD5 signature verification** for all API calls Input validation** and sanitization Session timeout management** Error logging** and monitoring HTTPS-only communications** 💡 Business Benefits For PPOB Entrepreneurs Reduce manual work** by 90% through automation 24/7 customer service** without human intervention Professional presentation** builds customer trust Scalable operations** handle unlimited transactions For Customers Instant transactions** with real-time confirmations Easy navigation** through intuitive menus Multiple service options** in one convenient bot Reliable service** with comprehensive error handling 📊 Performance Features Sub-second response times** for balance checks Concurrent transaction processing** Automatic retry logic** for failed operations Detailed logging** for business analytics 🎯 Perfect For Digital payment entrepreneurs** starting PPOB businesses Existing businesses** looking to automate customer service Resellers** wanting professional transaction interfaces Developers** seeking proven automation templates 📱 Supported Services Prepaid Products Mobile credit (all Indonesian operators) Data packages and internet vouchers PLN electricity tokens Game vouchers (Mobile Legends, Free Fire, PUBG, etc.) Postpaid Services PLN electricity bills Telkom phone bills Cable TV subscriptions (First Media, MNC, etc.) Internet service providers Credit card payments Multifinance installments 🚀 Getting Started Import the workflow JSON into your n8n instance Configure Telegram and Digiflazz credentials Set up environment variables Activate the workflow Test with your Telegram bot Start serving customers immediately! 💎 Premium Features Comprehensive documentation** with setup guides Error handling** for all edge cases Professional UI/UX** design Scalable architecture** for business growth Community support** and updates Transform your digital payment business today with this production-ready Telegram bot automation. No coding required – just configure and launch! Perfect for the Indonesian PPOB market with full Digiflazz integration and professional customer experience.
by Alexandra Spalato
Short Description This LinkedIn automation workflow monitors post comments for specific trigger words and automatically sends direct messages with lead magnets to engaged users. The system checks connection status, handles non-connected users with connection requests, and prevents duplicate outreach by tracking all interactions in a database. Key Features Comment Monitoring**: Scans LinkedIn post comments for customizable trigger words Connection Status Check**: Determines if users are 1st-degree connections Automated DMs**: Sends personalized messages with lead magnet links to connected users Connection Requests**: Asks non-connected users to connect via comment replies Duplicate Prevention**: Tracks interactions in NocoDB to avoid repeat messages Message Rotation**: Uses different comment reply variations for authenticity Batch Processing**: Handles multiple comments with built-in delays Who This Workflow Is For Content creators looking to convert post engagement into leads Coaches and consultants sharing valuable LinkedIn content Anyone wanting to automate lead capture from LinkedIn posts How It Works Setup: Configure post ID, trigger word, and lead magnet link via form Comment Extraction: Retrieves all comments from the specified post using Unipile Trigger Detection: Filters comments containing the specified trigger word Connection Check: Determines if commenters are 1st-degree connections Smart Routing: Connected users receive DMs, others get connection requests Database Logging: Records all interactions to prevent duplicates Setup Requirements Required Credentials Unipile API Key**: For LinkedIn API access NocoDB API Token**: For database tracking Database Structure **Table: leads linkedin_id: LinkedIn user ID name: User's full name headline: LinkedIn headline url: Profile URL date: Interaction date posts_id: Post reference connection_status: Network distance dm_status: Interaction type (sent/connection request) Customization Options Message Templates**: Modify DM and connection request messages Trigger Words**: Change the words that activate the workflow Timing**: Adjust delays between messages (8-12 seconds default) Reply Variations**: Add more comment reply options for authenticity Installation Instructions Import the workflow into your n8n instance Set up NocoDB database with required table structure Configure Unipile and NocoDB credentials Set environment variables for Unipile root URL and LinkedIn account ID Test with a sample post before full use
by Growth AI
Google Ads automated reporting to spreadsheets with Airtable Who's it for Digital marketing agencies, PPC managers, and marketing teams who manage multiple Google Ads accounts and need automated monthly performance reporting organized by campaign types and conversion metrics. What it does This workflow automatically retrieves Google Ads performance data from multiple client accounts and populates organized spreadsheets with campaign metrics. It differentiates between e-commerce (conversion value) and lead generation (conversion count) campaigns, then organizes data by advertising channel (Performance Max, Search, Display, etc.) with monthly tracking for budget and performance analysis. How it works The workflow follows an automated data collection and reporting process: Account Retrieval: Fetches client information from Airtable (project names, Google Ads IDs, campaign types) Active Filter: Processes only accounts marked as "Actif" for budget reporting Campaign Classification: Routes accounts through e-commerce or lead generation workflows based on "Typologie ADS" Google Ads Queries: Executes different API calls depending on campaign type (conversion value vs. conversion count) Data Processing: Organizes metrics by advertising channel (Performance Max, Search, Display, Video, Shopping, Demand Gen) Dynamic Spreadsheet Updates: Automatically fills the correct monthly column in client spreadsheets Sequential Processing: Handles multiple accounts with wait periods to avoid API rate limits Requirements Airtable account with client database Google Ads API access with developer token Google Sheets API access Client-specific spreadsheet templates (provided) How to set up Step 1: Prepare your reporting template Copy the Google Sheets reporting template Create individual copies for each client Ensure proper column structure (months B-M for January-December) Link template URLs in your Airtable database Step 2: Configure your Airtable database Set up the following fields in your Airtable: Project names: Client project identifiers ID GADS: Google Ads customer IDs Typologie ADS: Campaign classification ("Ecommerce" or "Lead") Status - Prévisionnel budgétaire: Account status ("Actif" for active accounts) Automation budget: URLs to client-specific reporting spreadsheets Step 3: Set up API credentials Configure the following authentication: Airtable Personal Access Token: For client database access Google Ads OAuth2: For advertising data retrieval Google Sheets OAuth2: For spreadsheet updates Developer Token: Required for Google Ads API access Login Customer ID: Manager account identifier Step 4: Configure Google Ads API settings Update the HTTP request nodes with your credentials: Developer Token: Replace "[Your token]" with your actual developer token Login Customer ID: Replace "[Your customer id]" with your manager account ID API Version: Currently using v18 (update as needed) Step 5: Set up scheduling Default schedule: Runs on the 3rd of each month at 5 AM Cron expression: 0 5 3 * * Recommended timing: Early month execution for complete previous month data Processing delay: 1-minute waits between accounts to respect API limits How to customize the workflow Campaign type customization E-commerce campaigns: Tracks: Cost and conversion value metrics Query: metrics.conversions_value for revenue tracking Use case: Online stores, retail businesses Lead generation campaigns: Tracks: Cost and conversion count metrics Query: metrics.conversions for lead quantity Use case: Service businesses, B2B companies Advertising channel expansion Current channels tracked: Performance Max: Automated campaign type Search: Text ads on search results Display: Visual ads on partner sites Video: YouTube and video partner ads Shopping: Product listing ads Demand Gen: Audience-focused campaigns Add new channels by modifying the data processing code nodes. Reporting period adjustment Current setting: Last month data (DURING LAST_MONTH) Alternative periods: Last 30 days, specific date ranges, quarterly reports Custom timeframes: Modify the Google Ads query date parameters Multi-account management Sequential processing: Handles multiple accounts automatically Error handling: Continues processing if individual accounts fail Rate limiting: Built-in waits prevent API quota issues Batch size: No limit on number of accounts processed Data organization features Dynamic monthly columns Automatic detection: Determines previous month column (B-M) Column mapping: January=B, February=C, ..., December=M Data placement: Updates correct month automatically Multi-year support: Handles year transitions seamlessly Campaign performance breakdown Each account populates 10 rows of data: Performance Max Cost (Row 2) Performance Max Conversions/Value (Row 3) Demand Gen Cost (Row 4) Demand Gen Conversions/Value (Row 5) Search Cost (Row 6) Search Conversions/Value (Row 7) Video Cost (Row 8) Video Conversions/Value (Row 9) Shopping Cost (Row 10) Shopping Conversions/Value (Row 11) Data processing logic Cost conversion: Automatically converts micros to euros (÷1,000,000) Precision rounding: Rounds to 2 decimal places for clean presentation Zero handling: Shows 0 for campaign types with no activity Data validation: Handles missing or null values gracefully Results interpretation Monthly performance tracking Historical data: Year-over-year comparison across all channels Channel performance: Identify best-performing advertising types Budget allocation: Data-driven decisions for campaign investments Trend analysis: Month-over-month growth or decline patterns Account-level insights Multi-client view: Consolidated reporting across all managed accounts Campaign diversity: Understanding which channels clients use most Performance benchmarks: Compare similar account types and industries Resource allocation: Focus on high-performing accounts and channels Use cases Agency reporting automation Client dashboards: Automated population of monthly performance reports Budget planning: Historical data for next month's budget recommendations Performance reviews: Ready-to-present data for client meetings Trend identification: Spot patterns across multiple client accounts Internal performance tracking Team productivity: Track account management efficiency Campaign optimization: Identify underperforming channels for improvement Growth analysis: Monitor client account growth and expansion Forecasting: Use historical data for future performance predictions Strategic planning Budget allocation: Data-driven distribution across advertising channels Channel strategy: Determine which campaign types to emphasize Client retention: Proactive identification of declining accounts New business: Performance data to support proposals and pitches Workflow limitations Monthly execution: Designed for monthly reporting (not real-time) API dependencies: Requires stable Google Ads and Sheets API access Rate limiting: Sequential processing prevents parallel account handling Template dependency: Requires specific spreadsheet structure for proper data placement Previous month focus: Optimized for completed month data (run early in new month) Manual credential setup: Requires individual configuration of API tokens and customer IDs
by Grigory Frolov
📊 YouTube Personal Channel Videos → Google Sheets Automatically sync your YouTube videos (title, description, tags, publish date, captions, etc.) into Google Sheets — perfect for creators and marketers who want a clean content database for analysis or reporting. 🚀 What this workflow does ✅ Connects to your personal YouTube channel via Google OAuth 🔁 Fetches all uploaded videos automatically (with pagination) 🏷 Extracts metadata: title, description, tags, privacy status, upload status, thumbnail, etc. 🧾 Retrieves captions (SRT format) if available 📈 Writes or updates data in your Google Sheets document ⚙️ Can be run manually or scheduled via Cron 🧩 Nodes used Manual Trigger** — to start manually or connect with Cron HTTP Request (YouTube API v3)** — fetches channel, uploads, and captions Code Nodes** — manage pagination and collect IDs SplitOut** — iterates through video lists Google Sheets (appendOrUpdate)** — stores data neatly If Conditions** — control data flow and prevent empty responses ⚙️ Setup guide Connect your Google Account Used for both YouTube API and Google Sheets. Make sure the credentials are set up in Google OAuth2 API and Google Sheets OAuth2 API nodes. Create a Google Sheet Add a tab named Videos. Add these columns: youtube_id | title | description | tags | privacyStatus | uploadStatus | thumbnail | captions You can also include categoryId, maxres, or published if you’d like. Replace the sample Sheet ID In each Google Sheets node, open the “Spreadsheet” field and choose your own document. Make sure the sheet name matches the tab name (Videos). Run the workflow Execute it manually first to pull your latest uploads. Optionally add a Cron Trigger node for daily sync (e.g., once per day). Check your Sheet Your data should appear instantly — with each video’s metadata and captions (if available). 🧠 Notes & tips ⚙️ The flow loops through all pages of your upload playlist automatically — no manual pagination needed. 🕒 The workflow uses YouTube’s “contentDetails.relatedPlaylists.uploads” to ensure you only fetch your own uploads. 💡 Captions fetch may fail for private videos — use “Continue on Fail” if you want the rest to continue. 🧮 Ideal for dashboards, reporting sheets, SEO analysis, or automation triggers. 💾 To improve speed, you can disable the “Captions” branch if you only need metadata. 👥 Ideal for 🎬 YouTube creators maintaining a video database 📊 Marketing teams tracking SEO performance 🧠 Digital professionals building analytics dashboards ⚙️ Automation experts using YouTube data in other workflows 💛 Credits Created by Grigory Frolov YouTube: @gregfrolovpersonal More workflows and guides → ozwebexpert.com/n8n
by Stephan Koning
Real-Time ClickUp Time Tracking to HubSpot Project Sync This workflow automates the synchronization of time tracked on ClickUp tasks directly to a custom project object in HubSpot, ensuring your project metrics are always accurate and up-to-date. Use Case & Problem This workflow is designed for teams that use a custom object in HubSpot for high-level project overviews (tracking scoped vs. actual hours per sprint) but manage daily tasks and time logging in ClickUp. The primary challenge is the constant, manual effort required to transfer tracked hours from ClickUp to HubSpot, a process that is both time-consuming and prone to errors. This automation eliminates that manual work entirely. How It Works Triggers on Time Entry:** The workflow instantly starts whenever a user updates the time tracked on any task in a specified ClickUp space. ⏱️ Fetches Task & Time Details:** It immediately retrieves all relevant data about the task (like its name and custom fields) and the specific time entry that was just updated. Identifies the Project & Sprint:** The workflow processes the task data to determine which HubSpot project it belongs to and categorizes the work into the correct sprint (e.g., Sprint 1, Sprint 2, Additional Requests). Updates HubSpot in Real-Time:** It finds the corresponding project record in HubSpot and updates the master actual_hours_tracked property. It then intelligently updates the specific field for the corresponding sprint (e.g., actual_sprint_1_hours), ensuring your reporting remains granular and accurate. Requirements ✅ ClickUp Account with the following custom fields on your tasks: A Dropdown custom field named Sprint to categorize tasks. A Short Text custom field named HubSpot Deal ID or similar to link to the HubSpot record. ✅ HubSpot Account with: A Custom Object used for project tracking. Custom Properties** on that object to store total and sprint-specific hours (e.g., actual_hours_tracked, actual_sprint_1_hours, total_time_remaining, etc.). > Note: Since this workflow interacts with a custom HubSpot object, it uses flexible HTTP Request nodes instead of the standard n8n HubSpot nodes. Setup Instructions Configure Credentials: Add your ClickUp (OAuth2) and HubSpot (Header Auth with a Private App Token) credentials to the respective nodes in the workflow. Set ClickUp Trigger: In the Time Tracked Update Trigger node, select your ClickUp team and the specific space you want to monitor for time updates. Update HubSpot Object ID: Find the ID of your custom project object in HubSpot. In the HubSpot HTTP Request nodes (e.g., OnProjectFolder), replace the placeholder ID objectTypeId in the URL with your own objectTypeId How to Customize Adjust the Code: Extract Sprint & Task Data node to change how sprint names are mapped or how time is calculated. Update the URLs in the HubSpot HTTP Request nodes if your custom object or property names differ.
by Fahmi Fahreza
Sign up for Decodo HERE for Discount Automatically scrape, structure, and log forum or news content using Decodo and Google Gemini AI. This workflow extracts key details like titles, URLs, authors, and engagement stats, then appends them to a Google Sheet for tracking and analysis. Who’s it for? Ideal for data journalists, market researchers, or AI enthusiasts who want to monitor trending topics across specific domains. How it works Trigger: Workflow runs on schedule. Data Setup: Defines forum URLs and geolocation. Scraping: Extracts raw text data using the Decodo API. AI Extraction: Gemini parses and structures the scraped text into clean JSON. Data Storage: Each news item is appended or updated in Google Sheets. Logging: Records scraping results for monitoring and debugging. How to set up Add your Decodo, Google Gemini, and Google Sheets credentials in n8n. Adjust the forum URLs, geolocation, and Google Sheet ID in the Workflow Config node. Set your preferred trigger interval in Schedule Trigger. Activate and monitor from the n8n dashboard.
by Rahul Joshi
Description Turn incoming Gmail messages into Zendesk tickets and keep a synchronized log in Google Sheets. Uses Gmail as the trigger, creates Zendesk tickets, and appends or updates a central sheet for tracking. Gain a clean, auditable pipeline from inbox to support queue. ✨ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail payload for consistent fields. 🧹 Creates a Zendesk ticket from the email content. 🎫 Formats data for Sheets and appends or updates a row. 📊 Executes helper sub-workflows and writes logs for traceability. 🔁🧾 Key Benefits Converts emails to actionable support tickets automatically. ⚡ Maintains a single source of truth in Google Sheets. 📒 Reduces manual triage and data entry. 🕒 Improves accountability with structured logs. ✅ Features Gmail Trigger for real-time intake. ⏱️ Normalize Gmail Data for consistent fields. 🧩 Create Zendesk Ticket (create: ticket). 🎟️ Format Sheet Data for clean columns. 🧱 Log to Google Sheets with appendOrUpdate. 🔄 Execute workflow (sub-workflow) steps for modularity. 🧩 Requirements n8n instance (cloud or self-hosted). 🛠️ Gmail credentials configured in n8n (with read access to the monitored inbox). ✉️ Zendesk credentials (API token or OAuth) with permission to create tickets. 🔐 Google Sheets credentials with access to the target spreadsheet for append/update. 📊 Access to any sub-workflows referenced by the Execute workflow nodes. 🔁 Target Audience IT support and helpdesk teams managing email-based requests. 🖥️ Ops teams needing auditable intake logs. 🧾 Agencies and service providers converting client emails to tickets. 🤝 Small teams standardizing email-to-ticket flows. 🧑💼 Step-by-Step Setup Instructions Connect Gmail, Zendesk, and Google Sheets in n8n Credentials. 🔑 Set the Gmail Trigger to watch the desired label/inbox. 📨 Map Zendesk fields (description) from normalized Gmail data. 🧭 Point the Google Sheets node to your spreadsheet and confirm appendOrUpdate mode. 📄 Assign credentials to all nodes, including any Execute workflow steps. 🔁 Run once to test end-to-end; then activate the workflow. ✅
by Abdul Matheen
Description: This workflow automates the entire student exam evaluation process using AI and Google Workspace tools — no manual correction needed! Teachers simply submit a form with their name and a scanned copy of a student’s answer sheet. The flow then: Uses Gemini Document Analysis to extract answers from the scanned sheet. Passes the extracted answers to an AI Evaluation Agent, equipped with the Question Paper and Correct Answer Sheet (connected via Google Docs tools). The AI cross-checks each student answer, counts correct and incorrect responses, and calculates the total marks. The results are recorded in two Google Sheets: A Summary Sheet with overall student performance (Name, Teacher, Total Marks, etc.) A Detailed Report Sheet logging each question, correct answer, student’s answer, and correctness status. This workflow turns the tedious task of exam evaluation into a seamless AI-driven automation — ensuring speed, accuracy, and transparency in academic grading. Highlights: ✅ AI Document Understanding (Gemini Model) ✅ Intelligent Answer Comparison ✅ Automated Mark Calculation ✅ Real-Time Google Sheets Update ✅ No Code — Fully Built in n8n