by Yaron Been
This workflow automatically tracks inventory stock levels across multiple products and suppliers to prevent stockouts and optimize inventory management. It saves you time by eliminating the need to manually check stock levels and provides automated alerts when inventory reaches critical thresholds. Overview This workflow automatically scrapes supplier websites, e-commerce platforms, and inventory systems to monitor real-time stock levels and availability. It uses Bright Data to access inventory data and AI to intelligently parse stock information, detect low inventory alerts, and track supply chain trends. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping inventory and supplier websites without being blocked OpenAI**: AI agent for intelligent stock level analysis and trend detection Google Sheets**: For storing inventory data and tracking stock movements How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your inventory tracking spreadsheet Customize: Define product URLs and inventory monitoring parameters Use Cases E-commerce**: Monitor product availability across multiple suppliers Retail Management**: Track inventory levels to prevent stockouts Supply Chain**: Monitor supplier stock levels and lead times Procurement**: Identify restocking needs and optimize purchasing decisions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #inventorytracking #stockmonitoring #brightdata #webscraping #inventorymanagement #n8nworkflow #workflow #nocode #stocklevels #supplychain #inventoryautomation #stockalerts #ecommerce #procurement #inventorycontrol #stockanalysis #suppliermonitoring #inventoryoptimization #stocktracking #warehousemanagement #retailautomation #inventorydata #stockmanagement #supplymanagement #inventorymonitoring #productavailability #stockforecasting #inventoryinsights
by Oneclick AI Squad
This automated n8n workflow performs weekly forecasting of restaurant sales and raw material requirements using historical data from Google Sheets and AI predictions powered by Google Gemini. The forecast is then emailed to stakeholders for efficient planning and waste reduction. What is Google Gemini AI? Google Gemini is an advanced AI model that analyzes historical sales data, seasonal patterns, and market trends to generate accurate forecasts for restaurant sales and inventory requirements, helping optimize purchasing decisions and reduce waste. Good to Know Google Gemini AI forecasting accuracy improves over time with more historical data Weekly forecasting provides better strategic planning compared to daily predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications ensure timely review of weekly forecasts by stakeholders The system analyzes trends and predicts upcoming needs for efficient planning and waste reduction How It Works Trigger Weekly Forecast - Automatically starts the workflow every week at a scheduled time Load Historical Sales Data - Pulls weekly sales and material usage data from Google Sheets Format Input for AI Agent - Transforms raw data into a structured format suitable for the AI Agent Generate Forecast with AI - Uses Gemini AI to analyze trends and predict upcoming needs Interpret AI Forecast Output - Parses the AI's response into readable, usable JSON format Log Forecast to Google Sheets - Stores the new forecast data back into a Google Sheet Email Forecast Summary - Sends a summary of the forecast via Gmail for stakeholder review Data Sources The workflow utilizes Google Sheets as the primary data source: Historical Sales Data Sheet - Contains weekly sales and inventory data with columns: Week/Date (date) Menu Item (text) Sales Quantity (number) Revenue (currency) Raw Material Used (number) Inventory Level (number) Category (text) Forecast Output Sheet - Contains AI-generated predictions with columns: Forecast Week (date) Menu Item (text) Predicted Sales (number) Recommended Inventory (number) Material Requirements (number) Confidence Level (percentage) Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up Gmail credentials for forecast report delivery Create the required Google Sheets with the specified column structures Configure Google Gemini AI API credentials Test with sample historical sales data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Gmail API credentials Google Gemini AI API credentials Historical sales and inventory data for initial training Customizing This Workflow Modify the Generate Forecast with AI node to focus on specific menu categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information, weather data, or special events calendar for more accurate predictions.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks email campaign performance metrics and triggers smart follow-up actions based on engagement data. It saves you time by eliminating the need to manually monitor campaign reports and provides intelligent re-engagement strategies for improving email marketing ROI. Overview This workflow automatically scrapes email service provider (ESP) reports to extract campaign performance metrics like open rates, click-through rates, and bounce rates. It uses AI to analyze the data and automatically sends targeted follow-up emails to re-engage subscribers who opened but didn't click, maximizing campaign effectiveness. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping ESP campaign reports without being blocked OpenAI**: AI agent for intelligent campaign data analysis and decision making Gmail**: For sending automated follow-up engagement emails How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Gmail: Connect your Gmail account for sending follow-up emails Customize: Set ESP report URLs and define engagement thresholds for triggering follow-ups Use Cases Email Marketing**: Automatically optimize campaign performance with smart follow-ups Marketing Automation**: Trigger re-engagement campaigns based on behavior data Performance Tracking**: Monitor email metrics without manual ESP login Customer Retention**: Re-engage subscribers who showed interest but didn't convert Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #emailmarketing #campaigntracking #brightdata #webscraping #emailautomation #n8nworkflow #workflow #nocode #emailcampaigns #marketingautomation #emailperformance #campaignanalysis #emailmetrics #reengagement #marketingdata #emailoptimization #campaignmonitoring #emailanalytics #digitalmarketing #performancetracking #emailstrategy #conversionoptimization #marketinganalytics #emailroi #campaigninsights #emailengagement #marketingefficiency #automatedemail
by Matheus Weckwerth
This workflow automates daily LinkedIn posts using Notion. It starts by fetching the day's post from a Notion database, processes and formats the content, including images, then publishes it on LinkedIn. Finally, it updates the post status in the Notion database. Set up Notion and LinkedIn credentials as required.
by n8n Team
This n8n workflow automates the analysis of email messages received in a Microsoft Outlook inbox to identify indicators of compromise (IOCs), specifically suspicious URLs. It can be triggered manually or scheduled to run daily at midnight. The workflow begins by retrieving up to 100 read email messages from the Outlook inbox. However, there seems to be a configuration issue as it should retrieve unread messages, not read ones. It then marks these messages as read to avoid processing them again in the future. The messages are then split into individual items using the Split In Batches node for sequential processing. For each email, the workflow analyzes its content to find URLs, which are considered potential IOCs. If URLs are found, the workflow proceeds to check these URLs for potential threats using two services, URLScan.io and VirusTotal, in parallel. In the first path, URLScan.io scans each URL, and if there are no errors, the results from URLScan.io and VirusTotal are merged. If there are errors, the workflow waits 1 minute before attempting to retrieve the URLScan results again. The loop then continues for the next email. In the second path, VirusTotal is used to scan the URLs, and the results are retrieved. Finally, the workflow checks if the data field is not empty, filtering out items where no data was found. It then sends a summarized Slack message to report details about the analyzed email, including the subject, sender, date, URLScan report URL, and VirusTotal verdict for URLs that were reported as malicious. Potential issues during setup include configuring the Outlook node to retrieve unread messages, resolving a configuration issue in the VirusTotal node, and handling authentication and API keys for both URLScan.io and VirusTotal nodes. Additionally, proper error handling and testing with various email content types and URLs are essential to ensure the workflow accurately identifies IOCs and reports them to the Slack channel.
by Deborah
This is a workflow that tries to answer user queries using the standard GPT-4 model. If it can't answer, it sends a message to Slack to ask for human help. It prompts the user to supply an email address. This workflow is used in Advanced AI examples | Ask a human in the documentation. To use this workflow: Load it into your n8n instance. Add your credentials as prompted by the notes. Configure the Slack node to use your Slack details, or swap out Slack for a different service.
by Łukasz
What is This? This automation simulates Scrum Master role on daily meetings. Essentially it is an AI Scrum Master using different sources of data. As intelligent support system for Scrum Masters that leverages data from Asana, Slack, and direct developer responses for comprehensive sprint status analysis and identification of areas requiring intervention. As such it is usable for Scrum Masters (of course) but Scrum Team aswell, Product Owner and possibly Business Owner. Who is it For? This automation is designed for Agile teams to support the Scrum Master role by collecting and analyzing data from various sources to identify potential impediments and support the team in sprint delivery. How Does It Work? The workflow has four main data entry points, that are launched either on-click or on workdays. First is collecting project section information from Asana. The automation retrieves project structure, available sections, and their organization, allowing the AI to understand the team's work context. Second is getting recently modified tasks in the Asana project. The system tracks changes in tasks, their status, assignments, and updates to detect potential delays or issues. Third is obtaining communication in the team's Slack channel. The flow collects data about recent conversations, discussion threads, and team communication to identify warning signals or areas requiring attention. Fourth is directly collecting responses from developers about the current sprint - their progress, impediments, concerns, and support needs. All collected data is passed to an AI model that analyzes it within the Scrum methodology context and identifies: Potential impediments in sprint delivery Areas requiring Scrum Master intervention Recommendations for team support Warning signals regarding Sprint Goal achievement Output is being pushed to Slack channel so it can be potentially used by another iteration of same flow itself via Slack channel history. Requirements You need Asana oAuth credentials You need OpenAI / alternative AI for processing data You need to have Slack app with proper permissions channels:history chat:write groups:history im:history mpim:history users.profile:write users:write Configuration Set up node "Asana Project and Slack Channel". Provide Asana project ID and Slack Channel ID (optional) Set up node "Get Scrum Master Answers". There are daily questions/answers that are being sent to channel. Alternative use You can get rid of the whole "Ask Users Daily ScrumMaster Questions" part if you don't want to do it simirarly as "daily Scrum standups". In such case whole flow is essentially changed to static analyzer of project status based on Slack and Asana. Extensions and Customizations There are many possibilities to extend this automation depending on team needs. For example, you can add integration with additional project management tools, implement different notification schemes based on detected issue criticality, or adjust data collection frequency to match the team's work rhythm. Disclaimers and Notes Whole automation has one important assumption: project is run on single Slack channel and on single Asana board. Of cource this can be extended, but is beyond currently designed scope. Adding new sources for AI to analyze should be fairly easy - just add another branch of data and push it to AI prompt. This automation represents a proof-of-concept and should not replace an actual Scrum Master. The Scrum Master role extends far beyond data collection and analysis - it requires deep understanding of team dynamics, business context, and interpersonal skills. As Scrum.org emphasizes, the Scrum Master doesn't need to be present during Daily Scrum, and their role is to ensure the meeting happens, but developers are responsible for conducting the meeting. Mindlessly executing daily questions without proper context analysis can lead to situations where the Scrum Master becomes a team manager instead of a self-organization facilitator. A real Scrum Master analyzes much more data than what's collected by automation - they observe team dynamics, understand business context, identify deeper root causes of problems, and support the team in developing self-organization skills. AI can be a valuable support tool, but it cannot replace the human intuition, empathy, and experience essential in this role. The automation should be treated as a tool supporting the Teams's work, providing additional insights and helping identify areas requiring attention, but always under the supervision and interpretation of an experienced Scrum practitioner.
by Md. Nazmul Islam
AI-Powered MCQ Quiz Generator from YouTube Videos Transform any YouTube video into an interactive MCQ quiz automatically! This workflow uses Google Gemini AI to analyze video content and generate comprehensive multiple-choice questions with automatic grading - perfect for educators, trainers, and content creators. Who is this For This workflow is perfect for: Educators** creating quizzes from educational YouTube content Corporate Trainers** developing assessments from training videos Content Creators** engaging their audience with interactive quizzes Students** testing their knowledge on video lectures Online Course Creators** building assessments from video content Features AI Video Analysis**: Google Gemini 2.5 Flash analyzes entire YouTube videos (up to 50 minutes) Dynamic Question Generation**: Creates up to 90 MCQ questions with 3 options each Automatic Form Creation**: Generates Google Forms with quiz functionality Smart Grading**: Built-in correct answer identification and scoring Error Handling**: Robust error management with user feedback How It Works User Input via n8n Web Form: Form Name (Quiz Title) Email Address YouTube Video URL Number of Questions (1-90) AI Processing Pipeline: Google Gemini analyzes the YouTube video content AI extracts key concepts and generates relevant questions Structured output parser formats questions into JSON Google Forms Integration: Automatically creates a new Google Form Adds all generated questions with multiple choice options Configures quiz settings with correct answers and scoring Completion & Access: User receives direct link to the generated quiz Form ready for immediate use or sharing Video Demo: See this youtube Video to explore "how it works". Set Up Steps Import the Workflow Create a new workflow in n8n Import the JSON file by clicking "three dots" (upper right corner) > "Import from file..." Configure Google Gemini API Get your Google AI Studio API key from Google AI Studio On “HTTP Request to Gemini” node replace the “API_KEY” from url with your API key. Create a "Google Gemini (PaLM) API" credential in n8n Add your API key to the credential Connect the credential to the "Google Gemini Chat Model" node Set Up Google Forms Integration Enable Google Forms API in Google Cloud Console Create a "Google OAuth2 API" credential in n8n Authorize the credential with Forms permissions Connect the credential to both HTTP Request nodes (“Create a Google Form” node and “Create MCQ Quizzes” node) Configure Form Trigger The workflow includes a built-in form trigger No additional setup needed - the form URL will be generated automatically Customize form fields if needed in the “Input YouTube URL" node Test the Workflow Activate the workflow Submit the form to generate a test quiz Verify the Google Form is created successfully Pre-requisites Necessary Accounts:** Google Account (for Forms API access) Google AI Studio Account (for Gemini API access) n8n Instance (cloud or self-hosted) API Access:** Google Forms API enabled Google drive API enabled Google Generative AI API access Valid API keys and OAuth credentials N8N Requirements:** n8n version 1.95.2 or higher LangChain nodes package installed Internet access for API calls Customization Guidance Question Generation Prompts: Modify the prompt in "Set Prompt and model" node for different question styles Adjust difficulty levels or focus areas Change question format (True/False, Fill-in-blanks, etc.) Form Customization: Update form title and description templates Add additional input fields (difficulty level, subject area) Customize success/error messages Advanced Features You Can Add: Email Notifications: Send quiz links via email Analytics Integration: Track quiz performance and completion rates Multi-language Support: Generate quizzes in different languages Question Bank Storage: Save generated questions to a database Batch Processing: Generate multiple quizzes from a YouTube playlist Error Handling Enhancements: Add retry logic for API failures Implement fallback question generation Create detailed error logging Technical Specifications Video Length**: Up to 50 minutes supported Question Limit**: 1-90 questions per quiz Processing Time**: 2-10 minutes depending on video length Supported Formats**: YouTube videos (public and unlisted) Output Format**: Google Forms with automatic grading Limitations & Considerations YouTube video must be publicly accessible or unlisted Processing time increases with video length and question count API rate limits may apply for high-volume usage Some complex visual content may not be fully analyzed Ready to Transform Videos into Quizzes? This workflow streamlines the entire process from video analysis to quiz deployment. Perfect for educators and trainers looking to create engaging assessments from video content quickly and efficiently.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors local event platforms (Eventbrite, Meetup, Facebook Events) and aggregates upcoming events that match your criteria. Never miss a networking or sponsorship opportunity again. Overview A scheduled trigger scrapes multiple event sites via Bright Data, filtering by location, date range, and keywords. OpenAI classifies each event (conference, meetup, workshop) and extracts key details such as venue, organizers, and ticket price. Updates are posted to Slack and archived in Airtable for quick lookup. Tools Used n8n** – Core automation engine Bright Data** – Reliable multi-site scraping OpenAI** – NLP-based event categorization Slack** – Delivers daily event digests Airtable** – Stores enriched event records How to Install Import the Workflow: Add the .json file to n8n. Configure Bright Data: Provide your account credentials. Set Up OpenAI: Insert your API key. Connect Slack & Airtable: Authorize both services. Customize Filters: Edit the initial Set node to adjust city, radius, and keywords. Use Cases Community Managers**: Curate a calendar of relevant events. Sales Teams**: Identify trade shows and meetups for prospecting. Event Planners**: Track competing events when choosing dates. Marketers**: Spot speaking or sponsorship opportunities. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #eventmonitoring #brightdata #openscraping #openai #slackalerts #n8nworkflow #nocode #meetup #eventbrite
by dirogar
Telegram Tasker Bot — это сценарий n8n, который принимает голосовые сообщения в Telegram, автоматически превращает их в текст, извлекает из него ключевые поля задачи и создаёт карточку в нужной доске Trello. Пользователь просто говорит задачу — бот сам оформляет её и присылает ссылку на готовую карточку. Для использования вам потребуется telegram bot. Его можно создать через бота BotFather Так же понадобится доступ к API chatgpt - он используется только для транскрибции аудио в речь. Вы можете использовать любой другой сервис, по вашему выбору. И аккаунт в trello, с доступом к API. !Внимание! ID доски в trello можно взять из url ID столбца на доске трелло можно взять через инструменты разработчика (по крайней мере я так получал эти данные)
by Anir Agram
🛡️📥 Telegram Invoice Agent → 🔎 OCR → 🤖 AI Parsing → 📄 Google Sheets + 🗂️ Drive What this workflow does 🤖 Captures invoices from Telegram and auto-downloads PDFs/images. 🔎 Runs OCR, then uses AI to structure clean invoice fields. 📄 Appends parsed data to a Google Sheets “Invoice Database.” 🗂️ Uploads the original file to Google Drive with a neat name. 💬 Sends a friendly Telegram summary with totals, due date, notes, and link. Why it’s useful ⚡ Faster bookkeeping with zero manual copy-paste. 🧱 Consistent schema for reliable reporting and pivots. 👥 Team-friendly drop-and-log via Telegram. 🧩 Easy to extend with approvals, ERP/CRM sync, or vendor routing. How it works 📲 Telegram Trigger → file received. 🌐 HTTP OCR (OCR.space) → text extracted. 🤖 AI Agent → maps to strict JSON schema. 📄 Google Sheets → appends structured row. 🗂️ Google Drive → saves original invoice. 💬 Telegram → concise confirmation and links. What you’ll need 🤖 Telegram Bot token. 🔑 OCR API key (OCR.space: free tier; upgrade for volume/accuracy). 🔐 Google OAuth for Sheets + Drive. 🧠 LLM account (e.g., Gemini/OpenAI-compatible). Setup steps 🔗 Connect credentials: Telegram, Google, OCR, AI. 📄 Prepare Sheet columns: Invoice Number, Date, Total Amount ($), Billing Address, Due Date, Notes. 🧭 Update sheet ID and Drive folder ID. 🧪 Test: send a sample invoice and validate OCR, AI output, row append, and Drive link. Customization ideas 🎯 Higher accuracy OCR: swap to Google Vision. 📊 Line items: extract into a second tab for analytics. ✅ Approvals: add Telegram keyboard confirmation before write. 🧯 Robustness: IF/Retry on empty OCR; user prompt to retake photo. Who it’s for 🧑💻 Freelancers/agencies needing fast invoice intake via Telegram. 🧾 Small finance teams wanting a searchable ledger with links to originals. 🏗️ Builders extending to ERPs/CRMs and custom accounting flows. Want help customizing? 📧 anirpoke@gmail.com 🔗 Linkedin
by Oneclick AI Squad
This automated n8n workflow scrapes job listings from Upwork using Apify, processes and cleans the data, and generates daily email reports with job summaries. The system uses Google Sheets for data storage and keyword management, providing a comprehensive solution for tracking relevant job opportunities and market trends. What is Apify? Apify is a web scraping and automation platform that provides reliable APIs for extracting data from websites like Upwork. It handles the complexities of web scraping including rate limiting, proxy management, and data extraction while maintaining compliance with website terms of service. Good to Know Apify API calls may incur costs based on usage; check Apify pricing for details Google Sheets access must be properly authorized to avoid data sync issues The workflow includes data cleaning and deduplication to ensure high-quality results Email reports provide structured summaries for easy review and decision-making Keyword management through Google Sheets allows for flexible job targeting How It Works The workflow is organized into three main phases: Phase 1: Job Scraping & Initial Processing This phase handles the core data collection and initial storage: Trigger Manual Run - Manually starts the workflow for on-demand job scraping Fetch Keywords from Google Sheet - Reads the list of job-related keywords from the All Keywords sheet Loop Through Keywords - Iterates over each keyword to trigger Apify scraping Trigger Apify Scraper - Sends HTTP request to start Apify actor for job scraping Wait for Apify Completion - Waits for the Apify actor to finish execution Delay Before Dataset Read - Waits a few seconds to ensure dataset is ready for processing Fetch Scraped Job Dataset - Fetches the latest dataset from Apify Process Raw Job Data - Filters jobs posted in the last 24 hours and formats the data Save Jobs to Daily Sheet - Appends new job data to the daily Google Sheet Update Keyword Job Count - Updates job count in the All Keywords summary sheet Phase 2: Data Cleaning & Deduplication This phase ensures data quality and removes duplicates: Load Today's Daily Jobs - Loads all jobs added in today's sheet for processing Remove Duplicates by Title/Desc - Removes duplicates based on title and description matching Save Clean Job Data - Saves the cleaned, unique entries back to the sheet Clear Old Daily Sheet Data - Deletes old or duplicate entries from the sheet Reload Clean Job Data - Loads clean data again after deletion for final processing Phase 3: Daily Summary & Email Report This phase generates summaries and delivers the final report: Generate Keyword Summary Stats - Counts job totals per keyword for analysis Update Summary Sheet - Updates the summary sheet with keyword statistics Fetch Final Summary Data - Reads the summary sheet for reporting purposes Build Email Body - Formats email with statistics and sheet link Send Daily Report Email - Sends the structured daily summary email to recipients Data Sources The workflow utilizes Google Sheets for data management: AI Keywords Sheet - Contains keyword management data with columns: Keyword (text) - Job search terms Job Count (number) - Number of jobs found for each keyword Status (text) - Active/Inactive status Last Updated (timestamp) - When keyword was last processed Daily Jobs Sheet - Contains scraped job data with columns: Job Title (text) - Title of the job posting Description (text) - Job description content Budget (text) - Job budget or hourly rate Client Rating (number) - Client's rating on Upwork Posted Date (timestamp) - When job was posted Job URL (text) - Direct link to the job posting Keyword (text) - Which keyword found this job Scraped At (timestamp) - When data was collected Summary Sheet - Contains daily statistics with columns: Date (date) - Report date Total Jobs (number) - Total jobs found Keywords Processed (number) - Number of keywords searched Top Keyword (text) - Most productive keyword Average Budget (currency) - Average job budget Report Generated (timestamp) - When summary was created How to Use Import the workflow into n8n Configure Apify API credentials and Google Sheets API access Set up email credentials for daily report delivery Create three Google Sheets with the specified column structures Add relevant job keywords to the AI Keywords sheet Test with sample keywords and adjust as needed Requirements Apify API credentials and actor access Google Sheets API access Email service credentials (Gmail, SMTP, etc.) Upwork job search keywords for targeting Customizing This Workflow Modify the Process Raw Job Data node to filter jobs by additional criteria like budget range, client rating, or job type. Adjust the email report format to include more detailed statistics or add visual aids, such as charts. Customize the data cleaning logic to better handle duplicate detection based on your specific requirements, or add additional data sources beyond Upwork for comprehensive job market analysis.