by ueharayuuki
đ€ Automated Multi-lingual News Curator & Archiver Overview This workflow automates news monitoring by fetching RSS feeds, rewriting content using AI, translating it (EN/ZH/KO), and archiving it. Who is this for? Content Curators, Localization Teams, and Travel Bloggers. How it works Fetch & Filter: Pulls NHK RSS and filters for keywords (e.g., "Tokyo"). AI Processing: Google Gemini rewrites articles, extracts locations, and translates text. Archive & Notify: Saves structured data to Google Sheets and alerts Slack. Setup Requirements Credentials: Google Gemini, Google Sheets, Slack. Google Sheet: Create headers: title, summary, location, en, zh, ko, url. Slack: Configure Channel IDs. Customization RSS Read:** Change feed URL. If Node:** Update filter keywords. AI Agent:** Adjust system prompts for tone. 1. Fetch & Filter Runs on a schedule to fetch the latest RSS items. Filters articles based on specific keywords (e.g., "Tokyo" or "Season") before processing. 2. AI Analysis & Parsing Uses Google Gemini to rewrite the news, extract specific locations, and translate content. The Code node cleans the JSON output for the database. 3. Archive & Notify Appends the structured data to Google Sheets and sends a formatted notification to Slack (or alerts if an article was skipped). Output Example (JSON) The translation agent outputs data in this format: { "en": "Tokyo Tower is...", "zh": "äžäșŹćĄæŻ...", "ko": "ëìż íìë..." }
by Aslamul Fikri Alfirdausi
How it works This workflow is a professional-grade market intelligence tool designed to bridge the gap between search interest and social media engagement. It automates the end-to-end process of trend discovery and content strategy. Detection: Polls Google Trends RSS daily for rising regional search queries. Parallel Extraction: Concurrently triggers industrial-grade Apify actors to scrape TikTok, Instagram, and X (Twitter) without the risk of account bans. Data Aggregation: Uses custom JavaScript logic to clean and merge disparate data points, optimizing them for LLM processing. AI Analysis: Google Gemini Flash analyzes the data to identify core topics, sentiment, and trend strength. Granular Delivery: Delivers individual, structured reports for each identified trend directly to Discord via Webhooks. Set up steps API Credentials: Prepare your Apify API Token and Google Gemini API Key. Discord Setup: Create a Webhook in your Discord server and paste the URL into the Discord node. Regional Configuration: Set your target country code (e.g., JP, ID, US) in the "Edit Fields" node at the start of the workflow. Node Settings: Ensure all scraper nodes are set to "Continue on Fail" to maintain workflow resilience. Requirements Apify Account. Google Gemini API Key. Discord Server for report delivery.
by Cheng Siong Chin
How It Works This workflow provides automated Chinese text translation with high-quality audio synthesis for language learning platforms, content creators, and international communication teams. It addresses the challenge of converting Chinese text into accurate multilingual translations with natural-sounding voiceovers. The system receives Chinese text via webhook, validates input formatting, and processes it through an AI translation agent that generates multiple language versions. Each translation is converted to speech using ElevenLabs' neural voice models, then formatted into professional audio responses. A quality review agent evaluates translation accuracy, cultural appropriateness, and audio clarity against predefined criteria. High-scoring outputs are returned via webhook for immediate use, while low-quality results trigger review processes, ensuring consistent delivery of publication-ready multilingual audio content. Setup Steps Obtain OpenAI API key and configure in "Translation Agent" Set up ElevenLabs account, generate API key Configure webhook URL and update in source applications to trigger workflow Customize target languages and voice settings in translation and ElevenLabs nodes Adjust quality thresholds in "Check Quality Score" Update output webhook endpoint in "Return Audio Files" node Prerequisites Active accounts: OpenAI API access, ElevenLabs subscription. Use Cases Chinese language learning apps, international marketing content localization Customization Add additional target languages, modify voice characteristics and speaking rates Benefits Automates 95% of translation workflow, delivers publication-ready audio in minutes
by Ruben AI
AI-Powered Flyer & Video Generator with Airtable, Klie.ai, and n8n Who is this for? This template is perfect for e-commerce entrepreneurs, marketers, agencies, and creative teams who want to turn simple product photos and short descriptions into professional flyers or product videosâautomatically and at scale. If you want to generate polished marketing assets without relying on designers or editors, this is for you. What problem is this workflow solving? Creating product ads, flyers, or videos usually involves multiple tools and manual steps: Collecting and cleaning product photos Writing ad copy or descriptions Designing flyers or visuals for campaigns Producing animations or video ads Managing multiple revisions and approvals This workflow automates the entire pipeline. Upload a raw product image into Airtable, type a quick description, and receive back a flyer or video animation tailored to your brand and contextâready to use for ads, websites, or campaigns. What this workflow does Uses Airtable as the central interface where you upload raw product photos and enter descriptions Processes the content automatically via n8n Generates flyers and visuals using OpenAI Image 1 Produces custom product videos with Googleâs VEO3 Runs through Klie.ai to unify the image + video generation process Sends the final creative assets back into Airtable for review and download Setup Download n8n files and connect your Airtable token to n8n Duplicate the Airtable base and make sure youâre on an Airtable Team plan Add your API key on the Airtable interface under API setup Create your agency inside the interface Start generating concept images and videos instantly How to customize this workflow to your needs Edit the prompts to match your brand voice and ad style Extend Airtable fields to include more creative parameters (colors, layout, target audience) Add approval steps via email, Slack, or Airtable statuses before finalizing Integrate with publishing platforms (social media, e-commerce CMS) for auto-posting Track generated assets inside Airtable for team collaboration đ„ Demo Video: Demo Video
by Avkash Kakdiya
How it works This workflow automatically collects a list of companies from Google Sheets, searches for their competitors using SerpAPI, extracts up to 10 relevant competitor names with source links, and logs the results into both Google Sheets and Airtable. It runs on a set schedule, cleans and formats the company list, processes each entry individually, checks if competitors exist, and separates results into successful and âno competitors foundâ lists for organized tracking. Step-by-step 1. Trigger & Input Auto Run (Scheduled) â Executes every day at the set time (e.g., 9 AM). Read Companies Sheet â Pulls the list of companies from a Google Sheet (List column). Clean & Format Company List â Removes empty rows, trims names, and attaches row numbers for tracking. Loop Over Companies â Processes each company one at a time in batches. 2. Competitor Search Search Company Competitors (SerpAPI) â Sends a query like "{Company} competitors" to SerpAPI, retrieving structured search results in JSON format. 3. Data Extraction & Validation Extract Competitor Data from Search â Parses SerpAPI results to: Identify the company name Extract up to 10 competitor names Capture the top source URL Count total search results Has Competitors? â Checks if any competitors were found: Yes â Proceeds to logging No â Logs in âno resultsâ list 4. Logging Results Log to Result Sheet â Appends or updates competitor data into the results Google Sheet. Log Companies Without Results â Records companies with zero competitors found in a separate section of the results sheet. Sync to Airtable â Pushes all results (successful or not) into Airtable for unified storage and analysis. Benefits Automated Competitor Research â Eliminates the need for manual Google searching. Daily Insights â Runs automatically at your chosen schedule. Clean Data Output â Stores structured competitor lists with sources for easy review. Multi-Destination Sync â Saves to both Google Sheets and Airtable for flexibility. Scalable & Hands-Free â Handles hundreds of companies without extra effort.
by Vinay Gangidi
LOB Underwriting with AI This template ingests borrower documents from OneDrive, extracts text with OCR, classifies each file (ID, paystub, bank statement, utilities, tax forms, etc.), aggregates everything per borrower, and asks an LLM to produce a clear underwriting summary and decision (plus next steps). Good to know AI and OCR usage consume credits (OpenAI + your OCR provider). Folder lookups by name can be ambiguousâuse a fixed folderId in production. Scanned image quality drives OCR accuracy; bad scans yield weak text. This flow handles PIIâmask sensitive data in logs and control access. Start small: batch size and pagination keep costs/memory sane. How it works Import & locate docs: Manual trigger kicks off a OneDrive folder search (e.g., âLOBsâ) and lists files inside. Per-file loop: Download each file â run OCR â classify the document type using filename + extracted text. Aggregate: Combine per-file results into a borrower payload (make BorrowerName dynamic). LLM analysis: Feed the payload to an AI Agent (OpenAI model) to extract underwriting-relevant facts and produce a decision + next steps. Output: Return a human-readable summary (and optionally structured JSON for systems). How to use Start with the Manual Trigger to validate end-to-end on a tiny test folder. Once stable, swap in a Schedule/Cron or Webhook trigger. Review the generated underwriting summary; handle only flagged exceptions (unknown/unreadable docs, low confidence). Setup steps Connect accounts Add credentials for OneDrive, OCR, and OpenAI. Configure inputs In Search a folder, point to your borrower docs (prefer folderId; otherwise tighten the name query). In Get items in a folder, enable pagination if the folder is large. In Split in Batches, set a conservative batch size to control costs. Wire the file path Download a file must receive the current fileâs id from the folder listing. Make sure the OCR node receives binary input (PDFs/images). Classification Update keyword rules to match your region/lenders/utilities/tax forms. Keep a fallback Unknown class and log it for review. Combine Replace the hard-coded BorrowerName with: a Set node field, a form input, or parsing from folder/file naming conventions. AI Agent Set your OpenAI model/credentials. Ask the model to output JSON first (structured fields) and Markdown second (readable summary). Keep temperature low for consistent, audit-friendly results. Optional outputs Persist JSON/Markdown to Notion/Docs/DB or write to storage. Customize if needed Doc types: add/remove categories and keywords without touching core logic. Error handling: add IF paths for empty folders, failed downloads, empty OCR, or Unknown class; retry transient API errors. Privacy: redact IDs/account numbers in logs; restrict execution visibility. Scale: add MIME/size filters, duplicate detection, and multi-borrower folder patterns (parent â subfolders).
by Paul Abraham
This n8n template demonstrates how to turn a Telegram bot into a personal AI-powered assistant that understands both voice notes and text messages. The assistant can transcribe speech, interpret user intent with AI, and perform smart actions such as managing calendars, sending emails, or creating notes. Use cases Hands-free scheduling with Google Calendar Quickly capturing ideas as Notion notes via voice Sending Gmail messages directly from Telegram A personal productivity assistant available on-the-go Good to know Voice notes are automatically transcribed into text before being processed. This template uses Google Gemini for AI reasoning.The AI agent supports memory, enabling more natural and contextual conversations. How it works Telegram Trigger â Starts when you send a text or voice note to your Telegram bot. Account Check â Ensures only authorized users can interact with the bot. Audio Handling â If itâs a voice message, the workflow retrieves and transcribes the recording. AI Agent â Both transcribed voice or text are sent to the AI Agent powered by Google Gemini + Simple Memory. Smart Actions â Based on the query, the AI can: Read or create events in Google Calendar Create notes in Notion Send messages in Gmail Reply in Telegram â The bot sends a response confirming the action or providing the requested information. How to use Clone this workflow into your n8n instance. Replace the Telegram Trigger with your bot credentials. Connect Google Calendar, Notion, and Gmail accounts where required. Start chatting with your Telegram bot to add events, notes, or send emails using just your voice or text. Requirements Telegram bot & API key Google Gemini account for AI Google Calendar, Notion, and Gmail integrations (optional, depending on use case) Customising this workflow Add more integrations (Slack, Trello, Airtable, etc.) for extended productivity. Modify the AI prompt in the agent node to fine-tune personality or task focus. Swap in another transcription service if preferred.
by SpaGreen Creative
WhatsApp Bulk Number Verification in Google Sheets Using Unofficial Rapiwa API Whoâs it for This workflow is for marketers, small business owners, freelancers, and support teams who want to automate WhatsApp messaging using a Google Sheet without the official WhatsApp Business API. Itâs suitable when you need a budget-friendly, easy-to-maintain solution that uses your personal or business WhatsApp number via an unofficial API service such as Rapiwa. How it works / What it does The workflow looks for rows in a Google Sheet where the Status column is pending. It cleans each phone number (removes non-digits). It verifies the number with the Rapiwa verify endpoint (/api/verify-whatsapp). If the number is verified: The workflow can send a message (optional). It updates the sheet: Verification = verified, Status = sent (or leaves Status for the send node to update). If the number is not verified: It skips sending. It updates the sheet: Verification = unverified, Status = not sent. The workflow processes rows in batches and inserts short delays between items to avoid rate limits. The whole process runs on a schedule (configurable). Key features Scheduled automatic checks (configurable interval; recommended 5â10 minutes). Cleans phone numbers to a proper format before verification. Verifies WhatsApp registration using Rapiwa. Batch processing with limits to control workload (recommended max per run configurable). Short delay between items to reduce throttling and temporary blocks. Automatic sheet updates for auditability (verified/unverified, sent/not sent). Defaults recommended in this workflow Trigger interval: every 5â10 minutes (adjustable). Max items per run: configurable (example: 200 max per cycle). Delay between items: 2â5 seconds (example uses 3 seconds). How to set up Duplicate the sample Google Sheet: †Sample Fill contact rows and set Status = pending. Include columns like WhatsApp No, Name, Message, Verification, Status. In n8n, add and authenticate a Google Sheets node pointed to your sheet. Create an HTTP Bearer credential in n8n and paste your Rapiwa API key. Configure the workflow nodes (Trigger â Google Sheets â Limit/SplitInBatches â Code (clean) â HTTP Request (verify) â If â Update Sheet â Wait). Enable the workflow and monitor first runs with a small test batch. Requirements n8n instance with Google Sheets and HTTP Request nodes enabled. Google Sheets OAuth2 credentials configured in n8n. Rapiwa account and Bearer token (stored in n8n credentials). Google Sheet formatted to match the workflow columns. Why use Rapiwa Cost-effective and developer-friendly REST API for WhatsApp verification and sending. Simple integration via HTTP requests and n8n. Useful when you prefer not to use the official WhatsApp Business API. Note: Rapiwa is an unofficial service â review its terms and risks before production use. How to customize Change schedule frequency in the Trigger node. Adjust maxItems in Limit/SplitInBatches for throughput control. Change the Wait node delay for safer sending. Modify the HTTP Request body to support media or templates if the provider supports it. Add logging or a separate audit sheet to record API responses and errors. Best practices Test with a small batch first. Keep the sheet headers exact and consistent. Store API keys in n8n credentials (do not hardcode). Increase Wait time or reduce batch size if you see rate limits. Keep a log sheet of verified/unverified rows for troubleshooting. Example HTTP verify body (n8n HTTP Request node) { "number": "{{ $json['WhatsApp No'] }}" } Notes and best practices Test with a small batch before scaling. Store the Rapiwa token in n8n credentials, not in node fields. Increase Wait delay or reduce batch size if you see rate limits or temporary blocks. Keep the sheet headers consistent; the workflow matches columns by name. Log API responses or errors for troubleshooting. Optional Add a send-message HTTP Request node after verification to send messages. Append successful and failed rows to separate sheets for easy review. Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Raphael De Carvalho Florencio
What this workflow is (About) This workflow turns a Telegram bot into an AI-powered lyrics assistant. Users send a command plus a lyrics URL, and the flow downloads, cleans, and analyzes the text, then replies on Telegram with translated lyrics, summaries, vocabulary, poetic devices, or an interpretationâall generated by AI (OpenAI). What problems it solves Centralizes lyrics retrieval + cleanup + AI analysis in one automated flow Produces study-ready outputs (translation, vocabulary, figures of speech) Saves time for teachers, learners, and music enthusiasts with instant results in chat Key features AI analysis** using OpenAI (no secrets hardcoded; uses n8n Credentials) Line-by-line translation, **concise summaries, vocabulary lists Poetic/literary device detection* and *emotional/symbolic interpretation** Robust ETL (extract, download, sanitize) and error handling Clear Sticky Notes documenting routing, ETL, AI prompts, and messaging Who itâs for Language learners & teachers Musicians, lyricists, and music bloggers Anyone studying lyrics for meaning, style, or vocabulary Input & output Input:* Telegram command with a public *lyrics URL** Output:** Telegram messages (Markdown/MarkdownV2), split into chunks if long How it works Telegram â Webhook** receives a user message (e.g., /get_lyrics <URL>). Routing (If/Switch)** detects which command was sent. Extract URL + Download (HTTP Request)** fetches the lyrics page. Cleanup (Code)** strips HTML/scripts/styles and normalizes whitespace. OpenAI (Chat)** formats the result per command (translation, summary, vocabulary, analysis). Telegram (Send Message)** returns the final text; long outputs are split into chunks. Error handling** replies with friendly guidance for unsupported/incomplete commands. Set up steps Create a Telegram bot with @BotFather and copy the bot token. In n8n, create Credentials â Telegram API and paste your token (no hardcoded keys in nodes). Create Credentials â OpenAI and paste your API key. Import the workflow and set a short webhook path (e.g., /lyrics-bot). Publish the webhook and set it on Telegram: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook?url=https://[YOUR_DOMAIN]/webhook/lyrics-bot (Optional) Restrict update types: curl -X POST https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook \ -H "Content-Type: application/json" \ -d '{ "url": "https://[YOUR_DOMAIN]/webhook/lyrics-bot", "allowed_updates": ["message"] }' Test by sending /start and then /get_lyrics <PUBLIC_URL> to your bot. If messages are long, ensure MarkdownV2 is used and special characters are escaped.
by Rakin Jakaria
Who this is for This workflow is for content creators, digital marketers, or YouTube strategists who want to automatically discover trending videos in their niche, analyze engagement metrics, and get data-driven insights for their content strategy â all from one simple form submission. What this workflow does This workflow starts every time someone submits the YouTube Trends Finder Form. It then: Searches YouTube videos* based on your topic and specified time range using the *YouTube Data API**. Fetches detailed analytics** (views, likes, comments, engagement rates) for each video found. Calculates engagement rates** and filters out low-performing content (below 2% engagement). Applies smart filters** to exclude videos with less than 1000 views, content outside your timeframe, and hashtag-heavy titles. Removes duplicate videos** to ensure clean data. Creates a Google Spreadsheet** with all trending video data organized by performance metrics. Delivers the results** via a completion form with a direct link to your analytics report. Setup To set this workflow up: Form Trigger â Customize the "YouTube Trends Finder" form fields if needed (Topic Name, Last How Many Days). YouTube Data API â Add your YouTube OAuth2 credentials and API key in the respective nodes. Google Sheets â Connect your Google Sheets account for automatic report generation. Engagement Filters â Adjust the 2% engagement rate threshold based on your quality standards. View Filters â Modify the minimum view count (currently 1000+) in the filter conditions. Regional Settings â Update the region code (currently "US") to target specific geographic markets. How to customize this workflow to your needs Change the engagement rate threshold to be more or less strict based on your niche requirements. Add additional filters like video duration, subscriber count, or specific keywords to refine results. Modify the Google Sheets structure to include extra metrics like "Channel Name", "Video Duration", or "Trending Score". Switch to different output formats like CSV export or direct email reports instead of Google Sheets.
by vinci-king-01
Public Transport Delay Tracker with Microsoft Teams and Todoist â ïž COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow continuously monitors public-transportation websites and apps for real-time schedule changes and delays, then posts an alert to a Microsoft Teams channel and creates a follow-up task in Todoist. It is ideal for commuters or travel coordinators who need instant, actionable updates about transit disruptions. Pre-conditions/Requirements Prerequisites An n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Microsoft Teams account with permission to create an Incoming Webhook Todoist account with at least one project Access to target transit authority websites or APIs Required Credentials ScrapeGraphAI API Key** â Enables scraping of transit data Microsoft Teams Webhook URL** â Sends messages to a specific channel Todoist API Token** â Creates follow-up tasks (Optional) Transit API key if you are using a protected data source Specific Setup Requirements | Resource | What you need | |--------------------------|---------------------------------------------------------------| | Teams Channel | Create a channel â Add âIncoming Webhookâ â copy the URL | | Todoist Project | Create âTransit Alertsâ project and note its Project ID | | Transit URLs/APIs | Confirm the URLs/pages contain the schedule & delay elements | How it works This workflow continuously monitors public-transportation websites and apps for real-time schedule changes and delays, then posts an alert to a Microsoft Teams channel and creates a follow-up task in Todoist. It is ideal for commuters or travel coordinators who need instant, actionable updates about transit disruptions. Key Steps: Webhook (Trigger)**: Starts the workflow on a schedule or via HTTP call. Set Node**: Defines target transit URLs and parsing rules. ScrapeGraphAI Node**: Scrapes live schedule and delay data. Code Node**: Normalizes scraped data, converts times, and flags delays. IF Node**: Determines if a delay exceeds the user-defined threshold. Microsoft Teams Node**: Sends formatted alert message to the selected Teams channel. Todoist Node**: Creates a âCheck alternate routeâ task with due date equal to the delayed departure time. Sticky Note Node**: Holds a blueprint-level explanation for future editors. Set up steps Setup Time: 15â20 minutes Install community node: In n8n, go to âManage Nodesâ â âInstallâ â search for âScrapeGraphAIâ â install and restart n8n. Create Teams webhook: In Microsoft Teams, open target channel â âConnectorsâ â âIncoming Webhookâ â give it a name/icon â copy the URL. Create Todoist API token: Todoist â Settings â Integrations â copy your personal API token. Add credentials in n8n: Settings â Credentials â create new for ScrapeGraphAI, Microsoft Teams, and Todoist. Import workflow template: File â Import Workflow JSON â select this template. Configure Set node: Replace example transit URLs with those of your local transit authority. Adjust delay threshold: In the Code node, edit const MAX_DELAY_MINUTES = 5; as needed. Activate workflow: Toggle âActiveâ. Monitor executions to ensure messages and tasks are created. Node Descriptions Core Workflow Nodes: Webhook** â Triggers workflow on schedule or external HTTP request. Set** â Supplies list of URLs and scraping selectors. ScrapeGraphAI** â Scrapes timetable, status, and delay indicators. Code** â Parses results, converts to minutes, and builds payloads. IF** â Compares delay duration to threshold. Microsoft Teams** â Posts formatted adaptive-card-style message. Todoist** â Adds a task with priority and due date. Sticky Note** â Internal documentation inside the workflow canvas. Data Flow: Webhook â Set â ScrapeGraphAI â Code â IF a. IF (true branch) â Microsoft Teams â Todoist b. IF (false branch) â (workflow ends) Customization Examples Change alert message formatting // In the Code node const message = `â ïž Delay Alert: Route: ${item.route} Expected: ${item.scheduled} New Time: ${item.newTime} Delay: ${item.delay} min Link: ${item.url}`; return [{ json: { message } }]; Post to multiple Teams channels // Duplicate the Microsoft Teams node and reference a different credential items.forEach(item => { item.json.webhookUrl = $node["Set"].json["secondaryChannelWebhook"]; }); return items; Data Output Format The workflow outputs structured JSON data: { "route": "Blue Line", "scheduled": "2024-12-01T14:25:00Z", "newTime": "2024-12-01T14:45:00Z", "delay": 20, "status": "Delayed", "url": "https://transit.example.com/blue-line/status" } Troubleshooting Common Issues Scraping returns empty data â Verify CSS selectors/XPath in the Set node and ensure the target site hasnât changed its markup. Teams message not sent â Check that the stored webhook URL is correct and the connector is still active. Todoist task duplicated â Add a unique key (e.g., route + timestamp) to avoid inserting duplicates. Performance Tips Limit the number of URLs per execution when monitoring many routes. Cache previous scrape results to avoid hitting site rate limits. Pro Tips: Use n8nâs built-in Cron instead of Webhook if you only need periodic polling. Add a SplitInBatches node after scraping to process large route lists incrementally. Enable execution logging to an external database for detailed audit trails.
by Yasir
đ§ Workflow Overview â AI-Powered Jobs Scraper & Relevancy Evaluator This workflow automates the process of finding highly relevant job listings based on a userâs resume, career preferences, and custom filters. It scrapes fresh job data, evaluates relevance using OpenAI GPT models, and automatically appends the results to your Google Sheet tracker â while skipping any jobs already in your sheet, so you donât have to worry about duplicates. Perfect for recruiters, job seekers, or virtual assistants who want to automate job research and filtering. âïž What the Workflow Does Takes user input through a form â including resume, preferences, target score, and Google Sheet link. Fetches job listings via an Apify LinkedIn Jobs API actor. Filters and deduplicates results (removes duplicates and blacklisted companies). Evaluates job relevancy using GPT-4o-mini, scoring each job (0â100) against the userâs resume & preferences. Applies a relevancy threshold to keep only top-matching jobs. Checks your Google Sheet for existing jobs and prevents duplicates. Appends new, relevant jobs directly into your provided Google Sheet. đ What Youâll Get A personal Job Scraper Form (public URL you can share or embed). Automatic job collection & filtering based on your inputs. Relevance scoring** (0â100) for each job using your resume and preferences. Real-time job tracking Google Sheet that includes: Job Title Company Name & Profile Job URLs Location, Salary, HR Contact (if available) Relevancy Score đȘ Setup Instructions 1. Required Accounts Youâll need: â n8n account (self-hosted or Cloud) â Google account (for Sheets integration) â OpenAI account (for GPT API access) â Apify account (to fetch job data) 2. Connect Credentials In your n8n instance: Go to Credentials â Add New: Google Sheets OAuth2 API Connect your Google account. OpenAI API Add your OpenAI API key. Apify API Replace <your_apify_api> with your apify api key. Set Up Apify API Get your Apify API key Visit: https://console.apify.com/settings/integrations Copy your API key. Rent the required Apify actor before running this workflow Go to: https://console.apify.com/actors/BHzefUZlZRKWxkTck/input Click âRent Actorâ. Once rented, it can be used by your Apify account to fetch job listings. 3. Set Up Your Google Sheet Make a copy of this template: đ Google Sheet Template Enable Edit Access for anyone with the link. Copy your sheetâs URL â youâll provide this when submitting the workflow form. 4. Deploy & Run Import this workflow (jobs_scraper.json) into your n8n workspace. Activate the workflow. Visit your form trigger endpoint (e.g. https://your-n8n-domain/webhook/jobs-scraper). Fill out the form with: Job title(s) Location Contract type, Experience level, Working mode, Date posted Target relevancy score Google Sheet link Resume text Job preferences or ranking criteria Submit â within minutes, new high-relevance job listings will appear in your Google Sheet automatically. đ§© Example Use Cases Automate daily job scraping for clients or yourself. Filter jobs by AI-based relevance instead of keywords. Build a smart job board or job alert system. Support a career agency offering done-for-you job search services. đĄ Tips Adjust the âTarget Relevancy Scoreâ (e.g., 70â85) to control how strict the filtering is. You can add your own blacklisted companies in the Filter & Dedup Jobs node.