by Atta
What it does The job search process is filled with manual, frustrating tasks—reading endless job descriptions only to find the seniority is wrong, the role requires a language you don't speak, or a "hybrid" job has an impossible commute. This workflow acts as a personal AI assistant that automates the entire top of your job search funnel. It doesn't just find jobs; it reads the full description, checks the commute time from your home, filters by your specific criteria, and even compares the job requirements against your CV to calculate a match score. It's a personalized, decision-making engine that only alerts you to the opportunities that are a perfect fit. How it works The workflow is designed to be fully customized from a single Config node and runs in a multi-layered sequence to find and qualify job opportunities. Scrape Jobs: The workflow triggers and uses Apify to find new job postings on LinkedIn based on a list of keywords you define (e.g., "AI Workflow Engineer," "Automation Specialist"). AI Triage & Smart Filtering: For each job found, a Google Gemini AI performs an initial triage, extracting key data like the job's language, work model (Remote, Hybrid, On-site), and seniority level. The workflow then applies a series of smart filters based on your personal preferences: Language & Seniority: It discards any jobs that don't match your target language and experience level. Commute Check: For hybrid or on-site roles, it uses the Google Maps API to calculate the commute time from your home address and filters out any that exceed your maximum desired travel time. AI Deep Analysis vs. CV: For the handful of jobs that pass the initial filters, a second, more advanced Google Gemini agent performs a deep analysis. It compares the job description against your personal CV (which you paste into the config) to generate a summary, a list of key required skills, and a final match score (e.g., 8/10). Log & Alert: The final step is action. The full analysis of every qualified job is logged in a Supabase database for your records. However, only jobs with a match score above your set threshold will trigger an immediate, detailed alert in Telegram, ensuring you only focus on the best opportunities. Setup Instructions This workflow is designed for easy setup, with most personal preferences controlled from a single node. Required Credentials Apify: You will need an Apify API Token. Google Cloud: You will need credentials for a Google Cloud project with the Google AI (Gemini) and Google Maps APIs enabled. Supabase: You will need your Supabase Project URL and Service Role Key. Telegram: You will need a Telegram Bot Token and the Chat ID for the channel where you want to receive alerts. Step-by-Step Configuration Almost all customization is done in the Config node. Open it and set the following parameters to match your personal job search criteria: MyCV: Paste the full text of your CV/resume here. This is used by the AI to compare your skills against the job requirements. JobKeywords: Search keywords for jobs (e.g., "engineer", "product manager"). JobsToScrape: The maximum number of relevant job postings to scrape in each run. HomeLocation: Your home city and country (e.g., "Breda, Netherlands"). This is used as the starting point for calculating commute times for hybrid or onsite jobs. MaxCommuteMinutes: Your personal maximum one-way commute time in minutes. The workflow will filter out any jobs that require a longer travel time. TargetLanguage: Your preferred language for job postings. The workflow will filter out any jobs not written in this language. You can list multiple languages, separated by a comma. ExperienceLevel: The seniority level you are looking for. The AI will validate this against the job description. The value can be: "" → (Any) "internship" → (Internship) "entry" → (Entry Level) "associate" → (Associate) "mid_senior" → (Mid-Senior Level) "director" → (Director) "executive" → (Executive) Under10Applicants: Set to true if you only want to see jobs with fewer than 10 applicants. Set to false to see all jobs. After setting up the Config node, configure the Supabase and Telegram nodes with your specific credentials and table/chat details. How to Adapt the Template This workflow is a powerful framework for any search and qualification process. Change Job Source:* Swap the *Apify** node to scrape different job boards, or use an RSS Feed Reader node to get jobs from sites that provide feeds. Refine AI Logic:* The prompts in the two *Google Gemini** nodes are the core of the engine. You can edit them to extract different data points, change the scoring criteria, or even ask the AI to evaluate a company's culture based on the tone of the job description. Change the Database:* Replace the *Supabase* node with *Airtable, **Google Sheets, or a traditional database node like Postgres to log your results. Modify Alerts:* Change the *Telegram* node to send alerts via *Slack, **Discord, or Email. You could also add a step to automatically create a draft application or add the job to a personal CRM.
by NAZIA AI ACADEMY
How it works This workflow lets users generate AI images directly from Telegram messages using: Google Gemini API – to convert text to detailed, high-quality image prompts. Pollinations API – to generate free AI images from the prompts. Telegram Bot – to interact with users and return generated images instantly. It’s fully automated using n8n — from text message to stunning image, all in one flow. Perfect for creators, content marketers, or anyone wanting quick visuals on the go. Set up steps 🧩 Estimated setup time: ~10–15 minutes Create a Telegram Bot via @BotFather, copy your token, and set up the Telegram Trigger node in n8n with your credentials. Set up Google Gemini API via Google AI Studio or Cloud Console. Make sure your API key is added in the credentials section of the Gemini node. Customize prompt structure or image size in the Fields - Set Values or Prompt Agent node. (Optional) Enable Save to Disk if you want to keep a local copy of every image. Deploy and run the workflow — done 🎉 🛠️ All technical details and logic are fully documented inside the workflow using sticky notes. ⚠️ Requirements n8n (Self-hosted or Cloud) Telegram Bot Token Google Gemini API key (with billing enabled — includes some free usage) No key needed for Pollinations API — it's 100% free 🆓
by Ninja - Abbas
Overview This workflow automates quiz delivery from a Google Sheet directly into a Telegram group. It ensures that quizzes are posted one by one as polls, and once a quiz is sent, its status in the sheet is automatically updated to prevent duplication. If no pending quiz is available, the workflow notifies a separate Telegram channel to refill the sheet. Target Audience This template is designed for teachers, community managers, and Telegram group admins who want an easy way to run quizzes, trivia games, or knowledge checks without manual posting. Problem It Solves Manually sending quizzes is repetitive and prone to mistakes (like re-posting the same question). This template ensures quizzes are sent in order, tracked, and managed automatically with minimal human effort. Requirements An active Google account with Google Sheets enabled A Telegram bot token (via BotFather) Chat IDs for both the quiz group and the notification group Google Sheet Structure Create a sheet with the following columns: quiz_number | question | option_a | option_b | option_c | option_d | status status**: Use 🟨 for pending quizzes, ✅ once completed Setup Instructions Copy the workflow into your n8n instance. Add your Google Sheets credentials. Replace the YOUR_SHEET_ID placeholder with your sheet ID. Set environment variables: TELEGRAM_BOT_TOKEN TELEGRAM_CHAT_ID (for the group) TELEGRAM_NOTIFY_CHAT_ID (for refill notifications) Run the workflow. Customization Options Adjust the sheet name if not using “Sheet1” Change emoji markers (🟨 / ✅) to your preferred system Modify the Telegram notification message
by Trung Tran
Beginner’s Tutorial: Manage Azure Storage Account Container & Blob with n8n > This beginner-friendly n8n workflow shows you how to generate AI images using OpenAI, store them in Azure Blob Storage, and manage blob containers, all with zero code. 👤 Who’s it for This workflow is perfect for: Beginners learning Azure + OpenAI integration** No-code developers** experimenting with image generation Cloud learners** who want hands-on Blob Storage use cases Anyone who wants to automate storing AI-generated content in the cloud ⚙️ How it works / What it does 🖱️ Trigger the workflow manually using the Execute Workflow node. ✏️ Use the Edit Fields node to input: containerName (e.g., demo-images) imageIdea (e.g., "a robot holding a coffee cup") 📦 Create a new Azure Blob container (Create container). 🤖 Use an OpenAI-powered Prompt Generation Agent to craft the perfect image prompt. 🎨 Generate an image using OpenAI’s DALL·E model. ☁️ Upload the generated image to Azure Blob Storage (Create Blob). 📂 List blobs in the container (Get many blobs). 🧹 Delete any blob as needed (Delete Blob). (Optional) 🗑️ Remove the entire container (Delete container). 🔧 How to set up 🧠 Set up OpenAI Create an OpenAI account and get your API key. In n8n, go to Credentials → OpenAI and paste your key. 🪣 Set up Azure Blob Storage Log in to your Azure Portal. Create a Storage Account (e.g., mystorageaccount). Go to Access Keys tab and copy: Storage Account Name Key1 In n8n, create a new Azure Blob Storage Credential using: Account Name = your storage account name Access Key = key1 value > 📝 This demo uses Access Key authentication. You can also configure Shared Access Signatures (SAS) or OAuth in production setups. Run the Workflow Enter your image idea and container name. Click “Execute Workflow” to test it. 📋 Requirements | Requirement | Description | |------------------------|--------------------------------------------------| | Azure Storage Account | With container-level read/write access | | OpenAI API Key | For image and prompt generation | | n8n Version | v1.0+ recommended | | Image Credits | OpenAI charges tokens for DALL·E image creation | 🛠️ How to customize the workflow 🧠 Adjust Prompt Generation Update the Prompt Agent to include: Specific style (3D, anime, cyberpunk) Brand elements Multiple language options 📁 Organize by Date/User Modify the containerName to auto-include: Date (e.g., images-2025-08-20) Username or session ID 📤 Send Image Output Add Slack, Telegram, or Email nodes to deliver the image Create public links using Azure’s blob permissions 🔁 Cleanup Logic Auto-delete blobs after X days Add versioning or backup logic
by Aliz
This workflow automates the daily backfill of Google Analytics 4 (GA4) data into BigQuery. It fetches 13 essential pre-processed reports (including User Acquisition, Traffic, and E-commerce) and uploads them to automatically created tables in BigQuery, and then send an alert in telegram. How it works Configuration:** You define your Project ID, Dataset, and Date Range in a central "Config" node. Parallel Fetching:** The workflow runs 13 parallel API calls to GA4 to retrieve key reports (e.g., ga4_traffic_sources, ga4_ecommerce_items). Dynamic Tables:** It automatically checks if the target BigQuery table exists and creates it with the correct schema if it's missing. Telegram Alerts:** After execution, it sends a summary message to Telegram indicating success or failure for the day's run. Set up steps Google Credentials (OAuth): This workflow uses n8n's built-in "Google OAuth2 API" credential. You do not need a Service Account key. Connect your Google account and ensure you grant scopes for Google Analytics API and BigQuery API. Config Node: Open the "Backfill Config" node and fill in: GA4 Property ID Google Cloud Project ID BigQuery Dataset ID Telegram Setup (Optional): If you want alerts, configure the Telegram node with your Bot Token and Chat ID. If not, you can disable/remove this node. Schedule: By default, this is set to run daily. It is recommended to use a date expression (e.g., Today - 2 Days) to allow GA4 time to process data fully before fetching.
by Dmitrij Zykovic
Invoice Budget Tracker Drop invoices to Google Drive and let AI handle the rest - OCR extraction, automatic categorization, budget tracking, and Telegram alerts when spending reaches thresholds. ✨ Key Features 📄 Invoice OCR** - Extracts data from PDF/image invoices automatically 🤖 AI Categorization** - Detects document type (skips contracts, delivery acts) and categorizes real invoices 💰 Budget Tracking** - Set monthly budgets per category, track spending in real-time 🔔 Smart Alerts** - Get notified when category spending reaches 80%+ of budget 📁 Auto-Organization** - Files renamed and sorted into monthly folders 🔄 Deduplication** - SHA256 hash prevents duplicate processing 📈 Scheduled Reports** - Weekly progress (Fridays) and monthly summaries (10th) 💬 Telegram Control** - Manage budgets via natural language chat 🎯 How It Works Drop invoice to Google Drive watched folder: PDF invoices Image scans (JPG, PNG) AI processes automatically (hourly): OCR extracts text via Ainoflow AI detects if it's actually an invoice (skips contracts, receipts, etc.) Extracts: vendor, amount, date, invoice number Categorizes based on service provided Checks for duplicates Get organized: File renamed: [2026-01-15] - Vendor (INV-123, 150.00 EUR).pdf Moved to monthly folder: /Invoices/2026-01/ Budget updated, alerts sent if threshold reached Manage budgets via Telegram: "Set budget Software 500" "Show budgets" "Budget status" 📋 Expense Categories Software, Marketing, Travel, Office, Professional Services, Infrastructure, Vehicle, Other 🔧 Setup Requirements Google Drive** — OAuth setup for file operations Telegram Bot** — Create bot for notifications and budget management OpenRouter** — Get API key for AI processing Ainoflow** — Sign up for OCR and JSON storage 🏗️ Workflow Architecture | Section | Description | |---------|-------------| | Document Processing | Hourly scan → OCR → AI categorization → Budget tracking → File organization | | Budget Management | Telegram bot for budget CRUD via AI Agent with MCP storage | | Weekly Report | Friday summary of current month progress | | Monthly Report | 10th of month detailed report for previous month | | Data Reset | Manual trigger to delete all invoice data (requires approval) | 💬 Usage Examples Invoice Processing 📄 Drop "invoice_aws.pdf" to /Invoices/ → ✅ AWS | 150.00 EUR | Software → File: [2026-01-15] - AWS (INV-2026-01, 150.00 EUR).pdf → Moved to: /Invoices/2026-01/ 📄 Drop "contract.pdf" to /Invoices/ → ⚠️ Skipped: This is a service agreement, not an invoice → File renamed: [REVIEW] - contract.pdf Budget Management (Telegram) "Set budget Software 500" → ✅ Budget set: Software - €500/month "Show budgets" → 📋 Monthly Budgets: • Software: €500 • Marketing: €1000 Total: €1500/month "Budget status" → 📊 January 2026: • Software: €150/€500 (30%) ✅ • Marketing: €850/€1000 (85%) ⚠️ Scheduled Reports 📅 Weekly (Friday): → 📊 Week Summary: Processed: 12 invoices Total: €2,450 Top: Software €800, Marketing €650 📅 Monthly (10th): → 📈 January 2026 Report: Total: €4,200 (28 invoices) [████████░░] Software 80% [██████░░░░] Marketing 60% 📦 Data Storage Invoices stored in Ainoflow JSON Storage by month (key = SHA256 hash): { "vendor": "Amazon Web Services", "vendor_normalized": "AWS", "amount": 150.00, "currency": "EUR", "date": "2026-01-15", "invoice_number": "INV-2026-01", "category": "Software", "file_id": "1BxiMVs0XRA5nFMd...", "processed_at": "2026-01-15T10:30:00Z" } ⚠️ Important Notes Run /start first** - Registers your chat_id, locks bot to you Document type detection** - Contracts, delivery acts, receipts are skipped automatically Budget alerts at 80%** - Only triggers if budget is set for category Duplicates detected** - Same file won't be processed twice (SHA256 hash) Data Reset is permanent** - Requires manual approval Single currency** - All invoices assumed same currency (no conversion) 🛠️ Customization Categories Edit SetDefaults node → allowed_categories Then send /start to re-register with new categories Processing Thresholds Edit WorkflowConfig node: alert_threshold - Budget alert % (default: 0.8) review_prefix - Failed files prefix (default: "[REVIEW] - ") duplicate_prefix - Duplicate prefix (default: "[DUPLICATE] - ") AI Models Swap OpenRouter models in Gpt4oCategorizer and Gpt4oBudgetAgent nodes 💼 Need Customization? Want to adapt this template for your business? Custom integrations, multi-user support, or enterprise deployment? Contact us at Ainova Systems - We build AI automation solutions for businesses. Tags: google-drive, invoice-processing, budget-tracking, ai-agent, ocr, telegram, openrouter, mcp-tools, business-automation
by Dahiana
Generate hyper-realistic images from Telegram messages with Nano Banana 2 Send any image description to your Telegram bot and receive a hyper-realistic AI-generated photo back in seconds. How it works A user sends a natural language image request to the Telegram bot The bot confirms the receipt Gemini Pro 3 expands the request into a detailed JSON prompt: focal length, aperture, ISO, lighting behavior, material physics, etc. Gemini Flash generates the image and returns it as a base64 string The image is sent back as a photo in the same Telegram chat Setup steps Telegram — Create a bot via @BotFather, copy the token, and add it as a Telegram credential in n8n. Connect it to the Telegram Trigger node and both Telegram send nodes OpenRouter — Add your OpenRouter API key (or any other tool you use) as a credential in n8n. Connect it to the Expand to JSON Prompt and Generate Image nodes. Activate the workflow and send your bot an image description to test Optional Modify the config node with your system prompt preferences for a more diverse pool of results.
by Sankar Battula
Send a product name to Telegram, compare offers across Amazon, Walmart, and Google Shopping, and get the best sensible buying recommendation with confidence level, price range, and a direct purchase link. What this template does This template transforms Telegram into a smart product price comparison tool. Once a user sends a product name, the workflow simultaneously searches Amazon, Walmart, and Google Shopping, standardizes all results into a unified format, and delivers the most practical buying recommendation. Rather than blindly picking the lowest price, an AI step filters out poor matches, suspiciously cheap listings, and mismatched product variants to ensure the recommendation is trustworthy. How it works A user sends a product name to the Telegram bot. The workflow queries Walmart, Amazon, and Google Shopping in parallel. Each marketplace response is cleaned and mapped into a consistent offer structure. All offers are merged and consolidated into a single dataset. An AI agent evaluates price, seller, reviews, and match quality to identify the best sensible offer. A Telegram message is sent back with the recommended price, seller, price range, offer count, confidence level, reasoning, and a direct purchase link. Requirements Before using this template, connect the following credentials: Telegram API OpenAI SerpApi You'll also need to add your own SerpApi key, review the marketplace query settings, and adjust or remove any fixed search location values as needed. Example input DJI Osmo Pocket 3 Creator Combo [file:73] Example output 🔎 DJI Osmo Pocket 3 (Creator Combo / Osmo Pocket 3 family) ✅ Best Price: $557.91 🏪 Seller: B&Z Legend 📊 Range: $6.85 - $1349.99 🧩 Comparable Offers: 69 🎯 Confidence: Medium 📝 Why this was chosen: Chosen the lowest sensible price among New/unspecified offers: B&Z Legend at $557.91. Much cheaper listings (< $200) appear anomalous or risky (very low prices, unclear sellers). There are lower open-box offers (e.g., $498.75) but they carry added risk; established retailers (Best Buy, Official DJI store, OE USA) list in the $569–$999 range if you prefer retailer security. 🔗 Link: https://www.walmart.com/ip/DJI-Osmo-Pocket-3-Creator-Combo-4K-120fps-Camera-1-CMOS-3-Axis-Stabilization-Face-Object-Tracking-Mic-Clear-Sound-Vlogging-Photography/10583721243?classType=VARIANT If no strong match is found, the workflow prompts the user to try a more specific product name. Good to know Results may vary depending on marketplace availability, search query quality, and regional settings. Using specific product names typically yields more accurate recommendations than broad or generic searches. More specific product names usually produce better recommendations than broad searches.
by Patrick Jennings
Sleeper NFL Team Chatbot Starter A Telegram chatbot built to look up your fantasy football team in the Sleeper app and return your roster details, player names, positions, and team info. This starter workflow is perfect for users who want a simple, conversational way to view their Sleeper team in-season or pre-draft. What It Does When a user types their Sleeper username into Telegram, this workflow: Extracts the username from Telegram Pulls their Sleeper User ID Retrieves their Leagues and selects the first one (by default) Pulls the full league Rosters Finds the matching roster owned by that user Uses player_ids to look up full player info from a connected database (e.g. Airtable or Google Sheets) Returns a clean list of player names, positions, and teams via Telegram Requirements To get this running, you’ll need: A Telegram bot (set up through BotFather) A Sleeper Fantasy Football account A synced player database that matches player_id to full player details (we recommend using the companion template: Sleeper NFL Players Daily Sync) Setup Instructions Import the workflow into your n8n instance Add the required credentials: Telegram (API Key from BotFather) Airtable (or replace with another database method like Google Sheets or HTTP request to a hosted JSON file) Trigger the workflow by sending your exact Sleeper username to the bot Your full team roster will return as a formatted message > If the user is in multiple Sleeper leagues, the current logic returns the first league found. Example Output You have 19 players on your roster: Cam Akers (RB - NO), Jared Goff (QB - DET), ... Customization Notes Replace Telegram Trigger with any other input method (webhook, form input, etc.) Replace Airtable node with Google Sheets, SQL DB, or even a local file if preferred You can hardcode a Sleeper username if you're using this for a single user Related Templates Sleeper NFL Players Daily Sync (syncs player_id to player name, position, team) -Create Player Sync first then either integrate it to this template or reate a subworkflow from it & use most recent data set. Difficulty Rating & Comment (from the author) 3 out of 10 if this ain't you're first rodeo, respectfully. Just a little bit more work on adding the Players Sync as your data table & knowing how to GET from Sleeper. If you use Sleeper for fantasy football, lets go win some games!
by ARofiqi Maulana
🚀 Overview This workflow automatically generates IELTS practice content using AI and sends it to Telegram on a schedule. It covers three key IELTS skills: 🧠 Grammar Practice (Monday) ✍️ Writing Task (Wednesday) 📘 Reading Practice (Friday) The workflow is designed to deliver structured, clean, and readable exercises directly to your Telegram chat. ⚙️ How it works The Schedule Trigger runs the workflow on specific days The "Select Test by Day" node determines which practice to generate AI generates structured IELTS content in JSON format The output is parsed and formatted into readable messages Long messages are split into chunks The content is sent to Telegram 🧩 Features AI-powered IELTS content generation Clean JSON parsing and formatting Telegram-ready message delivery Automatic scheduling by day Supports multiple IELTS skills in one workflow 🛠 Setup Create a Telegram Bot via @BotFather Copy your Bot Token Add Telegram credentials in n8n Add your AI credentials (Gemini or OpenAI) Replace the Chat ID in the "Send" nodes with your own 📩 Telegram Setup To get your Chat ID: Send a message to your bot Open: https://api.telegram.org/bot<TOKEN>/getUpdates Copy the chat.id ⚠️ Important Notes This workflow uses a fixed Chat ID for scheduled messages Make sure to replace the Chat ID before running Do not hardcode sensitive credentials Ensure the AI output format remains consistent (JSON) 💡 Customization You can easily customize: Topics and difficulty level Message format Schedule timing Add new modules (e.g., Speaking, Vocabulary) 🎯 Use Cases IELTS learners who want daily practice Teachers sharing exercises with students Automated English learning systems AI-powered educational bots 🚀 Creator Note This workflow demonstrates how to combine AI generation, structured parsing, and messaging automation into a scalable learning system. Feel free to adapt and extend it for your own use case.
by SpaGreen Creative
Who Is This For? This workflow is designed for individuals, families, and teams who want to automate birthday greetings for their contacts. It's particularly useful for those who maintain their contacts in Google and want to send personalized birthday wishes through multiple communication platforms without manual intervention. What This Workflow Does This workflow functions as an automated birthday reminder and greeting system that: Checks contacts daily to identify whose birthday it is Sends personalized birthday wishes through multiple platforms Provides voice announcements through Google Home speakers Handles WhatsApp verification before sending messages Formats dates appropriately for Bengali locale (bd) Key Features Automated Scheduling** – Runs every day at 10 AM Multi-Platform Messaging** – Telegram, Microsoft Teams, and WhatsApp Voice Announcements** – Google Home speaker integration Contact Management** – Google Contacts integration Date Formatting** – Bengali locale (bd) date formatting WhatsApp Verification** – Checks if contact has WhatsApp before sending Loop Processing** – Handles multiple birthdays on the same day Requirements & Setup Google Contacts API**: Required for accessing contact information and birthdays Telegram Bot API**: For sending birthday messages Microsoft Teams API**: For team notifications Rapiwa API**: For WhatsApp messaging and verification Home Assistant API**: For Google Home speaker integration Scheduled Trigger**: Set to run daily at 10 AM Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Harry Gunadi Permana
Get Forex Factory News Release to Telegram, Google Sheets. Record News Data and Live Price from MyFxBook for Affected Currency Pairs. This n8n template demonstrates how to capture Actual Data Releases as quickly as possible for trading decisions. Use cases: Get notified if the actual data release is positive or negative for the relevant currency. Use the Telegram chat message about the news release as a trigger to open a trading position in MetaTrader 4. Record news data and live price to google sheets for analysis. Currency Pairs EURUSD, GBPUSD, AUDUSD, NZDUSD, USDJPY, USDCHF, USDCAD, XAUUSD How it works A news release event acts as the trigger. Only news with a numerical Forecast value will be processed. Events that cannot be measured numerically (e.g., speeches) are ignored. Extract news details: currency, impact level (high/medium), release date, and news link. Wait 10 seconds to ensure the Actual value is available on the news page. Scrape the Actual value from the news link using Airtop. If the Actual value is not available, wait another 5 seconds and retry scraping. Extract both Actual and Forecast values from the scraped content. Remove non-numeric characters (%, K, M, B, T) and convert values to numbers. Determine the effect: If the Actual value is lower than the Forecast value (and lower is better), send it to the True branch. Otherwise, send it to the False branch. Record news data and live price from MyFxBook to Google Sheets. How to use Enter all required credentials. Create or Download Google Sheets file like this in your Google Drive: https://docs.google.com/spreadsheets/d/1OhrbUQEc_lGegk5pRWWKz5nrnMbTZGT0lxK9aJqqId4/edit?usp=drive_link Run the workflow. Requirements Google Calendar credentials Airtop API key Telegram Chat ID Telegram Bot API token Enable Google Drive API in Google Cloud Console Google Sheets credentials Need Help? Join the Discord or ask in the Forum! Thank you! Update Sept 26, 2025: Add new edit node