by Axiomlab.dev
HubSpot Lead Refinement ๐ How it works Triggers: HubSpot Trigger: Fires when contacts are created/updated. Manual Trigger: Run on demand for testing or batch checks. Get Recently Created/Updated Contacts: Pulls fresh contacts from HubSpot. Edit Fields (Set): Maps key fields (First Name, Last Name, Email) for the Agent. AI Agent: First reads your Google Doc (via the Google Docs tool) to learn the research steps and output format. Then uses SerpAPI (Google engine) to locate the contactโs likely LinkedIn profile and produce a concise result. Code โ Remove Think Part: Cleans the model output (removes hidden โthinkโ blocks / formatting) so only the final answer remains. HubSpot Update: Writes the cleaned LinkedIn URL to the contact (via email match). ๐ Required Credentials: HubSpot App Token (Private App) โ for Get/Update contact nodes. HubSpot Developer OAuth (optional) โ if you use the HubSpot * Trigger node for event-based runs. Google Service Account โ for the Google Docs tool (share your * playbook doc with this service account). OpenRouter โ for the OpenRouter Chat Model used by the AI Agent. SerpAPI โ for targeted Google searches from within the Agent. ๐ ๏ธ Setup Instructions HubSpot Create a Private App and copy the Access Token. Add or confirm the contact property linkedinUrl (Text). Plug the token into the HubSpot nodes. If using HubSpot Trigger, connect your Developer OAuth app and subscribe to contact create/update events. Google Docs (Living Instructions) โก๏ธ Sample configuration doc file Copy the sample doc file and modify to your need. Share the doc with your Google Service Account (Viewer is fine). In the Read Google Docs node, paste the Document URL. OpenRouter & SerpAPI Add your OpenRouter key to the OpenRouter Chat Model credential. Add your SerpAPI key to the SerpAPI tool node. (Optional) In your Google Doc or Agent prompt, set sensible defaults for SerpAPI (engine=google, hl=en, gl=us, num=5, max 1โ2 searches). โจ What you get Auto-enriched contacts with a LinkedIn URL and profile insights (clean, validated output). A research process you can change anytime by editing the Google Docโno workflow changes needed. Tight, low-noise searches via SerpAPI to keep costs down. And thatโs itโpublish and let the Agent enrich new leads automatically while you refine the rules in your doc. It allows handing off to a team who wouldn't necessarily tweak the automation nodes.
by Madame AI
Scrape Detailed GitHub Profiles to Google Sheets Using BrowserAct This template is a sophisticated data enrichment and reporting tool that scrapes detailed GitHub user profiles and organizes the information into dedicated, structured reports within a Google Sheet. This workflow is essential for technical recruiters, talent acquisition teams, and business intelligence analysts who need to dive deep into a pre-qualified list of developers to understand their recent activity, repositories, and technical footprint. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow is triggered manually but can be started by a Schedule Trigger or by integrating directly with a candidate sourcing workflow (like the "Source Top GitHub Contributors" template). A Google Sheets node reads a list of target GitHub user profile URLs from a master candidate sheet. The Loop Over Items node processes each user one by one. A Slack notification is sent at the beginning of the loop to announce that the scraping process has started for the user. A BrowserAct node visits the user's GitHub profile URL and scrapes all available data, including profile info, repositories, and social links. A custom Code node (labeled "Code in JavaScript") performs a critical task: it cleans, fixes, and consolidates the complex, raw scraped data into a single, clean JSON object. The workflow then dynamically manages your output. It creates a new sheet dedicated to the user (named after them) and clears it to ensure a fresh report every time. The consolidated data is separated into three paths: main profile data, links, and repositories. Three final Google Sheets nodes then append the structured data to the user's dedicated sheet, creating a clear, multi-section report (User Data, User Links, User Repositories). Requirements BrowserAct** API account for web scraping BrowserAct* "Scraping GitHub Users Activity & Data*" Template BrowserAct* "* Source Top GitHub Contributors by Language & Location**" Template Output BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets** credentials for input (candidate list) and structured output (individual user sheets) Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase GitHub Data Mining: Extracting User Profiles & Repositories with N8N
by Ranjan Dailata
This workflow automatically scrapes Amazon price-drop data via Decodo, extracts structured product details with OpenAI, generates summaries and sentiment insights for each item, and saves everything to Google Sheets โ creating a fully automated price-intelligence pipeline. Disclaimer Please note - This workflow is only available on n8n self-hosted as itโs making use of the community node for the Decodo Web Scraping Who this is for This workflow is designed for e-commerce analysts, product researchers, price-tracking teams, and affiliate marketers who want to: Monitor daily Amazon product price drops automatically. Extract key information such as product name, price, discount, and links. Generate AI-driven summaries and sentiment insights on the latest deals. Store all structured data directly in Google Sheets for trend analysis and reporting. What problem this workflow solves This workflow solves the following: Eliminates the need for manual data scraping or tracking. Turns unstructured web data into structured datasets. Adds AI-generated summaries and sentiment analysis for smarter decision-making. Enables automated, daily price intelligence tracking across multiple product categories. What this workflow does This automation combines Decodoโs web scraping, OpenAI GPT-4.1-mini, and Google Sheets to deliver an end-to-end price intelligence system. Trigger & Setup Manually start the workflow. Input your price-drop URL (default: CamelCamelCamel Daily Drops). Web Scraping via Decodo Decodo scrapes the Amazon price-drop listings and extracts product details (title, price, savings, product link). LLM-Powered Data Structuring The extracted content is sent to OpenAI GPT-4.1-mini to format and clean the output into structured JSON fields. Loop & Deep Analysis Each product URL is revisited by Decodo for content enrichment. The AI performs two analyses per product: Summarization: Generates a comprehensive summary of the product. Sentiment Analysis: Detects tone (positive/neutral/negative), sentiment score, and key topics. Aggregation & Storage All enriched results are merged and aggregated. Structured data is automatically appended to a connected Google Sheet. End Result: A ready-to-use dataset showing each price-dropped product, its summary, sentiment polarity, and key highlights updated in real time. Setup Pre-requisite Please make sure to install the n8n custom node for Decodo. Import and Connect Credentials Import the workflow into your n8n self-hosted instance. Connect: OpenAI API (GPT-4.1-mini)** โ for summarization and sentiment analysis Decodo API** โ for real-time price-drop scraping Google Sheets OAuth2** โ to save structured results Configure Input Fields In the โSet input fieldsโ node: Update the price_drop_url to your target URL (e.g., https://camelcamelcamel.com/top_drops?t=weekly). Run the Workflow Click โExecute Workflowโ or schedule it to run daily to automatically fetch and analyze new price-drop listings. Check Output The aggregated data is saved to a Google Sheet (Pricedrop Info). Each record contains: Product name Current price and savings Product link AI-generated summary Sentiment classification and score How to customize this workflow Change Source Replace the price_drop_url with another CamelCamelCamel or Amazon Deals URL. Add multiple URLs and loop through them for category-based price tracking. Modify Extraction Schema In the Structured Output Parser, modify the JSON schema to include fields like: category, brand, rating, or availability. Tune AI Prompts Edit the Summarize Content and Sentiment Analysis nodes to: Add tone analysis (e.g., promotional vs. factual). Include competitive product comparison. Integrate More Destinations Replace Google Sheets with: Airtable โ for no-code dashboards. PostgreSQL/MySQL โ for large-scale storage. Notion or Slack โ for instant price-drop alerts. Automate Scheduling Add a Cron Trigger node to run this workflow daily or hourly. Summary This workflow creates a fully automated price intelligence system that: Scrapes Amazon product price drops via Decodo. Extracts structured data with OpenAI GPT-4.1-mini. Generates AI-powered summaries and sentiment insights. Updates a connected Google Sheet with each run.
by Marsel Bait
๐ง How it works This workflow lets users extract full YouTube video transcripts directly from Slack using n8n and AssemblyAI. When a user submits a YouTube link via a Slack slash command, the workflow validates the video duration and converts the video into an audio file. The audio is then sent to AssemblyAI for transcription. Once the transcription is complete, the workflow cleans and formats the transcript for readability and posts the full text back to Slack asynchronously. โ๏ธ Features โข Triggers from a Slack slash command with a YouTube link โข Validates video length before processing (maximum 10 minutes) โข Converts YouTube videos to MP3 for transcription โข Polls transcription status until completion โข Cleans and reformats the transcript for easy reading โข Posts the full transcript back to Slack without blocking the command ๐ก Use cases & expected outcomes โข YouTube lectures or tutorials โ Get a full transcript in Slack for reference or note-taking โข Podcast or interview videos โ Extract spoken content as text for quoting or analysis โข Product demos or walkthroughs โ Review video content quickly without rewatching ๐ก Perfect for โข Teams that need quick access to YouTube transcripts inside Slack โข Researchers, content creators, and note-takers โข Developers learning how to connect Slack, external APIs, and async workflows in n8n
by Jose Castillo
This workflow scrapes Google Maps business listings (e.g., carpenters in Tarragona) to extract websites and email addresses โ perfect for lead generation, local business prospecting, or agency outreach. ๐ง How it works Manual Trigger โ start manually using the โTest Workflowโ button. Scrape Google Maps โ fetches the HTML from a Google Maps search URL. Extract URLs โ parses all business links from the page. Filter Google URLs โ removes unwanted Google/tracking links. Remove Duplicates + Limit โ keeps unique websites (default: 100). Scrape Site โ fetches each websiteโs HTML. Extract Emails โ detects valid email addresses. Filter Out Empties & Split Out โ isolates each valid email per site. (Optional) Add to Google Sheet โ appends results to your Sheet. ๐ผ Use cases Local business leads: find emails of carpenters, dentists, gyms, etc., in your city. Agency outreach: collect websites and contact emails to pitch marketing services. B2B prospecting: identify businesses by niche and region for targeted campaigns. ๐งฉ Requirements n8n instance with HTTP Request and Code nodes enabled. (Optional) Google Sheets OAuth2 credentials. Tip: Add a โGoogle Sheets โ Append Rowโ node and connect it to your account. ๐ Security No personal or sensitive data included โ only credential references. If sharing this workflow, anonymize the โcredentialsโ field before publishing.
by Davide
This workflow automates the process of creating short videos from multiple image references (up to 7 images). It uses "Vidu Reference to Video" model, a video generation API to transform a user-provided prompt and image set into a consistent, AI-generated video. This workflow automates the process of generating AI-powered videos from a set of reference images and then uploading them to TikTok and Youtube. The process is initiated via a user-friendly web form. Advantages โ Consistent Video Creation: Uses multiple reference images to maintain subject consistency across frames. โ Easy Input: Just a simple form with prompt + image URLs. โ Automation: No manual waitingโworkflow checks status until video is ready. โ SEO Optimization: Automatically generates a catchy, optimized YouTube title using AI. โ Multi-Platform Publishing: Uploads directly to Google Drive, YouTube, and TikTok in one flow. โ Time Saving: Removes repetitive tasks of video generation, download, and manual uploading. โ Scalable: Can run periodically or on-demand, perfect for content creators and marketing teams. โ UGC & Social Media Ready: Designed for creating viral short videos optimized for platforms like TikTok and YouTube Shorts. How It Works Form Trigger: A user submits a web form with two key pieces of information: a text Prompt describing the desired video and a list of Reference images (URLs separated by commas or new lines). Data Processing: The workflow processes the submitted image URLs, converting them from a text string into a proper array format for the AI API. AI Video Generation: The processed data (prompt and image array) is sent to the Fal.ai VIDU API endpoint (reference-to-video) to start the video generation job. This node returns a request_id. Status Polling: The workflow enters a loop where it periodically checks the status of the generation job using the request_id. It waits for 60 seconds and then checks if the status is "COMPLETED". If not, it waits and checks again. Result Retrieval: Once the video is ready, the workflow fetches the URL of the generated video file. Title Generation: Simultaneously, the original user prompt is sent to an AI model (GPT-4o-mini via OpenRouter) to generate an optimized, engaging title for the social media post. Upload & Distribution: The video file is downloaded from the generated URL. A copy is saved to a specified Google Drive folder for storage. The video, along with the AI-generated title, is automatically uploaded to YouTube and TikTok via the Upload-Post.com API service. Set Up Steps This workflow requires configuration and API keys from three external services to function correctly. Step 1: Configure Fal.ai for Video Generation Create an account and obtain your API key. In the "Create Video" HTTP node, edit the "Header Auth" credentials. Set the following values: Name: Authorization Value: Key YOUR_FAL_API_KEY (replace YOUR_FAL_API_KEY with your actual key) Step 2: Configure Upload-Post.com for Social Media Uploads Get an API key from your Upload-Post Manage Api Keys dashboard (10 free uploads per month). In both the "HTTP Request" (YouTube) and "Upload on TikTok" nodes, edit their "Header Auth" credentials. Set the following values: Name: Authorization Value: Apikey YOUR_UPLOAD_POST_API_KEY (replace YOUR_UPLOAD_POST_API_KEY with your actual key) Crucial: In the body parameters of both upload nodes, find the user field and replace YOUR_USERNAME with the exact name of the social media profile you configured on Upload-Post.com (e.g., my_youtube_channel). Step 3: Configure Google Drive (Optional Storage) The "Upload Video" node is pre-configured to save the video to a Google Drive folder named "Fal.run". Ensure your Google Drive credentials in n8n are valid and that you have access to this folder, or change the folderId parameter to your desired destination. Step 4: Configure AI for Title Generation The "Generate title" node uses OpenAI to access the gpt-5-mini model.. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Elvis Sarvia
Measure how well your AI classifier actually performs. This template shows how to evaluate a support ticket classifier using n8n's built-in evaluation system, comparing AI predictions against expected labels with exact match scoring. What you'll do Open the workflow and review the production path (webhook receives a ticket, AI Agent classifies it by category and urgency, response is returned). Open the Evaluations tab and click Run Test to feed each Data Table row through the AI Agent. Inspect per-test-case scores and aggregate metrics to see which tickets the classifier got right and which it missed. Tweak the prompt or model, re-run, and compare runs side by side. What you'll learn How n8n's Evaluation Trigger, Data Tables, and Evaluation node fit together How to use the "Check if Evaluating" operation to keep evaluation traffic out of production How to score structured AI outputs against known correct answers using exact match How to seed a test set from real execution history rather than synthetic examples Why it matters Classification accuracy that looked great in testing can quietly drop the moment your inputs shift. Building an evaluation path next to your production workflow gives you a repeatable way to measure quality, catch regressions before users do, and ship prompt changes with data instead of vibes. This template is a learning companion to the Production AI Playbook, a series that explores strategies, shares best practices, and provides practical examples for building reliable AI systems in n8n.
by Vedad Sose
Generate 50 Meta ad copy variations informed by target audience insights, then validate with real human feedback to identify the top 10 performers โ all before spending a dollar on ads. Why This Matters Traditional ad copy generation relies on AI guesswork about what might resonate. This workflow grounds every variation in real human insight from your target audience โ what actually matters to them, their specific concerns, the language they respond to, the benefits they care about. Instead of burning ad budget testing generic variations, you start with copy shaped by authentic audience perspectives, then pre-validated by those same people before you spend a single dollar. The top 10 ads aren't AI's best guesses โ they're ranked by real human feedback on what would genuinely make your audience stop scrolling and click. How It Works This workflow uses real human perspectives at two critical stages: 1. Generation (Audience-Informed) โ AI queries Digital Twins from your target demographic to understand their preferences, concerns, and emotional drivers. These insights directly shape the 50 ad variations, ensuring copy that speaks to real human motivations. 2. Validation (Pre-Tested) โ Each variation is evaluated by Digital Twins matching your audience. They score and provide specific feedback on what resonates and what falls flat. Only the top 10 make it to your Google Sheet. The result: Pre-validated ad copy ranked by actual target audience feedback, not AI assumptions. The Process Submit your brief โ Product details and target audience description AI gathers audience insights โ Queries Digital Twins to understand what matters to your demographic Generate 50 informed variations โ Copy crafted around real preferences, pain points, and emotional triggers Digital Twins validate โ Target audience evaluates each ad for resonance and effectiveness Top 10 ranked output โ Google Sheet with scores and specific human feedback Each variation includes: Primary Text** โ Main ad copy (first 125 chars are the crucial hook) Headline** โ Bold text below creative (under 40 chars)
by Robert Breen
A hands-on starter workflow that teaches beginners how to: Pull rows from a Google Sheet Append a new record that mimics a form submission Generate AI-powered text with GPT-4o based on a โTopicโ column Write the AI output back into the correct row using an update operation Along the way youโll learn the three essential Google Sheets operations in n8n (read โ append โ update), see how to pass sheet data into an OpenAI node, and document each step with sticky-note instructionsโperfect for anyone taking their first steps in no-code automation. 0๏ธโฃ Prerequisites Google Sheets** Open Google Cloud Console โ create / select a project. Enable Google Sheets API under APIs & Services. Create an OAuth Desktop credential and connect it in n8n. Share the spreadsheet with the Google account linked to the credential. OpenAI** Create a secret key at <https://platform.openai.com/account/api-keys>. In n8n โ Credentials โ New โ choose OpenAI API and paste the key. Sample sheet to copy** (make your own copy and use its link) <https://docs.google.com/spreadsheets/d/15i9WIYpqc5lNd5T4VyM0RRptFPdi9doCbEEDn8QglN4/edit?usp=sharing> 1๏ธโฃ Trigger Manual Trigger โ lets you run on demand while learning. (Swap for a Schedule or Webhook once you automate.) 2๏ธโฃ Read existing rows Node:** Get Rows from Google Sheets Reads every row from Sheet1 of your copied file. 3๏ธโฃ Generate a demo row Node:** Generate 1 Row of Data (Set node) Pretends a form was submitted: Name, Email, Topic, Submitted = "Yes" 4๏ธโฃ Append the new row Node:** Append Data to Google Operation append โ writes to the first empty line. 5๏ธโฃ Create a description with GPT-4o OpenAI Chat Model โ uses your OpenAI credential. Write description (AI Agent) โ prompt = the Topic. Structured Output Parser โ forces JSON like: { "description": "โฆ" }. 6๏ธโฃ Update that same row Node:** Update Sheets data Operation update. Matches on column Email to update the correct line. Writes the new Description cell returned by GPT-4o. 7๏ธโฃ Why this matters Demonstrates the three core Google Sheets operations: read โ append โ update. Shows how to enrich sheet data with an AI step and push the result right back. Sticky Notes provide inline docs so anyone opening the workflow understands the flow instantly. ๐ค Need help? Robert Breen โ Automation Consultant โ๏ธ robert.j.breen@gmail.com ๐ <https://www.linkedin.com/in/robert-breen-29429625/>
by Nishant
Automated daily swingโtrade ideas from endโofโday (EOD) data, scored by an LLM, logged to Google Sheets, and pushed to Telegram. What this workflow does Fetches EOD quotes* for a chosen stock universe (example: *NSEโ100** via RapidAPI). Cleans & filters** the universe using simple technical/quality gates (e.g., price/volume sanity, avoid illiquid names). Packages market context* and feeds it to *OpenAI* with a strict *JSON schema* to produce *top swingโtrade recommendations** (entry, target, stop, rationale). Splits structured output* into rows and *logs* them to a *Google Sheet** for tracking. Sends an alert* with the dayโs trade ideas to *Telegram** (channel or DM). Ideal for Retail traders who want a daily, handsโoff idea generator. PMs/engineers prototyping LLMโassisted quant sidekicks. Creators who publish daily trade notes to their audience. Tech stack n8n** (orchestration) RapidAPI** (EOD quotes; pluggable data source) OpenAI** (LLM for idea generation) Google Sheets** (logging & performance tracker) Telegram** (alerts) Prerequisites RapidAPI key with access to an EOD quotes endpoint for your exchange. OpenAI API key. Google account with a Sheet named Trade_Recommendations_Tracker (or update the node). Telegram bot token (via @BotFather) and destination chat ID. > You can replace any of the above vendors with equivalents (e.g., Alpha Vantage, Twelve Data, Polygon, etc.). Only the HTTP Request + Format nodes need tweaks. Environment variables | Key | Example | Used in | | -------------------- | -------------------------- | --------------------- | | RAPIDAPI_KEY | xxxxxxxxxxxxxxxxxxxxxxxx | HTTP Request (quotes) | | OPENAI_API_KEY | sk-โฆ | OpenAI node | | TELEGRAM_BOT_TOKEN | 123456:ABC-DEFโฆ | Telegram node | | TELEGRAM_CHAT_ID | 5357385827 | Telegram node | Google Sheet schema Create a Sheet (tab: EOD_Ideas) with the headers: Date, Symbol, Direction, Entry, Target, StopLoss, Confidence, Reason, SourceModel, UniverseTag Node map (name โ purpose) Trigger โ Daily Market Close โ Fires daily after market close (e.g., 4:15 PM IST). Prepare Stock List (NSE 100) โ Provides stock symbols to analyze (static list or from a Sheet/API). Fetch EOD Data (RapidAPI) โ Gets EOD data for all symbols in one or batched calls. Format EOD Data โ Normalizes API response to a clean array (symbol, close, high, low, volume, etc.). Filter Valid Stock Data โ Drops illiquid/invalid rows (e.g., volume > 200k, close > 50). Build LLM Prompt Input โ Creates compact market context & JSON instructions for the model. Generate Swing Trade Ideas (OpenAI) โ Returns strict JSON with top ideas. Split JSON Output (Tradeโwise) โ Explodes the JSON array into individual items. Log Trade to Google Sheet โ Appends each idea as a row. Send Trade Alert to Telegram โ Publishes a concise summary to Telegram.
by Punit
This n8n workflow automates the process of generating and publishing LinkedIn posts that align with your personal brand tone and trending tech topics. It uses OpenAI to create engaging content and matching visuals, posts it directly to LinkedIn, and sends a confirmation via Telegram with post details. ๐ Key Features ๐ท๏ธ Random Hashtag Selection Picks a trending tag from a custom list for post inspiration. โ๏ธ AI-Generated Content GPT-4o crafts a LinkedIn-optimized post in your personal writing style. ๐ผ๏ธ Custom Image Generation Uses OpenAI to generate a relevant image for visual appeal. ๐ค Direct LinkedIn Publishing Posts are made automatically to your profile with public visibility. ๐ฉ Telegram Notification You get a real-time Telegram alert with the post URL, tag, and timestamp. ๐ Writing Style Alignment Past posts are injected as examples to maintain a consistent tone. Ideal Use Case: Automate your daily or weekly LinkedIn presence with minimal manual effort while maintaining high-quality, relevant, and visually engaging posts.
by Robin
๐ธ HOW IT WORKS โ AI TELEGRAM EXPENSE TRACKER This workflow transforms natural Telegram messages into structured expenses using AI โ without forms, manual typing, or complex inputs. Simply send a message like: Groceries 23โฌ yesterday The workflow validates the sender, understands the intent, extracts structured data, and prepares the expense for approval before saving. โโโโโโโโโโโโโโโโ ๐ WORKFLOW OVERVIEW ๐ฉ 1. Secure Input Layer Incoming Telegram messages are checked against a list of approved Chat IDs to ensure only authorized users can create expenses. ๐ฆ 2. AI Expense Detection An AI layer analyzes the message and decides whether it represents a real financial transaction. Non-expense messages are safely ignored to avoid noise in your data. ๐จ 3. Smart Category Intelligence Existing categories are loaded from Google Sheets and compared with the message content. If no suitable category exists, the workflow can suggest and learn new categories over time. ๐ช 4. Structured Data Extraction AI converts natural language into structured fields: date amount category description shared vs personal expense Supports German and English input. ๐ฅ 5. Human Approval & Storage Before saving, the user confirms the extracted result directly via Telegram. After approval, the expense is appended to Google Sheets automatically. โโโโโโโโโโโโโโโโ ๐ SETUP REQUIREMENTS Before using this workflow, make sure the following components are ready: 1๏ธโฃTelegram Bot Create a Telegram bot using BotFather and connect it to the Telegram Trigger node in n8n. Detailed setup instructions can be found here. 2๏ธโฃLLM API Access An API Key for a Large Language Model (LLM) is required for: expense detection category matching structured data extraction Add your API credentials inside the AI node configuration. 3๏ธโฃGoogle Sheets Create two Google Sheets before importing the workflow. EXPENSES* Required columns: date, amount, category, description, common_expense, Person EXPENSE_CATEGORIES* Required columns: category, description, examples The workflow reads existing data and appends new entries automatically. โโโโโโโโโโโโโโโโ ๐กKEY FEATURES โข AI-powered expense detection from natural language โข Self-learning category system โข Human-in-the-loop approval step โข Multi-language support (DE & EN) โข Clean Google Sheets integration โข Designed for real-life shared finance tracking โโโโโโโโโโโโโโโโ ๐ฅMULTI-USER SUPPORT Built for couples, roommates, or teams. Add multiple Chat IDs in: Security โ Allow Approved Chat IDs Each expense is automatically tagged with the sender. Shared expenses are stored as true in the common_expense column, while personal expenses default to false unless shared spending is detected. This allows easy downstream analysis, dashboards, or automation.