by HoangSP
Medical Q&A Chatbot for Urology using RAG with Pinecone and GPT-4o This template provides an AI-powered Q&A assistant for the Urology domain using Retrieval-Augmented Generation (RAG). It uses Pinecone for vector search and GPT-4o for conversational responses. ๐ง Use Case This chatbot is designed for clinics or medical pages that want to automate question answering for Urology-related conditions. It uses a vector store of domain knowledge to return verified responses. ๐ง Requirements โ OpenAI API key (GPT-4o or GPT-4o-mini) โ Pinecone account with an active index โ Verified Urology documents embedded into Pinecone โ๏ธ Setup Instructions Create a Pinecone vector index and connect it using the Pinecone credentials node. Upload Urology-related documents to embed using the Create Embeddings for Urology Docs node. Customize the chatbot system message to reflect your medical specialty. Deploy this chatbot on your website or link it with Telegram via the chat trigger node. ๐ ๏ธ Components chatTrigger: Listens for user messages and starts the workflow. Medical AI Agent: GPT-based agent guided by domain-specific instructions. RAG Tool Vector Store: Fetches relevant documents from Pinecone using vector search. Memory Buffer: Maintains conversation context. Create Embeddings for Urology Docs: Encodes documents into vector format. ๐ Customization You can replace the knowledge base with any other medical domain by: Updating the documents stored in Pinecone. Modifying the system prompt in the AI Agent node. ๐ฃ CTA This chatbot is ideal for clinics, medical consultants, or educational websites wanting a reliable AI assistant in Urology.
by Airtop
Recursive Web Scraping Use Case Automating web scraping with recursive depth is ideal for collecting content across multiple linked pagesโperfect for content aggregation, lead generation, or research projects. What This Automation Does This automation reads a list of URLs from a Google Sheet, scrapes each page, stores the content in a document, and adds newly discovered links back to the sheet. It continues this process for a specified number of iterations based on the defined scraping depth. Input Parameters: Seed URL: The starting URL to begin the scraping process. Example: https://example.com/ Links must contain: Restricts the links to those that contain this specified string. Example: https://example.com/ Depth: The number of iterations (layers of links) to scrape beyond the initial set. Example: 3 How It Works Starts by reading the Seed URL from the Google Sheet. Scrapes each page and saves its content to the specified document. Extracts new links from each page that match the Links must contain string, appends them to the Google Sheet. Repeats steps 2โ3 for the number of times specified by Depth - 1. Setup Requirements Airtop API Key โ free to generate. Credentials set up for Google Docs (requires creating a project on Google Console). Read how to. Credentials set up for Google Spreadsheet. Next Steps Add Filtering Rules**: Filter which links to follow based on domain, path, or content type. Combine with Scheduler**: Run this automation on a schedule to continuously explore newly discovered pages. Export Structured Data**: Extend the process to store extracted data in a CSV or database for analysis. Read more about website scraping for LLMS
by David Roberts
AI evaluation in n8n This is a template for n8n's evaluation feature. Evaluation is a technique for getting confidence that your AI workflow performs reliably, by running a test dataset containing different inputs through the workflow. By calculating a metric (score) for each input, you can see where the workflow is performing well and where it isn't. How it works This template shows how to calculate a workflow evaluation metric: whether a category matches the expected one. The workflow takes support tickets and generates a category and priority, which is then compared with the correct answers in the dataset. We use an evaluation trigger to read in our dataset It is wired up in parallel with the regular trigger so that the workflow can be started from either one. More info Once the category is generated by the agent, we check whether it matches the expected one in the dataset Finally we pass this information back to n8n as a metric
by David Roberts
AI evaluation in n8n This is a template for n8n's evaluation feature. Evaluation is a technique for getting confidence that your AI workflow performs reliably, by running a test dataset containing different inputs through the workflow. By calculating a metric (score) for each input, you can see where the workflow is performing well and where it isn't. How it works This template shows how to calculate a workflow evaluation metric: text similarity, measured character-by-character. The workflow takes images of hand-written codes, extracts the code and compares it with the expected answer from the dataset. The images look like this: The workflow works as follows: We use an evaluation trigger to read in our dataset It is wired up in parallel with the regular trigger so that the workflow can be started from either one. More info We download the image and use AI to extract the code If weโre evaluating (i.e. the execution started from the evaluation trigger), we calculate the string distance metric We pass this information back to n8n as a metric
by Nick Saraev
AI Upwork Application Agent with OpenAI & Google Docs Categories: AI Agents, Freelance Automation, Proposal Generation This workflow creates an intelligent AI agent that automates Upwork job applications by generating highly personalized proposals, professional Google Doc presentations, and visual workflow diagrams. Built by someone who earned over $500,000 on Upwork, this system demonstrates the exact templates and strategies that achieve superior response rates through perceived customization and value demonstration. Benefits Complete Application Automation** - Transform job descriptions into custom proposals, documents, and diagrams in minutes Proven Templates** - Based on $500K+ in Upwork earnings using exact strategies for high-converting applications Intelligent Personalization** - AI analyzes job requirements and customizes responses with relevant social proof Professional Asset Generation** - Creates Google Doc proposals and Mermaid workflow diagrams for enhanced perceived value Modular Architecture** - Three specialized sub-workflows handle different aspects of proposal generation High Response Rates** - Focuses on perceived customization and value demonstration over generic applications How It Works AI Agent Orchestration: Receives Upwork job descriptions through chat interface Maintains conversation context with window buffer memory Coordinates three specialized sub-workflows for comprehensive proposal generation Automatically integrates generated assets into cohesive application packages Application Copy Generation: Uses proven templates based on $500K+ Upwork success Follows structure: "Hi, I do [thing] all the time. So confident I created a demo: [link]" Incorporates personal social proof and achievements automatically Generates concise, spartan-toned applications that avoid generic AI language Google Doc Proposal Creation: Copies professional proposal template from Google Drive Generates structured content including system title, explanation, scope, and timeline Uses find-and-replace to populate template with AI-generated, personalized content Creates shareable documents with proper permissions for immediate client access Mermaid Diagram Visualization: Analyzes job requirements to create relevant workflow diagrams Generates Mermaid.js code for professional flowchart visualization Provides visual representation of proposed solutions Enhances perceived value through custom diagram creation Smart Template Integration: Automatically replaces placeholder text with generated Google Doc links Maintains consistent messaging across all generated assets Ensures cohesive presentation of application, proposal, and supporting materials Required Setup Configuration Personal Information Setup: Update the "aboutMe" variable in both Set Variable nodes with your credentials: Professional background and specializations Notable client achievements with specific revenue numbers Social proof elements (community size, subscriber count, etc.) Relevant project examples with quantified results Google Services Integration: Google Drive API Setup: Enable Google Drive API in Google Cloud Console Create OAuth2 credentials (Client ID and Client Secret) Connect n8n to Google Drive with proper permissions Google Docs Template: Copy the provided Google Docs proposal template to your Drive Update the template ID in the Google Drive node Customize template with your branding and standard language Google Docs API: Ensure Google Docs API is enabled in your Google Cloud project Test document creation and sharing permissions OpenAI API Configuration: Set up OpenAI API credentials across all OpenAI nodes Configure appropriate models (GPT-4O-mini recommended for speed) Set temperature to 0.7 for optimal personalization balance Monitor API usage to control costs Template Customization: Application Template**: Modify the proposal structure in OpenAI prompts to match your services Google Doc Template**: Update the document template with your standard proposal format Personal Details**: Replace all placeholder information with your actual achievements and social proof Business Use Cases Freelance Professionals** - Automate high-quality Upwork applications across multiple job categories Automation Specialists** - Demonstrate capabilities through automated proposal generation itself Service Providers** - Scale application volume while maintaining personalization quality Agency Owners** - Offer proposal automation services to freelance clients Consultants** - Streamline business development with automated custom proposals Content Creators** - Generate professional project proposals with visual workflow representations Revenue Potential This system transforms freelance business development: 10x Application Speed**: Generate comprehensive proposals in minutes vs. hours Higher Response Rates**: Perceived customization and value demonstration increase client engagement Scalable Outreach**: Apply to more jobs with maintained quality through automation Professional Positioning**: Visual diagrams and structured proposals demonstrate expertise Competitive Advantage**: Deliver proposals faster than competitors through intelligent automation Difficulty Level: Advanced Estimated Build Time: 3-4 hours Monthly Operating Cost: ~$30 (OpenAI + Google APIs) Watch My Complete Live Build Want to see me build this entire system from scratch? I walk through every component live - including the AI agent setup, prompt engineering strategies, Google Docs integration, and all the debugging that goes into creating a production-ready freelance automation system. ๐ฅ See My Live Build Process: "I Built An AI Agent That Automates Upwork ($500K+ Earned)" This comprehensive tutorial shows the real development process - including advanced prompt engineering, modular workflow design, and the exact business strategies that generated $500K+ in Upwork revenue. Set Up Steps AI Agent Foundation: Configure chat trigger and AI agent node with OpenAI integration Set up window buffer memory for conversation context Define system message with clear agent instructions and behavior rules Sub-Workflow Creation: Build three specialized workflows: Application Copy, Google Doc Proposal, Mermaid Code Configure execute workflow triggers for each sub-workflow Set up proper data passing between agent and sub-workflows Google Services Configuration: Create Google Cloud Console project with Drive and Docs APIs enabled Set up OAuth2 credentials and connect to n8n Copy and customize the proposal template document Personalization Setup: Update all "aboutMe" variables with your specific achievements and social proof Customize prompt templates to match your service offerings and communication style Test individual sub-workflows with sample job descriptions Agent Tool Integration: Connect sub-workflows as tools in the main AI agent Configure proper tool descriptions and response property names Test complete agent functionality with realistic job posting scenarios Template Optimization: Refine proposal templates based on your specific service offerings Adjust AI prompts for optimal personalization and response quality Test with various job types to ensure consistent quality output Advanced Optimizations Scale the system with: Job Scraping Integration:** Automatically discover and apply to relevant Upwork jobs Response Tracking:** Monitor application success rates and optimize templates Multi-Platform Support:** Extend to other freelance platforms (Fiverr, Freelancer, etc.) Client Communication:** Automate follow-up sequences for proposal responses Portfolio Integration:** Automatically include relevant portfolio pieces based on job requirements Important Considerations Template Authenticity:** Customize templates significantly to avoid detection as automated Upwork Compliance:** Ensure applications meet platform guidelines and quality standards Personal Branding:** Maintain consistent voice and positioning across all generated content Response Management:** Be prepared to handle increased application volume and client responses Quality Control:** Regularly review and refine generated content for accuracy and relevance Why This System Works The competitive advantage lies in proven strategies: Perceived Customization:** AI generates content that appears manually crafted for each job Value Demonstration:** Visual diagrams and structured proposals show immediate value Speed Advantage:** Deliver comprehensive proposals before competitors finish reading job posts Professional Presentation:** Consistent quality and formatting across all applications Scalable Personalization:** Maintain individual attention at volume through intelligent automation Check Out My Channel For more advanced automation systems and proven freelance business strategies that generate real revenue, explore my YouTube channel where I share the exact methodologies used to build successful automation agencies and scale to $72K+ monthly revenue.
by sayamol thiramonpaphakul
This workflow automatically checks the status of your websites using UptimeRobot API. If any site is down or unstable, it will: Generate a natural-language alert message using GPT-4o Push the message to a LINE group (with funny IT-style encouragement) Log all DOWN status entries into your Supabase database Wait 30 minutes before repeating ๐ง How It Works Schedule Trigger โ Runs on a fixed interval (every few minutes). UptimeRobot Node โ Fetches website monitor data. Code Node (Filter) โ Filters only websites with status 8 (may be down) or 9 (down). IF Node โ If any site is down, proceed. LangChain LLM Node โ Formats alert with a humorous message using GPT-4o. Line Notify (HTTP Request) โ Sends the alert to your LINE group. Loop Over Items โ Loops through all monitors. Filter Down (Status = 9) โ Selects only โfully downโ sites. Supabase Node โ Logs these into synlora_uptime_down table. Wait Node โ Delays next alert by 30 minutes to avoid spamming. โ๏ธ Setup Steps Required: ๐ UptimeRobot API Key ๐ฒ LINE Channel Access Token and Group ID ๐ง OpenAI Key (GPT-4o Mini) ๐๏ธ Supabase Project & Table Step-by-step: Go to UptimeRobot โ Get API key and ensure monitors are set up. Create a Supabase table with fields: website, status, uptime_id. Create a LINE Messaging API bot, join it to your group, and get: Access Token Group ID (userId or groupId) Add your OpenAI API Key for GPT-4o Mini (or switch to your preferred LLM). Import the workflow JSON into n8n. Set credentials in all necessary nodes. Activate the workflow.
by assert
Who this template is for This template is for every engineer who wants to automate their code reviews or just get a 2nd opinion on their PR. How it works This workflow will automatically review your changes in a Gitlab PR using the power of AI. It will trigger whenever you comment with +0 to a Gitlab PR, get the code changes, analyze them with GPT, and reply to the PR discussion. Set up Steps Set up webhook of note_events in Gitlab repository (see here on how to do it) Configure ChatGPT credentials Note "+0" in MergeRequest to trigger automatic review by ChatGPT
by Yaron Been
๐งจ VIP Radar: Instantly Spot & Summarize High-Value Shopify Orders with AI + Slack Alerts Automatically detect when a new Shopify order exceeds $200, fetch the customerโs purchase history, generate an AI-powered summary, and alert your team in Slackโso no VIP goes unnoticed. ๐ ๏ธ Workflow Overview | Feature | Description | |------------------------|-----------------------------------------------------------------------------| | Trigger | Shopify โNew Orderโ webhook | | Conditional Check | Filters for orders > $200 | | Data Enrichment | Pulls full order history for the customer from Shopify | | AI Summary | Uses OpenAI to summarize buying behavior | | Notification | Sends detailed alert to Slack with name, order total, and customer insights | | Fallback | Ignores low-value orders and terminates flow | ๐ What This Workflow Does This automation monitors your Shopify store and reacts to any high-value order (over $200). When triggered: It fetches all past orders of that customer, Summarizes the history using OpenAI, Sends a full alert with context to your Slack channel. No more guessing whoโs worth a closer look. Your team gets instant insights, and your VIPs get the attention they deserve. ๐งฉ Node-by-Node Breakdown ๐ 1. Trigger: New Shopify Order Type**: Shopify Trigger Event**: orders/create Purpose**: Starts workflow on new order Pulls**: Order total, customer ID, name, etc. ๐ฃ 2. Set: Convert Order Total to Number Ensures the total_price is treated as a number for comparison. โ 3. If: Is Order > $200? Condition**: $json.total_price > 200 Yes** โ Continue No** โ End workflow ๐ 4. HTTP: Fetch Customer Order History Uses the Shopify Admin API to retrieve all orders from this customer. Requires your Shopify access token. ๐งพ 5. Set: Convert Orders Array to String Formats the order data so it's prompt-friendly for OpenAI. ๐ง 6. LangChain Agent: Summarize Order History Prompt**: "Summarize the customer's order history for Slack. Here is their order data: {{ $json.orders }}" Model**: GPT-4o Mini (customizable) ๐จ 7. Slack: Send VIP Alert Sends a rich message to a Slack channel. Includes: Customer name Order value Summary of past behavior ๐งฑ 8. No-Op (Optional) Used to safely end workflow if the order is not high-value. ๐ง How to Customize | What | How | |--------------------------|----------------------------------------------------------------------| | Order threshold | Change 200 in the If node | | Slack channel | Update channelId in the Slack node | | AI prompt style | Edit text in LangChain Agent node | | Shopify auth token | Replace shpat_abc123xyz... with your actual private token | ๐ Setup Instructions Open n8n editor. Go to Workflows โ Import โ Paste JSON. Paste this workflow JSON. Replace your Shopify token and Slack credentials. Save and activate. Place a test order in Shopify to watch it work. ๐ก Real-World Use Cases ๐ฏ Notify sales team when a potential VIP buys ๐๏ธ Prep support reps with customer history ๐ Detect repeat buyers and upsell opportunities ๐ Resources & Support ๐จโ๐ป Creator: Yaron Been ๐บ YouTube: NoFluff with Yaron Been ๐ Website: https://nofluff.online ๐ฉ Contact: Yaron@nofluff.online ๐ท๏ธ Tags #shopify, #openai, #slack, #vip-customers, #automation, #n8n, #workflow, #ecommerce, #customer-insights, #ai-summaries, #gpt4o
by phil
This workflow automates voice reminders for upcoming appointments by generating a professional audio message and sending it to clients via email with the voice file attached. It integrates Google Calendar to track appointments, ElevenLabs to generate high-quality voice messages, and Gmail to deliver them efficiently. Who Needs Automated Voice Appointment Reminders? This automated voice appointment reminder system is ideal for businesses that rely on scheduled appointments. It helps reduce no-shows, improve client engagement, and streamline communication. Medical Offices & Clinics โ Ensure patients receive timely appointment reminders. Real Estate Agencies โ Keep potential buyers and renters informed about property visits. Service-Based Businesses โ Perfect for salons, consultants, therapists, and coaches. Legal & Financial Services โ Help clients remember important meetings and consultations. If your business depends on scheduled appointments, this workflow saves time and enhances client satisfaction. ๐ Why Use This Workflow? Ensures clients receive timely reminders. Reduces appointment no-shows and scheduling issues. Automates the process with a personalized voice message. Step-by-Step: How This Workflow Automates Voice Reminders Trigger the Workflow โ The system runs manually or on a schedule to check upcoming appointments in Google Calendar. Retrieve Appointment Data โ It fetches event details (client name, time, and location) from Google Calendar. The workflow uses the summary, start.dateTime, location, and attendees[0].email fields from Google Calendar to personalize and send the voice reminders. Generate a Voice Reminder โ Using ElevenLabs, the workflow converts the appointment details into a natural-sounding voice message. Send via Email โ The generated audio file is attached to an email and sent to the client as a reminder. Customization: Tailor the Workflow to Your Business Needs Adjust Trigger Frequency โ Modify the scheduling to run daily, hourly, or at specific intervals. Customize Voice Message Format โ Change the script structure and voice tone to match your business needs. Change Notification Method โ Instead of email, integrate SMS or WhatsApp for delivery. ๐ Prerequisites Google Calendar Access** โ Ensure you have access to the calendar with scheduled appointments. ElevenLabs API Key โ Required for generating voice messages (you can start for free). Gmail API Access** โ Needed for sending reminder emails. n8n Setup** โ The workflow runs on an n8n instance (self-hosted or cloud). ๐ Step-by-Step Installation & Setup Set Up Google Calendar API** Go to Google Cloud Console. Create a new project and enable Google Calendar API. Generate OAuth 2.0 credentials and save them for n8n. Get an ElevenLabs API Key** Sign up at ElevenLabs. Retrieve your API key from the dashboard. Configure Gmail API** Enable Gmail API in Google Cloud Console. Create OAuth credentials and authorize your email address for sending. Deploy n8n & Install the Workflow** Install n8n (Installation Guide). Add the required Google Calendar, ElevenLabs, and Gmail nodes. Import or build the workflow with the correct credentials. Test and fine-tune as needed. โ Important: The LangChain Community node used in this workflow only works on self-hosted n8n instances. It is not compatible with n8n Cloud. Please ensure you are running a self-hosted instance before using this workflow. Summary This workflow ensures a professional and seamless experience for your clients, keeping them informed and engaged. ๐๐ Phil | Inforeole
by Adam Bertram
An AI-powered chat assistant that analyzes Azure virtual machine activity and generates detailed timeline reports showing VM state changes, performance metrics, and operational events over time. How It Works The workflow starts with a chat trigger that accepts user queries about Azure VM analysis. A Google Gemini AI agent processes these requests and uses six specialized tools to gather comprehensive VM data from Azure APIs. The agent queries resource groups, retrieves VM configurations and instance views, pulls performance metrics (CPU, network, disk I/O), and collects activity log events. It then analyzes this data to create timeline reports showing what happened to VMs during specified periods, defaulting to the last 90 days unless the user specifies otherwise. Prerequisites To use this template, you'll need: n8n instance (cloud or self-hosted) Azure subscription with virtual machines Microsoft Azure Monitor OAuth2 API credentials Google Gemini API credentials Proper Azure permissions to read VM data and activity logs Setup Instructions Import the template into n8n. Configure credentials: Add Microsoft Azure Monitor OAuth2 API credentials with read permissions for VMs and activity logs Add Google Gemini API credentials Update workflow parameters: Open the "Set Common Variables" node Replace <your azure subscription id here> with your actual Azure subscription ID Configure triggers: The chat trigger will automatically generate a webhook URL for receiving chat messages No additional trigger configuration needed Test the setup to ensure it works. Security Considerations Use minimum required Azure permissions (Reader role on subscription or resource groups). Store API credentials securely in n8n credential store. The Azure Monitor API has rate limits, so avoid excessive concurrent requests. Chat sessions use session-based memory that persists during conversations but doesn't retain data between separate chat sessions. Extending the Template You can add more Azure monitoring tools like disk metrics, network security group logs, or Application Insights data. The AI agent can be enhanced with additional tools for Azure cost analysis, security recommendations, or automated remediation actions. You could also integrate with alerting systems or export reports to external storage or reporting platforms.
by Oneclick AI Squad
This automated n8n workflow monitors ingredient price changes from external APIs or manual sources, analyzes historical trends, and provides smart buying recommendations. The system tracks price fluctuations in a PostgreSQL database, generates actionable insights, and sends alerts via email and Slack to help restaurants optimize their purchasing decisions. What is Price Trend Analysis? Price trend analysis uses historical price data to identify patterns and predict optimal buying opportunities. The system analyzes price movements over time and generates recommendations on when to buy ingredients based on current trends and historical patterns. Good to Know Price data accuracy depends on the reliability of external API sources Historical data improves recommendation accuracy over time (recommended minimum 30 days) PostgreSQL database provides robust data storage and complex trend analysis capabilities Real-time alerts help capture optimal buying opportunities Dashboard provides visual insights into price trends and recommendations How It Works Daily Price Check - Triggers the workflow daily to monitor price changes Fetch API Prices - Retrieves the latest prices from an external ingredient pricing API Setup Database - Ensures database tables are ready before inserting new data Store Price Data - Saves current prices to the PostgreSQL database for tracking Calculate Trends - Analyzes historical prices to detect patterns and price movements Generate Recommendations - Suggests actions based on price trends (buy/wait/stock up) Store Recommendations - Saves recommendations for future reporting Get Dashboard Data - Gathers necessary data for dashboard generation Generate Dashboard HTML - Builds an HTML dashboard to visualize insights Send Email Report - Emails the dashboard report to stakeholders Send Slack Alert - Sends key alerts or recommendations to Slack channels Database Structure The workflow uses PostgreSQL with two main tables: price_history - Historical price tracking with columns: id (Primary Key) ingredient (VARCHAR 100) - Name of the ingredient price (DECIMAL 10,2) - Current price value unit (VARCHAR 50) - Unit of measurement (kg, lbs, etc.) supplier (VARCHAR 100) - Source supplier name timestamp (TIMESTAMP) - When the price was recorded created_at (TIMESTAMP) - Record creation time buying_recommendations - AI-generated buying suggestions with columns: id (Primary Key) ingredient (VARCHAR 100) - Ingredient name current_price (DECIMAL 10,2) - Latest price price_change_percent (DECIMAL 5,2) - Percentage change from previous price trend (VARCHAR 20) - Price trend direction (INCREASING/DECREASING/STABLE) recommendation (VARCHAR 50) - Buying action (BUY_NOW/WAIT/STOCK_UP) urgency (VARCHAR 20) - Urgency level (HIGH/MEDIUM/LOW) reason (TEXT) - Explanation for the recommendation generated_at (TIMESTAMP) - When recommendation was created Price Trend Analysis The system analyzes historical price data over the last 30 days to calculate percentage changes, identify trends (INCREASING/DECREASING/STABLE), and generate actionable buying recommendations based on price patterns and movement history. How to Use Import the workflow into n8n Configure PostgreSQL database connection credentials Set up external ingredient pricing API access Configure email credentials for dashboard reports Set up Slack webhook or bot credentials for alerts Run the Setup Database node to create required tables and indexes Test with sample ingredient data to verify price tracking and recommendations Adjust trend analysis parameters based on your purchasing patterns Monitor recommendations and refine thresholds based on actual buying decisions Requirements PostgreSQL database access External ingredient pricing API credentials Email service credentials (Gmail, SMTP, etc.) Slack webhook URL or bot credentials Historical price data for initial trend analysis Customizing This Workflow Modify the Calculate Trends node to adjust the analysis period (currently 30 days) or add seasonal adjustments. Customize the recommendation logic to match your restaurant's buying patterns, budget constraints, or supplier agreements. Add additional data sources like weather forecasts or market reports for more sophisticated predictions.
by Yosua Surojo
Who it's for This workflow is for anyone who wants to build an automated, AI-enhanced reading list. Ideal for: Knowledge workers and researchers who collect and organize articles Students managing study materials Productivity hackers who use Telegram and Notion for personal knowledge management Anyone using the AI-Enhanced Knowledge Base Tracker Notion Template How it works This workflow takes any article link sent to your Telegram bot and automatically: Parses the article into a clean title and body Uses OpenAI to generate a 1โ2 sentence highlight and topic tag Saves it into your Notion database Sends a confirmation message with the highlight and Notion link back to Telegram Main steps: Telegram Trigger - Listens for incoming message containing an article link. Fetch Article Title & Content - Calls the article-parser-api deployed on Vercel to fetch and parse the article content into structured JSON (title and content). Generate Highlight + Tag (AI Agent) - Processes the parsed content to generate Highlight and Type tag values. Structured Metadata for Notion - Adjusts the extracted data before saving it to Notion. Save Article to Notion Database - Inserts the article and generated metadata into your Notion knowledge base. Confirm Save via Telegram - Sends a confirmation message and the Notion page link back to the Telegram bot chat after the entry is created. Setup Create and connect your API credentials: Telegram Bot OpenAI API Key Notion Integration Deploy the article parser: Use this repo: article-parser-api Deploy it to Vercel or any serverless environment Link your Notion database: Duplicate the AIโEnhanced Knowledge Base Tracker Copy the database URL and connect it in the Notion node Test your workflow: Click Execute workflow Send an article link to your Telegram bot Once verified, activate the workflow so it runs automatically Requirements Telegram bot token OpenAI API key Notion integration and shared database A deployed article parser (e.g., article-parser-api) Optional customization Edit the AI Agent prompt to change tone or tagging style Add filtering or additional fields in the Edit Fields node Trigger from other sources (e.g., Slack or Email)