by Nskha
A robust n8n workflow designed to enhance Telegram bot functionality for user management and broadcasting. It facilitates automatic support ticket creation, efficient user data storage in Redis, and a sophisticated system for message forwarding and broadcasting. How It Works Telegram Bot Setup: Initiate the workflow with a Telegram bot configured for handling different chat types (private, supergroup, channel). User Data Management: Formats and updates user data, storing it in a Redis database for efficient retrieval and management. Support Ticket Creation: Automatically generates chat tickets for user messages and saves the corresponding topic IDs in Redis. Message Forwarding: Forwards new messages to the appropriate chat thread, or creates a new thread if none exists. Support Forum Management: Handles messages within a support forum, differentiating between various chat types and user statuses. Broadcasting System: Implements a broadcasting mechanism that sends channel posts to all previous bot users, with a system to filter out blocked users. Blocked User Management: Identifies and manages blocked users, preventing them from receiving broadcasted messages. Versatile Channel Handling: Ensures that messages from verified channels are properly managed and broadcasted to relevant users. Set Up Steps Estimated Time**: Around 30 minutes. Requirements**: A Telegram bot, a Redis database, and Telegram group/channel IDs are necessary. Configuration**: Input the Telegram bot token and relevant group/channel IDs. Configure message handling and user data processing according to your needs. Detailed Instructions**: Sticky notes within the workflow provide extensive setup information and guidance. Live Demo Workflow Bot: Telegram Bot Link (Click here) Support Group: Telegram Group Link (Click here) Broadcasting Channel: Telegram Channel Link (Click here) Keywords: n8n workflow, Telegram bot, chat ticket system, Redis database, message broadcasting, user data management, support forum automation
by Juan Carlos Cavero Gracia
This automation template turns any long video into multiple viral-ready short clips and auto-schedules them to TikTok, Instagram Reels, and YouTube Shorts. It works with both vertical and horizontal inputs and respects the original input resolution (no unnecessary upscaling), cropping or letterboxing intelligently when needed. The workflow automatically extracts between 3 and 6 clips (based on video length and the most engaging segments) and schedules one short per consecutive day—e.g., 3 clips → the next 3 days, 6 clips → the next 6 days. Note: This workflow uses OpenAI Whisper for word-level transcription, Google’s Gemini for clip selection and metadata, and Upload-Post’s FFmpeg API for GPU-accelerated cutting/cropping and social scheduling. You can use the same Upload-Post API token for both FFmpeg jobs and publishing uploads. Upload-Post also offers a generous free trial with no credit card required.* Who Is This For? Creators & Editors:** Batch-convert long talks/podcasts into daily Shorts/Reels/TikToks. Agencies & Social Teams:** Turn webinars/interviews into a reliable short-form stream. Brands & Founders:** Maintain a steady posting cadence with minimal hands-on editing. What Problem Does This Workflow Solve? Manual clipping is slow and inconsistent. This workflow: Finds Hooks Automatically:** AI picks 3–6 high-retention segments from transcript + timestamps (count scales with video length/quality). Cuts Cleanly:** Absolute-second FFmpeg timing to avoid mid-word cuts. Vertical & Horizontal Friendly:** Handles both orientations and respects source resolution. Schedules for You:** Posts one clip per day on consecutive days. How It Works Form Upload: Submit your long video. Audio Extraction: FFmpeg job extracts audio for accurate ASR. Whisper Transcription: Word-level timestamps enable precise clipping. AI Clip Mining (Gemini): Detects 3–6 “viral” moments (15–60s) and generates titles/descriptions. Cut & Crop (FFmpeg): GPU pipeline produces clean clips; preserves input resolution/orientation when possible and crops/pads appropriately for target platforms. Status & Download: Polls job status and retrieves the final clips. Auto-Scheduling (Consecutive Days): Schedules one short per day starting tomorrow, for as many days as clips were produced (e.g., 3 clips → 3 days, 6 clips → 6 days) at a configurable time (default 20:00 Europe/Madrid). Setup OpenAI (Whisper): Add your OpenAI API credentials. Google Gemini: Add Gemini credentials used by the AI Agent node. Upload-Post (free trial no credit card required): Generate your api token https://app.upload-post.com/ connect your social media accounts and add your API token credentials in n8n (same token works for FFmpeg jobs and publishing). Scheduling: Adjust posting time/intervals and timezone (Europe/Madrid by default). Metadata Mapping: Titles/descriptions are auto-generated per platform; tweak as needed. Requirements Accounts:** n8n, OpenAI, Google (Gemini), Upload-Post, and social platform connections. API Keys:** OpenAI token, Gemini credentials, Upload-Post token. Budget:** Whisper + Gemini inference + FFmpeg compute + optional posting costs. Features Word-Accurate Cuts:** Absolute-second timecodes with subtle pre/post-roll. Orientation-Aware:** Supports vertical and horizontal inputs; preserves source resolution where possible. Platform-Optimized Output:** 9:16-ready delivery with smart crop/pad behavior. Consecutive-Day Scheduler:** 3–6 clips → 3–6 consecutive posting days, automatically. Retry & Polling:** Built-in waits and status checks for robust processing. Modular:** Swap models, adjust clip count/length, or add/remove platforms quickly. Turn long-form video into a consistent sequence of Shorts/Reels/TikToks—automatically, day after day, while respecting your source resolution.
by Femi Ad
Description AI-Powered Business Idea Generation & Social Media Content Strategy Workflow This intelligent content discovery and strategy system features 15 nodes that automatically monitor Reddit communities, analyze business opportunities, and generate targeted social media content for AI automation agencies and entrepreneurs. It leverages AI classification, structured analysis, and automated content creation to transform community discussions into actionable business insights and marketing materials. Core Components Reddit Intelligence: Multi-subreddit monitoring across AI automation, n8n, and entrepreneur communities with keyword-based filtering. AI Classification Engine: Intelligent categorization of posts into "Questions" vs "Requests" using LangChain text classification. Dual Analysis System: Specialized AI agents for educational content (questions) and sales-focused content (service requests). Content Strategy Generator: Automated creation of LinkedIn and Twitter content tailored to different audience engagement strategies. Telegram Integration: Real-time delivery of formatted content strategies and business insights. Structured Output Processing: JSON-formatted analysis with relevancy scores, feasibility assessments, and actionable content recommendations. Target Users • AI Automation Agency Owners seeking consistent lead generation and thought leadership content • Entrepreneurs wanting to identify market opportunities and position themselves as industry experts • Content Creators in the automation/AI space needing data-driven content strategies • Business Development Professionals looking for systematic opportunity identification • Digital Marketing Agencies serving tech and automation clients Setup Requirements To get started, you'll need: Reddit API Access: OAuth2 credentials for accessing Reddit's API and monitoring multiple subreddits. Required APIs: • OpenRouter (for AI model access - supports GPT-4, Claude, and other models) • Reddit OAuth2 API (for community monitoring and data extraction) n8n Prerequisites: • Version 1.7+ with LangChain nodes enabled • Webhook configuration for Telegram integration • Proper credential storage and management setup Telegram Bot: Create via @BotFather for receiving formatted content strategies and business insights. Disclaimer: This template uses LangChain nodes and Reddit API integration. Ensure your n8n instance supports these features and verify API rate limits for production use. Step-by-Step Setup Guide Install n8n: Ensure you're running n8n version 1.7 or higher with LangChain node support enabled. Set Up API Credentials: • Create Reddit OAuth2 application at reddit.com/prefs/apps • Set up OpenRouter account and obtain API key • Store credentials securely in n8n credential manager Create Telegram Bot: • Go to Telegram, search for @BotFather • Create new bot and note the token • Configure webhook pointing to your n8n instance Import the Workflow: • Copy the workflow JSON from the template submission • Import into your n8n dashboard • Verify all nodes are properly connected Configure Monitoring Settings: • Adjust subreddit targets (currently: ArtificialIntelligence, n8n, entrepreneur) • Set keyword filters for relevant topics • Configure post limits and sorting preferences Customize AI Analysis: • Update system prompts to match your business expertise • Adjust relevancy and feasibility scoring criteria • Modify content generation templates for your brand voice Test the Workflow: • Run manual execution to verify Reddit data collection • Check AI classification and analysis outputs • Confirm Telegram delivery of formatted content Schedule Automation: • Set up daily trigger (currently configured for 12 PM) • Monitor execution logs for any API rate limit issues • Adjust frequency based on content volume needs Usage Instructions Automated Discovery: The workflow runs daily at 12 PM, scanning three key subreddits for relevant posts about AI automation, business opportunities, and n8n workflows. Intelligent Classification: Posts are automatically categorized as either "Questions" (educational opportunities) or "Requests" (potential service leads) using AI text classification. Dual Analysis Approach: • Questions → Educational content strategy with relevancy and detail scoring • Requests → Sales-focused content with relevancy and feasibility scoring Content Strategy Generation: Each analyzed post generates: • 3 LinkedIn posts (thought leadership, case studies, educational frameworks) • 3 Twitter posts (quick insights, engagement questions, thread starters) Telegram Delivery: Receive formatted content strategies with: • Post summaries and business context • Relevancy/feasibility scores • Ready-to-use social media content • Strategic recommendations Content Customization: Adapt generated content for different tones (business, educational, technical) and posting schedules. Workflow Features Multi-Platform Monitoring: Simultaneous tracking of 3 key Reddit communities with customizable keyword filters. AI-Powered Classification: Automatic categorization of posts into actionable content types. Dual Scoring System: • Relevancy scores (0.05-0.95) for business alignment • Detail/Feasibility scores (0.05-0.95) for content quality assessment Content Variety: Generates both educational and sales-focused social media strategies. Structured Output: JSON-formatted analysis for easy integration with other systems. Real-time Delivery: Instant Telegram notifications with formatted content strategies. Scalable Monitoring: Easy addition of new subreddits and keyword filters. Error Handling: Comprehensive validation with graceful failure management. Performance Specifications • Monitoring Frequency: Daily automated execution with manual trigger capability • Post Analysis: 5 posts per subreddit (15 total daily) • Content Generation: 6 social media posts per analyzed opportunity • Classification Accuracy: AI-powered with structured output validation • Delivery Method: Real-time Telegram integration • Scoring Range: 0.05-0.95 scale for relevancy and feasibility assessment Why This Workflow? Systematic Opportunity Identification: Never miss potential business opportunities or content ideas from key communities. AI-Enhanced Analysis: Leverage advanced language models for intelligent content categorization and strategy generation. Time-Efficient Content Creation: Transform community discussions into ready-to-use social media content. Data-Driven Insights: Quantified scoring helps prioritize opportunities and content strategies. Automated Lead Intelligence: Identify potential service requests and educational content opportunities automatically. Workflow Image Need help customizing this workflow for your specific use case? As a fellow entrepreneur passionate about automation and business development, I'd be happy to consult. Connect with me on LinkedIn: https://www.linkedin.com/in/femi-adedayo-h44/ or email for support. Let's make your AI automation agency even more efficient!
by Robert Breen
This workflow is designed for creators, marketers, and agencies who want to automate content publishing while keeping quality control through human review. It integrates four powerful tools — Google Sheets, OpenAI, GoToHuman, and Blotato — to deliver a seamless AI-assisted, human-approved, auto-publishing system for LinkedIn. ⚙️ What This Workflow Does 📅 Pulls Today’s Topic from Google Sheets You store ideas in a spreadsheet with a date column. The workflow runs daily (or manually) and selects the row matching today’s date. 🧠 Generates a Caption with OpenAI The selected idea is passed to GPT-4 via an AI Agent node. OpenAI returns a short, emoji-rich LinkedIn caption (1–2 sentences). The result is saved back to the sheet. 👤 Sends the Caption for Human Review via GoToHuman A human reviewer sees the AI-generated caption. They approve or reject it using a GoToHuman review template. Only approved captions move forward. 🚀 Publishes the Approved Caption to LinkedIn via Blotato The caption is posted to a LinkedIn account via Blotato's API. No additional input is required — it's fully automated after approval. 🔧 Setup Requirements ✅ Google Sheets Create or copy the provided sample sheet. Connect your Google Sheets account in n8n using OAuth2. ✅ OpenAI Create an API key at platform.openai.com. Add it to n8n as an OpenAI credential. ✅ GoToHuman Create an account and a Review Template at gotohuman.com. Add your API credential in n8n and use your reviewTemplateId in the node. ✅ Blotato Create an account at blotato.com. Get your API key and Account ID. Insert them into the HTTP Request node that publishes the LinkedIn post. 🧪 Testing the Workflow Use the Manual Trigger node for step-by-step debugging. Review nodes like AI Agent, Ask Human for Approval, and Post to LinkedIn to verify output. Once confirmed, activate the schedule for fully hands-free publishing. 👋 Built By Robert Breen Founder of Ynteractive — Automation, AI, and Data Strategy 🌐 Website: https://ynteractive.com 📧 Email: robert@ynteractive.com 🔗 LinkedIn: https://www.linkedin.com/in/robert-breen-29429625/ 📺 YouTube: YnteractiveTraining 🏷 Tags linkedin openai gotohuman social automation ai content approval workflow google sheets blotato marketing automation
by Jimmy Lee
This workflow gathers papers in Arxiv and specific arxiv category AI helps to make summarized form of newsletter and send it to subscriber using gmail Arxive paper trend newsletter Setup Supabase Table schema user_email: Text - Mandatory arxiv_cat: [Text] interested_papers: [Text] keyword: [Text] Example { "id": 8, "created_at": "2024-09-24T12:31:17.09491+00:00", "user_email": "test@test.com", "arxiv_cat": [ "cs.AI", "cs.LG,cs.AR" ], "interested_papers": null, "keyword": [ "AI architecture which includes long context problem" ] } Qdrant vector store default setup Setup for sub workflows Get arxiv category by AI for given keyword Get arxiv categories Get arxiv papers this week and scoring by AI Filter by keyword within given documents Extract paper information Write newsletter by AI
by Milan Vasarhelyi - SmoothWork
Video Introduction Invoice Processing Automation This template is the automation behind a simple incoming invoice automation tool (AP automation) tool built in Airtable. Link to the Airtable base, and all other tools used, is in notes on the left of the automation. How it works See how it works on video: Full Video Walkthrough 1) We get an email with an invoice attachment: 2) Processes and adds the data to an Airtable interface: 3) Once we approved it and Due date approaches, it shows among Due invoices, where we can track if it's paid. Looking for customization or a custom business app? 📞 Book a Call | 💬 DM me on Linkedin
by Thibaud
How it works: Schedule Trigger** on a daily basis (configured at 7.30 am) Connects Google Contacts** to get personal information from them Field Checker** on birthday date & firstnames. And see if there is any celebration for today Send a Telegram Notification** and display the message on a Google Home Speaker via Home Assistant if any celebration has matched Set up steps: Download* the workflow and *import** it into your n8n instance Configure accounts** for Google Contacts, Telegram and Home Assistant And you will be good to go
by InfraNodus
The Ultimate Gmail Analysis and Visual Summarization Template This workflow showcases various useful Gmail search, filter, and AI categorization operations and generates a knowledge graph for your mail using the InfraNodus GraphRAG API, which you can use to reveal the main topics and blind spots in your correspondence. InfraNodus will then target those blind spots to generate interesting research questions for you and send the topical summary and insights via Telegram. You can also click the generated graph and explore the blind spots inside InfraNodus using the interactive visual interface: What is it useful for? Learn about advanced Gmail search, filtering, and AI categorization functions** that can be useful for your other workflows Analyze all your personal messages for the last week to get an overview of the main topics Analyze all your Sent messages to find recurrent topics and gaps and generate ideas based. on those gaps Generate ideas based on specific message filters (Personal, Promos, from a specific person, AI-defined criteria, e.g. urgency) Get an overview of an interaction with a specific person / company Get an overview of your notes Generate new ideas based on your correspondence on a certain topic (e.g. "business") Learn about various n8n nodes useful for email processing, filtering, and data conversion Never miss important topics, use AI filter to get notified of the urgent and important emails via Telegram How it works This template can be triggered in multiple ways: automatically in regular intervals (daily, weekly), manually in n8n, or via a private password-protected URL form where you can specify your search and filtering criteria When you start the workflow, you specify: your Gmail search filters (can be combined, e.g. after:2025/06/01 label:personal business to search for all emails received after 1 June 2025, filed in the Personal category containing the word "business". (optional, if empty, will retrieve all the emails or limited to the number you set in the Gmail node) Additional Gmail labels (e.g. SENT or CATEGORY_PERSONAL or your custom categories). Use the search filter for faster processing (e.g. prefer label:person to CATEGORY_PERSONAL, but labels can be useful for additional filtering for your search queries) (optional, if empty, will retrieve all the emails) AI filtering criteria** — set an additional classification criteria used to filter out the emails, e.g. "Only the urgent, personal emails" — in that case, AI classification node working with Google's Gemini AI will be activated and will only pass through the email based on the criteria you specify. Whether you want to build a text graph or a social graph — see the workflow for detailed explanation of each Use snippets of emails (default) or full text (for thorough analysis). We prefer snippets as it's faster and your graph context doesn't get biased towards longer emails this way. Once you set up your search parameters in Steps 1 and 2, the template will follow the following steps: Step 3 — retrieve Google emails that satisfy your filter criteria. Filter them by additional labels provided if applicable. Step 4 - if the user chooses to analyze full text, use additional Gmail node that retrieves the full text of the email message Step 5 — if AI filter rule is provided, use the AI Classifier node with Google Gemini Pro 2.5 model to classify the email based on the rule provided. Bypass if empty. Step 6 - format the text or the email snippets to add the sender meta-data and category and to prepare to submit to InfraNodus Step 7 - submit the data to the InfraNodus HTTP graphAndEntries endpoint and generate a knowledge graph Step 8 - access this graph via the graphAndAdvice endpoint) and generate a topical summary based on the GraphRAG representation and insight questions bridging the gaps identified. Send the results via a Telegram bot. We use Telegram, because it takes only 30 seconds to set up a bot with an API, unlike Discord or Slack, which is long and cumbersome to set up. You can also attach a Gmail send node and generate an email instead. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Add this Authorization code in Steps 7 and 8 of the workflow. Come up with the name of the graph and change it in the HTTP InfraNodus nodes in the steps 7 and 8 and also in the Telegram nodes that send a link to the graph. For additional settings you can use in the HTTP InfraNodus nodes, see the InfraNodus access points page. Authorize your Gmail account for Steps 2 and 3 Gmail nodes. The easiest way to set it up is to open a free Google Console API account and to create an OAuth access point for n8n. You can then reuse it with other Google services like Google Sheets, Drive, etc. So it's a useful thing to have in general. Set up the Gemini AI API key using the instructions in the Step 5 Gemini AI node. Set up the Telegram node bot for the Step 8. It takes only 30 seconds: just go to @botfather and type in /newbot and you'll have an API key ready. To get the conversation ID, follow the n8n / Telegram instructions in the node itself. Once everything is ready, try to run the default automated workflow to test if everything works well, then use the Form for playing around with specific filters that you may find useful. Requirements An InfraNodus account and API key An Google Cloud API OAuth client and key for Gmail access A Gemini AI API key A Telegram bot API key FAQ 1. What's the best search query to use? I personally like starting with analyzing the messages Gmail tags as "personal" from the last week (using the after:2025/05/28 label:personal search query) using the social graph settings. It helps me see who I interacted with, what it was about, and gives me a good bird's eye view into my last week's interactions, helping me see if I didn't miss anything. I also find it useful to analyze the sent messages (using the after:2025/05/28 label:sent search filter or SENT category filter) as it helps me see what I was writing about recently and understand some recurrent topics and gaps in my interactions. Finally, I also like to analyze notes (label:notes) or specific correspondence (from:your_friend@gmail.com) to get an overview and find gaps in the conversations. 2. Why use InfraNodus and not an AI summarization module? You probably get a lot of spam, so your AI will get overwhelmed with the content that's not really useful. The InfraNodus graph helps you see the important patterns and discover what's missing by focusing on the gaps. You can use the interactive graph to quickly remove the stuff you don't need and to focus on the most relevant topics and conversations. Customizing this workflow You can connect a Gmail node instead of the Telegram one if you prefer to receive notifications directly by email. I don't like using Slack and Discord because their bots are too difficult to set up and take too long. Check out the complete setup guide for this workflow at https://support.noduslabs.com/hc/en-us/articles/20394884531996-Build-a-Knowledge-Graph-and-Extract-Insights-from-Gmail-Emails-with-n8n-and-InfraNodus with a video tutorial coming soon and the links to other n8n workflows. Check our other n8n workflows at https://n8n.io/creators/infranodus/ for useful content gap analysis, expert panel, and marketing, and research workflows that utilize GraphRAG for better AI generation. Finally, check out https://infranodus.com to learn more about our network analysis technology used to build knowledge graphs from text.
by Niranjan G
Who is this for? Professionals, solopreneurs, or productivity enthusiasts who want to keep their Gmail inbox clean and organized without manual effort. What problem does this solve? Manually archiving emails clutters your time and slows you down. This workflow automates inbox cleanup by removing the "INBOX" label from messages received over 24 hours ago. A perfect companion to AI-based labeling workflows, this keeps your inbox light and relevant. What this workflow does Triggers every day at 4 AM Fetches Gmail messages from the INBOX that are older than 24 hours Processes them one by one using the Split Out node Removes the INBOX label, effectively archiving the messages Setup Connect your Gmail account using OAuth2 credentials. Customize the Schedule Trigger node to adjust the run time. Modify the Gmail filter if you want to archive unread or labeled emails instead. How to customize this workflow to your needs Schedule different frequencies (e.g. twice a day or weekly). 🔄 Pairs Well With This complements the Intelligent Email Organization with AI-Powered Content Classification workflow. Use that to label emails smartly using AI, and this one to auto-archive them for a clean, clutter-free inbox.
by Arthur Braghetto
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Your n8n Command Center in a Telegram Chat Remotely manage and operate your n8n instance from Telegram with powerful admin commands. This workflow connects your n8n instance with a Telegram Bot, giving you remote control over key admin operations through simple chat commands. 📱 You can List your workflows (workflows) Execute a workflow (execute [name]) Activate/deactivate workflows (activate [name], deactivate [name]) List past executions (executions [name]) Permanently delete archived workflows (cleanup) Create backups of all your workflows and credentials (backup) Get help (help) Get notified when a workflow fails and when n8n instance starts. This is especially useful for self-hosted instances when you want quick access to your automation environment from your mobile device. 📌 Notes backup** only works on self-hosted setups. execute, **activate, deactivate, and executions require the workflow name as argument. Workflows must contain the appropriate trigger nodes to be executed or activated. Commands and arguments are not case sensitive, there is no need to prefix with slash and spaces in the argument name are supported. ⚙️ Setup Create your credentials for Telegram API and n8n API. Edit all Telegram and n8n nodes. Select your credentials on them. On telegram nodes provide your chatid. Detailed step-by-step instructions are available in the workflow notes. In each workflow that fails and you want to receive a warning, configure this workflow as Error Workflow in its settings.
by Dr. Firas
Generate AI videos with Seedance & Blotato, upload to TikTok, YouTube & Instagram Who is this for? This template is ideal for creators, content marketers, social media managers, and AI enthusiasts who want to automate the production of short-form, visually captivating videos for platforms like TikTok, YouTube Shorts, and Instagram Reels — all without manual editing or publishing. What problem is this workflow solving? Creating engaging videos requires: Generating creative ideas Writing detailed scene prompts Producing realistic video clips and sound effects Editing and stitching the final video Publishing across multiple platforms This workflow automates the entire process, saving hours of manual work and ensuring consistent, AI-driven content output ready for social distribution. What this workflow does This end-to-end AI video automation workflow: Generates a creative idea using OpenAI and LangChain Creates detailed video prompts with Seedance AI Generates video clips via Wavespeed AI Generates sound effects with Fal AI Stitches the final video using Fal AI’s ffmpeg API Logs metadata and video links to Google Sheets Uploads the video to Blotato Auto-publishes to TikTok, YouTube, Instagram, and other platforms Setup Add your OpenAI API key in the LLM nodes Set up Seedance and Wavespeed AI credentials for video prompt and clip generation Add your Fal AI API key for sound and stitching steps Connect your Google Sheets account for tracking ideas and outputs Set your Blotato API key and fill in the platform account IDs in the Assign Social Media IDs node Adjust the Schedule Trigger to control when the automation runs How to customize this workflow to your needs Change the AI prompts** to target your niche (e.g., ASMR, product videos, humor) Add a Telegram or Slack step** for video preview before publishing Tweak scene structure** or video duration to match your style Disable platforms** you don’t want by turning off specific HTTP Request nodes Edit the sound generation prompts** for different moods or effects 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Joseph LePage
🤖 AI-Powered RAG Chatbot with Google Drive Integration This workflow creates a powerful RAG (Retrieval-Augmented Generation) chatbot that can process, store, and interact with documents from Google Drive using Qdrant vector storage and Google's Gemini AI. How It Works Document Processing & Storage 📚 Retrieves documents from a specified Google Drive folder Processes and splits documents into manageable chunks Extracts metadata using AI for enhanced search capabilities Stores document vectors in Qdrant for efficient retrieval Intelligent Chat Interface 💬 Provides a conversational interface powered by Google Gemini Uses RAG to retrieve relevant context from stored documents Maintains chat history in Google Docs for reference Delivers accurate, context-aware responses Vector Store Management 🗄️ Features secure delete operations with human verification Includes Telegram notifications for important operations Maintains data integrity with proper version control Supports batch processing of documents Setup Steps Configure API Credentials: Set up Google Drive & Docs access Configure Gemini AI API Set up Qdrant vector store connection Add Telegram bot for notifications Add OpenAI Api Key to the 'Delete Qdrant Points by File ID' node Configure Document Sources: Set Google Drive folder ID Define Qdrant collection name Set up document processing parameters Test and Deploy: Verify document processing Test chat functionality Confirm vector store operations Check notification system This workflow is ideal for organizations needing to create intelligent chatbots that can access and understand large document repositories while maintaining context and providing accurate responses through RAG technology.