by Raphael De Carvalho Florencio
What this workflow is (About) This workflow turns a Telegram bot into an AI-powered lyrics assistant. Users send a command plus a lyrics URL, and the flow downloads, cleans, and analyzes the text, then replies on Telegram with translated lyrics, summaries, vocabulary, poetic devices, or an interpretation—all generated by AI (OpenAI). What problems it solves Centralizes lyrics retrieval + cleanup + AI analysis in one automated flow Produces study-ready outputs (translation, vocabulary, figures of speech) Saves time for teachers, learners, and music enthusiasts with instant results in chat Key features AI analysis** using OpenAI (no secrets hardcoded; uses n8n Credentials) Line-by-line translation, **concise summaries, vocabulary lists Poetic/literary device detection* and *emotional/symbolic interpretation** Robust ETL (extract, download, sanitize) and error handling Clear Sticky Notes documenting routing, ETL, AI prompts, and messaging Who it’s for Language learners & teachers Musicians, lyricists, and music bloggers Anyone studying lyrics for meaning, style, or vocabulary Input & output Input:* Telegram command with a public *lyrics URL** Output:** Telegram messages (Markdown/MarkdownV2), split into chunks if long How it works Telegram → Webhook** receives a user message (e.g., /get_lyrics <URL>). Routing (If/Switch)** detects which command was sent. Extract URL + Download (HTTP Request)** fetches the lyrics page. Cleanup (Code)** strips HTML/scripts/styles and normalizes whitespace. OpenAI (Chat)** formats the result per command (translation, summary, vocabulary, analysis). Telegram (Send Message)** returns the final text; long outputs are split into chunks. Error handling** replies with friendly guidance for unsupported/incomplete commands. Set up steps Create a Telegram bot with @BotFather and copy the bot token. In n8n, create Credentials → Telegram API and paste your token (no hardcoded keys in nodes). Create Credentials → OpenAI and paste your API key. Import the workflow and set a short webhook path (e.g., /lyrics-bot). Publish the webhook and set it on Telegram: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook?url=https://[YOUR_DOMAIN]/webhook/lyrics-bot (Optional) Restrict update types: curl -X POST https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook \ -H "Content-Type: application/json" \ -d '{ "url": "https://[YOUR_DOMAIN]/webhook/lyrics-bot", "allowed_updates": ["message"] }' Test by sending /start and then /get_lyrics <PUBLIC_URL> to your bot. If messages are long, ensure MarkdownV2 is used and special characters are escaped.
by Ruben AI
AI-Powered Flyer & Video Generator with Airtable, Klie.ai, and n8n Who is this for? This template is perfect for e-commerce entrepreneurs, marketers, agencies, and creative teams who want to turn simple product photos and short descriptions into professional flyers or product videos—automatically and at scale. If you want to generate polished marketing assets without relying on designers or editors, this is for you. What problem is this workflow solving? Creating product ads, flyers, or videos usually involves multiple tools and manual steps: Collecting and cleaning product photos Writing ad copy or descriptions Designing flyers or visuals for campaigns Producing animations or video ads Managing multiple revisions and approvals This workflow automates the entire pipeline. Upload a raw product image into Airtable, type a quick description, and receive back a flyer or video animation tailored to your brand and context—ready to use for ads, websites, or campaigns. What this workflow does Uses Airtable as the central interface where you upload raw product photos and enter descriptions Processes the content automatically via n8n Generates flyers and visuals using OpenAI Image 1 Produces custom product videos with Google’s VEO3 Runs through Klie.ai to unify the image + video generation process Sends the final creative assets back into Airtable for review and download Setup Download n8n files and connect your Airtable token to n8n Duplicate the Airtable base and make sure you’re on an Airtable Team plan Add your API key on the Airtable interface under API setup Create your agency inside the interface Start generating concept images and videos instantly How to customize this workflow to your needs Edit the prompts to match your brand voice and ad style Extend Airtable fields to include more creative parameters (colors, layout, target audience) Add approval steps via email, Slack, or Airtable statuses before finalizing Integrate with publishing platforms (social media, e-commerce CMS) for auto-posting Track generated assets inside Airtable for team collaboration 🎥 Demo Video: Demo Video
by Supira Inc.
Overview This template automates invoice processing for teams that currently copy data from PDFs into spreadsheets by hand. It is ideal for small businesses, back-office teams, accounting, and operations who want to reduce manual entry, avoid human error, and never miss a payment deadline. The workflow watches a structured Google Drive folder, performs OCR, converts the text into clean structured JSON with an LLM, and appends one row per invoice into Google Sheets. It preserves a link back to the original file for easy review and audit. Designed for small businesses and back-office teams.** Eliminates manual typing** and reduces errors. Prevents missed due dates** by centralizing data. Works with monthly subfolders like "2025年10月分" (meaning "October 2025"). Keeps a Google Drive link to each invoice file. How It Works The workflow runs on a schedule, scans your Drive folder hierarchy, OCRs the PDFs/images, cleans the text, extracts key fields with an LLM, and appends a row to Google Sheets per invoice. Each step is modular so you can swap services or tweak prompts without breaking the flow. Scheduled trigger** runs on a recurring cadence. Scan the parent folder** in Google Drive. Auto-detect the current-month folder** (e.g., a folder named "2025年10月分" meaning "October 2025"). Download PDFs/images** from the detected folder. Extract text** using the OCR.Space API. Clean noise** and normalize with a Code node. Use an OpenAI model** to extract invoice_date, due_date, client_name, line items, totals, and bank info to JSON. Append one row per invoice** to Google Sheets. Requirements Before you start, make sure you have access to the required services and that your Drive is organized into monthly subfolders so the workflow can find the right files. n8n account.** Google Drive access.** Google Sheets access.** OCR.Space API key** (set as <your_ocr_api_key>). OpenAI / LLM API credential** (e.g., <your_openai_credential_name>). Invoice PDFs organized by month** on Google Drive (e.g., folders like "2025年10月分"). Setup Instructions Import the workflow, replace placeholder credentials and IDs with your own, and enable the schedule. You can also run it manually for testing. The parent-folder query and sheet ID must reflect your environment. Replace <your_google_drive_credential_id> and <your_google_drive_credential_name> with your Google Drive Credential. Adjust the parent folder search query to your invoice repository name. Replace the Sheets document ID <your_google_sheet_id> with your spreadsheet ID. Ensure your OpenAI credential <your_openai_credential_name> is selected. Set your OCR.Space key as <your_ocr_api_key>. Enable the Schedule Trigger** after testing. Customization This workflow is easily extensible. You can adapt folder naming rules, enrich the spreadsheet schema, and expand the AI prompt to extract custom fields specific to your company. It also works beyond invoices, covering receipts, quotes, or purchase orders with minor changes. Change the monthly folder naming rule such as {{$now.setZone("Asia/Tokyo").format("yyyy年MM月")}}分 to match your convention. Modify or extend Google Sheets column mappings as needed. Tune the AI prompt to extract project codes, owner names, or custom fields. Repurpose for receipts, quotes, or purchase orders. Localize date formats and tax calculation rules to your standards.
by Vinay Gangidi
LOB Underwriting with AI This template ingests borrower documents from OneDrive, extracts text with OCR, classifies each file (ID, paystub, bank statement, utilities, tax forms, etc.), aggregates everything per borrower, and asks an LLM to produce a clear underwriting summary and decision (plus next steps). Good to know AI and OCR usage consume credits (OpenAI + your OCR provider). Folder lookups by name can be ambiguous—use a fixed folderId in production. Scanned image quality drives OCR accuracy; bad scans yield weak text. This flow handles PII—mask sensitive data in logs and control access. Start small: batch size and pagination keep costs/memory sane. How it works Import & locate docs: Manual trigger kicks off a OneDrive folder search (e.g., “LOBs”) and lists files inside. Per-file loop: Download each file → run OCR → classify the document type using filename + extracted text. Aggregate: Combine per-file results into a borrower payload (make BorrowerName dynamic). LLM analysis: Feed the payload to an AI Agent (OpenAI model) to extract underwriting-relevant facts and produce a decision + next steps. Output: Return a human-readable summary (and optionally structured JSON for systems). How to use Start with the Manual Trigger to validate end-to-end on a tiny test folder. Once stable, swap in a Schedule/Cron or Webhook trigger. Review the generated underwriting summary; handle only flagged exceptions (unknown/unreadable docs, low confidence). Setup steps Connect accounts Add credentials for OneDrive, OCR, and OpenAI. Configure inputs In Search a folder, point to your borrower docs (prefer folderId; otherwise tighten the name query). In Get items in a folder, enable pagination if the folder is large. In Split in Batches, set a conservative batch size to control costs. Wire the file path Download a file must receive the current file’s id from the folder listing. Make sure the OCR node receives binary input (PDFs/images). Classification Update keyword rules to match your region/lenders/utilities/tax forms. Keep a fallback Unknown class and log it for review. Combine Replace the hard-coded BorrowerName with: a Set node field, a form input, or parsing from folder/file naming conventions. AI Agent Set your OpenAI model/credentials. Ask the model to output JSON first (structured fields) and Markdown second (readable summary). Keep temperature low for consistent, audit-friendly results. Optional outputs Persist JSON/Markdown to Notion/Docs/DB or write to storage. Customize if needed Doc types: add/remove categories and keywords without touching core logic. Error handling: add IF paths for empty folders, failed downloads, empty OCR, or Unknown class; retry transient API errors. Privacy: redact IDs/account numbers in logs; restrict execution visibility. Scale: add MIME/size filters, duplicate detection, and multi-borrower folder patterns (parent → subfolders).
by Rahul Joshi
📊 Description Streamline Facebook Messenger inbox management with an AI-powered categorization and response system. 💬⚙️ This workflow automatically classifies new messages as Lead, Query, or Spam using GPT-4, routes them for approval via Slack, responds on Facebook once approved, and logs all interactions into Google Sheets for tracking. Perfect for support and marketing teams managing high volumes of inbound DMs. 🚀📈 What This Template Does 1️⃣ Trigger – Runs hourly to fetch new Facebook Page messages. ⏰ 2️⃣ Extract & Format – Collects sender info, timestamps, and message content for analysis. 📋 3️⃣ AI Categorization – Uses GPT-4 to identify message type (Lead, Query, Spam) and suggest replies. 🧠 4️⃣ Slack Approval Flow – Sends categorized leads and queries to Slack for quick team approval. 💬 5️⃣ Facebook Response – Posts AI-suggested replies back to the original sender once approved. 💌 6️⃣ Data Logging – Records every message, reply, and approval status into Google Sheets for analytics. 📊 7️⃣ Error Handling – Automatically alerts via Slack if the workflow encounters an error. 🚨 Key Benefits ✅ Reduces manual message triage on Facebook Messenger ✅ Ensures consistent and professional customer replies ✅ Provides full visibility via Google Sheets logs ✅ Centralizes team approvals in Slack for faster response times ✅ Leverages GPT-4 for accurate categorization and natural replies Features Hourly Facebook message fetch with Graph API GPT-4 powered text classification and reply suggestion Slack-based dual approval flow Automated Facebook replies post-approval Google Sheets logging for all categorized messages Built-in error detection and Slack alerting Requirements Facebook Graph API credentials with page message permissions OpenAI API key for GPT-4 processing Slack API credentials with chat:write permission Google Sheets OAuth2 credentials Environment Variables: FACEBOOK_PAGE_ID GOOGLE_SHEET_ID GOOGLE_SHEET_NAME SLACK_CHANNEL_ID Target Audience Marketing and lead-generation teams using Facebook Pages 📣 Customer support teams managing Messenger queries 💬 Businesses seeking automated lead routing and CRM sync 🧾 Teams leveraging AI for customer engagement optimization 🤖 Step-by-Step Setup Instructions 1️⃣ Connect Facebook Graph API credentials and set your page ID. 2️⃣ Add OpenAI API credentials for GPT-4. 3️⃣ Configure Slack channel ID and credentials. 4️⃣ Link your Google Sheet for message logging. 5️⃣ Replace environment variable placeholders with your actual IDs. 6️⃣ Test the workflow manually before enabling automation. 7️⃣ Activate the schedule trigger for ongoing hourly execution. ✅
by Daniel Shashko
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. Who is this for? Brand managers and marketing teams** needing automated social listening and engagement on Reddit Community managers** responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers** who want to scale Reddit marketing without hiring a team PR and reputation teams** monitoring brand sentiment and responding to discussions in real-time Product marketers** seeking organic engagement opportunities in product-related communities Any business** that wants to build authentic Reddit presence while avoiding spammy marketing tactics Setup Steps Setup time:** Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements:** Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOUR_SHEET_ID with your actual Google Sheets document ID Send Slack Report: Replace YOUR_CHANNEL_ID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
by Nguyen Thieu Toan
🤖 Facebook Messenger Smart Chatbot – Batch, Format & Notify with n8n Data Table by Nguyen Thieu Toan 🌟 What Is This Workflow? This is a smart chatbot solution built with n8n, designed to integrate seamlessly with Facebook Messenger. It batches incoming messages, formats them for clarity, tracks conversation history, and sends natural replies using AI. Perfect for businesses, customer support, or personal AI agents. ⚙️ Key Features 🔄 Smart batching: Groups consecutive user messages to process them in one go, avoiding fragmented replies. 🧠 Context formatting: Automatically formats messages to fit Messenger’s structure and length limits. 📋 Conversation history tracking: Stores and retrieves chat logs between user and bot using n8n Data Table. 👀 Seen & Typing effects: Adds human-like responsiveness with Messenger’s sender actions. 🧩 AI Agent integration: Easily connects to GPT, Gemini, or any LLM for natural replies, scheduling, or business logic. 🚀 How It Works Connects to your Facebook Page via webhook to receive and send messages. Stores incoming messages in a Data Table called Batch_messages, including fields like user_text, bot_rep, processed, etc. Collects unprocessed messages, sorts them by id, and creates a merged_message and full history. Sends the history to an AI Agent for contextual response generation. Sends the AI reply back to Messenger with Seen/Typing effects. Updates the message status to processed = true to prevent duplicate handling. 🛠️ Setup Guide Create a Facebook App and Messenger webhook, link it to your Page. Set up the Batch_messages Data Table in n8n with required columns. Import the workflow or build nodes manually using the tutorial. Configure your API tokens, webhook URLs, and AI Agent endpoint. Deploy the workflow on a public n8n server. 📘 Full tutorial available at: 👉 Smart Chatbot Workflow Guide by Nguyen Thieu Toan 💡 Pro Tips Customize the AI prompt and persona to match your business tone. Add scheduling, lead capture, or CRM integration using n8n’s flexible nodes. Monitor your Data Table regularly to ensure clean message flow and batching. 👤 About the Creator Nguyen Thieu Toan (Nguyễn Thiệu Toàn/Jay Nguyen) is an expert in AI automation, business optimization, and chatbot development. With a background in marketing and deep knowledge of n8n workflows, Jay helps businesses harness AI to save time, boost performance, and deliver smarter customer experiences. Website: https://nguyenthieutoan.com
by Vlad Arbatov
Summary Chat with your AI agent in Telegram. It remembers important facts about you in Airtable, can transcribe your voice messages, search the web, read and manage Google Calendar, fetch Gmail, and query Notion. Responses are grounded in your recent memories and tool outputs, then sent back to Telegram. What this workflow does Listens to your Telegram messages (text or voice) Maintains short-term chat memory per user and long-term memory in Airtable Decides when to save new facts about you (auto “Save Memory” without telling you) Uses tools on demand: Web search via SerpAPI Google Calendar: list/create/update/delete events Gmail: list and read messages Notion: fetch database info Transcribes Telegram voice notes with OpenAI and feeds them to the agent Combines live tool results + recent memories and replies in Telegram Apps and credentials Telegram Bot API: personal_bot xAI Grok: Grok-4 model for chat OpenAI: speech-to-text (transcribe audio) Airtable: store long-term memories Google Calendar: calendar actions Gmail: email actions Notion: knowledge and reading lists SerpAPI: web search Typical use cases Personal assistant that remembers preferences, decisions, and tasks Create/update meetings by chatting, and get upcoming events Ask “what did I say I’m reading?” or “what’s our plan from last week?” Voice-first capture: send a voice note → get a transcribed, actionable reply Fetch recent emails or look up info on the web without leaving Telegram Query a Notion database (e.g., “show me the Neurocracy entries”) How it works (node-by-node) Telegram Trigger Receives messages from your Telegram chat (text and optional voice). Text vs Message Router Routes based on message contents: Text path → goes directly to the Agent (AI). Voice path → downloads the file and transcribes before AI. Always also fetches recent Airtable memories for context. Get a file (Telegram) Downloads the voice file (voice.file_id) when present. Transcribe a recording (OpenAI) Converts audio to text so the agent can use it like a normal message. Get memories (Airtable) Searches your “Agent Memories” base/table, filtered by user, sorted by Created. Aggregate (Aggregate) Bundles recent memory records into a single array “Memories” with text + timestamp. Merge (Merge) Combines current input (text or transcript) with the memory bundle before the agent. Simple Memory (Agent memory window) Short-term session memory keyed by Telegram chat ID; keeps the recent 30 turns. Tools wired into the agent SerpAPI Google Calendar tools: Get many events in Google Calendar Create an event in Google Calendar Update an event in Google Calendar Delete an event in Google Calendar Gmail tools: Get many messages in Gmail Get a message in Gmail Notion tool: Get a database in Notion Airtable tool: Save Memory (stores distilled facts about the user) Agent System prompt defines role, tone, and rules: Be a friendly assistant. On each message, decide if it contains user info worth saving. If yes, call “Save Memory” to persist a short summary in Airtable. Don’t announce memory saves—just continue helping. Use tools when needed (web, calendar, Gmail, Notion). Think with the provided memory context block. Uses xAI Grok Chat Model for reasoning and tool-calling. Can call Save Memory, Calendar, Gmail, Notion, and SerpAPI tools as needed. Save Memory (Airtable) Persists Memory and User fields to “Agent Memories” base; auto timestamp by Airtable. Send a text message (Telegram) Sends the agent’s final answer back to the same Telegram chat ID. Node map | Node | Type | Purpose | |---|---|---| | Telegram Trigger | Trigger | Receive text/voice from Telegram | | Text vs voice router | Flow control | Route text vs voice; also trigger memories fetch | | Get a file | Telegram | Download voice audio | | Transcribe a recording | OpenAI | Speech-to-text for voice notes | | Get memories | Airtable | Load recent user memories | | Aggregate | Aggregate | Pack memory records into “Memories” array | | Merge | Merge | Combine input and memories before agent call | | Simple Memory | Agent memory | Short-term chat memory per chat ID | | xAI Grok Chat Model | LLM | Core reasoning model for the Agent | | Search Web with SerpAPI | Tool | Web search | | Google Calendar tools | Tool | List/create/update/delete events | | Gmail tools | Tool | Search and read email | | Notion tool | Tool | Query a Notion database | | Save Memory | Airtable Tool | Persist distilled user facts | | AI Agent | Agent | Orchestrates tools + memory, produces the answer | | Send a text message | Telegram | Reply to the user in Telegram | Before you start Create a Telegram bot and get your token (via @BotFather). Put your Telegram user ID into the Telegram Trigger node (chatIds). Connect credentials: xAI Grok (model: grok-4-0709) OpenAI (for audio transcription) Airtable (Agent Memories base and table) Google Calendar OAuth Gmail OAuth Notion API SerpAPI key Adjust the Airtable “User” value and the filterByFormula to match your name or account. Setup instructions 1) Telegram Telegram Trigger: additionalFields.chatIds = your_telegram_id download = true to allow voice handling Send a text message: chatId = {{ $('Telegram Trigger').item.json.message.chat.id }} 2) Memory Airtable base/table must exist with fields: Memory, User, Created (Created auto-managed). In Save Memory and Get memories nodes, align Base, Table, and filterByFormula with your setup. Simple Memory: sessionKey = {{ $('If').item.json.message.chat.id }} contextWindowLength = 30 (adjust as needed) 3) Tools Google Calendar: choose your calendar, test get/create/update/delete. Gmail: set “returnAll/simplify/messageId” via $fromAI or static defaults. Notion: set your databaseId. SerpAPI: ensure the key is valid. 4) Agent (AI node) SystemMessage: customize role, name, and any constraints. Text input: concatenates transcript or text into one prompt: {{ $json.text }}{{ $json.message.text }} How to use Send a text or voice message to your bot in Telegram. The agent replies in the same chat, optionally performing tool actions. New personal facts you mention are silently summarized and stored in Airtable for future context. Customization ideas Replace Grok with another LLM if desired. Add more tools: Google Drive, Slack, Jira, GitHub, etc. Expand memory schema (e.g., tags, categories, confidence). Add guardrails: profanity filters, domain limits, or cost control. Multi-user support: store chat-to-user mapping and separate memories by user. Add summaries: a daily recap message created from new memories. Limits and notes Tool latency: calls to Calendar, Gmail, Notion, and SerpAPI add response time. Audio size/format: OpenAI transcription works best with common formats and short clips. Memory growth: periodically archive old Airtable entries, or change Aggregate window. Timezone awareness: Calendar operations depend on your Google Calendar settings. Privacy and safety Sensitive info may be saved to Airtable; restrict access to the base. Tool actions operate under your connected accounts; review scopes and permissions. The agent may call external APIs (SerpAPI, OpenAI); ensure this aligns with your policies. Example interactions “Schedule a 30‑min catch‑up with Alex next Tuesday afternoon.” “What meetings do I have in the next 4 weeks?” “Summarize my latest emails from Product Updates.” “What did I say I’m reading?” (agent recalls from memories) Voice note: “Remind me to call the dentist this Friday morning.” → agent transcribes and creates an event. Tags telegram, agent, memory, grok, openai, airtable, google-calendar, gmail, notion, serpapi, voice, automation Changelog v1: First release with Telegram agent, short/long-term memory, voice transcription, and tool integrations for web, calendar, email, and Notion.
by Laiba
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works User Uploads PDF** : The workflow accepts a PDF via webhook. Extract Text** : n8n extracts the text content from the PDF. Summarize with AI** : The extracted text is passed to an AI model (Groq) with OpenAI model for summarization. Generate Audio** : The summary text is sent to a TTS (Text-to-Speech) API (Qwen-TTS-Demo), you can use other free alternatives. Serve Result** : The workflow outputs both Summary and Audio File URL (WAV link) which you can attached to your audioPlayer. This allows users to read or listen to the summary instantly. How to use / Requirements Import Workflow** : Copy/paste the workflow JSON into your n8n instance. Set Up Input Trigger** : If you want users to upload directly you can use webhook or any other trigger. Configure AI Node** : Add your own API key for (Groq / Open AI). Configure TTS Node** : Add credentials for your chosen TTS service. Run Workflow** : Upload a PDF and get back the summary and audio file url. n8n-smart pdf summarizer & voice generator Please reach out to me at Laiba Zubair if you need further assistance with you n8n workflows and automations!
by Paul Abraham
This n8n template demonstrates how to turn a Telegram bot into a personal AI-powered assistant that understands both voice notes and text messages. The assistant can transcribe speech, interpret user intent with AI, and perform smart actions such as managing calendars, sending emails, or creating notes. Use cases Hands-free scheduling with Google Calendar Quickly capturing ideas as Notion notes via voice Sending Gmail messages directly from Telegram A personal productivity assistant available on-the-go Good to know Voice notes are automatically transcribed into text before being processed. This template uses Google Gemini for AI reasoning.The AI agent supports memory, enabling more natural and contextual conversations. How it works Telegram Trigger – Starts when you send a text or voice note to your Telegram bot. Account Check – Ensures only authorized users can interact with the bot. Audio Handling – If it’s a voice message, the workflow retrieves and transcribes the recording. AI Agent – Both transcribed voice or text are sent to the AI Agent powered by Google Gemini + Simple Memory. Smart Actions – Based on the query, the AI can: Read or create events in Google Calendar Create notes in Notion Send messages in Gmail Reply in Telegram – The bot sends a response confirming the action or providing the requested information. How to use Clone this workflow into your n8n instance. Replace the Telegram Trigger with your bot credentials. Connect Google Calendar, Notion, and Gmail accounts where required. Start chatting with your Telegram bot to add events, notes, or send emails using just your voice or text. Requirements Telegram bot & API key Google Gemini account for AI Google Calendar, Notion, and Gmail integrations (optional, depending on use case) Customising this workflow Add more integrations (Slack, Trello, Airtable, etc.) for extended productivity. Modify the AI prompt in the agent node to fine-tune personality or task focus. Swap in another transcription service if preferred.
by Jainik Sheth
What is this? This RAG workflow allows you to build a smart chat assistant that can answer user questions based on any collection of documents you provide. It automatically imports and processes files from Google Drive, stores their content in a searchable vector database, and retrieves the most relevant information to generate accurate, context-driven responses. The workflow manages chat sessions and keeps the document database current, making it adaptable for use cases like customer support, internal knowledge bases, or HR assistant etc. How it works 1. Chat RAG Agent Uses OpenAI for responses, referencing only specific data from the vector store (data that is uploaded on google drive folder). Maintains chat history in Postgres using a session key from the chat input. 2. Data Pipeline (File Ingestion) Monitors Google Drive for new/updated files and automatically updates them in vector store Downloads, extracts, and processes file content (PDFs, Google Docs). Generates embeddings and stores them in the Supabase vector store for retrieval. 3. Vector Store Cleanup Scheduled and manual routines to remove duplicate or outdated entries from the Supabase vector store. Ensures only the latest and unique documents are available for retrieval. 4. File Management Handles folder and file creation, upload, and metadata assignment in Google Drive. Ensures files are organized and linked with their corresponding vector store entries. Getting Started Create and connect all relevant credentials Google Drive Postgres Supabase OpenAI Run the table creation nodes first to set up your database tables in Postgres Upload your documents through Google Drive (or swap out for a different file storage solution) The agent will process them automatically (chunking text, storing tabular data in Postgres) Start asking questions that leverage the agent's multiple reasoning approaches Customization (optional) This template provides a solid foundation that you can extend by: Tuning the system prompt for your specific use case Adding document metadata like summaries Implementing more advanced RAG techniques Optimizing for larger knowledge bases Note, if you're using a different nodes eg. file storage, vector store etc the integration may vary a little Prerequisites Google account (google drive) Supabase account OpenAI APIs Postgres account
by Rakin Jakaria
Who this is for This workflow is for content creators, digital marketers, or YouTube strategists who want to automatically discover trending videos in their niche, analyze engagement metrics, and get data-driven insights for their content strategy — all from one simple form submission. What this workflow does This workflow starts every time someone submits the YouTube Trends Finder Form. It then: Searches YouTube videos* based on your topic and specified time range using the *YouTube Data API**. Fetches detailed analytics** (views, likes, comments, engagement rates) for each video found. Calculates engagement rates** and filters out low-performing content (below 2% engagement). Applies smart filters** to exclude videos with less than 1000 views, content outside your timeframe, and hashtag-heavy titles. Removes duplicate videos** to ensure clean data. Creates a Google Spreadsheet** with all trending video data organized by performance metrics. Delivers the results** via a completion form with a direct link to your analytics report. Setup To set this workflow up: Form Trigger – Customize the "YouTube Trends Finder" form fields if needed (Topic Name, Last How Many Days). YouTube Data API – Add your YouTube OAuth2 credentials and API key in the respective nodes. Google Sheets – Connect your Google Sheets account for automatic report generation. Engagement Filters – Adjust the 2% engagement rate threshold based on your quality standards. View Filters – Modify the minimum view count (currently 1000+) in the filter conditions. Regional Settings – Update the region code (currently "US") to target specific geographic markets. How to customize this workflow to your needs Change the engagement rate threshold to be more or less strict based on your niche requirements. Add additional filters like video duration, subscriber count, or specific keywords to refine results. Modify the Google Sheets structure to include extra metrics like "Channel Name", "Video Duration", or "Trending Score". Switch to different output formats like CSV export or direct email reports instead of Google Sheets.