by Amit Mehta
Streamline Your Zoom Meetings with Secure, Automated Stripe Payments This comprehensive workflow automates the entire process of setting up a paid online event, from scheduling a Zoom meeting and creating a Stripe payment link to tracking participants and sending confirmation emails. How it Works This workflow has two primary, distinct branches: Event Creation and Participant Registration. Event Creation Flow (Triggered via Form): An administrator submits details (title, price, date/time) via a form. The workflow creates a new Zoom meeting with a unique password. It creates a Stripe Product and a Payment Link. A dedicated Google Sheet tab is created for tracking participants. An email is sent to the event organizer with all the details, including the Zoom link, payment link, and participant list URL. Participant Registration Flow (Triggered via Stripe Webhook): A webhook is triggered when a Stripe payment is completed (checkout.session.completed). The participant's details are added to the dedicated Google Sheet tab. A confirmation email is sent to the participant with the Zoom link and password. A notification email is sent to the event organizer about the new registration. Use Cases Webinar Sales**: Automate setup and registration for paid webinars. Consulting/Coaching Sessions**: Streamline the booking and payment process for group coaching calls. Online Classes**: Handle registration, payment, and access distribution for online courses or classes. Setup Instructions Credentials: Add credentials for: Zoom: For creating the meeting. Google: You need both Gmail and Google Sheets credentials. Stripe: For creating products and handling payment webhooks. Google Sheet: Create a new, blank Google Sheet to hold meeting and participant information. Config Node: Fill the Config node with: currency (e.g., EUR). sheet_url (the URL of the Google Sheet you created). teacher_email (the organizer/host's email). Workflow Logic The workflow splits into two logical parts handled by an if node: Part A: Event Creation (Triggered by Creation Form) Trigger: Creation Form (Form Trigger). Check: if is creation flow (If) evaluates to true. Zoom: Create Zoom meeting creates the session. Stripe Product: Create Stripe Product creates a product and price in Stripe. Stripe Link: Create payment link generates the public payment link, embedding Zoom and sheet metadata. Google Sheet: Create participant list creates a new sheet tab for the event. Email Host: Send email to teacher notifies the host of the successful setup. Part B: Participant Registration (Triggered by On payment) Trigger: On payment (Stripe Trigger - checkout.session.completed). Format: Format participant extracts customer details. Google Sheet: Add participant to list appends the new participant's info to the event's sheet. Email Participant: Send confirmation to participant sends the Zoom access details. Email Host: Notify teacher sends a registration alert. Node Descriptions | Node Name | Description | |-----------|-------------| | Creation Form | A form trigger used to input the event's required details (title, price, start date/time). | | On payment | A Stripe trigger that listens for the checkout.session.completed event, indicating a successful payment. | | Create Zoom meeting | Creates a new Zoom meeting, calculating the start time based on the form inputs. | | Create Stripe Product | Posts to the Stripe API to create a new product and price based on the form data. | | Create payment link | Creates a Stripe Payment Link, embedding Zoom meeting and Google Sheet ID metadata. | | Create participant list | Creates a new tab (named dynamically) in the configured Google Sheet for event tracking. | | Add participant to list | Appends a new row to the event's Google Sheet tab upon payment completion. | | Send email to teacher / Notify teacher | Sends emails to the host/organizer for creation confirmation and new participant registration, respectively. | | Send confirmation to participant | Sends the welcome email to the paying customer with the Zoom access details retrieved from the Stripe metadata. | Customization Tips Email Content**: You are encouraged to adapt the email contents in the Gmail nodes to fit your branding and tone. Currency**: Change the currency in the Config node. Zoom Password**: The password is set to a random 4-character string; you can modify the logic in the Create Zoom meeting node. Stripe Price**: The price is sent to Stripe in the smallest currency unit (e.g., cents, * 100). Suggested Sticky Notes for Workflow Setup**: "Add Your credentials [Zoom, Google, Stripe]. Note: For Google, you need to add Gmail and Google Sheet. Create a new Google Sheet. Keep this sheet blank for now. And fill the config node." Creation Form**: "Your journey to easy event management starts here. Click this node, copy the production URL, and keep it handy. It's your personal admin tool for quickly creating new meetings." Customize**: "Feel free to adapt email contents to your needs." Config**: "Setup your flow". Required Files 2DT5BW5tOdy87AUl_Streamline_Your_Zoom_Meetings_with_Secure,_Automated_Stripe_Payments.json: The n8n workflow export file. A new, blank Google Sheet (URL configured in the Config node). Testing Tips Test Creation**: Run the Creation Form to trigger the Part A flow. Verify that a Zoom meeting and Stripe Payment Link are created, a new Google Sheet tab appears, and the host receives the setup email. Test Registration**: Simulate a successful payment to the generated Stripe link to trigger the Part B flow. Verify that the participant is added to the Google Sheet, receives the confirmation email with Zoom details, and the host receives the notification. Suggested Tags & Categories #Stripe #Zoom #Payment #E-commerce #GoogleSheets #Gmail #Automation #Webinar
by Atik
Automate video transcription and Q&A with async VLM processing that scales from short clips to long recordings. What this workflow does Monitors Google Drive for new files in a specific folder and grabs the file ID on create Automatically downloads the binary to hand off for processing Sends the video to VLM Run for async transcription with a callback URL that posts results back to n8n Receives the transcript JSON via Webhook and appends a row in Google Sheets with the video identifier and transcript data Enables chat Q&A through the Chat Trigger + AI Agent. The agent fetches relevant rows from Sheets and answers only from those segments using the connected chat model Setup Prerequisites: Google Drive and Google Sheets accounts, VLM Run API credentials, OpenAI (or another supported) chat model credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the nodes list, then click Install. You can also confirm on npm if needed. After install, it integrates directly for robust async transcription. Quick Setup: Google Drive folder watch Add Google Drive Trigger and choose Specific folder. Set polling to every minute, event to File Created. Connect Drive OAuth2. Download the new file Add Google Drive node with Download. Map {{$json.id}} and save the binary as data. Async transcription with VLM Run Add VLM Run node. Operation: video. Domain: video.transcription. Enable Process Asynchronously and set Callback URL to your Webhook path (for example /transcript-video). Add your VLM Run API key. Webhook to receive results Add Webhook node with method POST and path /transcript-video. This is the endpoint VLM Run calls when the job completes. Use When Last Node Finishes or respond via a Respond node if you prefer. Append to Google Sheets Add Google Sheets node with Append. Point to your spreadsheet and sheet. Map: Video Name → the video identifier from the webhook payload Data → the transcript text or JSON from the webhook payload Connect Google Sheets OAuth2. Chat entry point and Agent Add Chat Trigger to receive user questions. Add AI Agent and connect: a Chat Model (for example OpenAI Chat Model) the Google Sheets Tool to read relevant rows In the Agent system message, instruct: Use the Sheets tool to fetch transcript rows matching the question Answer only from those rows Cite or reference row context as needed Test and activate Upload a sample video to the watched Drive folder. Wait for the callback to populate your sheet. Ask a question through the Chat Trigger and confirm the agent quotes only from the retrieved rows. Activate your template and let it automate the task. How to take this further Team memory:** Ask “What did we decide on pricing last week?” and get the exact clip and answer. Study helper:** Drop classes in, then ask for key points or formulas by topic. Customer FAQ builder:** Turn real support calls into answers your team can reuse. Podcast highlights:** Find quotes, tips, and standout moments from each episode. Meeting catch-up:** Get decisions and action items from any recording, fast. Marketing snippets:** Pull short, social-ready lines from long demos or webinars. Team learning hub:** Grow a searchable video brain that remembers everything. This workflow uses the VLM Run node for scalable, async video transcription and the AI Agent for grounded Q&A from Sheets, giving you a durable pipeline from upload to searchable answers with minimal upkeep.
by Daniel Shashko
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. Who is this for? Brand managers and marketing teams** needing automated social listening and engagement on Reddit Community managers** responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers** who want to scale Reddit marketing without hiring a team PR and reputation teams** monitoring brand sentiment and responding to discussions in real-time Product marketers** seeking organic engagement opportunities in product-related communities Any business** that wants to build authentic Reddit presence while avoiding spammy marketing tactics Setup Steps Setup time:** Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements:** Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOUR_SHEET_ID with your actual Google Sheets document ID Send Slack Report: Replace YOUR_CHANNEL_ID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
by Ruben AI
AI-Powered Flyer & Video Generator with Airtable, Klie.ai, and n8n Who is this for? This template is perfect for e-commerce entrepreneurs, marketers, agencies, and creative teams who want to turn simple product photos and short descriptions into professional flyers or product videos—automatically and at scale. If you want to generate polished marketing assets without relying on designers or editors, this is for you. What problem is this workflow solving? Creating product ads, flyers, or videos usually involves multiple tools and manual steps: Collecting and cleaning product photos Writing ad copy or descriptions Designing flyers or visuals for campaigns Producing animations or video ads Managing multiple revisions and approvals This workflow automates the entire pipeline. Upload a raw product image into Airtable, type a quick description, and receive back a flyer or video animation tailored to your brand and context—ready to use for ads, websites, or campaigns. What this workflow does Uses Airtable as the central interface where you upload raw product photos and enter descriptions Processes the content automatically via n8n Generates flyers and visuals using OpenAI Image 1 Produces custom product videos with Google’s VEO3 Runs through Klie.ai to unify the image + video generation process Sends the final creative assets back into Airtable for review and download Setup Download n8n files and connect your Airtable token to n8n Duplicate the Airtable base and make sure you’re on an Airtable Team plan Add your API key on the Airtable interface under API setup Create your agency inside the interface Start generating concept images and videos instantly How to customize this workflow to your needs Edit the prompts to match your brand voice and ad style Extend Airtable fields to include more creative parameters (colors, layout, target audience) Add approval steps via email, Slack, or Airtable statuses before finalizing Integrate with publishing platforms (social media, e-commerce CMS) for auto-posting Track generated assets inside Airtable for team collaboration 🎥 Demo Video: Demo Video
by Avkash Kakdiya
How it works This workflow automatically collects a list of companies from Google Sheets, searches for their competitors using SerpAPI, extracts up to 10 relevant competitor names with source links, and logs the results into both Google Sheets and Airtable. It runs on a set schedule, cleans and formats the company list, processes each entry individually, checks if competitors exist, and separates results into successful and “no competitors found” lists for organized tracking. Step-by-step 1. Trigger & Input Auto Run (Scheduled) – Executes every day at the set time (e.g., 9 AM). Read Companies Sheet – Pulls the list of companies from a Google Sheet (List column). Clean & Format Company List – Removes empty rows, trims names, and attaches row numbers for tracking. Loop Over Companies – Processes each company one at a time in batches. 2. Competitor Search Search Company Competitors (SerpAPI) – Sends a query like "{Company} competitors" to SerpAPI, retrieving structured search results in JSON format. 3. Data Extraction & Validation Extract Competitor Data from Search – Parses SerpAPI results to: Identify the company name Extract up to 10 competitor names Capture the top source URL Count total search results Has Competitors? – Checks if any competitors were found: Yes → Proceeds to logging No → Logs in “no results” list 4. Logging Results Log to Result Sheet – Appends or updates competitor data into the results Google Sheet. Log Companies Without Results – Records companies with zero competitors found in a separate section of the results sheet. Sync to Airtable – Pushes all results (successful or not) into Airtable for unified storage and analysis. Benefits Automated Competitor Research – Eliminates the need for manual Google searching. Daily Insights – Runs automatically at your chosen schedule. Clean Data Output – Stores structured competitor lists with sources for easy review. Multi-Destination Sync – Saves to both Google Sheets and Airtable for flexibility. Scalable & Hands-Free – Handles hundreds of companies without extra effort.
by Vinay Gangidi
LOB Underwriting with AI This template ingests borrower documents from OneDrive, extracts text with OCR, classifies each file (ID, paystub, bank statement, utilities, tax forms, etc.), aggregates everything per borrower, and asks an LLM to produce a clear underwriting summary and decision (plus next steps). Good to know AI and OCR usage consume credits (OpenAI + your OCR provider). Folder lookups by name can be ambiguous—use a fixed folderId in production. Scanned image quality drives OCR accuracy; bad scans yield weak text. This flow handles PII—mask sensitive data in logs and control access. Start small: batch size and pagination keep costs/memory sane. How it works Import & locate docs: Manual trigger kicks off a OneDrive folder search (e.g., “LOBs”) and lists files inside. Per-file loop: Download each file → run OCR → classify the document type using filename + extracted text. Aggregate: Combine per-file results into a borrower payload (make BorrowerName dynamic). LLM analysis: Feed the payload to an AI Agent (OpenAI model) to extract underwriting-relevant facts and produce a decision + next steps. Output: Return a human-readable summary (and optionally structured JSON for systems). How to use Start with the Manual Trigger to validate end-to-end on a tiny test folder. Once stable, swap in a Schedule/Cron or Webhook trigger. Review the generated underwriting summary; handle only flagged exceptions (unknown/unreadable docs, low confidence). Setup steps Connect accounts Add credentials for OneDrive, OCR, and OpenAI. Configure inputs In Search a folder, point to your borrower docs (prefer folderId; otherwise tighten the name query). In Get items in a folder, enable pagination if the folder is large. In Split in Batches, set a conservative batch size to control costs. Wire the file path Download a file must receive the current file’s id from the folder listing. Make sure the OCR node receives binary input (PDFs/images). Classification Update keyword rules to match your region/lenders/utilities/tax forms. Keep a fallback Unknown class and log it for review. Combine Replace the hard-coded BorrowerName with: a Set node field, a form input, or parsing from folder/file naming conventions. AI Agent Set your OpenAI model/credentials. Ask the model to output JSON first (structured fields) and Markdown second (readable summary). Keep temperature low for consistent, audit-friendly results. Optional outputs Persist JSON/Markdown to Notion/Docs/DB or write to storage. Customize if needed Doc types: add/remove categories and keywords without touching core logic. Error handling: add IF paths for empty folders, failed downloads, empty OCR, or Unknown class; retry transient API errors. Privacy: redact IDs/account numbers in logs; restrict execution visibility. Scale: add MIME/size filters, duplicate detection, and multi-borrower folder patterns (parent → subfolders).
by Paul Abraham
This n8n template demonstrates how to turn a Telegram bot into a personal AI-powered assistant that understands both voice notes and text messages. The assistant can transcribe speech, interpret user intent with AI, and perform smart actions such as managing calendars, sending emails, or creating notes. Use cases Hands-free scheduling with Google Calendar Quickly capturing ideas as Notion notes via voice Sending Gmail messages directly from Telegram A personal productivity assistant available on-the-go Good to know Voice notes are automatically transcribed into text before being processed. This template uses Google Gemini for AI reasoning.The AI agent supports memory, enabling more natural and contextual conversations. How it works Telegram Trigger – Starts when you send a text or voice note to your Telegram bot. Account Check – Ensures only authorized users can interact with the bot. Audio Handling – If it’s a voice message, the workflow retrieves and transcribes the recording. AI Agent – Both transcribed voice or text are sent to the AI Agent powered by Google Gemini + Simple Memory. Smart Actions – Based on the query, the AI can: Read or create events in Google Calendar Create notes in Notion Send messages in Gmail Reply in Telegram – The bot sends a response confirming the action or providing the requested information. How to use Clone this workflow into your n8n instance. Replace the Telegram Trigger with your bot credentials. Connect Google Calendar, Notion, and Gmail accounts where required. Start chatting with your Telegram bot to add events, notes, or send emails using just your voice or text. Requirements Telegram bot & API key Google Gemini account for AI Google Calendar, Notion, and Gmail integrations (optional, depending on use case) Customising this workflow Add more integrations (Slack, Trello, Airtable, etc.) for extended productivity. Modify the AI prompt in the agent node to fine-tune personality or task focus. Swap in another transcription service if preferred.
by SpaGreen Creative
WhatsApp Bulk Number Verification in Google Sheets Using Unofficial Rapiwa API Who’s it for This workflow is for marketers, small business owners, freelancers, and support teams who want to automate WhatsApp messaging using a Google Sheet without the official WhatsApp Business API. It’s suitable when you need a budget-friendly, easy-to-maintain solution that uses your personal or business WhatsApp number via an unofficial API service such as Rapiwa. How it works / What it does The workflow looks for rows in a Google Sheet where the Status column is pending. It cleans each phone number (removes non-digits). It verifies the number with the Rapiwa verify endpoint (/api/verify-whatsapp). If the number is verified: The workflow can send a message (optional). It updates the sheet: Verification = verified, Status = sent (or leaves Status for the send node to update). If the number is not verified: It skips sending. It updates the sheet: Verification = unverified, Status = not sent. The workflow processes rows in batches and inserts short delays between items to avoid rate limits. The whole process runs on a schedule (configurable). Key features Scheduled automatic checks (configurable interval; recommended 5–10 minutes). Cleans phone numbers to a proper format before verification. Verifies WhatsApp registration using Rapiwa. Batch processing with limits to control workload (recommended max per run configurable). Short delay between items to reduce throttling and temporary blocks. Automatic sheet updates for auditability (verified/unverified, sent/not sent). Defaults recommended in this workflow Trigger interval: every 5–10 minutes (adjustable). Max items per run: configurable (example: 200 max per cycle). Delay between items: 2–5 seconds (example uses 3 seconds). How to set up Duplicate the sample Google Sheet: ➤ Sample Fill contact rows and set Status = pending. Include columns like WhatsApp No, Name, Message, Verification, Status. In n8n, add and authenticate a Google Sheets node pointed to your sheet. Create an HTTP Bearer credential in n8n and paste your Rapiwa API key. Configure the workflow nodes (Trigger → Google Sheets → Limit/SplitInBatches → Code (clean) → HTTP Request (verify) → If → Update Sheet → Wait). Enable the workflow and monitor first runs with a small test batch. Requirements n8n instance with Google Sheets and HTTP Request nodes enabled. Google Sheets OAuth2 credentials configured in n8n. Rapiwa account and Bearer token (stored in n8n credentials). Google Sheet formatted to match the workflow columns. Why use Rapiwa Cost-effective and developer-friendly REST API for WhatsApp verification and sending. Simple integration via HTTP requests and n8n. Useful when you prefer not to use the official WhatsApp Business API. Note: Rapiwa is an unofficial service — review its terms and risks before production use. How to customize Change schedule frequency in the Trigger node. Adjust maxItems in Limit/SplitInBatches for throughput control. Change the Wait node delay for safer sending. Modify the HTTP Request body to support media or templates if the provider supports it. Add logging or a separate audit sheet to record API responses and errors. Best practices Test with a small batch first. Keep the sheet headers exact and consistent. Store API keys in n8n credentials (do not hardcode). Increase Wait time or reduce batch size if you see rate limits. Keep a log sheet of verified/unverified rows for troubleshooting. Example HTTP verify body (n8n HTTP Request node) { "number": "{{ $json['WhatsApp No'] }}" } Notes and best practices Test with a small batch before scaling. Store the Rapiwa token in n8n credentials, not in node fields. Increase Wait delay or reduce batch size if you see rate limits or temporary blocks. Keep the sheet headers consistent; the workflow matches columns by name. Log API responses or errors for troubleshooting. Optional Add a send-message HTTP Request node after verification to send messages. Append successful and failed rows to separate sheets for easy review. Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Raphael De Carvalho Florencio
What this workflow is (About) This workflow turns a Telegram bot into an AI-powered lyrics assistant. Users send a command plus a lyrics URL, and the flow downloads, cleans, and analyzes the text, then replies on Telegram with translated lyrics, summaries, vocabulary, poetic devices, or an interpretation—all generated by AI (OpenAI). What problems it solves Centralizes lyrics retrieval + cleanup + AI analysis in one automated flow Produces study-ready outputs (translation, vocabulary, figures of speech) Saves time for teachers, learners, and music enthusiasts with instant results in chat Key features AI analysis** using OpenAI (no secrets hardcoded; uses n8n Credentials) Line-by-line translation, **concise summaries, vocabulary lists Poetic/literary device detection* and *emotional/symbolic interpretation** Robust ETL (extract, download, sanitize) and error handling Clear Sticky Notes documenting routing, ETL, AI prompts, and messaging Who it’s for Language learners & teachers Musicians, lyricists, and music bloggers Anyone studying lyrics for meaning, style, or vocabulary Input & output Input:* Telegram command with a public *lyrics URL** Output:** Telegram messages (Markdown/MarkdownV2), split into chunks if long How it works Telegram → Webhook** receives a user message (e.g., /get_lyrics <URL>). Routing (If/Switch)** detects which command was sent. Extract URL + Download (HTTP Request)** fetches the lyrics page. Cleanup (Code)** strips HTML/scripts/styles and normalizes whitespace. OpenAI (Chat)** formats the result per command (translation, summary, vocabulary, analysis). Telegram (Send Message)** returns the final text; long outputs are split into chunks. Error handling** replies with friendly guidance for unsupported/incomplete commands. Set up steps Create a Telegram bot with @BotFather and copy the bot token. In n8n, create Credentials → Telegram API and paste your token (no hardcoded keys in nodes). Create Credentials → OpenAI and paste your API key. Import the workflow and set a short webhook path (e.g., /lyrics-bot). Publish the webhook and set it on Telegram: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook?url=https://[YOUR_DOMAIN]/webhook/lyrics-bot (Optional) Restrict update types: curl -X POST https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook \ -H "Content-Type: application/json" \ -d '{ "url": "https://[YOUR_DOMAIN]/webhook/lyrics-bot", "allowed_updates": ["message"] }' Test by sending /start and then /get_lyrics <PUBLIC_URL> to your bot. If messages are long, ensure MarkdownV2 is used and special characters are escaped.
by Rakin Jakaria
Who this is for This workflow is for content creators, digital marketers, or YouTube strategists who want to automatically discover trending videos in their niche, analyze engagement metrics, and get data-driven insights for their content strategy — all from one simple form submission. What this workflow does This workflow starts every time someone submits the YouTube Trends Finder Form. It then: Searches YouTube videos* based on your topic and specified time range using the *YouTube Data API**. Fetches detailed analytics** (views, likes, comments, engagement rates) for each video found. Calculates engagement rates** and filters out low-performing content (below 2% engagement). Applies smart filters** to exclude videos with less than 1000 views, content outside your timeframe, and hashtag-heavy titles. Removes duplicate videos** to ensure clean data. Creates a Google Spreadsheet** with all trending video data organized by performance metrics. Delivers the results** via a completion form with a direct link to your analytics report. Setup To set this workflow up: Form Trigger – Customize the "YouTube Trends Finder" form fields if needed (Topic Name, Last How Many Days). YouTube Data API – Add your YouTube OAuth2 credentials and API key in the respective nodes. Google Sheets – Connect your Google Sheets account for automatic report generation. Engagement Filters – Adjust the 2% engagement rate threshold based on your quality standards. View Filters – Modify the minimum view count (currently 1000+) in the filter conditions. Regional Settings – Update the region code (currently "US") to target specific geographic markets. How to customize this workflow to your needs Change the engagement rate threshold to be more or less strict based on your niche requirements. Add additional filters like video duration, subscriber count, or specific keywords to refine results. Modify the Google Sheets structure to include extra metrics like "Channel Name", "Video Duration", or "Trending Score". Switch to different output formats like CSV export or direct email reports instead of Google Sheets.
by Supira Inc.
Overview This template automates invoice processing for teams that currently copy data from PDFs into spreadsheets by hand. It is ideal for small businesses, back-office teams, accounting, and operations who want to reduce manual entry, avoid human error, and never miss a payment deadline. The workflow watches a structured Google Drive folder, performs OCR, converts the text into clean structured JSON with an LLM, and appends one row per invoice into Google Sheets. It preserves a link back to the original file for easy review and audit. Designed for small businesses and back-office teams.** Eliminates manual typing** and reduces errors. Prevents missed due dates** by centralizing data. Works with monthly subfolders like "2025年10月分" (meaning "October 2025"). Keeps a Google Drive link to each invoice file. How It Works The workflow runs on a schedule, scans your Drive folder hierarchy, OCRs the PDFs/images, cleans the text, extracts key fields with an LLM, and appends a row to Google Sheets per invoice. Each step is modular so you can swap services or tweak prompts without breaking the flow. Scheduled trigger** runs on a recurring cadence. Scan the parent folder** in Google Drive. Auto-detect the current-month folder** (e.g., a folder named "2025年10月分" meaning "October 2025"). Download PDFs/images** from the detected folder. Extract text** using the OCR.Space API. Clean noise** and normalize with a Code node. Use an OpenAI model** to extract invoice_date, due_date, client_name, line items, totals, and bank info to JSON. Append one row per invoice** to Google Sheets. Requirements Before you start, make sure you have access to the required services and that your Drive is organized into monthly subfolders so the workflow can find the right files. n8n account.** Google Drive access.** Google Sheets access.** OCR.Space API key** (set as <your_ocr_api_key>). OpenAI / LLM API credential** (e.g., <your_openai_credential_name>). Invoice PDFs organized by month** on Google Drive (e.g., folders like "2025年10月分"). Setup Instructions Import the workflow, replace placeholder credentials and IDs with your own, and enable the schedule. You can also run it manually for testing. The parent-folder query and sheet ID must reflect your environment. Replace <your_google_drive_credential_id> and <your_google_drive_credential_name> with your Google Drive Credential. Adjust the parent folder search query to your invoice repository name. Replace the Sheets document ID <your_google_sheet_id> with your spreadsheet ID. Ensure your OpenAI credential <your_openai_credential_name> is selected. Set your OCR.Space key as <your_ocr_api_key>. Enable the Schedule Trigger** after testing. Customization This workflow is easily extensible. You can adapt folder naming rules, enrich the spreadsheet schema, and expand the AI prompt to extract custom fields specific to your company. It also works beyond invoices, covering receipts, quotes, or purchase orders with minor changes. Change the monthly folder naming rule such as {{$now.setZone("Asia/Tokyo").format("yyyy年MM月")}}分 to match your convention. Modify or extend Google Sheets column mappings as needed. Tune the AI prompt to extract project codes, owner names, or custom fields. Repurpose for receipts, quotes, or purchase orders. Localize date formats and tax calculation rules to your standards.
by Nguyen Thieu Toan
🤖 Facebook Messenger Smart Chatbot – Batch, Format & Notify with n8n Data Table by Nguyen Thieu Toan 🌟 What Is This Workflow? This is a smart chatbot solution built with n8n, designed to integrate seamlessly with Facebook Messenger. It batches incoming messages, formats them for clarity, tracks conversation history, and sends natural replies using AI. Perfect for businesses, customer support, or personal AI agents. ⚙️ Key Features 🔄 Smart batching: Groups consecutive user messages to process them in one go, avoiding fragmented replies. 🧠 Context formatting: Automatically formats messages to fit Messenger’s structure and length limits. 📋 Conversation history tracking: Stores and retrieves chat logs between user and bot using n8n Data Table. 👀 Seen & Typing effects: Adds human-like responsiveness with Messenger’s sender actions. 🧩 AI Agent integration: Easily connects to GPT, Gemini, or any LLM for natural replies, scheduling, or business logic. 🚀 How It Works Connects to your Facebook Page via webhook to receive and send messages. Stores incoming messages in a Data Table called Batch_messages, including fields like user_text, bot_rep, processed, etc. Collects unprocessed messages, sorts them by id, and creates a merged_message and full history. Sends the history to an AI Agent for contextual response generation. Sends the AI reply back to Messenger with Seen/Typing effects. Updates the message status to processed = true to prevent duplicate handling. 🛠️ Setup Guide Create a Facebook App and Messenger webhook, link it to your Page. Set up the Batch_messages Data Table in n8n with required columns. Import the workflow or build nodes manually using the tutorial. Configure your API tokens, webhook URLs, and AI Agent endpoint. Deploy the workflow on a public n8n server. 📘 Full tutorial available at: 👉 Smart Chatbot Workflow Guide by Nguyen Thieu Toan 💡 Pro Tips Customize the AI prompt and persona to match your business tone. Add scheduling, lead capture, or CRM integration using n8n’s flexible nodes. Monitor your Data Table regularly to ensure clean message flow and batching. 👤 About the Creator Nguyen Thieu Toan (Nguyễn Thiệu Toàn/Jay Nguyen) is an expert in AI automation, business optimization, and chatbot development. With a background in marketing and deep knowledge of n8n workflows, Jay helps businesses harness AI to save time, boost performance, and deliver smarter customer experiences. Website: https://nguyenthieutoan.com