by Atik
Automate video transcription and Q&A with async VLM processing that scales from short clips to long recordings. What this workflow does Monitors Google Drive for new files in a specific folder and grabs the file ID on create Automatically downloads the binary to hand off for processing Sends the video to VLM Run for async transcription with a callback URL that posts results back to n8n Receives the transcript JSON via Webhook and appends a row in Google Sheets with the video identifier and transcript data Enables chat Q&A through the Chat Trigger + AI Agent. The agent fetches relevant rows from Sheets and answers only from those segments using the connected chat model Setup Prerequisites: Google Drive and Google Sheets accounts, VLM Run API credentials, OpenAI (or another supported) chat model credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the nodes list, then click Install. You can also confirm on npm if needed. After install, it integrates directly for robust async transcription. Quick Setup: Google Drive folder watch Add Google Drive Trigger and choose Specific folder. Set polling to every minute, event to File Created. Connect Drive OAuth2. Download the new file Add Google Drive node with Download. Map {{$json.id}} and save the binary as data. Async transcription with VLM Run Add VLM Run node. Operation: video. Domain: video.transcription. Enable Process Asynchronously and set Callback URL to your Webhook path (for example /transcript-video). Add your VLM Run API key. Webhook to receive results Add Webhook node with method POST and path /transcript-video. This is the endpoint VLM Run calls when the job completes. Use When Last Node Finishes or respond via a Respond node if you prefer. Append to Google Sheets Add Google Sheets node with Append. Point to your spreadsheet and sheet. Map: Video Name → the video identifier from the webhook payload Data → the transcript text or JSON from the webhook payload Connect Google Sheets OAuth2. Chat entry point and Agent Add Chat Trigger to receive user questions. Add AI Agent and connect: a Chat Model (for example OpenAI Chat Model) the Google Sheets Tool to read relevant rows In the Agent system message, instruct: Use the Sheets tool to fetch transcript rows matching the question Answer only from those rows Cite or reference row context as needed Test and activate Upload a sample video to the watched Drive folder. Wait for the callback to populate your sheet. Ask a question through the Chat Trigger and confirm the agent quotes only from the retrieved rows. Activate your template and let it automate the task. How to take this further Team memory:** Ask “What did we decide on pricing last week?” and get the exact clip and answer. Study helper:** Drop classes in, then ask for key points or formulas by topic. Customer FAQ builder:** Turn real support calls into answers your team can reuse. Podcast highlights:** Find quotes, tips, and standout moments from each episode. Meeting catch-up:** Get decisions and action items from any recording, fast. Marketing snippets:** Pull short, social-ready lines from long demos or webinars. Team learning hub:** Grow a searchable video brain that remembers everything. This workflow uses the VLM Run node for scalable, async video transcription and the AI Agent for grounded Q&A from Sheets, giving you a durable pipeline from upload to searchable answers with minimal upkeep.
by Automate With Marc
## Podcast on Autopilot — Generate Podcast Ideas, Scripts & Audio Automatically with Eleven Labs, GPT-5 and Claude Sonnet 4.0 Bring your solo podcast to life — on full autopilot. This workflow uses GPT-5 and Claude Sonnet to turn a single topic input into a complete podcast episode intro and ready-to-send audio file. How it works Start a chat trigger – enter a seed idea or topic (e.g., “habits,” “failure,” “technology and purpose”). Podcast Idea Agent (GPT-5) instantly crafts a thought-provoking, Rogan- or Bartlett-style episode concept with a clear angle and takeaway. Podcast Script Agent (Claude 4.0 Sonnet) expands that idea into a natural, engaging 60-second opening monologue ready for recording. Text-to-Speech via ElevenLabs automatically converts the script into a high-quality voice track. Email automation sends the finished MP3 directly to your inbox. Perfect for • Solo creators who want to ideate, script and voice short podcasts effortlessly • Content teams prototyping daily or weekly audio snippets • Anyone testing AI-driven storytelling pipelines Customization tips • Swap ElevenLabs with your preferred TTS service by editing the HTTP Request node. • Adjust prompt styles for tone or audience in the Idea and Script Agents. • Modify the Gmail (or other mail service) node to send audio to any destination (Drive, Slack, Notion, etc.). • For reuse at scale, add variables for episode number, guest name, or theme category — just clone and update the trigger node. Watch step-by-step tutorial (how to build it yourself) https://www.youtube.com/watch?v=Dan3_W1JoqU Requirements & disclaimer • Requires API keys for OpenAI + Anthropic + ElevenLabs (or your chosen TTS). • You’re responsible for managing costs incurred through AI or TTS usage. • Avoid sharing sensitive or private data as input into prompt flows. • Designed with modularity so you can turn off or swap/deep-link any stage (idea → script → voice → email) without breaking the chain.
by Daniel Shashko
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. Who is this for? Brand managers and marketing teams** needing automated social listening and engagement on Reddit Community managers** responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers** who want to scale Reddit marketing without hiring a team PR and reputation teams** monitoring brand sentiment and responding to discussions in real-time Product marketers** seeking organic engagement opportunities in product-related communities Any business** that wants to build authentic Reddit presence while avoiding spammy marketing tactics Setup Steps Setup time:** Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements:** Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOUR_SHEET_ID with your actual Google Sheets document ID Send Slack Report: Replace YOUR_CHANNEL_ID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
by Ruben AI
AI-Powered Flyer & Video Generator with Airtable, Klie.ai, and n8n Who is this for? This template is perfect for e-commerce entrepreneurs, marketers, agencies, and creative teams who want to turn simple product photos and short descriptions into professional flyers or product videos—automatically and at scale. If you want to generate polished marketing assets without relying on designers or editors, this is for you. What problem is this workflow solving? Creating product ads, flyers, or videos usually involves multiple tools and manual steps: Collecting and cleaning product photos Writing ad copy or descriptions Designing flyers or visuals for campaigns Producing animations or video ads Managing multiple revisions and approvals This workflow automates the entire pipeline. Upload a raw product image into Airtable, type a quick description, and receive back a flyer or video animation tailored to your brand and context—ready to use for ads, websites, or campaigns. What this workflow does Uses Airtable as the central interface where you upload raw product photos and enter descriptions Processes the content automatically via n8n Generates flyers and visuals using OpenAI Image 1 Produces custom product videos with Google’s VEO3 Runs through Klie.ai to unify the image + video generation process Sends the final creative assets back into Airtable for review and download Setup Download n8n files and connect your Airtable token to n8n Duplicate the Airtable base and make sure you’re on an Airtable Team plan Add your API key on the Airtable interface under API setup Create your agency inside the interface Start generating concept images and videos instantly How to customize this workflow to your needs Edit the prompts to match your brand voice and ad style Extend Airtable fields to include more creative parameters (colors, layout, target audience) Add approval steps via email, Slack, or Airtable statuses before finalizing Integrate with publishing platforms (social media, e-commerce CMS) for auto-posting Track generated assets inside Airtable for team collaboration 🎥 Demo Video: Demo Video
by Avkash Kakdiya
How it works This workflow automatically collects a list of companies from Google Sheets, searches for their competitors using SerpAPI, extracts up to 10 relevant competitor names with source links, and logs the results into both Google Sheets and Airtable. It runs on a set schedule, cleans and formats the company list, processes each entry individually, checks if competitors exist, and separates results into successful and “no competitors found” lists for organized tracking. Step-by-step 1. Trigger & Input Auto Run (Scheduled) – Executes every day at the set time (e.g., 9 AM). Read Companies Sheet – Pulls the list of companies from a Google Sheet (List column). Clean & Format Company List – Removes empty rows, trims names, and attaches row numbers for tracking. Loop Over Companies – Processes each company one at a time in batches. 2. Competitor Search Search Company Competitors (SerpAPI) – Sends a query like "{Company} competitors" to SerpAPI, retrieving structured search results in JSON format. 3. Data Extraction & Validation Extract Competitor Data from Search – Parses SerpAPI results to: Identify the company name Extract up to 10 competitor names Capture the top source URL Count total search results Has Competitors? – Checks if any competitors were found: Yes → Proceeds to logging No → Logs in “no results” list 4. Logging Results Log to Result Sheet – Appends or updates competitor data into the results Google Sheet. Log Companies Without Results – Records companies with zero competitors found in a separate section of the results sheet. Sync to Airtable – Pushes all results (successful or not) into Airtable for unified storage and analysis. Benefits Automated Competitor Research – Eliminates the need for manual Google searching. Daily Insights – Runs automatically at your chosen schedule. Clean Data Output – Stores structured competitor lists with sources for easy review. Multi-Destination Sync – Saves to both Google Sheets and Airtable for flexibility. Scalable & Hands-Free – Handles hundreds of companies without extra effort.
by Vinay Gangidi
LOB Underwriting with AI This template ingests borrower documents from OneDrive, extracts text with OCR, classifies each file (ID, paystub, bank statement, utilities, tax forms, etc.), aggregates everything per borrower, and asks an LLM to produce a clear underwriting summary and decision (plus next steps). Good to know AI and OCR usage consume credits (OpenAI + your OCR provider). Folder lookups by name can be ambiguous—use a fixed folderId in production. Scanned image quality drives OCR accuracy; bad scans yield weak text. This flow handles PII—mask sensitive data in logs and control access. Start small: batch size and pagination keep costs/memory sane. How it works Import & locate docs: Manual trigger kicks off a OneDrive folder search (e.g., “LOBs”) and lists files inside. Per-file loop: Download each file → run OCR → classify the document type using filename + extracted text. Aggregate: Combine per-file results into a borrower payload (make BorrowerName dynamic). LLM analysis: Feed the payload to an AI Agent (OpenAI model) to extract underwriting-relevant facts and produce a decision + next steps. Output: Return a human-readable summary (and optionally structured JSON for systems). How to use Start with the Manual Trigger to validate end-to-end on a tiny test folder. Once stable, swap in a Schedule/Cron or Webhook trigger. Review the generated underwriting summary; handle only flagged exceptions (unknown/unreadable docs, low confidence). Setup steps Connect accounts Add credentials for OneDrive, OCR, and OpenAI. Configure inputs In Search a folder, point to your borrower docs (prefer folderId; otherwise tighten the name query). In Get items in a folder, enable pagination if the folder is large. In Split in Batches, set a conservative batch size to control costs. Wire the file path Download a file must receive the current file’s id from the folder listing. Make sure the OCR node receives binary input (PDFs/images). Classification Update keyword rules to match your region/lenders/utilities/tax forms. Keep a fallback Unknown class and log it for review. Combine Replace the hard-coded BorrowerName with: a Set node field, a form input, or parsing from folder/file naming conventions. AI Agent Set your OpenAI model/credentials. Ask the model to output JSON first (structured fields) and Markdown second (readable summary). Keep temperature low for consistent, audit-friendly results. Optional outputs Persist JSON/Markdown to Notion/Docs/DB or write to storage. Customize if needed Doc types: add/remove categories and keywords without touching core logic. Error handling: add IF paths for empty folders, failed downloads, empty OCR, or Unknown class; retry transient API errors. Privacy: redact IDs/account numbers in logs; restrict execution visibility. Scale: add MIME/size filters, duplicate detection, and multi-borrower folder patterns (parent → subfolders).
by Paul Abraham
This n8n template demonstrates how to turn a Telegram bot into a personal AI-powered assistant that understands both voice notes and text messages. The assistant can transcribe speech, interpret user intent with AI, and perform smart actions such as managing calendars, sending emails, or creating notes. Use cases Hands-free scheduling with Google Calendar Quickly capturing ideas as Notion notes via voice Sending Gmail messages directly from Telegram A personal productivity assistant available on-the-go Good to know Voice notes are automatically transcribed into text before being processed. This template uses Google Gemini for AI reasoning.The AI agent supports memory, enabling more natural and contextual conversations. How it works Telegram Trigger – Starts when you send a text or voice note to your Telegram bot. Account Check – Ensures only authorized users can interact with the bot. Audio Handling – If it’s a voice message, the workflow retrieves and transcribes the recording. AI Agent – Both transcribed voice or text are sent to the AI Agent powered by Google Gemini + Simple Memory. Smart Actions – Based on the query, the AI can: Read or create events in Google Calendar Create notes in Notion Send messages in Gmail Reply in Telegram – The bot sends a response confirming the action or providing the requested information. How to use Clone this workflow into your n8n instance. Replace the Telegram Trigger with your bot credentials. Connect Google Calendar, Notion, and Gmail accounts where required. Start chatting with your Telegram bot to add events, notes, or send emails using just your voice or text. Requirements Telegram bot & API key Google Gemini account for AI Google Calendar, Notion, and Gmail integrations (optional, depending on use case) Customising this workflow Add more integrations (Slack, Trello, Airtable, etc.) for extended productivity. Modify the AI prompt in the agent node to fine-tune personality or task focus. Swap in another transcription service if preferred.
by SpaGreen Creative
WhatsApp Bulk Number Verification in Google Sheets Using Unofficial Rapiwa API Who’s it for This workflow is for marketers, small business owners, freelancers, and support teams who want to automate WhatsApp messaging using a Google Sheet without the official WhatsApp Business API. It’s suitable when you need a budget-friendly, easy-to-maintain solution that uses your personal or business WhatsApp number via an unofficial API service such as Rapiwa. How it works / What it does The workflow looks for rows in a Google Sheet where the Status column is pending. It cleans each phone number (removes non-digits). It verifies the number with the Rapiwa verify endpoint (/api/verify-whatsapp). If the number is verified: The workflow can send a message (optional). It updates the sheet: Verification = verified, Status = sent (or leaves Status for the send node to update). If the number is not verified: It skips sending. It updates the sheet: Verification = unverified, Status = not sent. The workflow processes rows in batches and inserts short delays between items to avoid rate limits. The whole process runs on a schedule (configurable). Key features Scheduled automatic checks (configurable interval; recommended 5–10 minutes). Cleans phone numbers to a proper format before verification. Verifies WhatsApp registration using Rapiwa. Batch processing with limits to control workload (recommended max per run configurable). Short delay between items to reduce throttling and temporary blocks. Automatic sheet updates for auditability (verified/unverified, sent/not sent). Defaults recommended in this workflow Trigger interval: every 5–10 minutes (adjustable). Max items per run: configurable (example: 200 max per cycle). Delay between items: 2–5 seconds (example uses 3 seconds). How to set up Duplicate the sample Google Sheet: ➤ Sample Fill contact rows and set Status = pending. Include columns like WhatsApp No, Name, Message, Verification, Status. In n8n, add and authenticate a Google Sheets node pointed to your sheet. Create an HTTP Bearer credential in n8n and paste your Rapiwa API key. Configure the workflow nodes (Trigger → Google Sheets → Limit/SplitInBatches → Code (clean) → HTTP Request (verify) → If → Update Sheet → Wait). Enable the workflow and monitor first runs with a small test batch. Requirements n8n instance with Google Sheets and HTTP Request nodes enabled. Google Sheets OAuth2 credentials configured in n8n. Rapiwa account and Bearer token (stored in n8n credentials). Google Sheet formatted to match the workflow columns. Why use Rapiwa Cost-effective and developer-friendly REST API for WhatsApp verification and sending. Simple integration via HTTP requests and n8n. Useful when you prefer not to use the official WhatsApp Business API. Note: Rapiwa is an unofficial service — review its terms and risks before production use. How to customize Change schedule frequency in the Trigger node. Adjust maxItems in Limit/SplitInBatches for throughput control. Change the Wait node delay for safer sending. Modify the HTTP Request body to support media or templates if the provider supports it. Add logging or a separate audit sheet to record API responses and errors. Best practices Test with a small batch first. Keep the sheet headers exact and consistent. Store API keys in n8n credentials (do not hardcode). Increase Wait time or reduce batch size if you see rate limits. Keep a log sheet of verified/unverified rows for troubleshooting. Example HTTP verify body (n8n HTTP Request node) { "number": "{{ $json['WhatsApp No'] }}" } Notes and best practices Test with a small batch before scaling. Store the Rapiwa token in n8n credentials, not in node fields. Increase Wait delay or reduce batch size if you see rate limits or temporary blocks. Keep the sheet headers consistent; the workflow matches columns by name. Log API responses or errors for troubleshooting. Optional Add a send-message HTTP Request node after verification to send messages. Append successful and failed rows to separate sheets for easy review. Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Rahul Joshi
📊 Description Streamline Facebook Messenger inbox management with an AI-powered categorization and response system. 💬⚙️ This workflow automatically classifies new messages as Lead, Query, or Spam using GPT-4, routes them for approval via Slack, responds on Facebook once approved, and logs all interactions into Google Sheets for tracking. Perfect for support and marketing teams managing high volumes of inbound DMs. 🚀📈 What This Template Does 1️⃣ Trigger – Runs hourly to fetch new Facebook Page messages. ⏰ 2️⃣ Extract & Format – Collects sender info, timestamps, and message content for analysis. 📋 3️⃣ AI Categorization – Uses GPT-4 to identify message type (Lead, Query, Spam) and suggest replies. 🧠 4️⃣ Slack Approval Flow – Sends categorized leads and queries to Slack for quick team approval. 💬 5️⃣ Facebook Response – Posts AI-suggested replies back to the original sender once approved. 💌 6️⃣ Data Logging – Records every message, reply, and approval status into Google Sheets for analytics. 📊 7️⃣ Error Handling – Automatically alerts via Slack if the workflow encounters an error. 🚨 Key Benefits ✅ Reduces manual message triage on Facebook Messenger ✅ Ensures consistent and professional customer replies ✅ Provides full visibility via Google Sheets logs ✅ Centralizes team approvals in Slack for faster response times ✅ Leverages GPT-4 for accurate categorization and natural replies Features Hourly Facebook message fetch with Graph API GPT-4 powered text classification and reply suggestion Slack-based dual approval flow Automated Facebook replies post-approval Google Sheets logging for all categorized messages Built-in error detection and Slack alerting Requirements Facebook Graph API credentials with page message permissions OpenAI API key for GPT-4 processing Slack API credentials with chat:write permission Google Sheets OAuth2 credentials Environment Variables: FACEBOOK_PAGE_ID GOOGLE_SHEET_ID GOOGLE_SHEET_NAME SLACK_CHANNEL_ID Target Audience Marketing and lead-generation teams using Facebook Pages 📣 Customer support teams managing Messenger queries 💬 Businesses seeking automated lead routing and CRM sync 🧾 Teams leveraging AI for customer engagement optimization 🤖 Step-by-Step Setup Instructions 1️⃣ Connect Facebook Graph API credentials and set your page ID. 2️⃣ Add OpenAI API credentials for GPT-4. 3️⃣ Configure Slack channel ID and credentials. 4️⃣ Link your Google Sheet for message logging. 5️⃣ Replace environment variable placeholders with your actual IDs. 6️⃣ Test the workflow manually before enabling automation. 7️⃣ Activate the schedule trigger for ongoing hourly execution. ✅
by Raphael De Carvalho Florencio
What this workflow is (About) This workflow turns a Telegram bot into an AI-powered lyrics assistant. Users send a command plus a lyrics URL, and the flow downloads, cleans, and analyzes the text, then replies on Telegram with translated lyrics, summaries, vocabulary, poetic devices, or an interpretation—all generated by AI (OpenAI). What problems it solves Centralizes lyrics retrieval + cleanup + AI analysis in one automated flow Produces study-ready outputs (translation, vocabulary, figures of speech) Saves time for teachers, learners, and music enthusiasts with instant results in chat Key features AI analysis** using OpenAI (no secrets hardcoded; uses n8n Credentials) Line-by-line translation, **concise summaries, vocabulary lists Poetic/literary device detection* and *emotional/symbolic interpretation** Robust ETL (extract, download, sanitize) and error handling Clear Sticky Notes documenting routing, ETL, AI prompts, and messaging Who it’s for Language learners & teachers Musicians, lyricists, and music bloggers Anyone studying lyrics for meaning, style, or vocabulary Input & output Input:* Telegram command with a public *lyrics URL** Output:** Telegram messages (Markdown/MarkdownV2), split into chunks if long How it works Telegram → Webhook** receives a user message (e.g., /get_lyrics <URL>). Routing (If/Switch)** detects which command was sent. Extract URL + Download (HTTP Request)** fetches the lyrics page. Cleanup (Code)** strips HTML/scripts/styles and normalizes whitespace. OpenAI (Chat)** formats the result per command (translation, summary, vocabulary, analysis). Telegram (Send Message)** returns the final text; long outputs are split into chunks. Error handling** replies with friendly guidance for unsupported/incomplete commands. Set up steps Create a Telegram bot with @BotFather and copy the bot token. In n8n, create Credentials → Telegram API and paste your token (no hardcoded keys in nodes). Create Credentials → OpenAI and paste your API key. Import the workflow and set a short webhook path (e.g., /lyrics-bot). Publish the webhook and set it on Telegram: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook?url=https://[YOUR_DOMAIN]/webhook/lyrics-bot (Optional) Restrict update types: curl -X POST https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook \ -H "Content-Type: application/json" \ -d '{ "url": "https://[YOUR_DOMAIN]/webhook/lyrics-bot", "allowed_updates": ["message"] }' Test by sending /start and then /get_lyrics <PUBLIC_URL> to your bot. If messages are long, ensure MarkdownV2 is used and special characters are escaped.
by Toshiki Hirao
You can turn messy business card photos into organized contact data automatically. With this workflow, you can upload a business card photo to Slack and instantly capture the contact details into Google Sheets using OCR. No more manual typing—each new card is scanned, structured, saved, and confirmed back in Slack, making contact management fast and effortless. How it works Slack Trigger – The workflow starts when a business card photo is uploaded to Slack. HTTP Request – The uploaded image is fetched from Slack. AI/OCR Parsing – The card image is analyzed by an AI model and structured into contact fields (name, company, email, phone, etc.). Transform Data – The extracted data is cleaned and mapped into the correct format. Google Sheets – A new row is appended to your designated Google Sheet, creating an organized contact database. Slack Notification – Finally, a confirmation message is sent back to Slack to let you know the contact has been successfully saved. How to use Copy the template into your n8n instance. Connect your Slack account to capture uploaded images. Set up your Google Sheets connection and choose the spreadsheet where contacts should be stored. Adjust the Contact Information extraction node if you want to capture custom fields (e.g., job title, address). Deploy and test: upload a business card image in Slack and confirm it’s added to Google Sheets automatically. Requirements n8n running (cloud). A Slack account with access to the channel where photos will be uploaded. A Google Sheets account with a target sheet prepared for storing contacts. AI/OCR capability enabled in your n8n (e.g., OpenAI, Google Vision, or another OCR/LLM provider). Basic access rights in both Slack and Google Sheets to read and write data.
by Rakin Jakaria
Who this is for This workflow is for content creators, digital marketers, or YouTube strategists who want to automatically discover trending videos in their niche, analyze engagement metrics, and get data-driven insights for their content strategy — all from one simple form submission. What this workflow does This workflow starts every time someone submits the YouTube Trends Finder Form. It then: Searches YouTube videos* based on your topic and specified time range using the *YouTube Data API**. Fetches detailed analytics** (views, likes, comments, engagement rates) for each video found. Calculates engagement rates** and filters out low-performing content (below 2% engagement). Applies smart filters** to exclude videos with less than 1000 views, content outside your timeframe, and hashtag-heavy titles. Removes duplicate videos** to ensure clean data. Creates a Google Spreadsheet** with all trending video data organized by performance metrics. Delivers the results** via a completion form with a direct link to your analytics report. Setup To set this workflow up: Form Trigger – Customize the "YouTube Trends Finder" form fields if needed (Topic Name, Last How Many Days). YouTube Data API – Add your YouTube OAuth2 credentials and API key in the respective nodes. Google Sheets – Connect your Google Sheets account for automatic report generation. Engagement Filters – Adjust the 2% engagement rate threshold based on your quality standards. View Filters – Modify the minimum view count (currently 1000+) in the filter conditions. Regional Settings – Update the region code (currently "US") to target specific geographic markets. How to customize this workflow to your needs Change the engagement rate threshold to be more or less strict based on your niche requirements. Add additional filters like video duration, subscriber count, or specific keywords to refine results. Modify the Google Sheets structure to include extra metrics like "Channel Name", "Video Duration", or "Trending Score". Switch to different output formats like CSV export or direct email reports instead of Google Sheets.