by Vedad Sose
Create TikTok videos grounded in real human insight. One agent researches viral trends, gathers authentic audience feedback on your product to inform script creation, tests script variations with target audiences to identify the most resonant script option, generates the video, and sends it back to you on Telegram. Description Most AI video tools generate content based on assumptions and AI guesswork. That's why content often looks and feels the same. This workflow grounds every creative decision in an authentic human perspective. The process: Research trends — Apify scrapes what's actually viral on TikTok Understand your audience — Digital Twins provide real feedback on your product/concept to inform creative direction Generate informed scripts — The agent writes variations based on actual human insights, not AI assumptions Test for resonance — Digital Twins evaluate which script will genuinely connect with your target audience Produce the winner — Veo 3 generates video from the most resonant script By anchoring both the script development and selection in real human perspectives, you can create TikTok videos that actually resonate with your target audience. Triggers Telegram message** — send your product/concept to your bot to start the workflow Nodes Used Set** (Workflow Configuration) — target audience demographics Telegram Trigger** — starts the workflow with product/concept input AI Agent** — orchestrates the entire research → insight → creation → resonance testing flow OpenAI Chat Model** (GPT-4o) — powers the agent MCP Client Tool** (Apify) — scrapes trending TikTok content MCP Client Tool** (OriginalVoices) — queries Digital Twins for: Initial product/concept feedback to inform scripts Resonance testing to select the most compelling script version Code** (Parse JSON) — extracts structured output from the agent HTTP Request** (Veo 3) — generates 9:16 video via fal.ai Wait** — pauses for video generation HTTP Request** (Poll) — checks video status Telegram** — sends the finished video URL back to you Required Credentials Telegram Bot — create via @BotFather OpenAI API — for GPT-4o Apify API — for TikTok trend scraping (apify.com) OriginalVoices API — for real-time audience insights and feedback (originalvoices.ai) FAL.ai API — for Veo 3 video generation (fal.ai) Configuration Import workflow.json into n8n Edit the Workflow Configuration node: productName — your product/brand name productDescription — what your product does targetAudience — e.g., "US women, 25-40, interested in wellness and productivity" (defines which Digital Twins provide insights and resonance feedback) tone — e.g., "casual, energetic, Gen-Z friendly" Connect Telegram Bot credentials on both Telegram nodes Connect OpenAI API credentials on the Chat Model node Connect Header Auth on Apify MCP node: Header: Authorization, Value: Bearer YOUR_APIFY_TOKEN Connect Header Auth on OriginalVoices node: Header: X-Api-Key, Value: YOUR_API_KEY Connect Header Auth on Veo 3 and Poll Video Status nodes: Header: Authorization, Value: Key YOUR_FAL_API_KEY Activate the workflow Usage Send a message to your Telegram bot describing your product/concept (e.g., "meditation app that uses AI-generated soundscapes") The agent: Researches current TikTok trends Gathers authentic audience feedback on your concept from Digital Twins Writes script variations informed by real human perspectives Tests all scripts with Digital Twins of target audience to identify which one resonates most The most resonant script is sent to Veo 3 for video generation You receive the video URL, audience-informed caption, and hashtags back on Telegram Why this matters: Every creative decision — from initial direction to final script selection — is grounded in how real humans in your target demographic actually think, feel, and respond. Not AI assumptions or guesswork. Real human intelligence that ensures your content actually resonates. Customize: Audience targeting** — refine the targetAudience parameter to query specific Digital Twin demographics Trend niche** — modify the agent prompt to focus on specific hashtags or content types Script count** — adjust the prompt to generate more or fewer variations for testing Video style** — update the Veo 3 prompt for different aesthetics (cinematic, UGC, etc.) Resonance criteria** — specify what makes a script "compelling" for your use case
by moosa
Daily Tech & Startup Digest: Notion-Powered News Curation Description This n8n workflow automates the curation of a daily tech and startup news digest from articles stored in a Notion database. It filters articles from the past 24 hours, refines them using keyword matching and LLM classification, aggregates them into a single Markdown digest with categorized summaries, and publishes the result as a Notion page. Designed for manual testing or daily scheduled runs, it includes sticky notes (as required by the n8n creator page) to document each step clearly. This original workflow is for educational purposes, showcasing Notion integration, AI classification, and Markdown-to-Notion conversion. Data in Notion Workflow Overview Triggers Manual Trigger**: Tests the workflow (When clicking ‘Execute workflow’). Schedule Trigger**: Runs daily at 8 PM (Schedule Trigger, disabled by default). Article Filtering Fetch Articles**: Queries the Notion database (Get many database pages) for articles from the last 24 hours using a date filter. Keyword Filtering**: JavaScript code (Code in JavaScript) filters articles containing tech/startup keywords (e.g., "tech," "AI," "startup") in title, summary, or full text. LLM Classification**: Uses OpenAI’s gpt-4.1-mini (OpenAI Chat Model) with a text classifier (Text Classifier) to categorize articles as "Tech/Startup" or "Other," keeping only relevant ones. Digest Creation Aggregate Articles**: Combines filtered articles into a single object (Code in JavaScript1) for processing. Generate Digest**: An AI agent (AI Agent) with OpenAI’s gpt-4.1-mini (OpenAI Chat Model1) creates a Markdown digest with an intro paragraph, categorized article summaries (e.g., AI & Developer Tools, Startups & Funding), clickable links, and a closing note. Notion Publishing Format for Notion**: JavaScript code (Code in JavaScript2) converts the Markdown digest into a Notion-compatible JSON payload, supporting headings, bulleted lists, and links, with a title like “Tech & Startup Daily Digest – YYYY-MM-DD”. Create Notion Page**: Sends the payload via HTTP request (HTTP Request) to the Notion API to create a new page. Credentials Uses Notion API and OpenAI API credentials. Notes This workflow is for educational purposes, demonstrating Notion database querying, AI classification, and Markdown-to-Notion publishing. Enable and adjust the schedule trigger (e.g., 8 PM daily) for production use to create daily digests. Set up Notion and OpenAI API credentials in n8n before running. The date filter can be modified (e.g., hours instead of days) to adjust the article selection window.
by Trung Tran
AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) Watch the demo video below: > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. Who’s it for YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. How it works / What it does Weekly Schedule Trigger Runs the workflow once per week. Get all videos uploaded last week Queries YouTube for videos uploaded by the channel in the past 7 days. Get video detail Retrieves each video’s title, description, and ID. YouTube Video Auto Tagging Agent (LLM) Inputs: video.title, video.description, channelName. Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. Update video with AI-generated tags Writes the tags back to the video via YouTube Data API. Inform via Slack message Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. How to set up YouTube connection Create a Google Cloud project and enable YouTube Data API v3. Configure OAuth client (Web app / Desktop as required). Authorize with the Google account that manages the channel. In your automation platform, add the YouTube credential and grant scopes (see Requirements). Slack connection Create or use an existing Slack app/bot. Install to your workspace and capture the Bot Token. Add the Slack credential in your automation platform. LLM / Chat Model Select your model (e.g., OpenAI GPT). Paste the System Prompt (SEO expert) and the User Prompt template: Inputs: {{video_title}}, {{video_description}}, {{channel_name}}. Output: comma-separated list of 15–20 tags (no #, no duplicates). Node configuration Weekly Schedule Trigger: choose day/time (e.g., Mondays 09:00 local). Get all videos uploaded last week: date filter = now() - 7 days. Get video detail: map each video ID from previous node. Agent node: map fields to the prompt variables. Update video: map the agent’s tag string to the YouTube tags field. Slack message: The video "{{video_title}} - {{video_id}}" has been auto-tagged successfully. Tags: {{tags}} Test run Manually run the workflow with one recent video. Verify the tags appear in YouTube Studio and the Slack message posts. Requirements APIs & Scopes YouTube Data API v3** youtube.readonly (to list videos / details) youtube or youtube.force-ssl (to update video metadata incl. tags) Slack Bot Token Scopes** chat:write (post messages) channels:read or groups:read if selecting channels dynamically (optional) Platform Access to a chat/LLM provider (e.g., OpenAI). Outbound HTTPS allowed. Rate limits & quotas YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. How to customize the workflow Schedule:** switch to daily, or run on publish events instead of weekly. Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). Prompt tuning:** Add brand keywords to always include (e.g., “WiseStack AI”). Constrain to language (e.g., “Vietnamese tags only”). Enforce max 500 chars total for tags if you want a stricter cap. Safety guardrails:** Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. If the agent fails, fall back to a heuristic generator (title/keywords extraction). Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. Notifications:** add error Slack messages for failed API calls; summarize weekly results. Tip: Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.
by Interlock GTM
Summary Turns a plain name + email into a fully-enriched HubSpot contact by matching the person in Apollo, pulling their latest LinkedIn activity, summarising the findings with GPT-4o, and upserting the clean data into HubSpot Key use-cases SDRs enriching inbound demo requests before routing RevOps teams keeping executive records fresh Marketers building highly-segmented email audiences Inputs |Field |Type| Example| |-|-|-| name |string| “Jane Doe” email| string |“jane@acme.com” Required credentials |Service |Node |Notes| |-|-|-| Apollo.io API key | HTTP Request – “Enrich with Apollo” |Set in header x-api-key RapidAPI key| (Fresh-LinkedIn-Profile-Data) “Get recent posts”| Header x-rapidapi-key OpenAI 3 LangChain nodes| Supply an API key| default model gpt-4o-mini HubSpot OAuth2| “Enrich in HubSpot”| Add/create any custom contact properties referenced High-level flow Trigger – Runs when another workflow passes name & email. Clean – JS Code node normalises & deduplicates emails. Apollo match – Queries /people/match; skips if no person. LinkedIn fetch – Grabs up to 3 original posts from last 30 days. AI summary chain OpenAI → Structured/Auto-fixing parsers Produces a strict JSON block with job title, location, summaries, etc. HubSpot upsert – Maps every key (plus five custom properties) into the contact record. Sticky-notes annotate the canvas; error-prone bits have retry logic.
by Samir Saci
Tags: Logistics, Supply Chain, Warehouse Operations, Paperless Processes, Inventory Management Context Hi! I’m Samir, Supply Chain Engineer, Data Scientist based in Paris, and founder of LogiGreen. > Let's use AI with n8n to help SMEs digitalise their logistics operations! Traditional inventory cycle counts often require clipboards, scanners, and manual reconciliation. With this workflow, the operator walks through the warehouse, sends voice messages, and the bot automatically updates the inventory records. Using AI-based transcription and structured extraction, we optimise the entire process with a simple mobile device connected to Telegram. 📬 For business inquiries, you can find me on LinkedIn Demo of the workflow In this example, the bot guides the operator through the cycle count for three locations. The workflow automatically records the results in Google Sheets. Who is this template for? This template is ideal for companies with limited IT resources: Inventory controllers** who need a hands-free, mobile-friendly counting process Small 3PLs and retailers looking to digitalise stock control 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: What does this workflow do? This automation uses Telegram and OpenAI’s Whisper transcription: The operator sends /start to the bot. The bot identifies the first location that still needs to be counted. The operator is guided to the location through a Telegram message. The operator records a voice message with the location ID and the number of units counted. AI nodes transcribe the audio and extract location_id and quantity. If the message cannot be transcribed, the bot asks the operator to repeat. If the location is valid and still pending, the Google Sheet is updated. The bot sends the next location, until the final one is completed. The operator receives a confirmation that the cycle count is finished. Next Steps Before running the workflow, follow the sticky notes and configure: Connect your Telegram Bot API Add your OpenAI API Key to the transcription and extraction nodes Connect your Google Sheets credentials Update the Google Sheet ID and the worksheet name in all Spreadsheet nodes Adjust the AI prompts depending on your warehouse location naming conventions Submitted: 20 November 2025 Template designed with n8n version 1.116.2
by Jason
Build a personal Telegram bot that looks up English vocabulary and saves every entry to Notion — supporting text, voice, and photo input. Send a word by typing, voice message, or photo. The bot auto-detects the input type: voice is transcribed via Whisper, photos are scanned via GPT-4o-mini Vision OCR. A GPT-4.1-mini agent spell-checks and returns the word's definition, translation, part of speech, and an example sentence. The translation language is fully configurable — just set TARGET_LANGUAGE in the Config node (e.g. Traditional Chinese, Japanese, Spanish). Nodes used: Telegram Trigger, HTTP Request (Telegram file API), OpenAI (Whisper + GPT-4o-mini Vision + GPT-4.1-mini), Structured Output Parser, Notion. 💡 Want more AI automation templates? Check out the Content Automation Bundle: 👉 https://jasonchuang0818.gumroad.com/l/n8n-content-automation-bundle
by Rahul Joshi
📊 Description This workflow allows users to ask questions about past meetings using their voice. It converts the voice question into text, searches stored meeting notes using Pinecone, and replies with a spoken answer generated by AI. It helps teams quickly recall decisions, tasks, and discussions without reading long meeting notes. 🧠🔊 🔁 What This Template Does 1️⃣ Receives a voice question through a webhook endpoint. 🌐 2️⃣ Converts the audio question into text using speech transcription. 🎤➡️📝 3️⃣ Cleans and prepares the question for searching. ✅ 4️⃣ Converts the question into an embedding for semantic search. 🔍 5️⃣ Searches relevant meeting notes from Pinecone using the team namespace. 📚 6️⃣ Combines retrieved meeting context into a single readable format. 🧩 7️⃣ Uses AI to answer the question strictly from the meeting context. 🤖 8️⃣ Converts the AI’s text answer into spoken audio. 🔊 9️⃣ Sends the audio response back to the user via webhook. 🔁 ⭐ Key Benefits ✅ Allows hands-free access to meeting information ✅ Saves time searching through meeting notes ✅ Prevents AI hallucinations using RAG ✅ Supports multiple teams using namespaces ✅ Works with voice-based tools and assistants ✅ Improves meeting recall and clarity 🧩 Features Voice-based question input Speech-to-text transcription Semantic search using Pinecone Team-based data isolation Context-aware AI responses Text-to-speech output Webhook-driven architecture 🔐 Requirements OpenAI API key for transcription, embeddings, and speech generation Azure OpenAI credentials for chat responses Pinecone API key with a configured index Matching embedding model for ingest and query Webhook client capable of sending audio files 🎯 Target Audience Teams that conduct frequent meetings Managers needing quick decision recall Remote and distributed teams Product, engineering, and operations teams Automation builders using n8n Organizations adopting voice-based workflows
by Dmitrij Zykovic
Personal Expense Tracker Bot 💰 AI-powered Telegram bot for effortless expense tracking. Send receipts, voice messages, or text - the bot automatically extracts and categorizes your expenses. ✨ Key Features 📸 Receipt & Invoice OCR** - Send photos of receipts or PDF invoices, AI extracts expense data automatically 🎤 Voice Messages** - Speak your expenses naturally, audio is transcribed and processed 💬 Natural Language** - Just type "spent 50 on groceries" or any text format 🌍 Multilingual** - Processes documents in any language (EN, DE, PT, etc.) 📊 Smart Statistics** - Get monthly totals, category breakdowns, multi-month comparisons 🔒 Private & Secure** - Single-user authorization, only you can access your data ⚡ Zero Confirmation** - Expenses are added instantly, no annoying "confirm?" prompts 🎯 How It Works Send expense data via Telegram: Photo of receipt PDF invoice Voice message Text message AI processes automatically: Extracts amount, date, vendor Categorizes expense Stores in organized format Query your expenses: "Show my expenses for November" "How much did I spend on groceries?" "Compare last 3 months" 📋 Expense Categories Groceries, Transportation, Housing, Utilities, Healthcare, Entertainment, Dining Out, Clothing, Education, Subscriptions, Personal Care, Gifts, Travel, Sports, Other 🔧 Setup Requirements 1. Telegram Bot Create a Telegram bot via @BotFather and get your API token. Configure credentials for nodes: Input, WelcomeMessage, GetAudioFile, GetAttachedFile, GetAttachedPhoto ReplyText, NotAuthorizedMessage, DeleteProcessing 2. OpenRouter API Get API key from OpenRouter for AI processing. Configure credentials for: Gpt4o (main processing) Sonnet45 (expense assistant) 3. Ainoflow API Get API key from Ainoflow for storage and OCR. Configure Bearer credentials for: GetConfig, SaveConfig ExtractFileText, ExtractImageText TranscribeRecording JsonStorageMcp (MCP tool) 🏗️ Workflow Architecture | Section | Description | |---------|-------------| | Message Trigger | Receives all Telegram messages | | Bot Privacy | Locks bot to first user, rejects unauthorized access | | Chat Message / Audio | Routes text and voice messages to AI | | Document / Photo | Extracts text from files via OCR and forwards to AI | | Root Agent | Routes messages to Expense Assistant, validates responses | | Expense Assistant | Core logic: stores expenses, calculates statistics | | Result / Reply | Sends formatted response back to Telegram | | Cleanup / Reset | Manual trigger to delete all data (⚠️ use with caution) | 💬 Usage Examples Adding Expenses 📸 [Send receipt photo] → Added: 45.50 EUR - Groceries (Lidl) 🎤 "Bought coffee for five euros" → Added: 5.00 EUR - Dining Out (coffee) 💬 "50 uber" → Added: 50.00 EUR - Transportation (uber) Querying Expenses "Show my expenses" → November 2025: 1,250.50 EUR (23 expenses) Top: Groceries 450€, Transportation 280€, Dining 220€ "How much on entertainment this month?" → Entertainment: 85.00 EUR (3 expenses) "Compare October and November" → Oct: 980€ | Nov: 1,250€ (+27%) 📦 Data Storage Expenses are stored in JSON format organized by month (YYYY-MM): { "id": "uuid", "amount": 45.50, "currency": "EUR", "category": "Groceries", "description": "Store name", "date": "2025-11-10T14:30:00Z", "created_at": "2025-11-10T14:35:22Z" } ⚠️ Important Notes First user locks the bot** - Run /start to claim ownership Default currency is EUR** - AI auto-detects other currencies Cleanup deletes ALL data** - Use manual trigger with caution No confirmation for adding** - Only delete operations ask for confirmation 🛠️ Customization Change default currency in agent prompts Add/modify expense categories in ExpenseAssistant Extend Root Agent with additional assistants Adjust AI models (swap GPT-4o/Sonnet as needed) 📚 Related Resources Create Telegram Bot OpenRouter Credentials Ainoflow Platform 💼 Need Customization? Want to adapt this template for your specific needs? Custom integrations, additional features, or enterprise deployment? Contact us at Ainova Systems - We build AI automation solutions for businesses. Tags: telegram, expense-tracker, ai-agent, ocr, voice-to-text, openrouter, mcp-tools, personal-finance
by 小林幸一
Template Description 📝 Template Title Analyze Amazon product reviews with Gemini and save to Google Sheets 📄 Description This workflow automates the process of analyzing customer feedback on Amazon products. Instead of manually reading through hundreds of reviews, this template scrapes reviews (specifically targeting negative feedback), uses Google Gemini (AI) to analyze the root causes of dissatisfaction, and generates specific improvement suggestions. The results are automatically logged into a Google Sheet for easy tracking, and a Slack notification is sent to keep your team updated. This tool is essential for understanding "Voice of Customer" data efficiently without manual data entry. 🧍 Who is this for Product Managers** looking for product improvement ideas. E-commerce Sellers (Amazon FBA, D2C)** monitoring brand reputation. Market Researchers** analyzing competitor weaknesses. Customer Support Teams** identifying recurring issues. ⚙️ How it works Data Collection: The workflow triggers the Apify actor (junglee/amazon-reviews-scraper) to fetch reviews from a specified Amazon product URL. It is currently configured to filter for 1 and 2-star reviews to focus on complaints. AI Analysis: It loops through each review and sends the content to Google Gemini. The AI determines a sentiment score (1-5), categorizes the issue (Quality, Design, Shipping, etc.), summarizes the complaint, and proposes a concrete improvement plan. Formatting: A Code node parses the AI's response to ensure it is in a clean JSON format. Storage: The structured data is appended as a new row in a Google Sheet. Notification: A Slack message is sent to your specified channel to confirm the batch analysis is complete. 🛠️ Requirements n8n** (Self-hosted or Cloud) Apify Account:** You need to rent the junglee/amazon-reviews-scraper actor. Google Cloud Account:** For accessing the Gemini (PaLM) API and Google Sheets API. Slack Account:** For receiving notifications. 🚀 How to set up Apify Config: Enter your Apify API token in the credentials. In the "Run an Actor" node, update the startUrls to the Amazon product page you want to analyze. Google Sheets: Create a new Google Sheet with the following header columns: sentiment_score, category, summary, improvement. Copy the Spreadsheet ID into the Google Sheets node. AI Prompt: The "Message a model" node contains the prompt. It is currently set to output results in Japanese. If you need English output, simply translate the prompt text inside this node. Slack: Select the channel where you want to receive notifications in the Slack node.
by ueharayuuki
🤖 Automated Multi-lingual News Curator & Archiver Overview This workflow automates news monitoring by fetching RSS feeds, rewriting content using AI, translating it (EN/ZH/KO), and archiving it. Who is this for? Content Curators, Localization Teams, and Travel Bloggers. How it works Fetch & Filter: Pulls NHK RSS and filters for keywords (e.g., "Tokyo"). AI Processing: Google Gemini rewrites articles, extracts locations, and translates text. Archive & Notify: Saves structured data to Google Sheets and alerts Slack. Setup Requirements Credentials: Google Gemini, Google Sheets, Slack. Google Sheet: Create headers: title, summary, location, en, zh, ko, url. Slack: Configure Channel IDs. Customization RSS Read:** Change feed URL. If Node:** Update filter keywords. AI Agent:** Adjust system prompts for tone. 1. Fetch & Filter Runs on a schedule to fetch the latest RSS items. Filters articles based on specific keywords (e.g., "Tokyo" or "Season") before processing. 2. AI Analysis & Parsing Uses Google Gemini to rewrite the news, extract specific locations, and translate content. The Code node cleans the JSON output for the database. 3. Archive & Notify Appends the structured data to Google Sheets and sends a formatted notification to Slack (or alerts if an article was skipped). Output Example (JSON) The translation agent outputs data in this format: { "en": "Tokyo Tower is...", "zh": "东京塔是...", "ko": "도쿄 타워는..." }
by Stefan Joulien
Who this template is for This workflow is for users who want to turn Telegram into a personal AI-powered assistant capable of handling everyday tasks through natural language. It's ideal for solo founders, operators, or professionals who want to manage communication, scheduling, calculations, and information retrieval from a single chat interface. No advanced n8n knowledge is required, and the workflow is designed to be easily extended with additional tools. What this workflow does This workflow creates a Telegram-based AI assistant that can receive text or voice messages, understand user intent, and respond with text or audio. The assistant can reason about requests and use multiple tools such as contacts lookup, email drafting, calendar management, research, messaging, and calculations. Voice messages are automatically transcribed, processed like text input, and answered accordingly. How it works The workflow listens for incoming Telegram messages and validates the sender It detects whether the message is text or voice — voice messages are transcribed using OpenAI before being passed to the AI agent The AI agent processes the request using a chat model, short-term memory, and a set of productivity tools (contacts, email, calendar, research, messaging, calculator) The response is cleaned and formatted, then split into multiple chat bubbles with natural delays for a more human-like delivery Depending on the output type, the response is sent as plain text or converted into audio and returned to the user in Telegram How to set up Create a Telegram bot and connect it to the Telegram Trigger node Add your Telegram user ID to the authorization fields Connect your OpenAI credentials for chat, transcription, and text-to-speech Activate the workflow and start chatting with your assistant Requirements Telegram account and bot token OpenAI API credentials n8n instance (cloud or self-hosted)
by Automate With Marc
## Podcast on Autopilot — Generate Podcast Ideas, Scripts & Audio Automatically with Eleven Labs, GPT-5 and Claude Sonnet 4.0 Bring your solo podcast to life — on full autopilot. This workflow uses GPT-5 and Claude Sonnet to turn a single topic input into a complete podcast episode intro and ready-to-send audio file. How it works Start a chat trigger – enter a seed idea or topic (e.g., “habits,” “failure,” “technology and purpose”). Podcast Idea Agent (GPT-5) instantly crafts a thought-provoking, Rogan- or Bartlett-style episode concept with a clear angle and takeaway. Podcast Script Agent (Claude 4.0 Sonnet) expands that idea into a natural, engaging 60-second opening monologue ready for recording. Text-to-Speech via ElevenLabs automatically converts the script into a high-quality voice track. Email automation sends the finished MP3 directly to your inbox. Perfect for • Solo creators who want to ideate, script and voice short podcasts effortlessly • Content teams prototyping daily or weekly audio snippets • Anyone testing AI-driven storytelling pipelines Customization tips • Swap ElevenLabs with your preferred TTS service by editing the HTTP Request node. • Adjust prompt styles for tone or audience in the Idea and Script Agents. • Modify the Gmail (or other mail service) node to send audio to any destination (Drive, Slack, Notion, etc.). • For reuse at scale, add variables for episode number, guest name, or theme category — just clone and update the trigger node. Watch step-by-step tutorial (how to build it yourself) https://www.youtube.com/watch?v=Dan3_W1JoqU Requirements & disclaimer • Requires API keys for OpenAI + Anthropic + ElevenLabs (or your chosen TTS). • You’re responsible for managing costs incurred through AI or TTS usage. • Avoid sharing sensitive or private data as input into prompt flows. • Designed with modularity so you can turn off or swap/deep-link any stage (idea → script → voice → email) without breaking the chain.