by Paul Abraham
This n8n template demonstrates how to turn a Telegram bot into a personal AI-powered assistant that understands both voice notes and text messages. The assistant can transcribe speech, interpret user intent with AI, and perform smart actions such as managing calendars, sending emails, or creating notes. Use cases Hands-free scheduling with Google Calendar Quickly capturing ideas as Notion notes via voice Sending Gmail messages directly from Telegram A personal productivity assistant available on-the-go Good to know Voice notes are automatically transcribed into text before being processed. This template uses Google Gemini for AI reasoning.The AI agent supports memory, enabling more natural and contextual conversations. How it works Telegram Trigger – Starts when you send a text or voice note to your Telegram bot. Account Check – Ensures only authorized users can interact with the bot. Audio Handling – If it’s a voice message, the workflow retrieves and transcribes the recording. AI Agent – Both transcribed voice or text are sent to the AI Agent powered by Google Gemini + Simple Memory. Smart Actions – Based on the query, the AI can: Read or create events in Google Calendar Create notes in Notion Send messages in Gmail Reply in Telegram – The bot sends a response confirming the action or providing the requested information. How to use Clone this workflow into your n8n instance. Replace the Telegram Trigger with your bot credentials. Connect Google Calendar, Notion, and Gmail accounts where required. Start chatting with your Telegram bot to add events, notes, or send emails using just your voice or text. Requirements Telegram bot & API key Google Gemini account for AI Google Calendar, Notion, and Gmail integrations (optional, depending on use case) Customising this workflow Add more integrations (Slack, Trello, Airtable, etc.) for extended productivity. Modify the AI prompt in the agent node to fine-tune personality or task focus. Swap in another transcription service if preferred.
by Raphael De Carvalho Florencio
What this workflow is (About) This workflow turns a Telegram bot into an AI-powered lyrics assistant. Users send a command plus a lyrics URL, and the flow downloads, cleans, and analyzes the text, then replies on Telegram with translated lyrics, summaries, vocabulary, poetic devices, or an interpretation—all generated by AI (OpenAI). What problems it solves Centralizes lyrics retrieval + cleanup + AI analysis in one automated flow Produces study-ready outputs (translation, vocabulary, figures of speech) Saves time for teachers, learners, and music enthusiasts with instant results in chat Key features AI analysis** using OpenAI (no secrets hardcoded; uses n8n Credentials) Line-by-line translation, **concise summaries, vocabulary lists Poetic/literary device detection* and *emotional/symbolic interpretation** Robust ETL (extract, download, sanitize) and error handling Clear Sticky Notes documenting routing, ETL, AI prompts, and messaging Who it’s for Language learners & teachers Musicians, lyricists, and music bloggers Anyone studying lyrics for meaning, style, or vocabulary Input & output Input:* Telegram command with a public *lyrics URL** Output:** Telegram messages (Markdown/MarkdownV2), split into chunks if long How it works Telegram → Webhook** receives a user message (e.g., /get_lyrics <URL>). Routing (If/Switch)** detects which command was sent. Extract URL + Download (HTTP Request)** fetches the lyrics page. Cleanup (Code)** strips HTML/scripts/styles and normalizes whitespace. OpenAI (Chat)** formats the result per command (translation, summary, vocabulary, analysis). Telegram (Send Message)** returns the final text; long outputs are split into chunks. Error handling** replies with friendly guidance for unsupported/incomplete commands. Set up steps Create a Telegram bot with @BotFather and copy the bot token. In n8n, create Credentials → Telegram API and paste your token (no hardcoded keys in nodes). Create Credentials → OpenAI and paste your API key. Import the workflow and set a short webhook path (e.g., /lyrics-bot). Publish the webhook and set it on Telegram: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook?url=https://[YOUR_DOMAIN]/webhook/lyrics-bot (Optional) Restrict update types: curl -X POST https://api.telegram.org/bot<YOUR_BOT_TOKEN>/setWebhook \ -H "Content-Type: application/json" \ -d '{ "url": "https://[YOUR_DOMAIN]/webhook/lyrics-bot", "allowed_updates": ["message"] }' Test by sending /start and then /get_lyrics <PUBLIC_URL> to your bot. If messages are long, ensure MarkdownV2 is used and special characters are escaped.
by Toshiki Hirao
You can turn messy business card photos into organized contact data automatically. With this workflow, you can upload a business card photo to Slack and instantly capture the contact details into Google Sheets using OCR. No more manual typing—each new card is scanned, structured, saved, and confirmed back in Slack, making contact management fast and effortless. How it works Slack Trigger – The workflow starts when a business card photo is uploaded to Slack. HTTP Request – The uploaded image is fetched from Slack. AI/OCR Parsing – The card image is analyzed by an AI model and structured into contact fields (name, company, email, phone, etc.). Transform Data – The extracted data is cleaned and mapped into the correct format. Google Sheets – A new row is appended to your designated Google Sheet, creating an organized contact database. Slack Notification – Finally, a confirmation message is sent back to Slack to let you know the contact has been successfully saved. How to use Copy the template into your n8n instance. Connect your Slack account to capture uploaded images. Set up your Google Sheets connection and choose the spreadsheet where contacts should be stored. Adjust the Contact Information extraction node if you want to capture custom fields (e.g., job title, address). Deploy and test: upload a business card image in Slack and confirm it’s added to Google Sheets automatically. Requirements n8n running (cloud). A Slack account with access to the channel where photos will be uploaded. A Google Sheets account with a target sheet prepared for storing contacts. AI/OCR capability enabled in your n8n (e.g., OpenAI, Google Vision, or another OCR/LLM provider). Basic access rights in both Slack and Google Sheets to read and write data.
by Vlad Arbatov
Summary Chat with your AI agent in Telegram. It remembers important facts about you in Airtable, can transcribe your voice messages, search the web, read and manage Google Calendar, fetch Gmail, and query Notion. Responses are grounded in your recent memories and tool outputs, then sent back to Telegram. What this workflow does Listens to your Telegram messages (text or voice) Maintains short-term chat memory per user and long-term memory in Airtable Decides when to save new facts about you (auto “Save Memory” without telling you) Uses tools on demand: Web search via SerpAPI Google Calendar: list/create/update/delete events Gmail: list and read messages Notion: fetch database info Transcribes Telegram voice notes with OpenAI and feeds them to the agent Combines live tool results + recent memories and replies in Telegram Apps and credentials Telegram Bot API: personal_bot xAI Grok: Grok-4 model for chat OpenAI: speech-to-text (transcribe audio) Airtable: store long-term memories Google Calendar: calendar actions Gmail: email actions Notion: knowledge and reading lists SerpAPI: web search Typical use cases Personal assistant that remembers preferences, decisions, and tasks Create/update meetings by chatting, and get upcoming events Ask “what did I say I’m reading?” or “what’s our plan from last week?” Voice-first capture: send a voice note → get a transcribed, actionable reply Fetch recent emails or look up info on the web without leaving Telegram Query a Notion database (e.g., “show me the Neurocracy entries”) How it works (node-by-node) Telegram Trigger Receives messages from your Telegram chat (text and optional voice). Text vs Message Router Routes based on message contents: Text path → goes directly to the Agent (AI). Voice path → downloads the file and transcribes before AI. Always also fetches recent Airtable memories for context. Get a file (Telegram) Downloads the voice file (voice.file_id) when present. Transcribe a recording (OpenAI) Converts audio to text so the agent can use it like a normal message. Get memories (Airtable) Searches your “Agent Memories” base/table, filtered by user, sorted by Created. Aggregate (Aggregate) Bundles recent memory records into a single array “Memories” with text + timestamp. Merge (Merge) Combines current input (text or transcript) with the memory bundle before the agent. Simple Memory (Agent memory window) Short-term session memory keyed by Telegram chat ID; keeps the recent 30 turns. Tools wired into the agent SerpAPI Google Calendar tools: Get many events in Google Calendar Create an event in Google Calendar Update an event in Google Calendar Delete an event in Google Calendar Gmail tools: Get many messages in Gmail Get a message in Gmail Notion tool: Get a database in Notion Airtable tool: Save Memory (stores distilled facts about the user) Agent System prompt defines role, tone, and rules: Be a friendly assistant. On each message, decide if it contains user info worth saving. If yes, call “Save Memory” to persist a short summary in Airtable. Don’t announce memory saves—just continue helping. Use tools when needed (web, calendar, Gmail, Notion). Think with the provided memory context block. Uses xAI Grok Chat Model for reasoning and tool-calling. Can call Save Memory, Calendar, Gmail, Notion, and SerpAPI tools as needed. Save Memory (Airtable) Persists Memory and User fields to “Agent Memories” base; auto timestamp by Airtable. Send a text message (Telegram) Sends the agent’s final answer back to the same Telegram chat ID. Node map | Node | Type | Purpose | |---|---|---| | Telegram Trigger | Trigger | Receive text/voice from Telegram | | Text vs voice router | Flow control | Route text vs voice; also trigger memories fetch | | Get a file | Telegram | Download voice audio | | Transcribe a recording | OpenAI | Speech-to-text for voice notes | | Get memories | Airtable | Load recent user memories | | Aggregate | Aggregate | Pack memory records into “Memories” array | | Merge | Merge | Combine input and memories before agent call | | Simple Memory | Agent memory | Short-term chat memory per chat ID | | xAI Grok Chat Model | LLM | Core reasoning model for the Agent | | Search Web with SerpAPI | Tool | Web search | | Google Calendar tools | Tool | List/create/update/delete events | | Gmail tools | Tool | Search and read email | | Notion tool | Tool | Query a Notion database | | Save Memory | Airtable Tool | Persist distilled user facts | | AI Agent | Agent | Orchestrates tools + memory, produces the answer | | Send a text message | Telegram | Reply to the user in Telegram | Before you start Create a Telegram bot and get your token (via @BotFather). Put your Telegram user ID into the Telegram Trigger node (chatIds). Connect credentials: xAI Grok (model: grok-4-0709) OpenAI (for audio transcription) Airtable (Agent Memories base and table) Google Calendar OAuth Gmail OAuth Notion API SerpAPI key Adjust the Airtable “User” value and the filterByFormula to match your name or account. Setup instructions 1) Telegram Telegram Trigger: additionalFields.chatIds = your_telegram_id download = true to allow voice handling Send a text message: chatId = {{ $('Telegram Trigger').item.json.message.chat.id }} 2) Memory Airtable base/table must exist with fields: Memory, User, Created (Created auto-managed). In Save Memory and Get memories nodes, align Base, Table, and filterByFormula with your setup. Simple Memory: sessionKey = {{ $('If').item.json.message.chat.id }} contextWindowLength = 30 (adjust as needed) 3) Tools Google Calendar: choose your calendar, test get/create/update/delete. Gmail: set “returnAll/simplify/messageId” via $fromAI or static defaults. Notion: set your databaseId. SerpAPI: ensure the key is valid. 4) Agent (AI node) SystemMessage: customize role, name, and any constraints. Text input: concatenates transcript or text into one prompt: {{ $json.text }}{{ $json.message.text }} How to use Send a text or voice message to your bot in Telegram. The agent replies in the same chat, optionally performing tool actions. New personal facts you mention are silently summarized and stored in Airtable for future context. Customization ideas Replace Grok with another LLM if desired. Add more tools: Google Drive, Slack, Jira, GitHub, etc. Expand memory schema (e.g., tags, categories, confidence). Add guardrails: profanity filters, domain limits, or cost control. Multi-user support: store chat-to-user mapping and separate memories by user. Add summaries: a daily recap message created from new memories. Limits and notes Tool latency: calls to Calendar, Gmail, Notion, and SerpAPI add response time. Audio size/format: OpenAI transcription works best with common formats and short clips. Memory growth: periodically archive old Airtable entries, or change Aggregate window. Timezone awareness: Calendar operations depend on your Google Calendar settings. Privacy and safety Sensitive info may be saved to Airtable; restrict access to the base. Tool actions operate under your connected accounts; review scopes and permissions. The agent may call external APIs (SerpAPI, OpenAI); ensure this aligns with your policies. Example interactions “Schedule a 30‑min catch‑up with Alex next Tuesday afternoon.” “What meetings do I have in the next 4 weeks?” “Summarize my latest emails from Product Updates.” “What did I say I’m reading?” (agent recalls from memories) Voice note: “Remind me to call the dentist this Friday morning.” → agent transcribes and creates an event. Tags telegram, agent, memory, grok, openai, airtable, google-calendar, gmail, notion, serpapi, voice, automation Changelog v1: First release with Telegram agent, short/long-term memory, voice transcription, and tool integrations for web, calendar, email, and Notion.
by Laiba
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works User Uploads PDF** : The workflow accepts a PDF via webhook. Extract Text** : n8n extracts the text content from the PDF. Summarize with AI** : The extracted text is passed to an AI model (Groq) with OpenAI model for summarization. Generate Audio** : The summary text is sent to a TTS (Text-to-Speech) API (Qwen-TTS-Demo), you can use other free alternatives. Serve Result** : The workflow outputs both Summary and Audio File URL (WAV link) which you can attached to your audioPlayer. This allows users to read or listen to the summary instantly. How to use / Requirements Import Workflow** : Copy/paste the workflow JSON into your n8n instance. Set Up Input Trigger** : If you want users to upload directly you can use webhook or any other trigger. Configure AI Node** : Add your own API key for (Groq / Open AI). Configure TTS Node** : Add credentials for your chosen TTS service. Run Workflow** : Upload a PDF and get back the summary and audio file url. n8n-smart pdf summarizer & voice generator Please reach out to me at Laiba Zubair if you need further assistance with you n8n workflows and automations!
by Fayzul Noor
This workflow is built for e-commerce businesses, retail store owners, and entrepreneurs who want to provide intelligent customer support and seamless order taking through Telegram. If you're tired of manually answering product questions, taking orders through scattered messages, and managing customer information across multiple platforms, this automation will revolutionize your sales process. It transforms your Telegram bot into a smart AI shopping assistant that understands both text and voice messages, answers product questions with accurate information, and automatically records customer orders in Google Sheets—all while maintaining conversation context. How it works / What it does This n8n automation creates a complete conversational commerce experience on Telegram using AI, voice transcription, and intelligent order management. Here's a simple breakdown of how it works: Capture incoming messages using the Telegram Trigger node, which monitors your Telegram bot for both text and voice messages from customers in real-time. Route messages intelligently through the Switch node, which automatically detects whether the customer sent a text message or a voice note and routes it to the appropriate processing pipeline. Process voice messages by downloading the voice file through the Telegram node and transcribing it to text using OpenAI's Whisper audio transcription technology. Extract message content with the Set node, which captures the customer's text and chat ID for processing and conversation tracking. Generate intelligent responses using an AI Agent powered by GPT-4.1-nano that acts as a friendly, professional support agent for your men's clothing store. The agent responds in the same language the customer uses and maintains conversation context through memory. Search your product catalog with a Pinecone-powered RAG (Retrieval-Augmented Generation) system. The AI Agent queries your vector database using OpenAI embeddings to provide accurate prices, brand information, product details, and key features. Manage conversation memory through the Simple Memory node with an 8-message context window, allowing the AI to remember customer preferences and maintain natural, flowing conversations across multiple messages. Take and record orders automatically when customers provide all required information (Name, Phone Number, Address, and Product Category). The AI Agent uses the Google Sheets tool to append complete orders to your spreadsheet. Send responses back to customers through the Telegram Response node, delivering helpful answers and order confirmations directly in the chat. Once everything is set up, your Telegram bot operates 24/7 as an intelligent sales assistant that never misses a message or forgets an order. How to set up Follow these steps to get your AI-powered Telegram shopping assistant running: Import the JSON file into your n8n instance. Create and configure your Telegram bot: Talk to @BotFather on Telegram to create a new bot Save the bot token you receive Add the token to your Telegram API credentials in n8n Add your API credentials: Telegram API credentials for the bot OpenAI API key for the AI Agent, embeddings, and voice transcription Pinecone API credentials for vector storage Google Sheets OAuth2 credentials for order recording Set up your Pinecone vector database: Create a Pinecone index named Create a namespace Upload your store data to the vector store Configure your Google Sheet for orders: Create a new Google Sheet or use an existing one Set up columns: Name, Phone number, Address, Category Update the Google Sheets node with your sheet's document ID Customize the AI Agent's system message to match your brand voice, product categories, and support policies. Test the workflow by sending both text and voice messages to your Telegram bot. Activate the workflow to enable continuous operation. Share your Telegram bot username with customers to start receiving inquiries. Requirements Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) A Telegram bot created through @BotFather with an API token OpenAI API access for AI Agent, embeddings, and Whisper transcription (GPT-4.1-nano model) A Pinecone account with a configured vector database containing your product information A Google Sheets account for storing customer orders Your store knowledge base prepared and uploaded to Pinecone Basic understanding of how n8n workflows and nodes operate How to customize the workflow This workflow is highly flexible and can be customized to fit your specific business needs. Here's how you can tailor it: Change the store knowledge base by updating all references of your store in the AI Agent system message and node descriptions. Adapt it for whatever service your store is providing. Adjust order requirements by modifying the AI Agent's system message to collect different customer information (e.g., email, delivery date, payment method) and updating the Google Sheets column mappings accordingly. Modify conversation memory length in the Simple Memory node. The default is 8 messages, but you can increase or decrease this based on your typical customer conversation length. Add multilingual support by enhancing the AI Agent's system message with specific language instructions or integrating translation nodes for automated language detection and response. Integrate payment processing by adding nodes that generate payment links (Stripe, PayPal) when customers complete their orders, creating a full end-to-end shopping experience. Connect to your inventory system by adding HTTP Request nodes or database connections that check product availability before confirming orders. Implement order notifications by adding email or SMS nodes to notify your team immediately when a new order is recorded in Google Sheets. Add image recognition by incorporating computer vision nodes that allow customers to send product photos and receive information about similar items in your catalog. Create automated follow-ups by adding scheduled triggers that check Google Sheets for new orders and send confirmation or shipping update messages to customers. Enhance the RAG system by creating multiple Pinecone namespaces for different product categories, seasonal collections, or promotional items, allowing the AI to provide more targeted responses. Add customer segmentation by creating additional Google Sheets or database nodes that track customer purchase history, preferences, and interaction patterns for personalized marketing. Implement conversation analytics by adding nodes that log conversation topics, common questions, and conversion rates to help you optimize your product descriptions and support responses.
by ueharayuuki
🤖 Automated Multi-lingual News Curator & Archiver Overview This workflow automates news monitoring by fetching RSS feeds, rewriting content using AI, translating it (EN/ZH/KO), and archiving it. Who is this for? Content Curators, Localization Teams, and Travel Bloggers. How it works Fetch & Filter: Pulls NHK RSS and filters for keywords (e.g., "Tokyo"). AI Processing: Google Gemini rewrites articles, extracts locations, and translates text. Archive & Notify: Saves structured data to Google Sheets and alerts Slack. Setup Requirements Credentials: Google Gemini, Google Sheets, Slack. Google Sheet: Create headers: title, summary, location, en, zh, ko, url. Slack: Configure Channel IDs. Customization RSS Read:** Change feed URL. If Node:** Update filter keywords. AI Agent:** Adjust system prompts for tone. 1. Fetch & Filter Runs on a schedule to fetch the latest RSS items. Filters articles based on specific keywords (e.g., "Tokyo" or "Season") before processing. 2. AI Analysis & Parsing Uses Google Gemini to rewrite the news, extract specific locations, and translate content. The Code node cleans the JSON output for the database. 3. Archive & Notify Appends the structured data to Google Sheets and sends a formatted notification to Slack (or alerts if an article was skipped). Output Example (JSON) The translation agent outputs data in this format: { "en": "Tokyo Tower is...", "zh": "东京塔是...", "ko": "도쿄 타워는..." }
by 小林幸一
Template Description 📝 Template Title Analyze Amazon product reviews with Gemini and save to Google Sheets 📄 Description This workflow automates the process of analyzing customer feedback on Amazon products. Instead of manually reading through hundreds of reviews, this template scrapes reviews (specifically targeting negative feedback), uses Google Gemini (AI) to analyze the root causes of dissatisfaction, and generates specific improvement suggestions. The results are automatically logged into a Google Sheet for easy tracking, and a Slack notification is sent to keep your team updated. This tool is essential for understanding "Voice of Customer" data efficiently without manual data entry. 🧍 Who is this for Product Managers** looking for product improvement ideas. E-commerce Sellers (Amazon FBA, D2C)** monitoring brand reputation. Market Researchers** analyzing competitor weaknesses. Customer Support Teams** identifying recurring issues. ⚙️ How it works Data Collection: The workflow triggers the Apify actor (junglee/amazon-reviews-scraper) to fetch reviews from a specified Amazon product URL. It is currently configured to filter for 1 and 2-star reviews to focus on complaints. AI Analysis: It loops through each review and sends the content to Google Gemini. The AI determines a sentiment score (1-5), categorizes the issue (Quality, Design, Shipping, etc.), summarizes the complaint, and proposes a concrete improvement plan. Formatting: A Code node parses the AI's response to ensure it is in a clean JSON format. Storage: The structured data is appended as a new row in a Google Sheet. Notification: A Slack message is sent to your specified channel to confirm the batch analysis is complete. 🛠️ Requirements n8n** (Self-hosted or Cloud) Apify Account:** You need to rent the junglee/amazon-reviews-scraper actor. Google Cloud Account:** For accessing the Gemini (PaLM) API and Google Sheets API. Slack Account:** For receiving notifications. 🚀 How to set up Apify Config: Enter your Apify API token in the credentials. In the "Run an Actor" node, update the startUrls to the Amazon product page you want to analyze. Google Sheets: Create a new Google Sheet with the following header columns: sentiment_score, category, summary, improvement. Copy the Spreadsheet ID into the Google Sheets node. AI Prompt: The "Message a model" node contains the prompt. It is currently set to output results in Japanese. If you need English output, simply translate the prompt text inside this node. Slack: Select the channel where you want to receive notifications in the Slack node.
by Oneclick AI Squad
Automatically creates complete videos from a text prompt—script, voiceover, stock footage, and subtitles all assembled and ready. How it works Send a video topic via webhook (e.g., "Create a 60-second video about morning exercise"). The workflow uses OpenAI to generate a structured script with scenes, converts text to natural-sounding speech, searches Pexels for matching B-roll footage, and downloads everything. Finally, it merges audio with video, generates SRT subtitles, and prepares all components for final assembly. The workflow handles parallel processing—while generating voiceover, it simultaneously searches and downloads stock footage to save time. Setup steps Add OpenAI credentials for script generation and text-to-speech Get a free Pexels API key from pexels.com/api for stock footage access Connect Google Drive for storing the final video output Install FFmpeg (optional) for automated video assembly, or manually combine the components Test the webhook by sending a POST request with your video topic Input format: { "prompt": "Your video topic here", "duration": 60, "style": "motivational" } What you get ✅ AI-generated script broken into scenes ✅ Professional voiceover audio (MP3) ✅ Downloaded stock footage clips (MP4) ✅ Timed subtitles file (SRT) ✅ All components ready for final editing Note: The final video assembly requires FFmpeg or a video editor. All components are prepared and organized by scene number for easy manual editing if needed.
by osama goda
How it works This workflow automates a full RAG ingestion pipeline. When a new OCR JSON file is added to a Google Drive folder, the workflow extracts lesson metadata, parses and cleans the Arabic text, generates semantic chunks, creates AI embeddings, and stores them in a Pinecone vector index. After processing, the file is automatically moved to an archive folder to prevent duplicates. Set up steps Follow the sticky notes inside the workflow for detailed instructions. Connect your Google Drive credentials. Replace the input folder ID and archive folder ID with your own. Connect your OpenAI account for embeddings. Connect your Pinecone API key and select your index. The workflow is ready to run once credentials and folder paths are configured.
by Isight
Who’s it for This workflow is designed for marketers, content creators, agency owners, and solopreneurs who want to automate LinkedIn content creation using AI. It helps turn Google Sheets entries into complete LinkedIn posts, including text, image prompts, and AI-generated images. What it does / How it works The workflow monitors a Google Sheet for new campaign entries. When a row is added, it automatically collects details about the campaign, searches LinkedIn via Tavily to identify relevant trends, and turns the information into an AI-generated LinkedIn post using a local Ollama model or an LLM of your choice. A second approval step lets you refine the text directly inside the sheet. Once approved, the workflow generates an image prompt, creates a ready-to-post visual with OpenAI Images, and finally publishes the post to LinkedIn. How to set up • Add your own Google Sheets Trigger credentials. • Add Tavily, Ollama/OpenAI, and LinkedIn OAuth credentials. • Replace the sample Sheet URL with your own. • Set your LinkedIn account/person ID in the posting node. Requirements • Google Sheets account • LinkedIn OAuth app • Tavily API key • Ollama (local) or OpenAI image generation How to customize You can change: • AI model • Image generation provider • Search query logic • Content tone • Approval step (manual or automatic)
by Pavlo Hurhu
This n8n workflow promotes your brand/company/platform by mentioning it in Twitter comments. The responses look human-like, the workflow is robust and designed to avoid bans. Good to know The workflow is configured to maximize efficiency while minimizing costs and ensuring your Twitter account won't get banned or shadow-banned. Generating more than 17 comments per day would require a paid Twitter subscription plan. How it works The User sets a keyword that would be used to find relevant Posts. An AI Agent analyzes each Post and writes a response, promoting User's Brand. After each reponse is submitted, the result is logged in a Report Table for tracking and convenience. Set up steps Set your target keyword and start the workflow. Detailed instructions and tutorials can be found in the workflow's sticky notes. Requirements Twitter and Google accounts. twitterapi.io subscription (used to overcome official Twitter API limitaions). Anthropic subscription (GPT models are also supported, but I personally recommend using Anthropic Claude Sonnet 4 for text generation).