by Agent Studio
Overview This n8n workflow processes user feedback automatically, tags it with sentiment, and links it to relevant insights in Notion. It uses GPT-4 to analyze each feedback entry, determine whether it corresponds to an existing insight or a new one, and update the Notion databases accordingly. It helps teams centralize and structure qualitative user feedback at scale. Who It’s For Product teams looking to organize and prioritize user feedback. Founders or solo builders seeking actionable insights from qualitative data. Anyone managing a Notion workspace where feedback is collected and needs to be tagged or linked to features and improvements. Prerequisites A Notion account with: A Feedback database (must include fields for feedback content and status). An Insights database with multi-select fields for Solution, User Persona, and a relation to Feedback. The Notion template (linked below) helps you get started quickly — just remove the mock data. A configured Notion API integration in n8n. 👉 Don’t forget to connect the n8n integration to the correct Notion page. An OpenAI API key Notion Template This workflow is designed to work seamlessly with a pre-configured Notion template that includes the required feedback and insights structure. 👉 User Feedback Analysis – Notion Template How It Works The workflow is triggered when a feedback item is updated in Notion (e.g. new feedback is submitted). Sentiment analysis (Positive, Neutral, or Negative) is run using OpenAI and stored in a select field in Notion. The AI agent analyzes the feedback to: Identify whether it matches an existing insight. Or create a new insight in Notion with a concise name, solution, and user persona. The feedback is then linked to the appropriate insight and marked as "Processed." How to Use It Connect your Notion databases in all Notion nodes (including those used by the AI agent) for both Feedback and Insights — follow the node names provided. Ensure your OpenAI and Notion credentials are correctly set. Set up your product context: Define a “Product Overview” and list your “Core Features”. This helps the AI agent categorize insights more accurately. (The Basecamp product is used as an example in the template.) (Optional) Modify the prompt to better fit your specific product context. Once feedback is added or updated in Notion, the workflow triggers automatically. Notes Only feedback with the status Received is processed. New insights are only created if no relevant match is found. Feedback is linked to insights via Notion’s relation property. A fallback parser is included to fix potential formatting issues in the AI output. You can swap the default n8n memory for a more robust backend like Supabase. 🙏 Please share your feedback with us. It helps us tremendously!
by IvanCore
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Telegram Voice AI Assistant This n8n template creates a multimodal Telegram bot that dynamically responds to users: Replies with voice** when receiving voice messages (using ElevenLabs TTS) Replies with text** for text-based queries Supports custom AI tools (e.g., crypto APIs, databases, or custom functions) Built with LangChain Agents, it can integrate any external API or data source into conversations. Key Features 🎙️ Smart Response Logic Voice Query? → Voice Reply** Transcribes audio via ElevenLabs STT Processes with AI (Groq/Gemini) Converts text response to natural speech (ElevenLabs TTS) Text Query? → Text Reply** Bypasses TTS/STT for faster responses 🛠️ Extensible AI Tools Add your own tools: Database lookups Weather/stock APIs Custom Python functions RAG (document retrieval) Supports multi-step tool chaining (e.g., "Get BTC price → analyze trends → summarize") 🌐 Language & Context Auto-detects user language (via Telegram’s language_code) Maintains session memory (remembers conversation history) Use Cases Voice-first customer support** Crypto/analytics assistants** (e.g., "What’s Ethereum’s current gas fee?") Multilingual FAQ bots** Educational tutors** (voice-interactive learning) Requirements Telegram Bot Token** ElevenLabs API Key** (For TTS/STT) Groq API Key* or *Google Gemini API Key** Customization Tips Change AI personality*: Modify the *systemMessage in the Voice Assistant node Add more models**: Swap Groq/Gemini for OpenAI, Anthropic, etc. Extend functionality**: Add RAG (Retrieval-Augmented Generation) for document queries Take this template to create a Siri-like AI assistant for Telegram in minutes! 🚀
by AiAgent
What It Does This intelligent workflow simplifies the complex task of determining whether a website is legitimate or potentially a scam. By simply submitting a URL through a form, the system initiates a multi-agent evaluation process. Four dedicated AI agents—each powered by GPT-4o and connected to SerpAPI—analyze different dimensions of the website: domain and technical details, search engine signals, product and pricing patterns, and on-site content analysis. Their findings are then passed to a fifth AI agent, the Analyzer, powered by GPT-4o mini, which consolidates the data, scores the site on a scale of 1–10 for scam likelihood, and presents the findings in a clear, structured format for the user. Who It's For This workflow is ideal for anyone who needs to quickly and reliably assess the trustworthiness of a website. Whether you're a consumer double-checking a store before making a purchase, a small business owner validating supplier sites, a cybersecurity analyst conducting threat assessments, or a developer building fraud detection into your platform — this tool offers fast, AI-powered insights without the need for manual research or technical expertise. It's designed for both individuals and teams who value accurate, scalable scam detection. How It Works The process begins with a simple form submission where the user enters the URL of the website they want to investigate. Once submitted, the workflow activates four specialized AI agents—each powered by GPT-4o and connected to SerpAPI—to independently analyze the site from different angles: Agent 1 examines domain age, SSL certificates, and TLD trustworthiness. Agent 2 reviews search engine results, forum mentions, and public scam reports. Agent 3 analyzes product pricing patterns and brand authenticity. Agent 4 assesses on-site content quality, grammar, legitimacy of claims, and presence of business info. Each agent returns its findings, which are then aggregated and passed to a fifth AI agent—the Analyzer. This final agent, powered by GPT-4o mini, evaluates all the input, assigns a scam likelihood score from 1 to 10, and compiles a neatly formatted summary with organized insights and a disclaimer for context. Set UP You will need to obtain an Open AI API key from platform.openai.com/api-keys After you obtain this Open AI API key you will need to connect it to the Open AI Chat Model for all of the Tools agents (Analyzer, Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents). You will now need to fund your Open AI account. GPT 4o costs ~$0.01 to run the workflow. Next you will need to create a SerpAPI account at https://serpapi.com/users/sign_up After you create an account you will need to obtain a SerpAPI key. You will then need to use this key to connect to the SerpAPI tool for each of the tools agents (Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents) Tip: SerpAPI will allow you to run 100 free searches each month. This workflow uses ~5-15 SerpAPI searches per run. If you would like to utilize the workflow more than that each month, create multiple SerpAPI accounts and have an API key for each account. When you utilize all 100 free searches for an account, switch to the API key for another account within the workflow. Disclaimer This tool is designed to assist in evaluating the potential risk of websites using AI-generated insights. The scam likelihood score and analysis provided are based on publicly available information and should not be considered a definitive or authoritative assessment. This tool does not guarantee the accuracy, safety, or legitimacy of any website. Users should perform their own due diligence and use independent judgment before engaging with any site. N8N, OpenAI, its affiliates, and the creators of this workflow are not responsible for any loss, damages, or consequences arising from the use of this tool or the actions taken based on its results.
by n8n Team
This workflow offers an effective way to handle a chatbot's functionality, making use of multiple tools for information retrieval, conversation context storage, and message sending. It's a setup tailored for a Slack environment, aiming to offer an interactive, AI-driven chatbot experience. Note that to use this template, you need to be on n8n version 1.19.4 or later.
by kenandrewmiranda
An automated n8n workflow that analyzes stocks using RSI and MACD, summarizes insights with OpenAI, and sends a Slack-ready market update every hour. This workflow: Runs hourly from 6:30 AM to 2:30 PM PT, Mon–Fri Checks if the U.S. stock market is open using Alpaca’s /clock API Pulls daily stock bars for a list of tickers via Alpaca’s /v2/stocks/bars Calculates RSI and MACD using a Python code node Categorizes each stock as Buy / Hold / Sell Uses OpenAI Assistant to summarize the results in Slack markdown Sends the message to a specific Slack user or channel
by Mario
Template to get your public IP address and push it to Namecheaps Dynamic DNS per subdomain. Open "yourdomain.com" Insert your domain and your Namecheap DDNS password Open "subdomains" Replaces and insert your subdomains Execute Workflow Have fun!
by Max Tkacz
This n8n workflow template lets teams easily generate a custom AI chat assistant based on the schema of any Notion database. Simply provide the Notion database URL, and the workflow downloads the schema and creates a tailored AI assistant designed to interact with that specific database structure. Set Up Watch this quick set up video 👇 Key Features Instant Assistant Generation**: Enter a Notion database URL, and the workflow produces an AI assistant configured to the database schema. Advanced Querying**: The assistant performs flexible queries, filtering records by multiple fields (e.g., tags, names). It can also search inside Notion pages to pull relevant content from specific blocks. Schema Awareness**: Understands and interacts with various Notion column types like text, dates, and tags for accurate responses. Reference Links**: Each query returns direct links to the exact Notion pages that inform the assistant’s response, promoting transparency and easy access. Self-Validation**: The workflow has logic to check the generated assistant, and if any errors are detected, it reruns the agent to fix them. Ideal for Product Managers**: Easily access and query product data across Notion databases. Support Teams**: Quickly search through knowledge bases for precise information to enhance support accuracy. Operations Teams**: Streamline access to HR, finance, or logistics data for fast, efficient retrieval. Data Teams**: Automate large dataset queries across multiple properties and records. How It Works This AI assistant leverages two HTTP request tools—one for querying the Notion database and another for retrieving data within individual pages. It’s powered by the Anthropic LLM (or can be swapped for GPT-4) and always provides reference links for added transparency.
by Davide
💬🗂️🤖 This workflow automates the translation of Google Slides presentations from any languages, while preserving the original formatting and slide structure. It leverages Google APIs, AI translation (Gemini/PaLM), and modular execution for high flexibility and accuracy. DISCLAIMER: texts are split by Google Slides APIs into small blocks, so the translation will not always be contextualized. Key Benefits ⚡ Time-Saving**: Automates a typically manual and error-prone task of translating slides. 🌍 AI-Powered Accuracy**: Uses Google Gemini to provide context-aware translations while respecting defined rules. 🔒 Safe & Non-Destructive**: The original presentation is never modified — a new copy is always created. 🎯 Precision**: Skips irrelevant text (e.g., emails, URLs, names) to avoid mistranslation. 🔁 Modular & Scalable**: Uses subworkflows and batching, ideal for presentations with many slides. 🎨 Layout Preservation**: Keeps the original design and formatting intact. How it Works Initialization: The workflow starts with a manual trigger ("When clicking ‘Execute workflow’"). Set the language to translate (IMPORTANT format ISO-639) It duplicates a specified Google Slides presentation ("Duplicate presentation") to create a new copy for translation, preserving the original. Slide Processing: The workflow retrieves slides from the copied presentation ("Get slides from a presentation") and processes them in batches ("Loop Over Items"). For each slide, text content is extracted ("Extract Text") using a custom JavaScript snippet, which identifies and collects text elements while retaining the slide's objectId. Translation: The extracted texts are passed to a LangChain agent ("Translation expert"), which translates the content from Italian to English. The agent follows strict guidelines (e.g., skipping URLs, brand names, etc.). The translated text is sent to the "Translate Google Slides" node, which replaces the original text in the presentation using the slide's objectId for targeting. Execution Flow: The workflow includes delays ("Wait 10 sec" and "Wait 3 sec") to manage API rate limits and ensure smooth execution. The process repeats for each batch of slides until all content is translated. Set Up Steps Prerequisites: Ensure access to the source Google Slides presentation (specified by fileId in "Duplicate presentation"). Set up Google OAuth2 credentials for Google Drive and Slides (nodes reference credentials like "Google Slides account"). Configure the Google Gemini (PaLM) API credentials for the translation agent. Configuration: Update the fileId in the "Duplicate presentation" node to point to your source presentation. Adjust the translation guidelines in the "Translation expert" node if needed (e.g., language pairs or exclusion rules). Modify batch sizes or wait times (e.g., "Wait 10 sec") based on API constraints. Execution: Run the workflow manually or trigger it via the "Execute Workflow" node from another workflow. Monitor progress in n8n’s execution log, as each slide is processed and translated sequentially. Output: The translated presentation is saved as a new file in Google Drive, with the filename including a timestamp (e.g., NAME_PRESENTATION_{lang}_{timestamp}). Note: The workflow is currently inactive ("active": false); enable it after configuration. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Automate With Marc
🎬 Veo3 Instagram Reel Generator – AI-Powered Ad Creation in Minutes Description: This no-code workflow transforms your creative brief into an engaging Instagram Reel using OpenAI and Veo3 API (via Wavespeed) — fully automated in n8n. Just type a product, theme, or trend via chat, and get a short-form video plus caption delivered and logged, ready to post. Perfect for marketers, creators, and content teams looking to scale their ad content output without hiring editors or creative agencies. Watch step-by-step build video tutorial here: https://www.youtube.com/@Automatewithmarc ⚙️ How It Works: 💬 Chat Trigger Start by sending a message like “Create an ad for a minimalist perfume brand using the ‘quiet luxury’ trend.” 🧠 Prompt Engineer (ChatGPT) Generates a 5–8 second descriptive video prompt suitable for Veo3 based on your input — including visual tone, motion, and hook. 📡 API Call to Veo3 via Wavespeed Submits the prompt to create a short video (9:16 ratio, ~8 seconds), then polls for the final video URL. ✍️ Caption Generator (GPT) Creates an Instagram-friendly caption to pair with the video, using a playful, impactful writing style. 📄 Google Sheets Integration Logs each generated video prompt, final video URL, caption, and status into a Google Sheet for easy management and scheduling. 🔌 Tools & Integrations: OpenAI GPT (Prompt generation & caption copywriting) Veo3 via Wavespeed API (Video generation) Google Sheets (Content tracking and publishing queue) Telegram / Chat UI trigger (Optional – easily swappable) 💡 Use Cases: Instagram & TikTok ad generation Creative automation for digital agencies Short-form UGC testing at scale Trend-driven campaign ideation
by Catalina Kuo
Overview Do you often forget to record expenses? 你是不是會常常忘記紀錄花費? Let Spending Tracker Bot help you! 讓 Spending Tracker Bot 來幫你! This AI image/text Spending Tracker LINE Bot Workflow allows you to quickly create a customized spending tracker robot without writing a line of code. At any time, you can speak or send a photo, and the AI will parse it and automatically log the expense to your cloud ledger. 這套 AI 圖片文字記帳 LINE Bot Workflow ,讓你不用寫一行程式碼,就能快速打造一個量身訂製的記帳機器人。無論何時,只需要口述或發送一張照片,AI 就會幫你整理好自動計入雲端帳本 Preparation ① Enable the Google Sheets API in GCP and complete the OAuth setup ② Create the Google Sheet and populate the field names (Feel free to modify based on your own needs) ③ Configure the Webhook URL in the LINE Developers Console ④ OpenAI API Key ① 在 GCP 啟用 Google Sheets API,並完成 OAuth ② 建立並填好 Google Sheet 欄名 (按照自己的需求做更動) ③ 於 LINE Developers 控制台設定 Webhook URL ④ OpenAI API Key Node Configurations Webhook Purpose: The URL is used to receive incoming requests from LINE. Configuration: Paste this URL into the Webhook URL field in your LINE Developers Console. 用途: 要接收 Line 的 URL 設定: 將 URL 放到 Line Webhook URL Switch based on Expense Type & Set/Https Purpose: To distinguish whether the incoming message is text or an image. Configuration: Use a Switch node to route the flow accordingly. 用途: 區分 text 或 image 設定: switch 分流 AI Agent Purpose: To extract and organize the required fields. Configuration: Chat Model & Structured Output Parser. 用途: 整理出需要的欄位 設定: Chat Model & Structured Output Parser Create a deduplication field Purpose: To prevent duplicate entries by creating a unique "for_deduplication" field. Configuration: Join multiple field names using hyphens (-) as separators. 用途: 確保不會重複寫入,先創建一個"去重使用"欄位 設定: 用 - 連接多個欄位 Aggregrate & Merge_all Purpose: To prevent duplicate entries in the data table. Configuration: Read the Google sheet, extract the existing "for_deduplication" column into a dedupeList, and compare it against the newly generated "for_deduplication" value from the previous step. 用途: 防止重複寫入資料表 設定:讀取雲端表,將原本的"去重使用欄位"整理成dedupeList,與前一步整理好的"去重使用"欄位做比對 Response Switch Purpose: To route data and send appropriate replies based on the content. Configuration: Use the replyToken to respond after branching logic. Depending on the result, either write to the data table or return a message: ✅ Expense recorded successfully: <for_deduplication> Irrelevant details or images will not be logged. ⚠️ This entry has already been logged and will not be duplicated. 用途: 資料分流,回應訊息 設定:使用 replyToken ,資料分流後,寫入資料表或回應訊息 ✅ 記帳成功 : <去重使用欄位> 不相關明細或圖片,不會計入 ⚠️ 此筆資料已記錄過,不會重複記帳 Step by step teaching notes 【Auto Expense Tracker from LINE Messages with GPT-4 and Google Sheets】 【AI 圖片文字記帳 Line Bot,自動記帳寫入 Google Sheet】
by Cristian Tala Sánchez
✨ SEO Blog Post Automation with Perplexity, GPT, Leonardo AI & WordPress This workflow automates the creation and publishing of weekly SEO-optimized blog posts using AI and publishes them directly to WordPress — with featured images and tracking in Google Sheets. 🧠 Who is this for This automation is ideal for: Startup platforms and tech blogs Content creators and marketers Solopreneurs who want consistent blog output Spanish-speaking audiences focused on startup trends ⚙️ What it does ⏰ Runs every Monday at 6:00 AM via CRON 📡 Uses Perplexity AI to research trending startup topics 📝 Generates a 1000–1500 word article with GPT in structured HTML 🎨 Creates a cinematic blog image using Leonardo AI 🖼️ Uploads the image to WordPress with alt text and SEO-friendly filename 📰 Publishes the post in a pre-defined category 📊 Logs the post in Google Sheets for tracking 🚀 How to set it up Connect your credentials: Perplexity API OpenAI (GPT-4.1 Mini or similar) Leonardo AI (Bearer token) WordPress (Basic Auth) Google Sheets (OAuth2) Customize your content: Adjust the prompt inside the HTTP node to fit your tone or focus Change the WordPress category ID Update scheduling if you want a different publishing day Test the workflow manually to ensure all steps function correctly 💡 Pro tips Add Slack or email nodes to get notified when a post goes live Use multiple categories or RSS feeds for content diversification Adjust GPT prompt to support different languages or tones Add post-validation rules if needed before publishing 🎯 Why this matters This workflow gives you a full editorial process on autopilot: research, writing, design, publishing, and tracking — all powered by AI. No more blank pages or manual posting. Use it to scale your content strategy, boost your SEO, and stay relevant — 100% hands-free.
by Markhah
Overview This n8n workflow is a modular AI analyst system that provides real-time insights from CoinMarketCap’s centralized and decentralized data sources. Using GPT-based AI, the system interprets natural language questions about the crypto market and delegates them to specialized agent workflows. It supports Telegram chat input and returns structured results such as coin quotes, DEX liquidity, exchange info, and community sentiment—all integrated from the CoinMarketCap API ecosystem. Prerequisites a. OpenAI or Gemini account (via GPT-4o-mini or equivalent LLM). b. Telegram Bot API token (for message input/output). c. Valid CoinMarketCap API key. 📦 Required subflows: CoinMarketCap_Crypto_Agent_Tool CoinMarketCap_Exchange_and_Community_Agent_Tool CoinMarketCap_DEXScan_Agent_Tool d. All tools must be installed and configured before use. Each one acts as a specialized endpoint wrapper for CoinMarketCap APIs. How It Works Telegram Input Users send a query to the bot (e.g. “Top DEX pairs on Ethereum”). Session Memory & Agent Brain Session is tracked via chat.id GPT-4o-mini interprets the query, routes to sub-agents Sub-Agent Workflows Crypto Agent: prices, rankings, conversions Exchange Agent: community sentiment, exchange token holdings DEX Agent: OHLCV data, liquidity pools, trades Multi-Agent Coordination AI can combine queries across tools (e.g., get token ID → fetch quote → analyze liquidity) Ensures valid parameters and avoids API errors Telegram Output Final analysis is sent back to the user as a formatted message. Troubleshooting Tips Error Code Meaning Fix 400 Bad request Check symbol/slug/ID validity 401 Unauthorized Verify CoinMarketCap API key 429 Rate limit exceeded Throttle or upgrade API tier 500 Server error Retry with backoff or report to CMC Example Telegram Queries “Show me top 5 coins by market cap” “Get price of ETH on Uniswap and Binance” “How much liquidity is in SOL-USDC pair?” “Fear & Greed Index and trending tokens” SEO Tags (ẩn hoặc ghi chú riêng): coinmarketcap, n8n crypto analyst, crypto ai telegram bot, dex liquidity, CMC price tracker, gpt-4o crypto market, token sentiment dashboard, fear and greed index