by Satoshi
Overview The workflow automatically gathers weekly user and page view metrics. It then uses AI to analyze, compare, and compile a summary report. Finally, it sends the report to the manager's email. How it works Get Data from GA Automatically retrieve data from Google Analytics (GA) for the two most recent weeks. Compare the data and calculate the variances between the two weeks. Generate Report Automatically analyze the data and generate reports using Artificial Intelligence (AI). Generate charts to visualize the data. Export the report to PDF. Send Report Send the report via email to the manager. Set up steps Google cloud account Create the credentials and replace them in the workflow. Please enable the following APIs: Gmail API Google Analytics Admin API Google Analytics Data API HTML to PDF account You need to install node HTML to PDF. Get API key and replace in the workflow.
by Robert Breen
This n8n workflow template creates an intelligent data analysis chatbot that can answer questions about data stored in Google Sheets using OpenAI's GPT-5 Mini model. The system automatically analyzes your spreadsheet data and provides insights through natural language conversations. What This Workflow Does Chat Interface**: Provides a conversational interface for asking questions about your data Smart Data Analysis**: Uses AI to understand column structures and data relationships Google Sheets Integration**: Connects directly to your Google Sheets data Memory Buffer**: Maintains conversation context for follow-up questions Automated Column Detection**: Automatically identifies and describes your data columns 🚀 Try It Out! 1. Set Up OpenAI Connection Get Your API Key Visit the OpenAI API Keys page. Go to OpenAI Billing. Add funds to your billing account. Copy your API key into your OpenAI credentials in n8n (or your chosen platform). 2. Prepare Your Google Sheet Connect Your Data in Google Sheets Data must follow this format: Sample Marketing Data First row** contains column names. Data should be in rows 2–100. Log in using OAuth, then select your workbook and sheet. 3. Ask Questions of Your Data You can ask natural language questions to analyze your marketing data, such as: Total spend** across all campaigns. Spend for Paid Search only**. Month-over-month changes** in ad spend. Top-performing campaigns** by conversion rate. Cost per lead** for each channel. 📬 Need Help or Want to Customize This? 📧 rbreen@ynteractive.com 🔗 LinkedIn 🔗 n8n Automation Experts
by Pake.AI
Overview This workflow converts a single topic into a full blog article through a structured multi-step process. Instead of generating everything in one pass, it breaks the task into clear stages to produce cleaner structure, better SEO consistency, and more predictable output quality. How this workflow differs from asking ChatGPT directly It does not produce an article in one step. It separates the process into two focused stages: outline generation and paragraph expansion. This approach gives you more control over tone, SEO, structure, and keyword placement. How it works 1. Generate outline The workflow sends your topic to an AI Agent. It returns a structured outline based on the topic, desired depth, language, and keyword focus. 2. Expand each subtopic The workflow loops through each outline item. Every subtopic is expanded into a detailed, SEO-friendly paragraph. Output is consistent and optimized for readability. 3. Produce final outputs Combines all expanded sections into: A clean JSON object A Markdown version ready for blogs or CMS The JSON includes: Title HTML content Markdown content You can send this directly to REST APIs such as WordPress, Notion, or documentation platforms. Content is validated for readability and typically scores well in tools like Yoast SEO. Uses GPT-4o Mini by default, with average token usage between 2000 and 3000 depending on outline size. Use cases Auto-generate long-form articles for blogs or content marketing. Turn Instagram or short-form scripts into complete SEO articles. Create documentation or educational content using consistent templates. Setup steps 1. Prepare credentials Add your OpenAI API Key inside n8n’s credential manager. 2. Adjust input parameters Topic or main idea Number of outline items Language Primary keyword Tone or writing style (optional) 3. Customize the workflow Switch the model if you want higher quality or lower token usage. Modify the prompt for the outline or paragraph generator to match your writing style. Add additional nodes if you want to auto-upload the final article to WordPress, Notion, or any API. 4. Run the workflow Enter your topic Execute the workflow Retrieve both JSON and Markdown outputs for immediate publishing If you need help expanding this into a full content pipeline or want to integrate it with other automation systems, feel free to customize further.
by n8n Team
This workflow provides a simple example of how to use itemMatching(itemIndex: Number) in the Code node to retrieve linked items from earlier in the workflow.
by Lucas Peyrin
How it works This workflow is an interactive, hands-on tutorial designed to teach you the absolute basics of JSON (JavaScript Object Notation) and, more importantly, how to use it within n8n. It's perfect for beginners who are new to automation and data structures. The tutorial is structured as a series of simple steps. Each node introduces a new, fundamental concept of JSON: Key/Value Pairs: The basic building block of all JSON. Data Types: It then walks you through the most common data types one by one: String (text) Number (integers and decimals) Boolean (true or false) Null (representing "nothing") Array (an ordered list of items) Object (a collection of key/value pairs) Using JSON with Expressions: The most important step! It shows you how to dynamically pull data from a previous node into a new one using n8n's expressions ({{ }}). Final Exam: A final node puts everything together, building a complete JSON object by referencing data from all the previous steps. Each node has a detailed sticky note explaining the concept in simple terms. Set up steps Setup time: 0 minutes! This is a tutorial workflow, so there is no setup required. Simply click the "Execute Workflow" button to run it. Follow the instructions in the main sticky note: click on each node in order, from top to bottom. For each node, observe the output in the right-hand panel and read the sticky note next to it to understand what you're seeing. By the end, you'll have a solid understanding of what JSON is and how to work with it in your own n8n workflows.
by Mohamed Salama
Let AI agents fetch communicate with your Bubble app automatically. It connects direcly with your Bubble data API. This workflow is designed for teams building AI tools or copilots that need seamless access to Bubble backend data via natural language queries. How it works Triggered via a webhook from an AI agent using the MCP (Model-Chain Prompt) protocol. The agent selects the appropriate data tool (e.g., projects, user, bookings) based on user intent. The workflow queries your Bubble database and returns the result. Ideal for integrating with ChatGPT, n8n AI-Agents, assistants, or autonomous workflows that need real-time access to app data. Set up steps Enable access to your Bubble data or backend APIs (as needed). Create a Bubble admin token. Add your Bubble node/s to your n8n workflow. Add your Bubble admin token. Configer your Bubble node/s. Copy the generated webhook URL from the MCP Server Trigger node and register it with your AI tool (e.g., LangChain tool loader). (Optional) Adjust filters in the “Get an Object Details” node to match your dataset needs. Once connected, your AI agents can automatically retrieve context-aware data from your Bubble app, no manual lookups required.
by moosa
Daily Tech & Startup Digest: Notion-Powered News Curation Description This n8n workflow automates the curation of a daily tech and startup news digest from articles stored in a Notion database. It filters articles from the past 24 hours, refines them using keyword matching and LLM classification, aggregates them into a single Markdown digest with categorized summaries, and publishes the result as a Notion page. Designed for manual testing or daily scheduled runs, it includes sticky notes (as required by the n8n creator page) to document each step clearly. This original workflow is for educational purposes, showcasing Notion integration, AI classification, and Markdown-to-Notion conversion. Data in Notion Workflow Overview Triggers Manual Trigger**: Tests the workflow (When clicking ‘Execute workflow’). Schedule Trigger**: Runs daily at 8 PM (Schedule Trigger, disabled by default). Article Filtering Fetch Articles**: Queries the Notion database (Get many database pages) for articles from the last 24 hours using a date filter. Keyword Filtering**: JavaScript code (Code in JavaScript) filters articles containing tech/startup keywords (e.g., "tech," "AI," "startup") in title, summary, or full text. LLM Classification**: Uses OpenAI’s gpt-4.1-mini (OpenAI Chat Model) with a text classifier (Text Classifier) to categorize articles as "Tech/Startup" or "Other," keeping only relevant ones. Digest Creation Aggregate Articles**: Combines filtered articles into a single object (Code in JavaScript1) for processing. Generate Digest**: An AI agent (AI Agent) with OpenAI’s gpt-4.1-mini (OpenAI Chat Model1) creates a Markdown digest with an intro paragraph, categorized article summaries (e.g., AI & Developer Tools, Startups & Funding), clickable links, and a closing note. Notion Publishing Format for Notion**: JavaScript code (Code in JavaScript2) converts the Markdown digest into a Notion-compatible JSON payload, supporting headings, bulleted lists, and links, with a title like “Tech & Startup Daily Digest – YYYY-MM-DD”. Create Notion Page**: Sends the payload via HTTP request (HTTP Request) to the Notion API to create a new page. Credentials Uses Notion API and OpenAI API credentials. Notes This workflow is for educational purposes, demonstrating Notion database querying, AI classification, and Markdown-to-Notion publishing. Enable and adjust the schedule trigger (e.g., 8 PM daily) for production use to create daily digests. Set up Notion and OpenAI API credentials in n8n before running. The date filter can be modified (e.g., hours instead of days) to adjust the article selection window.
by Interlock GTM
Summary Turns a plain name + email into a fully-enriched HubSpot contact by matching the person in Apollo, pulling their latest LinkedIn activity, summarising the findings with GPT-4o, and upserting the clean data into HubSpot Key use-cases SDRs enriching inbound demo requests before routing RevOps teams keeping executive records fresh Marketers building highly-segmented email audiences Inputs |Field |Type| Example| |-|-|-| name |string| “Jane Doe” email| string |“jane@acme.com” Required credentials |Service |Node |Notes| |-|-|-| Apollo.io API key | HTTP Request – “Enrich with Apollo” |Set in header x-api-key RapidAPI key| (Fresh-LinkedIn-Profile-Data) “Get recent posts”| Header x-rapidapi-key OpenAI 3 LangChain nodes| Supply an API key| default model gpt-4o-mini HubSpot OAuth2| “Enrich in HubSpot”| Add/create any custom contact properties referenced High-level flow Trigger – Runs when another workflow passes name & email. Clean – JS Code node normalises & deduplicates emails. Apollo match – Queries /people/match; skips if no person. LinkedIn fetch – Grabs up to 3 original posts from last 30 days. AI summary chain OpenAI → Structured/Auto-fixing parsers Produces a strict JSON block with job title, location, summaries, etc. HubSpot upsert – Maps every key (plus five custom properties) into the contact record. Sticky-notes annotate the canvas; error-prone bits have retry logic.
by Trung Tran
AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) Watch the demo video below: > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. Who’s it for YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. How it works / What it does Weekly Schedule Trigger Runs the workflow once per week. Get all videos uploaded last week Queries YouTube for videos uploaded by the channel in the past 7 days. Get video detail Retrieves each video’s title, description, and ID. YouTube Video Auto Tagging Agent (LLM) Inputs: video.title, video.description, channelName. Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. Update video with AI-generated tags Writes the tags back to the video via YouTube Data API. Inform via Slack message Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. How to set up YouTube connection Create a Google Cloud project and enable YouTube Data API v3. Configure OAuth client (Web app / Desktop as required). Authorize with the Google account that manages the channel. In your automation platform, add the YouTube credential and grant scopes (see Requirements). Slack connection Create or use an existing Slack app/bot. Install to your workspace and capture the Bot Token. Add the Slack credential in your automation platform. LLM / Chat Model Select your model (e.g., OpenAI GPT). Paste the System Prompt (SEO expert) and the User Prompt template: Inputs: {{video_title}}, {{video_description}}, {{channel_name}}. Output: comma-separated list of 15–20 tags (no #, no duplicates). Node configuration Weekly Schedule Trigger: choose day/time (e.g., Mondays 09:00 local). Get all videos uploaded last week: date filter = now() - 7 days. Get video detail: map each video ID from previous node. Agent node: map fields to the prompt variables. Update video: map the agent’s tag string to the YouTube tags field. Slack message: The video "{{video_title}} - {{video_id}}" has been auto-tagged successfully. Tags: {{tags}} Test run Manually run the workflow with one recent video. Verify the tags appear in YouTube Studio and the Slack message posts. Requirements APIs & Scopes YouTube Data API v3** youtube.readonly (to list videos / details) youtube or youtube.force-ssl (to update video metadata incl. tags) Slack Bot Token Scopes** chat:write (post messages) channels:read or groups:read if selecting channels dynamically (optional) Platform Access to a chat/LLM provider (e.g., OpenAI). Outbound HTTPS allowed. Rate limits & quotas YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. How to customize the workflow Schedule:** switch to daily, or run on publish events instead of weekly. Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). Prompt tuning:** Add brand keywords to always include (e.g., “WiseStack AI”). Constrain to language (e.g., “Vietnamese tags only”). Enforce max 500 chars total for tags if you want a stricter cap. Safety guardrails:** Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. If the agent fails, fall back to a heuristic generator (title/keywords extraction). Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. Notifications:** add error Slack messages for failed API calls; summarize weekly results. Tip: Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.
by Cuong Nguyen
Description: Start your day with a personalized news podcast delivered directly to your Telegram. This workflow helps you stay informed without scrolling through endless feeds. It automatically collects news from your favorite websites and YouTube channels, filters out the noise, and uses AI to turn them into a short, listenable audio briefing. It’s like having a personal news assistant that reads the most important updates to you while you commute or drink your morning coffee. Who is this for This template is perfect for busy professionals, commuters, and learners who want to keep up with specific topics (like Tech, Finance, or AI) but don't have time to read dozens of articles every morning. How it works Collects News: The workflow automatically checks your chosen RSS feeds (e.g., TechCrunch, BBC) and searches for trending YouTube videos on topics you care about. Filters Noise: It smartly removes duplicate stories and filters out promotional content or spam, ensuring you only get high-quality news. Summarizes: Google Gemini (AI) reads the collected data, picks the top stories, and rewrites them into a clear, engaging script. Creates Audio: OpenAI turns that script into a natural-sounding MP3 file (Text-to-Speech). Delivers: You receive a neat text summary and the audio file in your Telegram chat, ready to play. Requirements API Keys: Google Gemini (PaLM) OpenAI YouTube Data API Telegram Bot Token How to set up Get Credentials: Sign up for the required services (Google, OpenAI, Telegram) and get your API keys. Connect Nodes: Use your API credentials into the respective nodes in the workflow. Set Chat ID: Enter your Telegram Chat ID in the Telegram nodes (or set it as a variable) so the bot knows where to send the message. Turn on: Activate the workflow switch to let it run automatically every morning at 7:00 AM (or any time you want). How to customize the workflow Your Interests:** Simply change the URLs in the RSS Feed Read nodes to follow your favorite blogs. Your Topics:** Update the keywords in the YouTube - Search node (e.g., change "AI" to "Football" or "Marketing") to get relevant video news. Your Voice:** You can change the voice style (e.g., from alloy to echo) in the Code - Build TTS Payload node to suit your preference. Contact me for consulting and support Email: cuongnguyen@aiops.vn
by Dmitrij Zykovic
Personal Expense Tracker Bot 💰 AI-powered Telegram bot for effortless expense tracking. Send receipts, voice messages, or text - the bot automatically extracts and categorizes your expenses. ✨ Key Features 📸 Receipt & Invoice OCR** - Send photos of receipts or PDF invoices, AI extracts expense data automatically 🎤 Voice Messages** - Speak your expenses naturally, audio is transcribed and processed 💬 Natural Language** - Just type "spent 50 on groceries" or any text format 🌍 Multilingual** - Processes documents in any language (EN, DE, PT, etc.) 📊 Smart Statistics** - Get monthly totals, category breakdowns, multi-month comparisons 🔒 Private & Secure** - Single-user authorization, only you can access your data ⚡ Zero Confirmation** - Expenses are added instantly, no annoying "confirm?" prompts 🎯 How It Works Send expense data via Telegram: Photo of receipt PDF invoice Voice message Text message AI processes automatically: Extracts amount, date, vendor Categorizes expense Stores in organized format Query your expenses: "Show my expenses for November" "How much did I spend on groceries?" "Compare last 3 months" 📋 Expense Categories Groceries, Transportation, Housing, Utilities, Healthcare, Entertainment, Dining Out, Clothing, Education, Subscriptions, Personal Care, Gifts, Travel, Sports, Other 🔧 Setup Requirements 1. Telegram Bot Create a Telegram bot via @BotFather and get your API token. Configure credentials for nodes: Input, WelcomeMessage, GetAudioFile, GetAttachedFile, GetAttachedPhoto ReplyText, NotAuthorizedMessage, DeleteProcessing 2. OpenRouter API Get API key from OpenRouter for AI processing. Configure credentials for: Gpt4o (main processing) Sonnet45 (expense assistant) 3. Ainoflow API Get API key from Ainoflow for storage and OCR. Configure Bearer credentials for: GetConfig, SaveConfig ExtractFileText, ExtractImageText TranscribeRecording JsonStorageMcp (MCP tool) 🏗️ Workflow Architecture | Section | Description | |---------|-------------| | Message Trigger | Receives all Telegram messages | | Bot Privacy | Locks bot to first user, rejects unauthorized access | | Chat Message / Audio | Routes text and voice messages to AI | | Document / Photo | Extracts text from files via OCR and forwards to AI | | Root Agent | Routes messages to Expense Assistant, validates responses | | Expense Assistant | Core logic: stores expenses, calculates statistics | | Result / Reply | Sends formatted response back to Telegram | | Cleanup / Reset | Manual trigger to delete all data (⚠️ use with caution) | 💬 Usage Examples Adding Expenses 📸 [Send receipt photo] → Added: 45.50 EUR - Groceries (Lidl) 🎤 "Bought coffee for five euros" → Added: 5.00 EUR - Dining Out (coffee) 💬 "50 uber" → Added: 50.00 EUR - Transportation (uber) Querying Expenses "Show my expenses" → November 2025: 1,250.50 EUR (23 expenses) Top: Groceries 450€, Transportation 280€, Dining 220€ "How much on entertainment this month?" → Entertainment: 85.00 EUR (3 expenses) "Compare October and November" → Oct: 980€ | Nov: 1,250€ (+27%) 📦 Data Storage Expenses are stored in JSON format organized by month (YYYY-MM): { "id": "uuid", "amount": 45.50, "currency": "EUR", "category": "Groceries", "description": "Store name", "date": "2025-11-10T14:30:00Z", "created_at": "2025-11-10T14:35:22Z" } ⚠️ Important Notes First user locks the bot** - Run /start to claim ownership Default currency is EUR** - AI auto-detects other currencies Cleanup deletes ALL data** - Use manual trigger with caution No confirmation for adding** - Only delete operations ask for confirmation 🛠️ Customization Change default currency in agent prompts Add/modify expense categories in ExpenseAssistant Extend Root Agent with additional assistants Adjust AI models (swap GPT-4o/Sonnet as needed) 📚 Related Resources Create Telegram Bot OpenRouter Credentials Ainoflow Platform 💼 Need Customization? Want to adapt this template for your specific needs? Custom integrations, additional features, or enterprise deployment? Contact us at Ainova Systems - We build AI automation solutions for businesses. Tags: telegram, expense-tracker, ai-agent, ocr, voice-to-text, openrouter, mcp-tools, personal-finance
by Automate With Marc
## Podcast on Autopilot — Generate Podcast Ideas, Scripts & Audio Automatically with Eleven Labs, GPT-5 and Claude Sonnet 4.0 Bring your solo podcast to life — on full autopilot. This workflow uses GPT-5 and Claude Sonnet to turn a single topic input into a complete podcast episode intro and ready-to-send audio file. How it works Start a chat trigger – enter a seed idea or topic (e.g., “habits,” “failure,” “technology and purpose”). Podcast Idea Agent (GPT-5) instantly crafts a thought-provoking, Rogan- or Bartlett-style episode concept with a clear angle and takeaway. Podcast Script Agent (Claude 4.0 Sonnet) expands that idea into a natural, engaging 60-second opening monologue ready for recording. Text-to-Speech via ElevenLabs automatically converts the script into a high-quality voice track. Email automation sends the finished MP3 directly to your inbox. Perfect for • Solo creators who want to ideate, script and voice short podcasts effortlessly • Content teams prototyping daily or weekly audio snippets • Anyone testing AI-driven storytelling pipelines Customization tips • Swap ElevenLabs with your preferred TTS service by editing the HTTP Request node. • Adjust prompt styles for tone or audience in the Idea and Script Agents. • Modify the Gmail (or other mail service) node to send audio to any destination (Drive, Slack, Notion, etc.). • For reuse at scale, add variables for episode number, guest name, or theme category — just clone and update the trigger node. Watch step-by-step tutorial (how to build it yourself) https://www.youtube.com/watch?v=Dan3_W1JoqU Requirements & disclaimer • Requires API keys for OpenAI + Anthropic + ElevenLabs (or your chosen TTS). • You’re responsible for managing costs incurred through AI or TTS usage. • Avoid sharing sensitive or private data as input into prompt flows. • Designed with modularity so you can turn off or swap/deep-link any stage (idea → script → voice → email) without breaking the chain.