by Lucas Perret
This workflow monitor G2 reviews URLS. When a new review is published, it will: trigger a Slack notification record the review in Google Sheets To install it, you'll need: access to Slack, Google Sheets and ScrapingBee Full guide here: https://lempire.notion.site/Scrape-G2-reviews-with-n8n-3f46e280e8f24a68b3797f98d2fba433?pvs=4
by Romuald Członkowski
Social Media Intelligence Workflow with Bright Data and OpenAI Get a 360 Social media presence report for a person Who's it for Business development professionals, recruiters, sales teams, and market researchers who need comprehensive social media intelligence on individuals for lead qualification, due diligence, partnership evaluation, or candidate assessment. How it works Enter target person's details through the web form (name, company, location) AI Discovery Agent searches across selected platforms using name variations Profile validator verifies discovered profiles with confidence scoring Platform-specific agents analyze each profile using Bright Data MCP tools GPT-4 synthesizes all data into a comprehensive intelligence report Report automatically generated as formatted Google Doc with direct link Requirements Bright Data MCP account with PRO access (Get your Bright Data API key here) OpenAI API key (or alternative LLM provider) Google Drive OAuth connection for report delivery n8n self-hosted instance or cloud account How to set up Update Bright Data credentials: Find "Bright Data MCP" node (look for red warning note) Replace YOUR_BRIGHT_DATA_TOKEN_HERE with your actual token Update UNLOCKER_CODE_HERE with your unlocker code Update Google Drive settings: Find "Create Empty Google Doc" node Select target folder there Configure your LLM credentials (OpenAI or alternative) Test with your own name using "Basic" search depth Watch Youtube Tutorial How to customize the workflow Add platforms**: Extend the Switch node with new cases and create corresponding prompt builders Modify analysis depth**: Edit the platform-specific prompt builders to focus on different metrics Change report format**: Update the final LLM Chain prompt to adjust report structure Add notifications**: Insert Slack or email nodes after report generation Adjust confidence thresholds**: Modify validators to change profile verification requirements Alternative outputs**: Replace Google Docs with PDF, Excel, or webhook to CRM
by Janak Patel
Who’s it for This template is ideal for YouTube video creators who spend a lot of time manually generating SEO assets like descriptions, tags, titles, keywords, and thumbnails. If you're looking to automate your YouTube SEO workflow, this is the perfect solution for you. How it works / What it does Connect a Google Sheet to n8n and pull in the Hindi script (or any language). Use OpenAI to generate SEO content: Video description Tags Keywords Titles Thumbnail titles etc. Use the generated description as input to create a thumbnail image using an image generation API. Store all outputs in the same Google Sheet in separate columns. Optionally, use tools like VidIQ or TubeBuddy to test the SEO strength of generated titles, tags, and keywords. 💡 Note: This example uses Runway’s image generation API, but you can plug in any other image-generation service of your choice. Requirements A Google Sheet with clearly named columns Hindi, English, or other language scripts in the sheet OpenAI API key Runway API key (or any other image generation API) How to set up You can set up this workflow in 15 minutes by following the pre-defined steps. Replace the manual Google Sheet trigger with a scheduled trigger for daily or timed automation. You may also swap Google Sheets with any database or data source of your choice. No Google Sheets API required. Requires minimal JavaScript or Python knowledge for advanced customizations.
by Robert Breen
Give business users a chat box; get back valid BigQuery SQL and live query results. The workflow: Captures a plain-language question from a chat widget or internal portal. Fetches the current table + column schema from your BigQuery dataset (via INFORMATION_SCHEMA). Feeds both the schema and the question to GPT-4o so it can craft a syntactically correct SQL query using only fields that truly exist. Executes the AI-generated SQL in BigQuery and returns the results. Stores a short-term memory by session, enabling natural follow-up questions. Perfect for analysts, customer-success teams, or any stakeholder who needs data without writing SQL. ⚙️ Setup Instructions Import the workflow n8n → Workflows → Import from File (or Paste JSON) → Save Add credentials | Service | Where to create credentials | Node(s) to update | |---------|----------------------------|-------------------| | OpenAI | <https://platform.openai.com> → Create API key | OpenAI Chat Model | | Google BigQuery | Google Cloud Console → IAM & Admin → Service Account JSON key | Google BigQuery (schema + query) | Point the schema fetcher to your dataset In Google BigQuery1 you’ll see: SELECT table_name, column_name, data_type FROM n8nautomation-453001.email_leads_schema.INFORMATION_SCHEMA.COLUMNS Replace n8nautomation-453001.email_leads_schema with YOUR_PROJECT.YOUR_DATASET. Keep the rest of the query the same—BigQuery’s INFORMATION_SCHEMA always surfaces table_name, column_name, and data_type. Update the execution node Open Google BigQuery (the second BigQuery node). In Project ID select your project. The SQL Query field is already {{ $json.output.query }} so it will run whatever the AI returns. (Optional)Embed the chat interface Test end-to-end Open the embedded chat widget. Ask: “How many distinct email leads were created last week?” After a few seconds the workflow will return a table of results—or an error if the schema lacks the requested fields. As specific questions about your data Activate Toggle Active so the chat assistant is available 24/7. 🧩 Customization Ideas Row-limit safeguard**: automatically append LIMIT 1000 to every query. Chart rendering**: send query results to Google Sheets + Looker Studio for instant visuals. Slack bot**: forward both the question and the SQL result to a Slack channel for team visibility. Schema caching**: store the INFORMATION_SCHEMA result for 24 hours to cut BigQuery costs. Contact Email:** rbreen@ynteractive.com Website:** https://ynteractive.com YouTube:** https://www.youtube.com/@ynteractivetraining LinkedIn:** https://www.linkedin.com/in/robertbreen
by Ranjan Kumar
Who’s it for This template is ideal for creators, bloggers, and automation enthusiasts who want to auto-generate blog posts from AI-generated content — without lifting a finger. Whether you're running a tech blog, AI newsletter, or just want to keep your WordPress site fresh, this workflow does the heavy lifting. How it works This n8n workflow automatically publishes WordPress posts using trending content from Reddit RSS feeds (like /r/artificial and /r/MachineLearning), enhanced with AI writing and royalty-free images. RSS Feed Trigger: Fetches new Reddit posts every minute from multiple AI-related subreddits. AI Blog Writer: Uses an LLM (Groq / GPT-4o) to convert Reddit titles + content into a full blog article (title, content, category, tags, image keyword). Image Generator: Queries the Pexels API using the keyword provided by the AI to fetch a relevant blog image. Category & Tag Manager: Automatically creates or reuses categories and tags in WordPress. WordPress Publisher: Posts the article in draft or published form — complete with featured image and metadata. Everything is dynamically generated — no hardcoded text or API keys! How to set up Estimated time: 15–20 minutes You’ll need: 🧠 Groq or OpenAI API key (for AI article generation) 🖼️ Pexels API key (for fetching featured images) 📰 WordPress API credentials (with media + post permissions) Customization via Sticky Notes: Choose your own RSS feeds (or subreddit URLs) Modify the AI prompt to match your writing style Set post status (draft or publish) Add your WordPress API URL and credentials Requirements Free n8n account (or self-hosted instance) API credentials (Groq/OpenAI, Pexels, WordPress) Working WordPress site with REST API access Sticky notes explaining: Setup instructions AI prompt format Required credential names
by Don Jayamaha Jr
Get real-time MEXC Spot Market data instantly in Telegram! This workflow connects the MEXC REST v3 API with Telegram and optional GPT-4.1-mini formatting, providing users with latest prices, 24h stats, order book depth, trades, and candlesticks in structured, Telegram-ready messages. 🔎 How It Works A Telegram Trigger node listens for commands. User Authentication ensures only authorized Telegram IDs can access the bot. A Session ID is generated from chat.id for lightweight memory. The MEXC AI Agent coordinates multiple API calls via HTTP nodes: Ticker (Latest Price) → /api/v3/ticker/price?symbol=BTCUSDT 24h Stats → /api/v3/ticker/24hr?symbol=BTCUSDT Order Book Depth → /api/v3/depth?symbol=BTCUSDT&limit=50 Best Bid/Ask Snapshot → /api/v3/ticker/bookTicker?symbol=BTCUSDT Candlesticks (Klines) → /api/v3/klines?symbol=BTCUSDT&interval=15m&limit=200 Recent Trades → /api/v3/trades?symbol=BTCUSDT&limit=100 Utility Nodes refine the data: Calculator → spreads, averages, mid-prices. Think → formats raw JSON into human-readable summaries. Simple Memory → saves symbol, sessionId, and context across turns. Message Splitter prevents Telegram messages from exceeding 4000 characters. Results are sent back to Telegram in structured, readable reports. ✅ What You Can Do with This Agent Get latest prices & 24h stats for any spot pair. Retrieve order book depth (customizable levels). Monitor best bid/ask quotes for spreads. View candlestick OHLCV data for multiple timeframes. Check recent trades (up to 100). Receive clean Telegram reports — no raw JSON. 🛠️ Setup Steps Create a Telegram Bot Use @BotFather to create a bot and copy its API token. Configure in n8n Import MEXC AI Agent v1.02.json. Update the User Authentication node with your Telegram ID. Add Telegram API credentials (bot token). Add OpenAI API key (Optional) Add MEXC API key Deploy & Test Activate the workflow in n8n. Send a query like BTCUSDT to your bot. Instantly receive structured MEXC Spot Market data in Telegram. 📤 Output Rules Output grouped into Price, 24h Stats, Order Book, Candlesticks, Trades. No raw JSON — formatted summaries only. Complies with Telegram’s 4000-character message limit (auto-split). 📺 Setup Video Tutorial Watch the full setup guide on YouTube: ⚡ Unlock real-time MEXC Spot Market insights in Telegram — clean, fast, and API-key free. 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: Don Jayamaha – LinkedIn
by Don Jayamaha Jr
Instantly access Upbit Spot Market Data in Telegram with AI Automation This workflow integrates the Upbit REST API with GPT-4o-mini and Telegram, giving you real-time price data, order books, trades, and candles directly in chat. Perfect for crypto traders, market analysts, and investors who want structured Upbit data at their fingertips—no manual API calls required. ⚙️ How It Works A Telegram bot listens for user queries like upbit KRW-BTC 15m. The Upbit AI Agent parses the request and fetches live data from the official Upbit REST API: Price & 24h stats (/v1/ticker) Order book depth & best bid/ask (/v1/orderbook) Recent trades (/v1/trades/ticks) Dynamic OHLCV candles across all timeframes (/v1/candles/{seconds|minutes|days|weeks|months|years}) A built-in Calculator tool computes spreads, % change, and midpoints. A Think module reshapes raw JSON into simplified, clean fields. The agent formats results into concise, structured text and sends them back via Telegram. 📊 What You Can Do with This Agent ✅ Get real-time prices and 24h change for any Upbit trading pair. ✅ View order book depth and best bid/ask snapshots. ✅ Fetch multi-timeframe OHLCV candles (from 1s to 1y). ✅ Track recent trades with price, volume, side, and timestamp. ✅ Calculate midpoints, spreads, and percentage changes. ✅ Receive clean, human-readable reports in Telegram—no JSON parsing needed. 🛠 Set Up Steps Create a Telegram Bot Use @BotFather and save your bot token. Configure Telegram API and OpenAI in n8n Add your bot token under Telegram credentials. Replace your Telegram ID in the authentication node to restrict access. Import the Workflow Load Upbit AI Agent v1.02.json into n8n. Ensure connections to tools (Ticker, Orderbook, Trades, Klines). Deploy and Test Example query: upbit KRW-BTC 15m → returns price, order book, candles, and trades. Example query: upbit USDT-ETH trades 50 → returns 50 latest trades. 📺 Setup Video Tutorial Watch the full setup guide on YouTube: ⚡ Unlock clean, structured Upbit Spot Market data instantly—directly in Telegram! 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: Don Jayamaha – LinkedIn
by Yahor Dubrouski
📌 How it works This workflow turns voice or text messages from Telegram into structured tasks in Notion, using AI-powered intent detection and task generation. It supports: 🆕 Task creation ✏️ Task updates (like changing priority, title, or deadline) 🧠 Task analysis (e.g., workload, goal alignment, emotional fatigue) The assistant uses OpenAI to: Detect intent (create, update, or analyze) Extract or update task fields (title, priority, due date, etc.) Auto-format list-style descriptions with bullet points Detect relevant tags like health, money, sport, etc. ⚙️ Setup steps Clone the GitHub repo or import the .json into n8n manually. Configure: OpenAI credentials Telegram Bot Token Notion credentials Use Telegram to send messages like: “Create a task to call mom tomorrow” “Update the grocery task to add milk” “Am I overbooked today?”
by MANISH KUMAR
This AI Blog Generator is an advanced n8n-powered automation workflow that leverages Google Gemini and Google Sheets to generate SEO-friendly blog articles for Shopify products. It automates the entire process — from fetching product data to creating structured HTML content — with zero manual effort. 💡 Key Advantages Our AI Blog Generator offers five core advantages that make it the perfect solution for automated content creation: 🔗 Shopify Product Sync** — Automatically pulls product data (titles, descriptions, images, etc.) via Shopify API. ✍️ SEO Blog Generation** — Gemini generates blog titles, meta descriptions, and complete articles using product information. 🗂️ Structured Content Output** — Creates well-formatted HTML with headers and bullet points for seamless Shopify blog integration. 📄 Google Sheets Integration** — Tracks blog creation and prevents duplicate publishing using a centralized Google Sheet. 📤 Shopify Blog API Integration** — Publishes the generated blog to Shopify with a single API call. ⚙️ How It Works The workflow follows a systematic 8-step process that ensures quality and efficiency: Step-by-Step Process Manual Trigger – Start the workflow via a test trigger or scheduler. Fetch Products from Shopify – Retrieves all product details, including images and descriptions. Fix Input Format – Organizes and updates the input table using Code and Google Sheet nodes. Filter Duplicates – Ensures no previously used rows are processed again. Limit Control – Processes one row at a time and loops until all blogs are posted. Gemini AI Generation – Creates SEO-friendly blog content in HTML format from product data. HTML Structure Fix – Adjusts content for JSON compatibility by cleaning unsupported HTML tags. Article API Posting – Sends finalized blog content to Shopify for publishing or drafting. 🛠️ Setup Steps Required Node Configuration To implement this workflow, you'll need to configure the following n8n nodes: Trigger Node:** Start the workflow instantly. Shopify Node:** Fetch product details. Google Sheet Node:** Store input/output data and track blog creation status. Code Node:** Format data as required. Filter Node:** Remove used rows to avoid duplication. Limit Node:** Process one blog at a time. Agent Node:** Sends prompt to Gemini and returns parsed SEO-ready content. HTTP Node:** Posts content to Shopify via the API. 🔐 Credentials Required Authentication Setup Before running the workflow, ensure you have the following credentials configured: Shopify Access Token** – For fetching products and posting blogs Gemini API Key** – For AI-powered blog generation Google Sheets OAuth** – For logging and tracking workflow data 👤 Ideal For Target Users This automation workflow is specifically designed for: Ecommerce teams** automating blogs for hundreds of products Shopify store owners** boosting organic traffic effortlessly Marketing teams** building scalable, AI-driven content workflows 💬 Bonus Tip Extensibility Features The workflow is fully modular and highly customizable. You can easily extend it for: Internal linking** between related products Multi-language translation** for global markets Social media sharing** automation Email marketing** integration All extensions can be implemented within the same n8n flow, making it a comprehensive content automation solution.
by Jose Bossa
👥 Who's it for This workflow is perfect for businesses or individuals who want to automate WhatsApp conversations 💬 with an intelligent AI chatbot that can handle text, voice notes 🎵, and images 🖼️. No advanced coding required! 🤖 What it does It automatically receives WhatsApp messages through WasenderAPI, intelligently buffers consecutive messages to avoid fragmented responses, processes multimedia content (transcribing audio and analyzing images with AI), and responds naturally using GPT-4o mini with conversation memory. All while protecting your WhatsApp account from being banned. ⚙️ How it works 📱 Webhook Trigger – Receives new messages from WasenderAPI 🗃️ Redis Buffer System – Groups consecutive messages intelligently (7-second window) 🔀 Content Classifier – Routes messages by type (text, audio, or image) 🎵 Audio Processing – Decrypts and transcribes voice notes using OpenAI Whisper 🖼️ Image Analysis – Decrypts and analyzes images with GPT-4O Vision 🧠 AI Agent (GPT-4o mini) – Generates intelligent responses with 10-message memory ⏱️ Anti-Ban Wait – 6-second delay to simulate human typing 📤 Message Sender – Delivers response back to WhatsApp user 📋 Requirements WasenderAPI account with connected WhatsApp number : https://wasenderapi.com/ Redis database (free tier works fine) OpenAI API key with access to GPT-4o mini and Whisper n8n's AI Agent, LangChain, and Redis nodes 🛠️ How to set up Create your WasenderAPI account and connect a WhatsApp number Set up a free Redis database and get connection credentials Configure OpenAI API key in n8n credentials Replace the WasenderAPI Bearer token in "Get the audio", "Get the photo", and "Send Message to User" nodes Change the Manual Trigger to a Webhook and configure it in WasenderAPI Customize the AI Agent prompt to match your business needs Adjust wait times if needed (default: 6 seconds for responses, 7 seconds for buffer) Save and activate the workflow ✅ 🎨 How to customize Modify the AI Agent prompt to change bot personality and instructions Adjust buffer wait time (7 seconds) for faster/slower message grouping Change response delay (6 seconds) based on your use case , its recomendable 30 seconds. Add more content types (documents, videos) by extending the Switch Type node Configure conversation memory window (default: 10 messages)
by Artem Boiko
Estimate material price and total cost for grouped BIM/CAD elements using an LLM-driven analysis. The workflow accepts an existing XLSX (from your model) or, if missing, can trigger a local RvtExporter.exe to generate one. It enriches each element group with quantities, pricing, confidence, and produces a multi-sheet Excel report plus a polished HTML executive report. What it does Reads grouped element data** (from XLSX or extracted via RvtExporter.exe). Builds enhanced prompts with clear rules (volumes/areas are already aggregated per group). Calls your selected LLM (OpenAI/Anthropic/etc.) to identify materials, pick the pricing unit, and estimate price per unit and total cost. Parses AI output, adds per-group KPIs (cost %, rank), and aggregates **project totals (by material, by category). Exports a multi-sheet XLSX and an HTML executive report (charts, KPIs, top groups). Prerequisites LLM credentials** for your chosen provider (e.g., OpenAI, Anthropic). Enable exactly one chat node and connect credentials. Windows host* only if you want to auto-extract from .rvt/.ifc via RvtExporter.exe. If you already have an XLSX, Windows is *not required**. Optional: Internet access on the LLM side for price lookups (model/tooling dependent). How to use Import this JSON into n8n. Open the Setup node(s) and set: project_file — path to your .rvt/.ifc or to an existing grouped *_rvt.xlsx path_to_converter — C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe (optional) country — used to guide price sources/standards (e.g., Germany) In the canvas, enable one LLM node (e.g., OpenAI or Anthropic) and connect credentials; keep others disabled. Execute workflow (Manual Trigger). It will detect/build the XLSX, run analysis, then write the Excel and open the HTML report. Outputs Excel**: Price_Estimation_Report_YYYY-MM-DD.xlsx with sheets: Summary, Detailed Elements, Material Summary, Top 10 Groups HTML**: executive report with charts (project totals, top materials, top groups). Per-group fields include: Material (EU/DE/US), Quantity & Unit, Price per Unit (EUR), Total Cost (EUR), Assumptions, Confidence. Notes & tips Quantities in the input are already aggregated per group — do not multiply by element count. If you prefer XLSX-only extraction, run your converter with a -no-collada flag upstream. Keep ASCII-safe paths and ensure write permissions to the output folder. Categories Data Extraction · Files & Storage · ETL · CAD/BIM · Cost Estimation Tags cad-bim, price-estimation, cost, revit, ifc, xlsx, html-report, llm, materials, qto Author DataDrivenConstruction.io info@datadrivenconstruction.io Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by Dayong Huang
How it works This template creates a fully automated Twitter content system that discovers trending topics, analyzes why they're trending using AI, and posts intelligent commentary about them. The workflow uses MCP (Model Context Protocol) with the twitter154 MCP server from MCPHub to connect with Twitter APIs and leverages OpenAI GPT models to generate brand-safe, engaging content about current trends. Key Features: 🔍 Smart Trend Discovery: Automatically finds US trending topics with engagement scoring 🤖 AI-Powered Analysis: Uses GPT to explain "why it's trending" in 30-60 words 📊 Duplicate Prevention: MySQL database tracks posted trends with 3-day cooldowns 🛡️ Brand Safety: Filters out NSFW content and low-quality hashtags ⚡ Rate Limiting: Built-in delays to respect API limits 🐦 Powered by twitter154: Uses the robust "Old Bird" MCP server for comprehensive Twitter data access Set up steps Setup time: ~10 minutes Prerequisites: OpenAI API key for GPT models Twitter API access for posting MySQL database for trend tracking MCP server access**: twitter154 from aigeon-ai via MCPHub Configuration: Set up MCP integration with twitter154 server endpoint: https://api.mcphub.com/mcp/aigeon-ai-twitter154 Configure credentials for OpenAI, Twitter, and MySQL connections Set up authentication for the twitter154 MCP server (Header Auth required) Create MySQL table for keyword registry (schema provided in workflow) Test the workflow with manual execution before enabling automation Set schedule for automatic trend discovery (recommended: every 2-4 hours) MCP Server Features Used: Search Tweets**: Core functionality for trend analysis Get Trends Near Location**: Discovers trending topics by geographic region AI Tools**: Leverages sentiment analysis and topic classification capabilities Customization Options: Modify trend scoring criteria in the AI agent prompts Adjust cooldown periods in database queries Change target locale from US to other regions (WOEID configuration) Customize tweet formatting and content style Configure different MCP server endpoints if needed Perfect for: Social media managers, content creators, and businesses wanting to stay current with trending topics while maintaining consistent, intelligent posting schedules. Powered by: The twitter154 MCP server ("The Old Bird") provides robust access to Twitter data including tweets, user information, trends, and AI-powered text analysis tools.