by vinci-king-01
How it works This workflow automatically discovers and analyzes backlinks for any website, providing comprehensive SEO insights and competitive intelligence using AI-powered analysis. Key Steps Website Input - Accepts target URLs via webhook or manual input for backlink analysis. Backlink Discovery - Scrapes and crawls the web to find all backlinks pointing to the target website. AI-Powered Analysis - Uses GPT-4 to analyze backlink quality, relevance, and SEO impact. Data Processing & Categorization - Cleans, validates, and automatically categorizes backlinks by type, authority, and relevance. Database Storage - Saves processed backlink data to PostgreSQL database for ongoing analysis and reporting. API Response - Returns structured summary with backlink counts, domain authority scores, and SEO insights. Set up steps Setup time: 8-12 minutes Configure OpenAI credentials - Add your OpenAI API key for AI-powered backlink analysis. Set up PostgreSQL database - Connect your PostgreSQL database and create the required table structure. Configure webhook endpoint - The workflow provides a /analyze-backlinks endpoint for URL submissions. Customize analysis parameters - Modify the AI prompt to include your preferred SEO metrics and analysis criteria. Test the workflow - Submit a sample website URL to verify the backlink discovery and analysis process. Set up database table - Ensure your PostgreSQL database has a backlinks table with appropriate columns. Features Comprehensive backlink discovery**: Finds all backlinks pointing to target websites AI-powered analysis**: GPT-4 analyzes backlink quality, relevance, and SEO impact Automatic categorization**: Backlinks categorized by type (dofollow/nofollow), authority level, and relevance Data validation**: Cleans and validates backlink data with error handling Database storage**: PostgreSQL integration for data persistence and historical tracking API responses**: Clean JSON responses with backlink summaries and SEO insights Competitive intelligence**: Analyzes competitor backlink profiles and identifies link building opportunities Authority scoring**: Calculates domain authority and page authority metrics for each backlink
by Stéphane Heckel
Scanning Email Inbox for Delivery Errors Prerequisite: Automate Personalized Email Campaigns with Google Docs, Sheets, and SMTP. How It Works After running your email campaign, some messages may fail to deliver. This workflow scans your email inbox for delivery errors (e.g., bounced messages), flags problematic email addresses in the Google Sheet and ensures future campaigns skip them. How to Use Ensure Prerequisite Workflow: You should have the Email Campaign Workflow configured and running. Google Sheet Setup: Use the Google Sheet Template. Identify your document’s ID (the string after /d/ and before /edit in the URL). Configure Workflow: Enter your Google Sheet ID in the settings node. Connect your Google credentials to n8n. Email Inbox: Set up the readspamfolder node to search for bounce/error messages in your mail (e.g., in the Spam or Inbox folders—adjust label/folder if emails land elsewhere). Google Sheet Update: Configure the lookupemail and update_err nodes Requirements Google Credentials** to access Gmail and sheets. Gmail Account** (bounce/error messages must be accessible here). n8n Version:** Tested with 1.105.2 (Ubuntu). Need Help? Comment this post or contact me on LinkedIn Ask in the n8n Community Forum!
by Balakrishnan C
Personal AI Assistant on Telegram Who It’s For: This workflow is designed for developers, founders, community managers, and automation enthusiasts who want to bring a personal AI assistant directly into their Telegram chat. It lets users interact naturally—either through text or voice messages—and get instant, AI-powered replies without switching apps or opening dashboards. ⚡ What It Does / How It Works 📥 Message Trigger: A Telegram Trigger node listens for incoming messages. 🧭 Smart Routing: A Switch node decides if the user sent a text or voice message. 🗣️ Voice to Text: If it’s voice, the workflow uses OpenAI Whisper Transcription to convert it into text. 🧠 AI Processing: The text is passed to an AI Agent powered by GPT-4.1-mini to understand intent and craft a response. 💬 Reply: The bot sends a clean, structured, and polite answer back to the user on Telegram. 🧠 Memory: A buffer memory node keeps short-term conversation history for a more contextual, human-like experience. 🧰 How to Set It Up: Telegram Integration Create a bot via @BotFather on Telegram. https://telegram.me/BotFather Add your Telegram API Key to n8n credentials. Connect the Telegram Trigger and Send a Message nodes. OpenAI Setup Get your API key from platform.openai.com. https://platform.openai.com/api-keys Configure the OpenAI Chat Model and Transcribe a Recording nodes with GPT-4.1-mini. Workflow Logic Use the Switch node to detect message type (text or voice). Route voice messages through transcription before sending them to the AI agent. Add Simple Memory to maintain short conversational context. Go Live Activate the workflow. Send a message or voice note to your bot. Get instant replies from your personal AI assistant. 🚀 🛡️ Requirements: n8n (self-hosted or cloud) Telegram Bot API key OpenAI API key (for GPT-4.1-mini) Basic understanding of n8n nodes and connections 🌟 Why Use This Workflow: ✅ Hands-free experience: Just talk or type. 🧠 AI-powered responses: Natural language understanding with GPT. ⚡ Real-time interaction: Fast replies via Telegram. 🔁 Memory-aware conversations: Feels more human. 🧩 Modular design: Easily extend to other AI tools or platforms.
by Robert Breen
Create a Fall 2025 course schedule for each student based on what they’ve already completed, catalog prerequisites, and term availability (Fall/Both). Reads students from Google Sheets → asks an AI agent to select exactly 5 courses (target 15–17 credits, no duplicates, prereqs enforced) → appends each student’s schedule to a schedule tab. 🧠 Summary Trigger:* Manual — *“When clicking ‘Execute workflow’” I/O:** Google Sheets in → OpenAI decisioning → Google Sheets out Ideal for:** Registrars, advisors, degree-planning prototypes ✅ What this template does Reads: StudentID, Name, Program, Year, CompletedCourses (pipe-separated CourseIDs) from **Sheet1 Decides: AI **Scheduling Agent chooses 5 courses per student following catalog rules and prerequisites Writes: Appends StudentID + Schedule strings to **schedule worksheet Credits target**: 15–17 total per term Catalog rules** (enforced in the agent prompt): Use Fall or Both courses for Fall 2025 Enforce AND prereqs (e.g., CS-102|CS-103 means both) Priority: Major Core → Major Elective → Gen Ed (include Gen Ed if needed) No duplicates or already-completed courses Prefer 200-level progression when prereqs allow ⚙️ Setup (only 2 steps) 1) Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2), sign in and grant access In the Google Sheets nodes, select your spreadsheet and these tabs: Sheet1 (input students) schedule (output) > Example spreadsheet (replace with your own): > - Input: .../edit#gid=0 > - Output: .../edit#gid=572766543 2) Connect OpenAI (API Key) In n8n → Credentials → New → OpenAI API, paste your key In the OpenAI Chat Model node, select that credential and a chat model (e.g., gpt-4o) 📥 Required input (Sheet1) Columns**: StudentID, Name, Program, Year, CompletedCourses CompletedCourses**: pipe-separated IDs (e.g., GEN-101|GEN-103|CS-101) Program* names should match those referenced in the embedded catalog (e.g., *Computer Science BS, Business Administration BBA, etc.) 📤 Output (schedule tab) Columns**: StudentID Schedule → a selected course string (written one row per course after splitting) 🧩 Nodes in this template Manual Trigger* → *Get Student Data (Google Sheets)* → *Scheduling Agent (OpenAI)** → Split Schedule → Set Fields → Clear sheet → Append Schedule (Google Sheets) 🛠 Configuration tips If you rename tabs, update both Get Student Data and Append Schedule nodes Keep CompletedCourses consistent (use | as the delimiter) To store rationale as well, add a column to the output and map it from the agent’s JSON 🧪 Test quickly 1) Add 2–3 sample student rows with realistic CompletedCourses 2) Run the workflow and verify: 5 course rows per student in schedule Course IDs respect prereqs & Fall/Both availability Credits sum ~15–17 🧯 Troubleshooting Sheets OAuth error:** Reconnect “Google Sheets (OAuth2)” and re-select the spreadsheet & tabs Empty schedules:** Ensure CompletedCourses uses | and that programs/courses align with the provided catalog names Prereq violations:** Check that students actually have all AND-prereqs in CompletedCourses OpenAI errors (401/429):** Verify API key, billing, and rate-limit → retry with lower concurrency 🔒 Privacy & data handling Student rows are sent to OpenAI for decisioning. Remove or mask any fields you don’t want shared. Google Sheets retains input/output. Use spreadsheet sharing controls to limit access. 💸 Cost & performance OpenAI**: Billed per token; cost scales with student count and prompt size Google Sheets**: Free within normal usage limits Runtime**: Typically seconds to a minute depending on rows and rate limits 🧱 Limitations & assumptions Works for Fall 2025 only (as written). For Spring, update availability rules in the agent prompt Assumes catalog in the agent system message is your source of truth Assumes Program names match catalog variants (case/spacing matter for clarity) 🧩 Customization ideas Add a Max Credits column to cap term credits per student Include Rationale in the sheet for advisor review Add a “Gen Ed needed?” flag per student to force at least one Gen Ed selection Export to PDF or email the schedules to advisors/students 🧾 Version & maintenance n8n version:** Tested on recent n8n Cloud builds (2025) Community nodes:** Not required Maintenance:** Update the embedded catalog and offerings each term; keep prerequisites accurate 🗂 Tags & category Category:** Education / Student Information Systems Tags:** scheduling, registrar, google-sheets, openai, prerequisites, degree-planning, catalog, fall-term 🗒 Changelog v1.0.0** — Initial release: Sheets in/out, Fall 2025 catalog rules, prereq enforcement, 5-course selection, credits target 📬 Contact Need help customizing this (e.g., cohort logic, program-specific rules, adding rationale to the sheet, or emailing PDFs)? 📧 rbreen@ynteractive.com 🔗 Robert Breen — https://www.linkedin.com/in/robert-breen-29429625/ 🌐 ynteractive.com — https://ynteractive.com
by Ilyass Kanissi
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🤖 AI-Powered Twitter Content Generator Transform topic ideas into ready to post Twitter drafts (text + image) using fresh web data and AI agents 🎯 What does this workflow do? This end to end automation creates complete Twitter posts by: Taking your topic input (e.g., "Agentic AI") via chat interface Generating fresh, research-backed content using AI agents: First agent uses GPT-4.1-MINI + Tavily to bypass LLM knowledge limits with real-time web data Second agent creates optimized prompt for image generation Producing custom visuals through OpenAI's gpt-image-1 Delivering polished drafts (text + image) via Gmail for review ⚙️ How it works User input: You provide a topic through chat node Content research: Agent 1 (GPT-4.1-mini + Tavily) researches current web data Generates factually fresh tweet content Visual creation: Agent 2 optimizes prompt for image generation HTTP request node calls OpenAI's gpt-image-1 model to generate the image Convert to file node converst the base64 string to a file so we can send it as an attachment Delivery: Gmail node sends compiled draft with text body + image attachment 🔑 Required setup Have a verified organization: OpenAI Org Settings OpenAI API Key: Create a Key Here Tavily API Key: Get it Here Gmail credentials: Google Cloud Console
by Robert Breen
This workflow fetches live financial data from SerpApi and generates a daily market recap using OpenAI. ⚙️ Setup Instructions 1️⃣ Set Up SerpApi Connection Create a free account at SerpApi Copy your API Key from the SerpApi dashboard In n8n → Credentials → New → SerpApi Paste your API Key → Save In the workflow, select your SerpApi credential in the Finance Search node. 2️⃣ Set Up OpenAI Connection Go to OpenAI Platform Navigate to OpenAI Billing Add funds to your billing account Copy your API key into the OpenAI credentials in n8n 🧠 How it works SerpApi Finance Search** → pulls market data (example: S&P 500, ticker ^GSPC) OpenAI Model** → summarizes into a daily report with a paragraph + key bullet points 📬 Contact Need help customizing (e.g., pulling multiple tickers, exporting to Google Sheets, or sending Slack/Email updates)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Nabin Bhandari
Appointment Reminder Agent automates phone call reminders for upcoming appointments by seamlessly connecting Google Calendar with Retell AI. This powerful workflow is designed to help businesses, clinics, and service providers ensure clients never miss their scheduled appointments—reducing no-shows and increasing efficiency. ##Use Cases Healthcare practitioners reminding patients of upcoming visits Salons, spas, and beauty services confirming bookings Consultants, therapists, and coaches sending appointment reminders Any service-based business wanting to reduce missed appointments ##Workflow Overview Trigger – A scheduler node runs every day at 9 AM (configurable) to start the workflow. Fetch Events – Pulls all events scheduled in the next 12 hours from Google Calendar. Extract Details – A Code node parses each event’s description for: Name Phone number (must be in E.164 format, e.g., +14155552671) Reason for appointment Start and end time Configure Retell – Uses credentials to set up: from_number (Retell-registered phone number) agent_id (Retell agent ID) Send Call – Calls Retell AI’s API to place a personalized reminder call to the client. ##Setup Instructions Add your Retell API key to n8n credentials (never hardcode it). Add your Google Calendar account to credentials. Set the from_number (Retell-registered number). Set the agent_id (Retell agent ID). Ensure all calendar event descriptions include required fields (Name, Phone number, Reason, Start & End times) in the specified format. Adjust the scheduler trigger time if needed. ##Requirements Retell AI account with API key Registered Retell phone number Google Calendar account Event descriptions formatted properly with all required details ##Customization Options Modify the trigger schedule (e.g., nightly, hourly, or webhook-based). Add logging or tracking (e.g., use Google Sheets or Airtable to log call attempts/results). Tailor the Retell agent script to suit different appointment types (e.g., “Consultation,” “Follow-up,” “Service Visit”). Expand with additional channels (e.g., SMS or email reminders before or after calls).
by Cristian Tala Sánchez
Automate Market Problem Discovery from Reddit with OpenAI Who’s it for This workflow is perfect for entrepreneurs, startup founders, product managers, researchers, and market analysts who want to automatically discover and analyze real market problems from online discussions. While this example focuses on identifying issues in the future of work and future of education, it can be fully adapted to detect any type of market pain point by changing the data sources and AI prompts. If you’re looking to automate market research, find customer pain points, or detect unmet needs in your industry, this template is for you. What it does This no-code n8n automation: Collects fresh discussions from Reddit (default: Teachers, Education, RemoteWork subreddits). Filters posts by engagement (more than 5 upvotes by default). Uses OpenAI GPT-4.1 to: Detect if a post describes a real market problem. Identify the underlying pain or unmet need. Suggest a practical, tech/AI-based solution. Score the problem’s Impact, Confidence, and Ease of prototyping (ICE framework). Saves results to Google Sheets for easy prioritization and action. This allows you to automate the detection of market opportunities without manually reading through hundreds of posts. How it works Schedule Trigger**: Runs the workflow at your preferred interval (daily, hourly, etc.). Reddit Nodes**: Pull posts from the targeted subreddits (you can replace them with any niche communities). Filter Node**: Keeps only posts with engagement above your threshold. OpenAI Node: Analyzes each post using a **User Prompt and System Prompt in English, returning structured JSON data. Google Sheets Node**: Stores all results with full scoring for later review and prioritization. Requirements Reddit OAuth2 API credentials. OpenAI API key. Google Sheets OAuth2 API credentials. A Google Sheet with these columns: SubReddit, Title, Content, Short Description, Detected Pain, Possible Solution, Impact, Confidence, Ease, URL, ICE Score How to customize Change the market focus**: Replace example subreddits with your own industry forums or communities. Adjust the filter criteria**: Modify the upvote threshold or other parameters. Edit the AI prompts**: Tailor them to detect specific types of problems (healthtech, fintech, sustainability, etc.). Integrate more sources**: Add Twitter, LinkedIn, or customer support logs for richer insights. 💡 With this automation, you can continuously monitor and capture market pain points, turning raw online discussions into actionable business opportunities.
by Johnny Rafael
AI-Enriched Cold Outreach: Research → Draft → QA → Write-back What this template does Automates cold email drafting from a lead list by: Enriching each lead with LinkedIn profile, LinkedIn company, and Crunchbase data Generating a personalized subject + body with Gemini Auto-reviewing with a Judge agent and writing back only APPROVED drafts to your Data Table Highlights Hands-off enrichment via RapidAPI; raw JSON stored back on each row Two-agent pattern: Creative Outreach Agent (draft) + Outreach Email Judge (QA) Structured outputs guaranteed by LangChain Structured Output Parsers Data Table–native: reads “unprocessed” rows, writes results to the same row Async polling with Wait nodes for scraper task results How it works (flow) Trigger: Manual (replace with Cron if needed) Fetch leads: Data Table “Get row(s)” filters rows where email_subject is empty (pending) Loop: Split in Batches iterates rows Enrichment (runs in parallel): LinkedIn profile: HTTP (company_url) → Wait → Results → Data Table update → linkedin_profile_scrape LinkedIn company: HTTP (company_url) → Wait → Results → Data Table update → linkedin_company_scrape Crunchbase company: HTTP (url_search) → Wait → Results → Data Table update → crunchbase_company_scrape (All calls use host cold-outreach-enrichment-scraper with a RapidAPI key.) Draft (Gemini): “Agent One” composes a concise, personalized email using row fields + enrichment + ABOUT ME block. Structured Output Parser enforces: { "email_subject": "text", "email_content": "text" } Prep for QA: “Email Context” maps email_subject, email_content, and email for the judge. QA (Judge): “Judge Agent” returns APPROVED or REVISE (brief feedback allowed). Route: If APPROVED → Data Table “Update row(s)” writes email_subject + email_body (a.k.a. email_content) back to the row. If REVISE → Skipped; loop continues. Required setup Data Table: “email_linkedin_list” (or your own) with at least: email, First_name, Last_name, Title, Location, Company_Name, Company_site, Linkedin_URL, company_linkedin (if used), Crunchbase_URL, email_subject, email_body, linkedin_profile_scrape, linkedin_company_scrape, crunchbase_company_scrape (string fields for JSON). Credentials: RapidAPI key for cold-outreach-enrichment-scraper (store securely as credential, not hardcoded) Google Gemini (PaLM) API configured in the Google Gemini Chat Model node ABOUT ME block: Replace the sample persona (James / CEO / Company Sample / AI Automations) with your own. Nodes used Data Table** HTTP Request:** AI Agent:** Google Gemini Chat Model** Split in Batches:** Main Loop Set:** RapidAPI-Key Customization ideas Process flags:** Add email_generated_at or processed boolean to prevent reprocessing. Human-in-the-loop:** Send drafts to Slack/Email for spot check before write-back. Delivery:** After approval, optionally email the draft to the sender for review. Quotas & costs RapidAPI: Multiple calls per row (three tasks + result polls). Gemini: Token usage for generator + judge per row. Tune batch size and schedule accordingly. Privacy & compliance You are scraping and storing person/company data. Ensure lawful basis, respect ToS, and minimize stored data.
by sebastian pineda
🤖 AI-Powered Hardware Store Assistant with PostgreSQL & MCP Supercharge your customer service with this conversational AI agent! This n8n workflow provides a complete solution for a hardware store chatbot that connects to a PostgreSQL database in real-time. It uses Google Gemini for natural language understanding and the powerful MCP (My Credential Provider) nodes to securely expose database operations as tools for the AI agent. ✨ Key Features 💬 Conversational Product Queries: Allow users to ask for products by name, category, description, or even technical notes. 📦 Real-time Inventory & Pricing: The agent fetches live data directly from your PostgreSQL database, ensuring accurate stock and price information. 💰 Automatic Quote Generation: Ask the agent to create a detailed quote for a list of materials, and it will calculate quantities and totals. 🧠 Smart Project Advice: The agent is primed with a system message to act as an expert, helping users calculate materials for projects (e.g., "How much drywall do I need for a 10x12 foot room?"). 🛠️ Tech Stack & Core Components Technologies Used 🗄️ PostgreSQL: For storing and managing product data. ✨ Google Gemini API: The large language model that powers the agent's conversational abilities. 🔗 MCP (My Credential Provider): Securely exposes database queries as callable tools without exposing credentials directly to the agent. n8n Nodes Used @n8n/n8n-nodes-langchain.agent: The core AI agent that orchestrates the workflow. @n8n/n8n-nodes-langchain.chatTrigger: To start a conversation. @n8n/n8n-nodes-langchain.lmChatGoogleGemini: The connection to the Google Gemini model. n8n-nodes-base.postgresTool: Individual nodes for querying products by ID, name, category, etc. @n8n/n8n-nodes-langchain.mcpTrigger: Exposes the PostgresTools. @n8n/n8n-nodes-langchain.mcpClientTool: Allows the AI agent to consume the tools exposed by the MCP Trigger. 🚀 How to Get Started: Setup & Configuration Follow these steps to get your AI assistant up and running: Configure your Database: This template assumes a PostgreSQL database named bd_ferreteria with a productos table. You can adapt the PostgresTool nodes to match your own schema. Set up Credentials: Create and assign your PostgreSQL credentials to each of the six PostgresTool nodes. Create and assign your Google Gemini API credentials in the Language Model (Google Gemini) node. Review the System Prompt: The main AI Agent node has a detailed system prompt that defines its persona and capabilities. Feel free to customize it to better fit your business's tone and product line. Activate the Workflow: Save and activate the workflow. You can now start interacting with your new AI sales assistant through the chat interface! 💡 Use Cases & Customization While designed for a hardware store, this template is highly adaptable. You can use it for: Any e-commerce store with a product database (e.g., electronics, clothing, books). An internal IT support bot that queries a database of company assets. A booking assistant that checks availability in a database of appointments or reservations.
by YungCEO
What It Does Stop chasing the market—let the market come to you. This Done-For-You AI Crypto Bot is a fully configured n8n workflow that scrapes CoinMarketCap for trending cryptocurrencies, analyzes them with cutting-edge OpenAI AI (GPT-4o-mini), and delivers concise, actionable insights directly to your Discord channel. Forget tedious manual research and complex setups; this system is ready for instant deployment, giving you and your community an unfair advantage by providing daily, automated crypto trend intelligence without lifting a finger. It’s the ultimate shortcut to staying ahead in the fast-paced crypto world with a pre-built crypto automation solution. ⚙️ Key Features ⏰ Automated Daily Crypto Updates:* Pre-scheduled to run multiple times a day, ensuring you never miss a trending coin. 🧠 AI-Powered Market Analysis:* Leverages GPT-4o-mini to distill complex data into digestible, insightful summaries. 💬 Seamless Discord Integration:* Delivers beautifully formatted, Markdown-compatible messages directly to your chosen channel. ⚡ Zero-Setup n8n Workflow:* Simply import the JSON, plug in your API keys, and go live within minutes. 📈 Actionable Insights:* Provides ticker, price, market cap, volume, and direct links for quick research and trading decisions. 😩 Pain Points Solved Tired of missing crucial crypto market trends and potential opportunities? Wasting countless hours on manual research and data aggregation from multiple sources? Struggling to provide timely, concise, and professional crypto insights to your community or personal trading strategy? Frustrated by the complexity and time investment of setting up custom AI and automation workflows from scratch? Need a reliable, hands-off solution to stay informed and competitive in the volatile cryptocurrency landscape? 📦 What’s Included Fully configured n8n workflow JSON file (ready to import) Pre-optimized AI prompt for expert crypto analysis Step-by-step setup guide for API keys and Discord integration Lifetime access to workflow updates 🚀 Call to Action Get your AI Crypto Bot live today. Automate insights, dominate trends. 🏷️ Optimized Tags done for you crypto bot, n8n workflow, ai crypto analysis, discord bot, trending coins, coinmarketcap automation, crypto insights, market intelligence, ready made solution, pre built automation, digital product, crypto trading tool, passive income bot
by Yang
Who’s it for This template is perfect for content creators, social media strategists, and marketing teams who want to uncover trending questions directly from real TikTok audiences. If you spend hours scrolling through videos to find content ideas or audience pain points, this workflow automates the entire research process and delivers clean, ready-to-use insights in minutes. What it does The workflow takes a keyword, searches TikTok for matching creator profiles, retrieves their latest videos, extracts viewer comments, and uses GPT-4 to identify the most frequently asked questions. These questions can be used to inspire new content, shape engagement strategies, or create FAQ-style videos that directly address what your audience is curious about. Here’s what happens step by step: Accepts a keyword from a form trigger Uses Dumpling AI to search TikTok for relevant profiles Fetches the most recent videos from each profile Extracts and cleans comments using a Python script Sends cleaned comments to GPT-4 to find recurring audience questions Saves the top questions and video links into a Data Table for easy review How it works Form Trigger: Collects the keyword input from the user Dumpling AI: Searches TikTok to find relevant creators based on the keyword Video Retrieval: Gets recent videos from the discovered profiles Comments Extraction: Gathers and cleans all video comments using Python GPT-4: Analyzes the cleaned text to extract top audience questions Data Table: Stores the results for easy access and content planning Requirements ✅ Dumpling AI API key stored as credentials ✅ OpenAI GPT-4 credentials ✅ Python node enabled in n8n ✅ A Data Table in n8n to store questions and related video details How to customize Adjust the GPT prompt to refine the tone or format of the extracted questions Add filters to target specific types of TikTok profiles or content niches Integrate the output with your content calendar or idea tracking tool Set up scheduled runs to build a constantly updating library of audience questions Extend the workflow to analyze TikTok hashtags or trends alongside comments > This workflow turns TikTok keyword searches into structured audience insights, helping you quickly discover real questions your audience is asking—perfect for fueling content strategies, campaigns, and engagement.