by Ilyass Kanissi
🛠️ Smart Email Classifier Workflow Intelligent AI-powered email classification system that automatically sorts incoming Gmail messages into Business, Meetings, Cold Emails, and other categories using OpenAI. ⚡ Quick Setup Import this workflow into your n8n instance Setup your OpenAI credentials at: OpenAI api key Configure your Gmail credentials and you're ready to go: Google Cloud Console Activate the workflow to start automatic email classification 🔧 How it Works Gmail Trigger: Monitors incoming emails in real-time Text Classifier: AI-powered categorization using OpenAI Chat Model Smart Routing: Automatically sorts emails into predefined categories Gmail Integration: Adds appropriate labels and organizes emails automatically Fallback Handling: "No Operation" path for unclassifiable emails Every email gets intelligently sorted into: 🏢 Business Work-related correspondence Client communications Project updates 📅 Meetings Meeting invitations and requests Calendar-related emails Scheduling communications ❄️ Cold Emails Sales outreach and pitches Unsolicited business proposals Marketing communications 🔀 Random Personal emails Newsletters Miscellaneous content
by Babish Shrestha
This Database SQL Query Agent convert natural language into sql query to get results Turn your PostgreSQL database into a conversational AI agent! Ask questions in plain English and get instant data results without writing SQL. ✨ What It Does Natural Language Queries**: "Show laptops under $500 in stock" → Automatic SQL generation Smart Column Mapping**: Understands your terms and maps them to actual database columns Conversational Memory**: Maintains context across multiple questions Universal Compatibility**: Works with any PostgreSQL table structure 🎯 Perfect For Business analysts querying data without SQL knowledge Customer support finding information quickly Product managers analyzing inventory/sales data Anyone who needs database insights fast 🚀 Quick Setup Step 1: Prerequisites n8n instance (cloud/self-hosted) PostgreSQL database with read access OpenAI API key/You can use other LLM as well Step 2: Import & Configure Import this workflow template into n8n Add Credentials: OpenAI API: Add your API key PostgreSQL: Configure database connection Set Table Name: Edit "Set Table Name" node → Replace "table_name" with your actual table Test Connection: Ensure your database user has SELECT permissions Step 3: Deploy & Use Start the workflow Open the chat interface Ask questions like: "Show all active users" "Find orders from last month over $100" "List products with low inventory" 🔧 Configuration Details Required Settings Table Name**: Update in "Set Table Name" node Database Schema**: Default is 'public' (modify SQL if different) Result Limit**: Default 50 rows (adjustable in system prompt) Optional Customizations Multi-table Support**: Modify system prompt and add table selection logic Custom Filters**: Add business rules to restrict data access Output Format**: Customize response formatting in the agent prompt 💡 Example Queries E-commerce "Show me all electronics under $200 that are in stock" HR Database "List employees hired in 2024 with salary over 70k" Customer Data "Find VIP customers from California with recent orders" 🛡️ Security Features Read-only Operations**: Only SELECT queries allowed SQL Injection Prevention**: Parameterized queries and validation Result Limits**: Prevents overwhelming queries Safe Schema Discovery**: Uses information_schema tables 🔍 How It Works Schema Discovery: Agent fetches table structure and column info Query Planning: Maps natural language to database columns SQL Generation: Creates safe, optimized queries Result Formatting: Returns clean, user-friendly data ⚡ Quick Troubleshooting No Results**: Check table name and ensure data exists Permission Error**: Verify database user has SELECT access Connection Failed**: Confirm PostgreSQL credentials and network access Unexpected Results**: Try more specific queries with exact column names 🎨 Use Cases Inventory Management**: "Show low-stock items by category" Sales Analysis**: "Top 10 products by revenue this quarter" Customer Support**: "Find customer orders with status 'pending'" Data Exploration**: "What are the unique product categories?" 🔧 Advanced Tips Performance**: Add database indexes on frequently queried columns Customization**: Modify the system prompt for domain-specific terminology Scaling**: Use read replicas for high-query volumes Integration**: Connect to Slack/Teams for team-wide data access Tags: AI, PostgreSQL, Natural Language, SQL, Business Intelligence, LangChain, Database Query Difficulty: Beginner to Intermediate Setup Time: 10-15 minutes
by InfyOm Technologies
✅ What problem does this workflow solve? Most e-commerce chatbots are transactional; they answer one question at a time and forget your context right after. This workflow changes that. It introduces a smart, memory-enabled shopping assistant that remembers user preferences, past orders, and previous queries to offer deeply personalized, natural conversations. ⚙️ What does this workflow do? Accepts real-time chat messages from users. Uses Zep Memory to store and recall personalized context. Integrates with: 🛒 Product Inventory 📦 Order History 📜 Return Policy Answers complex queries based on historical context. Provides: Personalized product recommendations Context-aware order lookups Seamless return processing Policy discussions with minimal user input 🧠 Why Context & Memory Matter Traditional bots: ❌ Forget what the user said 2 messages ago ❌ Ask repetitive questions (name, order ID, etc.) ❌ Can’t personalize beyond basic filters With Zep-powered memory, your bot: ✅ Remembers preferences (e.g., favorite categories, past questions) ✅ Builds persistent context across sessions ✅ Gives dynamic, user-specific replies (e.g., "You ordered this last week…") ✅ Offers a frictionless support experience 🔧 Setup Instructions 🧠 Zep Memory Setup Create a Zep instance and connect it via the Zep Memory node. It will automatically store user conversations and summarize facts. 💬 Chat Trigger Use the "When chat message received" trigger to initiate the conversation workflow. 🤖 AI Agent Configuration Connect: Chat Model → OpenAI GPT-4 or GPT-3.5 Memory → Zep Tools: Get_Orders – Fetch user order history from Google Sheets Get_Inventory – Recommend products based on stock and preferences Get_ReturnPolicy – Answer policy-related questions 📄 Google Sheets Store orders, inventory, and return policies in structured sheets. Use read access nodes to fetch data dynamically during conversations. 🧠 How it Works – Step-by-Step Chat Trigger – User sends a message. AI Agent (w/ Zep Memory): Reads past interactions to build context. Pulls memory facts (e.g., "User prefers men's sneakers"). Uses External Tools: Looks up orders, return policies, or available products. Generates Personalized Response using OpenAI. Reply Sent Back to the user through chat. 🧩 What the Bot Can Do 🛍 Suggest products based on past browsing or purchase behavior. 📦 Check order status and history without requiring the user to provide order IDs. 📃 Explain return policies in detail, adapting answers based on context. 🤖 Engage in more human-like conversations across multiple sessions. 👤 Who can use this? This is ideal for: 🛒 E-commerce store owners 🤖 Product-focused AI startups 📦 Customer service teams 🧠 Developers building intelligent commerce bots If you're building a chatbot that goes beyond canned responses, this memory-first shopping assistant is the upgrade you need. 🛠 Customization Ideas Connect with Shopify, WooCommerce, or Notion instead of Google Sheets. Add payment processing or shipping tracking integrations. Customize the memory expiration or fact-summarization rules in Zep. Integrate with voice AI to make it work as a phone-based shopping assistant. 🚀 Ready to Launch? Just connect: ✅ OpenAI Chat Model ✅ Zep Memory Engine ✅ Your Product/Order/Policy Sheets And you’re ready to deliver truly personalized shopping conversations.
by Matthew
Automated Personalized Email Icebreakers This workflow automates creating personalized email icebreakers. It reads leads from a Google Sheet, scrapes their company website, uses OpenAI to analyze the data and craft a unique opening line, and then saves that icebreaker back into the original sheet. How It Works Fetch Lead**: The workflow starts, loops through your leads, and pulls one from your Google Sheet. Scrape & Summarize**: It scrapes the lead's company website and uses a fast OpenAI model to summarize the key points about the company and the person. Generate Icebreaker**: This summary is then sent to a more powerful OpenAI model, which follows specific instructions to write a short, personalized icebreaker. Update Sheet**: The new icebreaker is saved back into the correct lead's row in your Google Sheet, using their email to match the record. Requirements An n8n instance. An OpenAI API key with available credits. A Google account with a Sheet for your leads. The Google Sheet must have columns for lead data (e.g., Email, Website, Company Name) and an empty column named icebreaker. The Email column must be unique for each lead. Setup Instructions Add Credentials: In n8n, add your OpenAI API key and connect your Google account via the Credentials menu. Configure Google Sheets Nodes: Select each of the two Google Sheets nodes (Client data and Add icebreaker to sheet). In each, choose your credential, select your spreadsheet and the specific sheet name, and ensure the column mapping is correct. Configure OpenAI Nodes: Select both OpenAI nodes (Summarising prospect data and Creating icebreaker) and choose your OpenAI credential from the dropdown. Verify Update Node: On the final Add icebreaker to sheet node, ensure the Operation is set to Append Or Update and the Matching Columns field is set to Email. Customization Options 💡 Trigger**: Change the manual start to an automatic trigger, like when a new row is added to the sheet or on a daily schedule (Cron). AI Prompt**: Modify the prompt in the "Creating icebreaker" node to change the tone, style, or length of the output. AI Model**: Experiment with different OpenAI models (like gpt-4o) for a different balance of cost, speed, and quality. Data Source**: Replace Google Sheets with a CRM like HubSpot or a database like Postgres.
by Ertay Kaya
This workflow automatically reviews new Zendesk tickets and tags them using OpenAI’s language model. It runs every 24 hours, fetches tickets created in the last day (for specified brands), and uses an AI agent to analyze each ticket’s content. Based on customizable rules, the agent suggests and applies relevant tags, ensuring existing tags are preserved. This helps automate ticket categorization and improves support team efficiency. Key Features: Scheduled daily execution Brand filtering for targeted ticket processing AI-powered tagging based on ticket content and custom rules Preserves existing tags while adding new ones Setup Instructions: Replace placeholder brand IDs/names and tag rules with your own. Connect your Zendesk and OpenAI accounts.
by Recrutei Automações
Overview: Automated Candidate Creation with AI Vacancy Matching This workflow automates the creation of new candidates in the Recrutei ATS directly from an n8n Form submission, ensuring a seamless "Apply Now" funnel. Its core feature is an AI Agent (OpenAI + Tool) that dynamically identifies the correct Recrutei vacancy_id based on the applicant's selection in the form. The workflow also automatically extracts the text content from the candidate's PDF curriculum and uploads it as an internal observation (note) to the profile. This template eliminates manual data entry, guarantees that candidates are associated with the correct vacancy, and makes the resume content easily searchable within your Recrutei ATS. Workflow Logic & Steps On Form Submission (Form Trigger): The workflow starts when a candidate submits the n8n Form, capturing Name, Email, Phone, the selected Vacancy Name (e.g., "Javascript Developer"), and the Resume (PDF file). Get Vacancy ID from AI (OpenAI): The text name of the vacancy is sent to an AI Agent. The AI, guided by a specific System Prompt, uses the Recrutei's MCP Tool to accurately find the official vacancy_id corresponding to that job title in your ATS. Set Vacancy ID (Set): Extracts the clean vacancy_id (a number) returned by the AI. Get Pipe Stages (HTTP Request): Fetches the pipeline stages associated with the identified vacancy ID. Create Prospect in Recrutei (HTTP Request): Creates the new candidate (Prospect) in the Recrutei ATS, associating them with the correct vacancy_id and the first available pipe stage. Merge Candidate Data (Merge): Merges the prospect creation output with the original form data to ensure all necessary details (like the resume file) are available for the next steps. Extract Text from PDF Resume (Extract from File): Reads and extracts all text content from the uploaded PDF resume file. Add Curriculum as Observation (HTTP Request): Adds the extracted CV text as an internal observation/note (talent_observation_type_id: 11) to the newly created candidate's profile in Recrutei. Setup Instructions To implement this workflow, you must configure the following: Recrutei API Credential: Create a Header Auth credential named Recrutei API (or similar) with: Header Name: Authorization Header Value: Bearer YOUR_API_KEY_HERE This credential must be selected in the nodes: Get Pipe Stages, Create Prospect in Recrutei, and Add Curriculum as Observation. AI Configuration: OpenAI: Configure your API Key in the Get Vacancy ID from AI node. Recrutei's MCP: Replace YOUR_MCP_ENDPOINT_URL_HERE in the Endpoint URL field of the Recrutei's MCP node with your actual Recrutei's MCP Server Endpoint URL. For more information about Recrutei API please refer to: https://developers.recrutei.com.br/docs/obtendo-token#
by Avkash Kakdiya
How it works This workflow starts whenever you add a new company name to a Google Sheet. It checks if the company name is filled in, then uses AI to find more details about the company like industry, size, location, and website. Next, it looks for the company in your HubSpot CRM. If the company is not there, it adds it automatically. Finally, it updates the Google Sheet with all the new company information so you have everything organized in one place. Step-by-step 1. Start with Google Sheets The workflow watches your Google Sheet for new company names. It ignores any empty or incomplete entries. 2. Get Company Details Uses AI (OpenAI GPT-4o-mini) to find more information about the company. Formats the AI results so they can be used easily. 3. Check and Update HubSpot Searches your HubSpot CRM to see if the company already exists. If the company is new, it creates a record in HubSpot with the AI details. 4. Save Everything in Google Sheets Prepares the enriched data for saving. Adds or updates the company information in the Google Sheet for easy tracking. Why use this? Automatically adds useful company info without manual work. Keeps your CRM clean by avoiding duplicates. Stores all updated company details in one place for easy access. Runs smoothly in the background without you needing to do anything after setup.
by Robert Breen
Use the n8n Data Tables feature to store, retrieve, and analyze survey results — then let OpenAI automatically recommend the most relevant course for each respondent. 🧠 What this workflow does This workflow demonstrates how to use n8n’s built-in Data Tables to create an internal recommendation system powered by AI. It: Collects survey responses through a Form Trigger Saves responses to a Data Table called Survey Responses Fetches a list of available courses from another Data Table called Courses Passes both Data Tables into an OpenAI Chat Agent, which selects the most relevant course Returns a structured recommendation with: course: the course title reasoning: why it was selected > Trigger: Form submission (manual or public link) 👥 Who it’s for Perfect for educators, training managers, or anyone wanting to use n8n Data Tables as a lightweight internal database — ideal for AI-driven recommendations, onboarding workflows, or content personalization. ⚙️ How to set it up 1️⃣ Create your n8n Data Tables This workflow uses two Data Tables — both created directly inside n8n. 🧾 Table 1: Survey Responses Columns: Name Q1 — Where did you learn about n8n? Q2 — What is your experience with n8n? Q3 — What kind of automations do you need help with? To create: Add a Data Table node to your workflow. From the list, click “Create New Data Table.” Name it Survey Responses and add the columns above. 📚 Table 2: Courses Columns: Course Description To create: Add another Data Table node. Click “Create New Data Table.” Name it Courses and create the columns above. Copy course data from this Google Sheet: 👉 https://docs.google.com/spreadsheets/d/1Y0Q0CnqN0w47c5nCpbA1O3sn0mQaKXPhql2Bc1UeiFY/edit?usp=sharing This Courses Data Table is where you’ll store all available learning paths or programs for the AI to compare against survey inputs. 2️⃣ Connect OpenAI Go to OpenAI Platform Create an API key In n8n, open Credentials → OpenAI API and paste your key The workflow uses the gpt-4.1-mini model via the LangChain integration 🧩 Key Nodes Used | Node | Purpose | n8n Feature | |------|----------|-------------| | Form Trigger | Collect survey responses | Forms | | Data Table (Upsert) | Stores results in Survey Responses | Data Tables | | Data Table (Get) | Retrieves Courses | Data Tables | | Aggregate + Set | Combines and formats table data | Core nodes | | OpenAI Chat Model (LangChain Agent) | Analyzes responses and courses | AI | | Structured Output Parser | Returns structured JSON output | LangChain | 💡 Tips for customization Add more Data Table columns (e.g., email, department, experience years) Use another Data Table to store AI recommendations or performance results Modify the Agent system message to customize how AI chooses courses Send recommendations via Email, Slack, or Google Sheets 🧾 Why Data Tables? This workflow shows how n8n’s Data Tables can act as your internal database: Create and manage tables directly inside n8n No external integrations needed Store structured data for AI prompts Share tables across multiple workflows All user data and course content are stored securely and natively in n8n Cloud or Self-Hosted environments. 📬 Contact Need help customizing this (e.g., expanding Data Tables, connecting multiple surveys, or automating follow-ups)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Marth
Automated Instagram Carousel Post (Blotato + GPT-4.1) This workflow is an end-to-end solution for automating the creation and publishing of highly engaging Instagram Carousel content on a recurring schedule. It leverages the intelligence of an AI Agent (GPT-4.1) for idea generation and sharp copywriting, combined with the visual rendering capabilities of Blotato, all orchestrated by the n8n automation platform. The core objective is to drastically cut content production time, enabling creators and marketing teams to consistently generate high-impact, scroll-stopping educational or inspirational content without manual intervention. How It Works The workflow executes in five automated phases: 1. Trigger and Idea Generation The workflow starts with the Schedule Trigger node, running at your specified time interval (e.g., daily). It takes the initial subject from the Topic node and feeds it to the Topic1 AI Agent. This agent is specifically prompted to create a short, viral hook/title (max. 6 words) in the style of confident, tactical copywriters (like Alex Hormozi), maximizing the content's initial draw. 2. Content Creation and Output Structuring The viral hook is then passed to the AI Agent Carousel Maker. This agent uses the GPT-4.1 model, following strict system instructions, to generate all necessary content elements in a structured JSON format: Punchy, concise text for each Carousel slide. A long, detailed Instagram Caption with explanations and a CTA. A short final title for internal reference. 3. Visual Rendering (Blotato Tool) The slide text output is sent to the Simple tweet cards monocolor (Blotato Tool) node. Blotato acts as a graphic generation API, rendering the text onto a chosen template to create a series of Carousel images (using the 4:5 aspect ratio). This replaces the need for manual design work in tools like Canva. 4. Status Check and Retry Mechanism Visual rendering takes time, so the workflow pauses: The Wait node holds the execution for 3 minutes. The Get carousel node retrieves the image generation status using the ID provided by the previous Blotato node. The If carousel ready node checks if the status is done. If not, the flow is routed back to the Wait node, implementing a built-in simple retry mechanism until the visuals are complete. 5. Final Posting Once the status is confirmed as done, the workflow proceeds to the final step: The Instagram [BLOTATO] node uses the media URLs retrieved from Blotato and the long caption from the AI Agent to automatically publish the entire Carousel post (multiple images plus text) to your linked Instagram account. Set Up Steps To successfully activate and personalize this n8n workflow, follow these steps: Step 1: Import and Connect Credentials Import Workflow: Import the provided JSON file (Automated Instagram Carousel Post with Blotato + Gpt 4.1.json) into your n8n instance. OpenAI Credentials: Ensure you have valid OpenAI API credentials connected to the OpenAI Chat Model node. Blotato Credentials: Ensure your valid Blotato API credentials are connected to all three Blotato-related nodes (Simple tweet cards monocolor, Get carousel, and Instagram [BLOTATO]). Step 2: Configure Workflow Inputs Set Topic: Open the Topic node. Change the default initial topic expression =Top ai tools for finance to any general subject matter you want your Carousels to cover. Set Schedule: Open the Schedule Trigger node and configure the Rule to define how often you want the content to be created and posted (e.g., set it to run Every Day at a specific time). Step 3: Personalize Content and Visuals Customize AI Persona: Open the AI Agent Carousel Maker node. Review and modify the long System Message to refine the AI's output: Adjust the # ROLE and # STYLE sections to match your brand's voice (e.g., change the Alex Hormozi style to a more formal, academic tone if needed). Do not change the structure defined in # OUTPUT as this JSON format is essential for downstream nodes. Personalize Visuals: Open the Simple tweet cards monocolor (Blotato Tool) node. Under templateInputs, customize fields like authorName, handle, and profileImage URLs to ensure the generated visuals are consistent with your personal or brand identity. Step 4: Final Posting Setup Select Instagram Account: Open the Instagram [BLOTATO] node. In the accountId parameter, use the dropdown list to select the specific Instagram account that is connected via your Blotato service. Activate: Once all steps are complete, save the workflow and toggle the main switch to Active to allow the Schedule Trigger to begin running the automation.
by MAMI YAMANE
Generate SEO content outlines from SERP analysis to Google Docs Overview Stop wasting hours on manual competitor research and content briefing. This workflow automates the creation of data-backed content briefs by analyzing the current top-ranking pages for your specific keyword. It scrapes the Google Search Engine Results Page (SERP), extracts the content structure (headings H1-H3) from competitor articles, and uses AI to generate a comprehensive article outline based on what is already ranking. The final outline is automatically saved to a Google Doc, streamlining your content production process. Who is this for? Content Marketers:** To drastically reduce the time needed to create detailed content briefs. SEO Specialists:** To analyze competitor content structures at scale without manual checking. Bloggers & Writers:** To overcome writer's block and ensure their content covers all necessary topics to rank. How it works Input: You enter a "Target Keyword" and "Target Audience" via the built-in n8n Form. SERP Scraping: The workflow uses Apify (Google Search Scraper) to fetch the top results for that keyword. Filtering: It automatically removes non-article URLs (such as Amazon product pages, YouTube videos, and PDFs) to ensure only relevant content competitors are analyzed. Deep Extraction: It visits each competitor's URL using Apify (Cheerio Scraper) to extract their article metadata and heading structure (H1, H2, H3). AI Analysis: The aggregated data is sent to OpenAI, which analyzes common patterns and generates an optimized article outline. Output: A new Google Doc is created with the generated outline. The request details are logged in Google Sheets for your records. Requirements Apify Account:* You will need an Apify account with access to the *Google Search Result Scraper and Cheerio Scraper actors. OpenAI Account:** An API key for OpenAI (GPT-3.5 or GPT-4 recommended). Google Cloud:** Credentials to access Google Docs and Google Sheets. How to set up Configure Credential: Connect your Apify, OpenAI, and Google accounts in the respective nodes. Workflow Configuration: Open the Workflow Configuration node. You can change the countryCode (default is "jp" for Japan) to your target region (e.g., "us", "uk") and adjust maxResults if needed. Google Sheets Setup: Create a Google Sheet with a column header named target_keyword. Copy the Spreadsheet ID and paste it into the Store Form Responses node. Run: Click "Chat" or "Open Form" in the trigger node to start the workflow. How to customize Change the AI Model:** In the AI Content Structure Analysis node, you can switch between different OpenAI models or adjust the system prompt to change the tone/format of the outline. Adjust Filters:** Modify the Filter Non-Article URLs node to exclude specific domains you don't want to analyze (e.g., wikipedia.org). Output Format:** You can modify the Create Google Doc node to include more specific data, such as the list of competitor URLs analyzed.
by Neal Mcleod
🧠 FB Group Problem Solver - Auto - Generate Helpful Posts For: Business Owners, Community managers, coaches, consultants, and business owners who want to build authentic relationships in Facebook groups without spending hours scrolling and crafting responses. Pain Point Solved: Tired of manually browsing Facebook groups to find engagement opportunities? This workflow automatically discovers what your community is struggling with and writes genuine, helpful posts that position you as a trusted problem-solver. How It Works This workflow runs on autopilot to: Scan your target Facebook groups for recent posts Identify the most common problems and pain points Analyze the community's language and communication style Generate authentic, value-packed posts that solve real problems Save ready-to-publish content to your Google Sheet What You'll Need Google Sheets account (for group URLs and post storage) PAID Apify account with Facebook Groups Scraper actor OpenAI API key (GPT-4 recommended) n8n instance (self-hosted or cloud) Quick Setup Import workflow and connect your Google Sheets Add your Apify API key and configure the Facebook scraper Insert OpenAI API keys in the three AI nodes List your FB groups in the input sheet (URL, Name, Niche) Test manually, then schedule to run daily/weekly Results Get 2 post variations for each identified problem, written in the group's natural tone and style. Posts are non-promotional, genuinely helpful, and designed to spark engagement while building trust. Time saved: 3-5 hours per week of manual group monitoring and content creation
by Yang
Who’s it for This workflow is for marketers, influencer agencies, or outreach teams who want to quickly check if a TikTok user meets certain criteria before adding them to an influencer list. No manual profile checking—just drop in a username, and the system does the rest. What it does This workflow takes a TikTok username submitted via form, fetches the user’s profile using Dumpling AI, then evaluates the user using GPT-4 to decide if they qualify for influencer outreach based on predefined rules: 40+ videos 100,000+ followers 300,000+ total likes It then checks Google Sheets: If the user does not exist, it adds a new row If the user already exists, it updates the row How it works Form Trigger: Collects TikTok username Dumpling AI: Pulls TikTok profile (username, ID, followers, videos, likes, etc.) GPT-4: Checks if the user meets outreach criteria Google Sheets: Checks if user already exists Updates or appends user data + qualification status Requirements ✅ Dumpling AI API key (HTTP Header Auth) ✅ OpenAI API key (GPT-4) ✅ Google Sheets integration with the following columns: Tik Tok user User ID Follower Count Following Count Heart Count Video Count Qualified? How to customize Change the qualification logic in the GPT-4 prompt Add additional TikTok data (bio, profile pic, location, etc.) Send a notification if the user is qualified Push the qualified leads to Airtable, Notion, or your CRM > This workflow gives you a plug-and-play tool to qualify TikTok influencers instantly using AI—without leaving your browser or spreadsheet.