by Jimleuk
This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Survey Insights scenario where survey participant responses can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, researchers can save days and even weeks of work breaking down cohorts of participants and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vT6m8XH8JWJTUAfwojc68NAUGC7q0lO7iV738J7aO5fuVjiVzdTRRPkMmT1C4N8TwejaiT0XrmF1Q48/pubhtml# How it works All survey questions and responses are imported from a Google Sheet. Responses are then inserted into a Qdrant collection carefully tagged with the question and survey metadata. For each question, all relevant response are put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the researcher. Requirements Survey data and format as shown in the attached google sheet. Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Add more clusters for open-ended questions and less clusters when responses are multiple choice.
by Brian Money
Overview This template is designed for Amazon sellers and advertisers who want to automate their campaign performance analysis and bidding strategy. It solves the common challenge of manually reviewing Sponsored Products reports and guessing how to adjust keywords, placements, and budgets. By combining Amazon Advertising reports with OpenAI's GPT-4o, this workflow delivers real-time, personalized optimization instructions — automatically. Features 📥 Automatically downloads Sponsored Products reports from Google Drive 🧠 Uses AI to analyze campaign, keyword, placement, targeting, and budget performance 📊 Supports both .csv and .xlsx report formats 🔁 Handles multiple ASINs and scales easily across ad accounts 📧 Sends structured optimization recommendations to your inbox via Gmail 🗂 Built-in logic to normalize filenames and correctly map reports 🧹 Includes error handling and formatting cleanup for AI-ready input Requirements To use this workflow, you’ll need: An Amazon Ads account with access to Sponsored Products reports A Google Drive folder where Amazon Ads reports are delivered (manually or via Gmail automation) A Gmail account (for sending summaries) An OpenAI API key with access to GPT-4o Optional: a developer account for the Amazon Ads API to fully automate report generation in the future Setup Instructions 📂 Connect your Amazon Ads reports folder in the Google Drive node 🔐 Add your credentials to the OpenAI and Gmail nodes 📝 Schedule five reports in the Amazon Ads Console: Search Term Report → Detailed Targeting Report → Detailed Campaign Report → Summary Placement Report → Summary Budget Report → Summary Use “Last 30 Days”, “Daily”, and .xlsx or .csv format 🔁 (Optional) Automate report ingestion using Gmail + Drive workflows 🧪 Test with one account, then replicate across additional ad accounts as needed ⏱️ Setup time: 15–30 minutes 📌 All field-specific guidance is included in workflow notes`
by Dvir Sharon
📍 Extract Google My Business Leads by Service & Location with Bright Data to Google Sheets This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts Google My Business listings by service type and geographic location using Bright Data's Google Maps dataset, with intelligent city expansion and automatic duplicate removal. 👥 Who is this for? Lead generation professionals Sales teams Marketing agencies Business development representatives Entrepreneurs conducting outreach or market research ❓ What problem is this solving? Manual lead generation from Google Maps is time-consuming and inefficient. This workflow automates the process of finding businesses by service type and location, expanding searches across cities, removing duplicates, and organizing results in a structured format. ⚙️ What this workflow does Input Processing Accepts service type, state, and country via web form Uses Claude AI to generate city lists Auto-categorizes services Creates search queries per city Data Collection Uses Bright Data's Google Maps dataset Processes in batches with rate limits Monitors scraping with retry logic Formats and handles API responses Quality Control Removes duplicates by name and phone Maintains clean data in Google Sheets Ensures structured, usable datasets 📄 Output Data Points | Field | Description | Example | | :-------------- | :-------------------------- | :---------------------------- | | Business Name | Company or business name | TechFix Computer Repair | | Category | Business category type | Electronics | | Country | Country location | US | | City | Specific city searched | Austin | | Phone Number | Contact phone number | +1 (555) 123-4567 | | Website URL | Business website | https://techfix.com | | Google Maps URL | Direct Maps link | https://maps.google.com/... | | Address | Full business address | 123 Main St, Austin, TX | | Operating Hours | Business hours | Mon-Fri 9AM-6PM | | Google Rating | Star rating | 4.5 | | Total Reviews | Number of reviews | 127 | | Reviews URL | Link to reviews | https://maps.google.com/reviews... | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access Anthropic API key for Claude AI Step-by-Step Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets and OAuth2 credentials Configure Claude AI with your API key Replace all placeholder credential IDs and tokens. For improved security, use credentials instead of hardcoding the API token placeholder in the HTTP Request node. Test with sample data (e.g., "Coffee Shop" in California, US) Activate the workflow and use the form for submissions 🛠 How to Customize Modify Geographic Scope Add countries to the form dropdown Customize Claude prompts for city generation Adjust search logic for international markets Enhance Data Collection Add more fields from Bright Data Include revenue, employee count, social profiles Improve Duplicate Detection Use fuzzy matching for similar names Include address-based checks Customize Output Format Transform data for CRM compatibility Export to CSV, database, or multiple destinations Implement Advanced Features Integrate email finder services Include lead scoring logic Discover social media profiles Batch Processing Optimization Adjust batch sizes per Bright Data limits Use parallel processing and retry logic Integration Options Connect to CRMs like HubSpot or Salesforce Trigger email automation Integrate with marketing platforms
by James Carter
This n8n workflow automatically fetches trending news articles based on your chosen country, category, and keyword — then enriches the data with AI-powered business insights before posting a concise summary to Slack. Ideal for sales teams, executives, marketers, or anyone who wants fast, actionable news briefings directly in their Slack workspace. ⸻ Who it’s for Executives, analysts, sales teams, or marketing professionals who want curated, AI-enhanced news summaries tailored to business opportunities, risks, and trends — delivered automatically to Slack. ⸻ How it works / What it does A Schedule Trigger runs on a daily, weekly, or custom frequency. It queries the NewsAPI to retrieve top headlines by country, category, or keyword. Headlines are formatted and enriched with your configured query context. The AI model (GPT-4) analyzes articles and summarizes key insights, categorizing them as Opportunities, Risks, or Trends. Finally, the summarized insights are posted directly into a Slack channel of your choice. ⸻ How to set up Set your schedule frequency in the Schedule Trigger node. Configure your preferred country, category, and keyword in the Inject Config node. Add your NewsAPI Key inside the Fetch News Articles node. Connect your Slack credentials in the Post to Slack node. Optional: Adjust the AI prompt for more tailored analysis. ⸻ Requirements A NewsAPI account to fetch headlines. An OpenAI API key for GPT-4 summarization. A Slack workspace and connected credentials via n8n. ⸻ How to customize the workflow Change the country, category, or keyword in the Inject Config to focus on specific markets or sectors. Adjust the AI prompt in the GPT node to prioritize certain insights like ESG factors, M&A activity, or market sentiment. Extend the workflow to log results to Google Sheets, email summaries, or send SMS alerts. Replace the Schedule Trigger with a Webhook if you want to trigger summaries on demand. This template is designed to be modular, making it easy to adapt for competitive intelligence, investment tracking, or industry news curation.
by HoangSP
SEO Blog Generator with GPT-4o, Perplexity, and Telegram Integration This workflow helps you automatically generate SEO-optimized blog posts using Perplexity.ai, OpenAI GPT-4o, and optionally Telegram for interaction. 🚀 Features 🧠 Topic research via Perplexity sub-workflow ✍️ AI-written blog post generated with GPT-4o 📊 Structured output with metadata: title, slug, meta description 📩 Integration with Telegram to trigger workflows or receive outputs (optional) ⚙️ Requirements ✅ OpenAI API Key (GPT-4o or GPT-3.5) ✅ Perplexity API Key (with access to /chat/completions) ✅ (Optional) Telegram Bot Token and webhook setup 🛠 Setup Instructions Credentials: Add your OpenAI credentials (openAiApi) Add your Perplexity credentials under httpHeaderAuth Optional: Setup Telegram credentials under telegramApi Inputs: Use the Form Trigger or Telegram input node to send a Research Query Subworkflow: Make sure to import and activate the subworkflow Perplexity_Searcher to fetch recent search results Customization: Edit prompt texts inside the Blog Content Generator and Metadata Generator to change writing style or target industry Add or remove output nodes like Google Sheets, Notion, etc. 📦 Output Format The final blog post includes: ✅ Blog content (1500-2000 words) ✅ Metadata: title, slug, and meta description ✅ Extracted summary in JSON ✅ Delivered to Telegram (if connected) Need help? Reach out on the n8n community forum
by Femi Ad
"Ade Technical Analyst" is a dual-workflow AI system combining conversational intelligence with visual chart analysis through Telegram. The system features 11 primary nodes for conversation management and 8 secondary nodes for chart generation and analysis. Core Components: Telegram Integration: Message handling with dynamic typing indicators AI Personality: "Ade" - a financial analyst with 50+ years NYSE/LSE experience using Claude 3.5 Sonnet Chart Generation: TradingView integration via Chart-IMG API with MACD and volume indicators Visual Analysis: GPT-4O vision for technical pattern recognition Memory System: Session-based conversation context retention Target Users Individual traders seeking professional-grade analysis without subscription costs Financial advisors wanting 24/7 AI-powered client support Investment educators needing interactive learning tools Fintech companies requiring white-label analysis solutions Setup Requirements Critical Security Fix Needed: Remove hardcoded API key from Chart-IMG node immediately Store all credentials securely in n8n credential manager Required APIs: OpenRouter (Claude 3.5 Sonnet) OpenAI (GPT-4O vision) Chart-IMG API Telegram Bot Token Technical Prerequisites: n8n version 1.7+ with Langchain nodes Webhook configuration for Telegram Dual-workflow setup with proper ID referencing Workflow Requirements Security Compliance: Never hardcode API keys in workflow JSON files Use n8n credential manager for all sensitive data Implement proper session isolation for user data Include mandatory financial disclaimers Performance Specifications: Model temperature: 0.8 for balanced responses Token limit: 500 for optimized performance Dark theme charts with professional indicators Session-based memory management Need help customizing? Contact me for consulting and support or add me on LinkedIn
by Naveen Choudhary
Who is this for? Marketing, content, and enablement teams that need a quick, human-readable summary of every new video published by the YouTube channels they care about—without leaving Slack. What problem does this workflow solve? Manually checking multiple channels, skimming long videos, and pasting the highlights into Slack wastes time. This template automates the whole loop: detect a fresh upload from your selected channels → pull subtitles → distill the key take-aways with GPT-4o-mini → drop a neatly-formatted digest in Slack. What this workflow does Schedule Trigger fires every 10 min, then grabs a list of YouTube RSS feeds from a Google Sheet. HTTP + XML fetch & parse each feed; only brand-new videos continue. YouTube API fetches title/description, RapidAPI grabs English subtitles. Code nodes build an AI payload; OpenAI returns a JSON summary + article. A formatter turns that JSON into Slack Block Kit, and Slack posts it. Processed links are appended back to the “Video Links” sheet to prevent dupes. Setup Make a copy of this Google Sheet and connect a Google Sheets OAuth2 credential with edit rights. Slack App: create → add chat:write, channels:read, app_mention; enable Event Subscriptions; install and store the Bot OAuth token in an n8n Slack credential. RapidAPI key for https://yt-api.p.rapidapi.com/subtitles (300 free calls/mo) → save as HTTP Header Auth. OpenAI key → save in an OpenAI credential. Add your RSS feed URLs to the “RSS Feed URLs” tab; press Execute Workflow. How to customise Adjust the schedule interval or freshness window in “If newly published”. Swap the OpenAI model or prompt for shorter/longer digests. Point the Slack node at a different channel or DM. Extend the AI payload to include thumbnails or engagement stats. Use-case ideas Product marketing**: Instantly brief sales & CS teams when a competitor uploads a feature demo. Internal learning hub**: Auto-summarise conference talks and share bullet-point notes with engineers. Social media managers**: Get ready-to-post captions and key moments for re-purposing across platforms.
by Evoort Solutions
🖼️ Text-to-Image Generator using n8n + Flux AI This n8n workflow automates image generation from text prompts using the Text-to-Image Flux AI API. It reads prompts from Google Sheets, generates images via API, uploads them to Google Drive, and logs the outcome. 🌟 Key Features Integrates with Text-to-Image Flux AI on RapidAPI Converts base64 image data to downloadable files Stores images on Google Drive Updates logs and errors back into Google Sheets Skips prompts already processed 📄 Google Sheet Column Structure Your source Google Sheet should include the following columns: | Column Name | Description | |-------------------|--------------------------------------------------| | Prompt | The text prompt to generate an image from | | drive path | (Optional) File path or URL of saved image | | Generated Date | Date/time the image was generated | | Base64 | Base64 string or error message (for logging) | Only rows with a non-empty Prompt and empty drive path will be processed. 📌 Use Case Perfect for: Bulk AI image generation for content marketing Creative automation with prompt-based image creation Building image assets based on structured datasets Any workflow where prompts are tracked via Google Sheets Uses the Text-to-Image Flux AI API to generate high-quality images on demand. 🔧 Workflow Summary | Step | Node | Description | |------|------|-------------| | 1 | Manual Trigger | Manually start the workflow | | 2 | Google Sheets2 | Reads prompts from Google Sheets | | 3 | Loop Over Items | Processes rows one by one | | 4 | If2 | Skips rows that already have images | | 5 | HTTP Request1 | Calls Text-to-Image Flux AI via RapidAPI | | 6 | Code1 | Converts base64 image to binary file | | 7 | Google Drive1 | Uploads the image file to a Drive folder | | 8 | Google Sheets1 | Logs base64 result and timestamp back | | 9 | If1 | Handles errors from the API | | 10 | Google Sheets4 | Logs errors to the sheet | | 11 | Wait | Adds delay between batches to prevent rate-limiting | 🚀 RapidAPI: Text-to-Image Flux AI This flow is powered by Text-to-Image Flux AI. Be sure to: Sign up at RapidAPI and subscribe to the API. Copy your API Key. Replace "your key" in the HTTP Request1 node’s x-rapidapi-key header. You can test the API directly here before connecting it to n8n. ✅ Tips for Setup Ensure you’ve set up a Google Service Account with access to both Sheets and Drive. Fill only the Prompt column — leave drive path and Base64 empty for new prompts. Monitor your RapidAPI dashboard for usage and quota. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Julian Kaiser
This automated workflow scrapes and processes the monthly "Who is Hiring" thread from Hacker News, transforming raw job listings into structured data for analysis or integration with other systems. Perfect for job seekers, recruiters, or anyone looking to monitor tech job market trends. How it works Automatically fetches the latest "Who is Hiring" thread from Hacker News Extracts and cleans relevant job posting data using the HN API Splits and processes individual job listings into structured format Parses key information like location, role, requirements, and company details Outputs clean, structured data ready for analysis or export Set up steps Configure API access to [Hacker News](https://github.com/HackerNews/API ) (no authentication required) Follow the steps to get your cURL command from https://hn.algolia.com/ Set up desired output format (JSON structured data or custom format) Optional: Configure additional parsing rules for specific job listing information Optional: Set up integration with preferred storage or analysis tools The workflow transforms unstructured job listings into clean, structured data following this pattern: Input: Raw HN thread comments Process: Extract, clean, and parse text Output: Structured job listing data This template saves hours of manual work collecting and organizing job listings, making it easier to track and analyze tech job opportunities from Hacker News's popular monthly hiring threads.
by AlexAy
Who is this workflow template for? This workflow template is perfect for freelancers, small business owners, accounting teams, or anyone responsible for managing and recording invoices regularly. If you deal with multiple invoices and spend considerable time manually entering invoice data into a database, this automation will significantly simplify your daily operations and reduce potential errors. What this workflow does The workflow automates the entire invoice logging process. It continuously monitors a designated Google Drive folder every minute for new PDF invoice uploads. Once a new invoice is detected, it is automatically converted from PDF to an image format using the ILovePDF API. After conversion, Google's Gemini AI analyzes the image, intelligently extracting essential details such as vendor name, item description, invoice amount, invoice date, payment date, and bank reference numbers. Finally, this structured data is automatically recorded in an Airtable database (or optionally in a Google Sheet), ensuring organized, accessible records. Detailed Workflow Explanation Step 1: Invoice Detection** Monitors Google Drive for newly uploaded PDF invoices. Step 2: PDF to Image Conversion** Converts PDFs into images using ILovePDF. Step 3: Data Extraction via Gemini AI** Uses Gemini AI to analyze the invoice image. Extracts data such as Vendor, Description, Amount, Invoice Date, Paid Date, and Bank Reference. Provides clear descriptions even when original invoice descriptions are vague or missing by analyzing vendor context. Step 4: Structured Data Storage** Automatically sends extracted data to Airtable or Google Sheets. Step 5: File Management** Moves processed PDF files into a separate "Done" folder to clearly differentiate between processed and unprocessed invoices. Step-by-Step Setup Instructions Set Up Google Drive: Log in to Google Drive and create two folders: One named Invoices (for incoming PDF files) One named Processed (for processed files) Obtain API Credentials: ILovePDF API: Sign up at ILovePDF Developers. Retrieve your API key from your account dashboard. Google Gemini AI API: Register at Google AI and generate an API key. Airtable Database Preparation: Create an Airtable base with the following columns: Vendor (Text) Description (Text) Amount (Number or Text) Invoice Date (Date) Paid Date (Date) Bank Reference (Text) Import and Configure Workflow in n8n: Import the provided workflow JSON file into your n8n instance. Connect your Google Drive, ILovePDF, Google Gemini AI, and Airtable accounts by entering your credentials in their respective nodes. Adjust Workflow Settings: In the Google Drive nodes, ensure your newly created Invoices and Processed folders are correctly selected. Update the ILovePDF public key in the appropriate HTTP Request node. Customize the Gemini AI prompt to refine or expand data extraction according to your specific needs. Testing Your Setup: Upload a sample PDF invoice into the Invoices folder. Execute the workflow by clicking Test Workflow in n8n and verify if data extraction and Airtable logging operate correctly. Airtable Column Specifications Ensure your Airtable includes the following structure: Vendor**: Single Line Text Description**: Single Line Text Amount**: Currency or Single Line Text Invoice Date**: Date (formatted as YYYY-MM-DD) Paid Date**: Date (formatted as YYYY-MM-DD) Bank Reference**: Single Line Text How to Customize the Workflow System Prompt:** Adjust the AI instructions by modifying the prompt text to focus on additional or fewer invoice details. Structured Output Parser:** Modify the JSON schema in the parser node to match the structure and data points your project specifically requires: By following these instructions, you’ll have a fully automated, reliable system for handling and logging invoice data, significantly enhancing your productivity.
by Robert Breen
Extract Local Business Contacts with Google Sheets, SerpAPI & GPT‑4o Status: Ready for Use ✅ Disclaimer: This workflow relies on community nodes that are not part of n8n’s core package. Install the following from n8n → Community Nodes before running: n8n-nodes-langchain** n8n-nodes-openai** (Structured Output Parser) n8n-nodes-apify** 📝 Description This n8n workflow automates discovery of local‑business contact details by search term and location, then enriches the results with publicly listed email addresses using GPT‑4o AI. 🔑 Key Features 🔗 Google Sheets Integration Reads search terms and locations from a Google Sheet. Processes only rows that are not marked Complete, preventing duplicates. 🗺️ Google Maps Search via SerpAPI Queries Google Maps through SerpAPI for every search‑term‑and‑location pair. Retrieves the following fields: business name, website, street address, and phone number. 🧠 Website Scraping & Email Extraction Scrapes the business homepage content with Apify’s Fast Website Content Crawler. Sends the scraped HTML to a GPT‑4o AI Agent. Extracts any publicly listed email address. Returns a clean, structured JSON object for downstream use. 💾 Data Storage & Tracking Writes every result to a Results tab in the same Google Sheet. Marks the corresponding row in the Searches tab as Complete once finished. 🧱 Extensible Design The workflow uses modular sub‑workflows and AI agents. You can easily extend it to add: Phone‑number verification with Twilio Social‑media enrichment with Clearbit Exports to HubSpot, Salesforce, Airtable, PostgreSQL, or CSV files 📄 Google Sheet Setup Create a Searches tab with these exact columns (one header row): Search | Area | Area Name | Complete Create a results tab with these columns title | website | address | phone | Search | Search Name | Area | email (Manual Entry) ⚙️ Prerequisites Google Cloud Project with Google Sheets API and Google Drive API enabled SerpAPI account (free trial or paid) – obtain an API key Apify account (free trial or paid) with the Fast Website Content Crawler actor installed OpenAI account with an API key that can access GPT‑4o models 🚀 Setup Instructions Copy the Google Sheet Make a personal copy of the template sheet. Ensure the tab names are Searches and Results. https://docs.google.com/spreadsheets/d/1QgcVMlXRlM_5ZFFUHr6bVK-93Tzia9XseTX03ZYnowI/edit?usp=sharing Configure Google Sheets nodes in n8n Open the workflow. Update the nodes Extract Search Terms and Save Emails to Sheet to point at your copied sheet. Authenticate using Google OAuth2 credentials that have access to the sheet. Add SerpAPI credentials Sign in at <https://serpapi.com>. Copy your API key. In the Search Google Maps node, create a new credential and paste the key. Set up Apify Sign up at <https://apify.com>. Add the Fast Website Content Crawler actor to your account. In the Scrape Web Page HTTP node, append ?token=YOUR_API_KEY to the actor URL. Add your OpenAI API key Go to <https://platform.openai.com>. Generate an API key. Add it to the AI Agent and OpenAI Chat Model node credentials. ✅ Running the Workflow Click Execute Workflow in n8n. For each unprocessed row in the Searches tab, the automation will: Retrieve business information from Google Maps via SerpAPI. Scrape the business website using Apify. Use GPT‑4o to extract a public email address. Write all collected data to the Results tab. Mark the original row as Complete. 🧩 Example Use Cases Build highly targeted lead lists for sales and marketing outreach. Compile local business directories for regional websites or apps. Automate contact‑information collection for lead‑generation campaigns and reduce manual data entry. 🤝 Connect with Me Description I’m Robert Breen, founder of Ynteractive — a consulting firm that helps businesses automate operations using n8n, AI agents, and custom workflows. I’ve helped clients build everything from intelligent chatbots to complex sales automations, and I’m always excited to collaborate or support new projects. If you found this workflow helpful or want to talk through an idea, I’d love to hear from you. Links 🌐 Website: https://www.ynteractive.com 📺 YouTube: @ynteractivetraining 💼 LinkedIn: https://www.linkedin.com/in/robert-breen 📬 Email: rbreen@ynteractive.com
by Jimleuk
This n8n template introduces the Dynamic Prompts Ai workflow pattern which are incredible for certain types of data extraction tasks where attributes are unknown or need to remain flexible. The general idea behind this pattern is that the prompts for requested attributes to be extracted live outside the template and so can be changed at any time - without needing to edit the template. This seriously cuts down on maintainance requirements and is reusable for any number of tables at little cost. Check out the video demo I did for n8n Studio here: https://www.youtube.com/watch?v=_fNAD1u8BZw Check out the example Airtable here: https://airtable.com/appAyH3GCBJ56cfXl/shrXzR1Tj99kuQbyL Looking for the Baserow Version? https://n8n.io/workflows/2780-ai-data-extraction-with-dynamic-prompts-and-baserow/ How it works Given we have an "input" field for context and a number of fields for the data we want to extract, this template will run in the background to react to any changes to either the "input" or fields and automatically update the rows accordingly. The key is that Airtable fields have a special property called the "field description". In this pattern, we use this property to allow the user to store a simple prompt describing the data that should exist in the column. Our n8n template reads these column descriptions aka "prompts" to use as instructions to perform tasks on the "input". In this template, the "input" is a PDF of a resume/CV and the columns are attributes a HR person would want to extract from it - such as full name, address, last position, years of experience etc. How to use First publish this template and ensure it's accessible via webhook URL. You then have to run the "create airtable webhooks" mini-flow to configure your Airtable to send change events to the n8n template. This mini-flow exists in the template but you'll have to update the IDs. Check the template for more instructions. Requirements Airtable for Tables/Database OpenAI for LLM and extraction. Feel free to choose another LLM if preferred. Customising this workflow If you're not using files, you can replace the "input" field with anything you like. For example, the "input" could be single line text.