by Guillaume Duvernay
Description This template provides a simple and powerful backend for adding speech-to-text capabilities to any application. It creates a dedicated webhook that receives an audio file, transcribes it using OpenAI's gpt-4o-mini model, and returns the clean text. To help you get started immediately, you'll find a complete, ready-to-use HTML code example right inside the workflow in a sticky note. This code creates a functional recording interface you can use for testing or as a foundation for your own design. Who is this for? Developers:** Quickly add a transcription feature to your application by calling this webhook from your existing frontend or backend code. No-code/Low-code builders:** Embed a functional audio recorder and transcription service into your projects by using the example code found inside the workflow. API enthusiasts:** A lean, practical example of how to use n8n to wrap a service like OpenAI into your own secure and scalable API endpoint. What problem does this solve? Provides a ready-made API:** Instantly gives you a secure webhook to handle audio file uploads and transcription processing without any server setup. Decouples frontend from backend:** Your application only needs to know about one simple webhook URL, allowing you to change the backend logic in n8n without touching your app's code. Offers a clear implementation pattern:** The included example code provides a working demonstration of how to send an audio file from a browser and handle the response—a pattern you can replicate in any framework. How it works This solution works by defining a clear API contract between your application (the client) and the n8n workflow (the backend). The client-side technique: Your application's interface records or selects an audio file. It then makes a POST request to the n8n webhook URL, sending the audio file as multipart/form-data. It waits for the response from the webhook, parses the JSON body, and extracts the value of the Transcript key. You can see this exact pattern in action in the example code provided in the workflow's sticky note. The n8n workflow (backend): The Webhook node catches the incoming POST request and grabs the audio file. The HTTP Request node sends this file to the OpenAI API. The Set node isolates the transcript text from the API's response. The Respond to Webhook node sends a clean JSON object ({"Transcript": "your text here..."}) back to your application. Setup Configure the n8n workflow: In the Transcribe with OpenAI node, add your OpenAI API credentials. Activate the workflow to enable the endpoint. Click the "Copy" button on the Webhook node to get your unique Production Webhook URL. Integrate with the frontend: Inside the workflow, find the sticky note labeled "Example Frontend Code Below". Copy the complete HTML from the note below it. ⚠️ Important: In the code you just copied, find the line const WEBHOOK_URL = 'YOUR WEBHOOK URL'; and replace the placeholder with the Production Webhook URL from n8n. Save the code as an HTML file and open it in your browser to test. Taking it further Save transcripts:* Add an *Airtable* or *Google Sheets** node to log every transcript that comes through the workflow. Error handling:** Enhance the workflow to catch potential errors from the OpenAI API and respond with a clear error message. Analyze the transcript:* Add a *Language Model** node after the transcription step to summarize the text, classify its sentiment, or extract key entities before sending the response.
by Agent Studio
Overview This workflow answers user requests sent via Mac Shortcuts Several Shortcuts call the same webhook, with a query and a type of query Types of query are: translate to english translate to spanish correct grammar (without changing the actual content) make content shorter make content longer How it works Select a text you are writing Launch the shortcut The text is sent to the webhook Depending on the type of request, a different prompt is used Each request is sent to an OpenAI node The workflow responds to the request with the response from GPT Shortcut replace the selected text with the new one For a demo and setup instructions: How to use it Activate the workflow Download this Shortcut template Install the shortcut In step 2 of the shortcut, change the url of the Webhook In Shortcut details, "add Keyboard Shortcut" with the key you want to use to launch the shortcut Go to settings, advanced, check "Allow running scripts" You are ready to use the shortcut. Select a text and hit the keyboard shortcut you just defined
by Viktor Klepikovskyi
Google Sheets UI for Workflow Control This n8n template provides a practical and efficient way to manage your n8n workflows using Google Sheets as a user-friendly interface. It demonstrates how to leverage a simple spreadsheet to control inputs, capture outputs, and track the processing status of individual data rows, offering a clear and visual overview of your automation tasks. Purpose of This Template: The primary purpose of this template is to illustrate how Google Sheets can serve as a dynamic UI for your n8n automations. It's designed for n8n users who need: A structured method to feed specific data into their workflows. The ability to selectively trigger workflow execution based on data status. A centralized place to view and store workflow outputs alongside original inputs. A simple, no-code solution for managing workflow data without building custom applications. Setup Instructions: To use this template, follow these steps: Create a Google Sheet: Set up a new Google Sheet (see the template here) with three columns: Color, Status, and Number. Populate the Color column with some sample data (e.g., color names) and set the Status for the rows you want to process to READY. Import the n8n Workflow: Import this n8n template into your n8n instance. Configure Google Sheets Nodes: For the first Google Sheets node (Read operation), ensure it's connected to your newly created Google Sheet and configured to read rows where the Status column is READY. You will need to authenticate your Google Sheets account. For the second Google Sheets node (Update operation), ensure it's also connected to the same Google Sheet. The node should automatically map the row_number, Number, and Status fields from the preceding nodes. Execute the Workflow: Run the workflow. Observe how it reads READY rows, processes them (calculates string length), and updates the Number and Status columns in your Google Sheet to DONE. Control Execution: To process new data, simply add new rows to your Google Sheet and set their Status to READY. Rerunning the workflow will then only process these new entries. For more details and context on this approach, you can refer to the related blog post here.
by Calistus Christian
How it works • Webhook → urlscan.io → GPT-4o mini → Gmail • Payload example: { "url": "https://example.com" } • urlscan.io returns a Scan ID and raw JSON. • AI node classifies the scan as malicious / suspicious / benign, assigns a 1-10 risk score, and writes a two-sentence summary. • Gmail sends an alert that includes the URL, Scan ID, AI verdict, screenshot link, and full report link. Set-up steps (~5 min) • Create three credentials in n8n urlscan.io API key OpenAI API key (GPT-4o mini access) Gmail OAuth (or SMTP) • Replace those fields in the nodes, or reference env vars like {{ $env.OPENAI_API_KEY }}. • Switch the Webhook to Production → copy the live URL. • Test with: curl -X POST <your-webhook-url> \ -H "Content-Type: application/json" \ -d '{ "url": "https://example.com" }'
by Ria
This is a very simple workflow that lets you subscribe to any github repository for the latest release (using n8n as example). How it works: daily poll to Github repository for release for latest (stable) version of n8n parses the content to HTML sends a gmail Setup steps: add your gmail credentials (or use other email node of choice) change the url to the right Github repository you want to check regularly change the To email address to the email that you want to receive the updates for Feedback & Questions If you have any questions or feedback about this workflow - Feel free to get in touch at ria@n8n.io
by Robert Breen
This workflow introduces beginners to one of the most fundamental concepts in n8n: looping over items. Using a simple use case—generating LinkedIn captions for content ideas—it demonstrates how to split a dataset into individual items, process them with AI, and collect the output for review or export. ✅ Key Features 🧪 Create Dummy Data**: Simulate a small dataset of content ideas. 🔁 Loop Over Items**: Process each row independently using the SplitInBatches node. 🧠 AI Caption Creation**: Automatically generate LinkedIn captions using OpenAI. 🧰 Tool Integration**: Enhance AI output with creativity-injection tools. 🧾 Final Output Set**: Collect the original idea and generated caption. 🧰 What You’ll Need ✅ An OpenAI API key ✅ The LangChain nodes enabled in your n8n instance ✅ Basic knowledge of how to trigger and run workflows in n8n 🔧 Step-by-Step Setup 1️⃣ Run Workflow Node**: Manual Trigger (Run Workflow) Purpose**: Manually start the workflow for testing or learning. 2️⃣ Create Random Data Node**: Create Random Data (Code) What it does**: Simulates incoming data with multiple content ideas. Code**: return [ { json: { row_number: 2, id: 1, Date: '2025-07-30', idea: 'n8n rises to the top', caption: '', complete: '' } }, { json: { row_number: 3, id: 2, Date: '2025-07-31', idea: 'n8n nodes', caption: '', complete: '' } }, { json: { row_number: 4, id: 3, Date: '2025-08-01', idea: 'n8n use cases for marketing', caption: '', complete: '' } } ]; 3️⃣ Loop Over Items Node**: Loop Over Items (SplitInBatches) Purpose**: Sends one record at a time to the next node. Why It Matters**: Loops in n8n are created using this node when you want to iterate over multiple items. 4️⃣ Create Captions with AI Node**: Create Captions (LangChain Agent) Prompt**: idea: {{ $json.idea }} System Message**: You are a helpful assistant creating captions for a LinkedIn post. Please create a LinkedIn caption for the idea. Model**: GPT-4o Mini or GPT-3.5 Credentials Required**: OpenAI Credential Go to: OpenAI API Keys Create a key and add it in n8n under credentials as “OpenAi account” 5️⃣ Inject Creativity (Optional) Node**: Tool: Inject Creativity (LangChain Tool) Purpose**: Demonstrates optional LangChain tools that can enhance or manipulate input/output. Why It’s Cool**: A great way to show chaining tools to AI agents. 6️⃣ Output Table Node**: Output Table (Set) Purpose**: Combines original ideas and generated captions into final structure. Fields**: idea: ={{ $('Create Random Data').item.json.idea }} output: ={{ $json.output }} 💡 Educational Value This workflow demonstrates: Creating dynamic inputs with the Code node Using SplitInBatches to simulate looping Sending dynamic prompts to an AI model Using Set to structure the output data Beginners will understand how item-level processing works in n8n and how powerful looping combined with AI can be. 📬 Need Help or Want to Customize This? Robert Breen Automation Consultant | AI Workflow Designer | n8n Expert 📧 robert@ynteractive.com 🌐 ynteractive.com 🔗 LinkedIn 🏷️ Tags n8n loops OpenAI LangChain workflow training beginner LinkedIn automation caption generator
by Airtop
Automating LinkedIn Company URL Verification Use Case This automation verifies that a given LinkedIn URL actually belongs to a company by comparing the website listed on their LinkedIn page against the expected company domain. It is essential for ensuring data accuracy in lead qualification, enrichment, and CRM updates. What This Automation Does Input Parameters Company LinkedIn**: The LinkedIn URL to be verified. Company Domain**: The expected domain (e.g., example.com) for validation. Airtop Profile (connected to LinkedIn)**: Airtop Profile with LinkedIn authentication. Output Confirmation whether the LinkedIn page corresponds to the provided domain. Returns the verified LinkedIn URL if the match is confirmed. How It Works Extracts the website URL from the specified LinkedIn company profile. Compares the extracted URL with the provided company domain. If the domain is contained in the extracted website, the LinkedIn profile is confirmed as valid. Returns the original LinkedIn URL if the match is successful. Setup Requirements Airtop API Key LinkedIn-authenticated Airtop Profile Next Steps Use for LinkedIn Discovery Validation**: Ensure correctness after automated LinkedIn page discovery. Combine with CRM Updates**: Prevent incorrect LinkedIn links from being stored in CRM. Automate in Data Pipelines**: Use this as a validation gate before enrichment or scoring steps.
by Jaruphat J.
⚠️ Note: This template requires a community node and works only on self-hosted n8n installations. It uses the Typhoon OCR Python package and custom command execution. Make sure to install required dependencies locally. Who is this for? This template is for developers, operations teams, and automation builders in Thailand (or any Thai-speaking environment) who regularly process PDFs or scanned documents in Thai and want to extract structured text into a Google Sheet. It is ideal for: Local government document processing Thai-language enterprise paperwork AI automation pipelines requiring Thai OCR What problem does this solve? Typhoon OCR is one of the most accurate OCR tools for Thai text. However, integrating it into an end-to-end workflow usually requires manual scripting and data wrangling. This template solves that by: Running Typhoon OCR on PDF files Using AI to extract structured data fields Automatically storing results in Google Sheets What this workflow does Trigger: Run manually or from any automation source Read Files: Load local PDF files from a doc/ folder Execute Command: Run Typhoon OCR on each file using a Python command LLM Extraction: Send the OCR markdown to an AI model (e.g., GPT-4 or OpenRouter) to extract fields Code Node: Parse the LLM output as JSON Google Sheets: Append structured data into a spreadsheet Setup 1. Install Requirements Python 3.10+ typhoon-ocr: pip install typhoon-ocr Install Poppler and add to system PATH (needed for pdftoppm, pdfinfo) 2. Create folders Create a folder called doc in the same directory where n8n runs (or mount it via Docker) 3. Google Sheet Create a Google Sheet with the following column headers: | book\_id | date | subject | detail | signed\_by | signed\_by2 | contact | download\_url | | -------- | ---- | ------- | ------ | ---------- | ----------- | ------- | ------------- | You can use this example Google Sheet as a reference. 4. API Key Export your TYPHOON_OCR_API_KEY and OPENAI_API_KEY in your environment (or set inside the command string in Execute Command node). How to customize this workflow Replace the LLM provider in the Basic LLM Chain node (currently supports OpenRouter) Change output fields to match your data structure (adjust the prompt and Google Sheet headers) Add trigger nodes (e.g., Dropbox Upload, Webhook) to automate input About Typhoon OCR Typhoon is a multilingual LLM and toolkit optimized for Thai NLP. It includes typhoon-ocr, a Python OCR library designed for Thai-centric documents. It is open-source, highly accurate, and works well in automation pipelines. Perfect for government paperwork, PDF reports, and multilingual documents in Southeast Asia.
by John Alejandro SIlva
🤖🥗 Telegram Nutrition AI Assistant (Alternative to Cal AI App) > AI-powered nutrition assistant for Telegram — log meals, set goals, and get personalized daily reports with Google Sheets integration. 📋 Description This n8n template creates a Telegram-based Nutrition AI Assistant 🥑🔥 designed as an open-source alternative to the Cal AI mobile app. It allows users to interact with an AI agent via text, voice, or images to track meals, calculate macros, and monitor nutrition goals directly from Telegram. The system integrates Google Sheets as the database, handling both user profiles and meal logs, while leveraging Gemini AI for natural conversation, food recognition, and daily progress reports. ✨ Key Features 💬 Multi-input support: Text, voice messages (transcribed), and food images (AI analysis). 📊 Macro calculation: Automatic estimation of calories, proteins, carbs, and fats. 📝 User-friendly registration: Simple onboarding without storing personal health data (no weight/height required). 🎯 Goal tracking: Users can set and update calorie and protein targets. 📈 Daily reports: Personalized progress messages with visual progress bars. 🗂 Google Sheets integration: Profile table for user targets. Meals table for food logs. 🔄 Advanced n8n nodes: Includes use of Merge, Subworkflow, and Code nodes for data processing and report generation. 💡 Acknowledgment Inspired by the Cal AI concept 💡 — this template demonstrates how to reproduce its main functionality with n8n, Telegram, and AI agents as a flexible, open-source automation workflow. 🏷 Tags telegram ai-assistant nutrition meal-tracking google-sheets food-logging voice-transcription image-analysis daily-reports n8n-template merge-node subworkflow-node code-node telegram-trigger google-gemini 💼 Use Case Use this template if you want to: 🥗 Log meals using text, images, or voice messages. 📊 Track nutrition goals (calories, proteins) with daily progress updates. 🤖 Provide a chat-based nutrition assistant without building a full app. 🗂 Store structured nutrition data in Google Sheets for easy access and analysis. 💬 Example User Interactions 📸 User sends a photo of a meal → AI analyzes the food and logs calories/macros. 🎤 User sends a voice message → AI transcribes and logs the meal. ⌨️ User types “report” → AI returns a daily nutrition summary with progress bars. 🥅 User says “update my protein goal” → AI updates profile in Google Sheets. 🔑 Required Credentials Telegram Bot API (Bot Token) Google Sheets API credentials AI Provider API (Google Gemini or compatible LLM) ⚙️ Setup Instructions 🗂 Create two Google Sheets tables: Profile: User_ID, Name, Calories_target, Protein_target Meals: User_ID, Date, Meal_description, Calories, Proteins, Carbs, Fats 🔌 Configure the Telegram Trigger with your bot token. 🤖 Connect your AI provider credentials (Gemini recommended). 📑 Connect Google Sheets with your credentials. ▶️ Deploy the workflow in n8n. 🎯 Start interacting with your nutrition assistant via Telegram. 📌 Extra Notes 🟩 Green section: Handles Telegram trigger and user check. 🟥 Red section: Registers new users and sets goals. 🟦 Blue section: Processes text, voice, and images. 🟨 Yellow section: Generates nutrition reports. 🟪 Purple section: Main AI agent controlling tools and logic. 💡 Need Assistance? If you’d like help customizing or extending this workflow, feel free to reach out: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez
by Yaron Been
Workflow Overview This cutting-edge n8n automation is a sophisticated market research and intelligence gathering tool designed to transform web content discovery into actionable insights. By intelligently combining web crawling, AI-powered filtering, and smart summarization, this workflow: Discovers Relevant Content: Automatically crawls target websites Identifies trending topics Extracts comprehensive article details Intelligent Content Filtering: Applies custom keyword matching Filters for most relevant articles Ensures high-quality information capture AI-Powered Summarization: Generates concise, meaningful summaries Extracts key insights Provides quick, digestible information Seamless Delivery: Sends summaries directly to Slack Enables instant team communication Facilitates rapid information sharing Key Benefits 🤖 Full Automation: Continuous market intelligence 💡 Smart Filtering: Precision content discovery 📊 AI-Powered Insights: Intelligent summarization 🚀 Instant Delivery: Real-time team updates Workflow Architecture 🔹 Stage 1: Content Discovery Scheduled Trigger**: Daily market research FireCrawl Integration**: Web content crawling Comprehensive Site Scanning**: Extracts article metadata Captures full article content Identifies key information sources 🔹 Stage 2: Intelligent Filtering Keyword-Based Matching** Relevance Assessment** Custom Domain Optimization**: AI and technology focus Startup and innovation tracking 🔹 Stage 3: AI Summarization OpenAI GPT Integration** Contextual Understanding** Concise Insight Generation**: 3-point summary format Captures essential information 🔹 Stage 4: Team Notification Slack Integration** Instant Information Sharing** Formatted Insight Delivery** Potential Use Cases Market Research Teams**: Trend tracking Innovation Departments**: Technology monitoring Startup Ecosystems**: Competitive intelligence Product Management**: Industry insights Strategic Planning**: Rapid information gathering Setup Requirements FireCrawl API Web crawling credentials Configured crawling parameters OpenAI API GPT model access Summarization configuration API key management Slack Workspace Channel for insights delivery Appropriate app permissions Webhook configuration n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Multi-source crawling 📊 Advanced sentiment analysis 🔔 Customizable alert mechanisms 🌐 Expanded topic tracking 🧠 Machine learning refinement Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible crawling strategies Ensure compliance with website terms of service Ethical Guidelines Respect content creator rights Use data for legitimate research Maintain transparent information gathering Provide proper attribution Workflow Visualization [Daily Trigger] ⬇️ [Web Crawling] ⬇️ [Content Filtering] ⬇️ [AI Summarization] ⬇️ [Slack Delivery] Connect With Me Ready to revolutionize your market research? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your information gathering with intelligent, automated workflows! #AIResearch #MarketIntelligence #AutomatedInsights #TechTrends #WebCrawling #AIMarketing #InnovationTracking #BusinessIntelligence #DataAutomation #TechNews
by Yaron Been
Automated workflow that transforms BuiltWith technology data into actionable sales leads in Trello, creating a visual sales pipeline. 🚀 What It Does Converts tech stack data into Trello cards Organizes leads by technology stack Tracks sales pipeline stages Enables team collaboration Updates automatically 🎯 Perfect For Sales teams Business development Account executives Tech startups Digital agencies ⚙️ Key Benefits ✅ Visual sales pipeline ✅ Easy lead qualification ✅ Team collaboration ✅ Technology-based filtering ✅ Automated data entry 🔧 What You Need BuiltWith API access Trello account n8n instance Google account (for authentication) 📊 Data Mapped to Trello Company details Technology stack Contact information Website metrics Custom labels 🛠️ Setup & Support Quick Setup Start in 20 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Turn technology intelligence into sales opportunities with automated lead management.
by Manuel
Effortlessly optimize your workflow by automatically importing hundreds of manufacturers from a Google Sheet into your Shopware online store, saving countless hours of manual work. How it works Retrieve all manufactures from a Google Sheet Add each manufacture via Shopware sync API Endpoint to Shopware Upload a logo for each manufacture from a provided public URL to Shopware Set Up Steps Add your Shopware url to first node called Settings Create a Google Sheet in your Google account with the following columns (Demo Sheet) name (the name of the manufacturer which has to be unique and is required) website (url to the manufacturer website) description logo_url (public manufcaturer logo url. Have to be a png, jpg or svg file) translation_language_code_1 (optional. Language Code of your language. For example 'es-ES' for spanish. You have to make sure a language with this code exists in your Shopware shop.) translation_name_1 (optional. Manufacturer name translated to the language you defined at translation_language_code_1) translation_description_1 (optional. Manufacturer description translated to the language you defined at translation_language_code_1) translation_language_code_2 (optional. Same as translation_language_code_1 for another language) translation_name_2 (optional. Same as translation_name_1 for another language) translation_description_2 (optional. Same as translation_description_1 for another language) translation_language_code_3 (optional. Same as translation_language_code_1 for another language) translation_name_3 (optional. Same as translation_name_1 for another language) translation_description_3 (optional. Same as translation_description_1 for another language) Connect to your Google account Connect to your Shopware account Create a Shopware Integration Connect to Shopware at the nodes "Import Manufacturer" and "Upload Manufacturer Logo" using a Generic OAuth2 API Authentication with Grant Type "Client Credentials". The Access Token URL is https://your-shopware-domain.com/api/oauth/token. Run the workflow