by Joey D’Anna
This workflow is a building block designed to be called from other workflows via an Execute workflow node. When called from another workflow, and given the JSON input of a "pulse" field with the ID to pull from monday, this workflow will return: The items name and ID All column data, indexable by the column name All column data, indexable by the column's ID string All board relation columns, with their data and column values All subitems, with their data and column values For example: ++Prerequisites++ A monday.com account and credential A workflow that needs to get detailed data from a monday.com row The pulse id of the monday.com row to retreive data from. ++Setup++ Import the workflow Configure all monday nodes with your credentials and save the workflow Copy the workflow ID from it's URL In a different workflow, add an Edit Fields node, to output the field "pulse", with the monday item you want to retrieve. Feed the Edit Fields node with your pulse into an Execute workflow node, and paste the workflow ID from above into it This "pulse" field will tell the workflow what pulse to retreive. This can be populated by an expression in your workflow There is an example of the Edit Fields and Execute Workflow nodes in the template
by Mario
Purpose This ensures that executions of scheduled workflows do not overlap when they take longer than expected. How it works This is a separate workflow which monitors the execution of the main workflow Stores a flag in Redis (key dynamically named after workflow ID) which indicates if the main workflow is running or idle Only calls the main workflow if the last execution has finished Setup Update the credentials suitable for your Redis instance Replace the Schedule Trigger of your main workflow by an Execute Workflow Trigger Copy the workflow ID from the URL Paste the workflow ID in the Execute Workflow Node of this workflow Configure the Schedule Trigger Node
by Jimleuk
This n8n template demonstrates how to calculate the evaluation metric "Correctness" which in this scenario, measures the compares and classifies the agent's response against a set of ground truths. The scoring approach is adapted from the open-source evaluations project RAGAS and you can see the source here https://github.com/explodinggradients/ragas/blob/main/ragas/src/ragas/metrics/_answer_correctness.py How it works This evaluation works best where the agent's response is allowed to be more verbose and conversational. For our scoring, we classify the agent's response into 3 buckets: True Positive (in answer and ground truth), False Positive (in answer but not ground truth) and False Negative (not in answer but in ground truth). We also calculate an average similarity score on the agent's response against all ground truths. The classification and the similarity score is then averaged to give the final score. A high score indicates the agent is accurate whereas a low score could indicate the agent has incorrect training data or is not providing a comprehensive enough answer. Requirements n8n version 1.94+ Check out this Google Sheet for a sample data https://docs.google.com/spreadsheets/d/1YOnu2JJjlxd787AuYcg-wKbkjyjyZFgASYVV0jsij5Y/edit?usp=sharing
by Aitor | 1Node
This n8n workflow processes incoming Telegram messages, differentiating between text and voice messages. How it works: Message Trigger: The workflow initiates when a new message is received via the Telegram "Message Trigger" node. Switch Node: This node acts as a router. It examines the incoming message: If the message is text, it directs the flow along the "text" branch. If the message contains voice, it directs the flow along the "voice" branch. Get Audio File: For audio messages, this node downloads the audio file from Telegram. Transcribe Audio: The downloaded audio file is then sent to an "OpenAI Transcribe Recording" node, which uses OpenAI's whisper-1 speech-to-text model to convert the audio into a text transcript. Send Transcription Message: Regardless of whether the original message was text or transcribed audio, the final text content is then passed to a "Send transcription message" node. Setup Requirements: Telegram Bot Token**: You will need a Telegram bot token configured in the "Message Trigger" node to receive messages. OpenAI API Key**: An OpenAI API key is required for the "Transcribe audio" node to perform speech transcription. Additional Notes: This workflow provides a foundational step for building more complex AI-driven applications. The transcribed text or original text message can be easily piped into an AI agent (e.g., a large language model) for analysis, response generation, or interaction with other tools, extending the bot's capabilities beyond simple message reception and transcription. 👉 Need Help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.
by Eric Mooney
Usecase: When a new service ticket is created in Taiga, it's often unclear whether it contains sufficient details to begin work. This workflow automates the triage process by: Using an AI model to extract key information from the ticket description. Automatically assigning values for: Type (Bug, Enhancement, Onboarding, Question) Severity (Wishlist, Minor, Normal, Important, Critical) Priority (Low, Normal, High) Status (New, Needs More Info, etc.) Detecting missing critical data and blocking the ticket if incomplete. Setup instructions here: https://github.com/emooney/Service_Ticket_Triage_Helper
by NovaNode
Who is this for? This template is designed for internal support teams, product specialists, and knowledge managers who want to build an AI-powered knowledge assistant with retrieval-augmented generation (RAG) and reinforcement learning from human feedback (RLHF) via Telegram. What problem is this workflow solving? Manual knowledge management and answering support queries can be time-consuming and error-prone. This solution automates importing and indexing official documentation into MongoDB vector search and enhances AI responses with Telegram-based user feedback to continuously improve answer quality. What these workflows do Workflow 1: Document ingestion & indexing Manually triggered workflow imports product documentation from Google Docs. Documents are split into manageable chunks and embedded using OpenAI embeddings. Embedded document chunks are stored in MongoDB Atlas vector store to enable semantic search. Workflow 2: Telegram chat with RLHF feedback loop Listens for user messages via Telegram bot integration. Uses vector similarity search on MongoDB to retrieve relevant documentation chunks. Generates answers with OpenAI GPT-4o-mini model using retrieval-augmented generation. Sends answers back via Telegram and waits for user feedback (approval or disapproval). Captures feedback, maps it as positive or negative, and stores it with the conversation data for future model improvement. Setup Setting up vector embeddings Authenticate Google Docs and connect your Google Docs URL containing the product documentation you want to index. Authenticate MongoDB Atlas and connect the collection where you want to store the vector embeddings. Create a search index on this collection to support vector similarity queries. Ensure the index name matches the one configured in n8n (data_index). See the example MongoDB search index template below for reference. Setting up chat with Telegram RLHF Create a bot in Telegram with @botFather using the /newbot command. Connect the MongoDB database and search index used for vector search in the previous workflow. Also create two new collections in MongoDB Atlas: one for feedback and one for chat history. Create a search index for feedback, copying the provided template. Configure the AI system prompt in the “Knowledge Base Agent” node, making sure it references all three tools connected (productDocs, feedbackPositive, feedbackNegative) as provided in the template prompt. Make sure Product documentation and feedback collections must connect to the same MongoDB database. There are three distinct MongoDB collections: one for product documentation, one for feedback, and one for chat history (chat history collection can be separate). Telegram API credentials are valid and webhook URLs are correctly set up. MongoDB Search Index Templates Documentation Collection Index { "mappings": { "dynamic": false, "fields": { "_id": { "type": "string" }, "text": { "type": "string" }, "embedding": { "type": "knnVector", "dimensions": 1536, "similarity": "cosine" }, "source": { "type": "string" }, "doc_id": { "type": "string" } } } } Feedback Collection Index { "mappings": { "dynamic": false, "fields": { "prompt": { "type": "string" }, "response": { "type": "string" }, "text": { "type": "string" }, "embedding": { "type": "knnVector", "dimensions": 1536, "similarity": "cosine" }, "feedback": { "type": "token" } } } }
by Henry
Automated Multilingual Gmail Draft Reply with OpenAI GPT-4o in n8n Who is this for? This workflow is ideal for anyone who receives a high volume of Gmail inquiries, especially those providing multilingual customer support or handling diverse client communications. What problem is this workflow solving? Managing frequent emails in multiple languages can be overwhelming. This workflow reduces manual drafting by automatically generating context-aware replies using OpenAI GPT-4o, letting users focus on personalization and quality assurance. What this workflow does Monitors your Gmail inbox for new emails with a specific label (e.g., "Inquiry"). Uses OpenAI GPT-4o for message assessment and language detection. Parses information using a JSON parser. Generates an AI-powered draft reply in the detected language via OpenAI GPT-4o. Converts the reply to HTML and saves it as a draft in the original Gmail thread for your review. Setup Connect your Gmail account and set up relevant labels in both Gmail and the workflow. Integrate your OpenAI credentials in n8n. Configure the workflow trigger for your desired labels. How to customize this workflow to your needs Adjust label names in both Gmail and the workflow for different email categories. Define custom starting and ending phrases for draft replies per supported language. Expand supported languages or modify AI prompt instructions to suit your brand’s tone.
by Jimleuk
This n8n template demonstrates the beginnings of building your own n8n-powered WhatsApp chatbot! Under the hood, utilise n8n's powerful AI features to handle different message types and use an AI agent to respond to the user. A powerful tool for any use-case! How it works Incoming WhatsApp Trigger provides a way to get messages into the workflow. The message received is extracted and sent through 1 of 4 branches for processing. Each processing branch uses AI to analyse, summarize or transcribe the message so that the AI agent can understand it. The supported types are text, image, audio (voice notes) and video. The AI Agent is used to generate a response generally and uses a wikipedia tool for more complex queries. Finally, the response message is sent back to the WhatsApp user using the WhatsApp node. How to use Once you have setup and configured your WhatsApp account, you'll need to activate your workflow to start processing messages. Good to know: Large media files may negatively impact workflow performance. Requirements WhatsApp Buisness account Google Gemini for LLM. Gemini is used specifically because it can accept audio and video files whereas at time of writing, many other providers like OpenAI's GPT, do not. Customising this workflow For performance reasons, consider detecting large audio and video before sending to the LLM. Pre-processing such files may allow your agent to perform better. Go beyond and create rich and engagement customer experiences by responding using images, audio and video instead of just text!
by Sebastian/OptiLever
Tired of spending HOURS writing product descriptions that don’t rank or convert? This could be your solution. This free Product Description Writer workflow for n8n uses a multi-agent AI system to turn your product list into conversion-focused, SEO-ready copy. It analyzes your product images, identifies key features, and writes optimized titles and descriptions for platforms like Shopify and Google Shopping. It can process your entire catalog in minutes, saving you countless hours of manual work. This workflow is perfect for: 🛒 Shopify stores 🛒 Etsy sellers 🛒 Product managers 🛒 Digital marketers 🛒 Anyone who hates writing product copy manually! How it works This workflow automates the entire product description process in a few high-level steps: Reads Your Products: The workflow starts by reading product data from your specified Google Sheet, including the product name, an image URL, and optional fields like brand voice or target market. Analyzes Product Images: It downloads each product image and uses an AI vision model (GPT-4o-mini) to perform a detailed visual analysis, extracting objective information like materials, colors, features, and structure. Writes Optimized Copy: The visual analysis and your original data are passed to two specialized AI agents. The first drafts a Shopify-optimized title and description, while the second refines it and generates additional SEO-focused copy for Google Merchant Center. Updates Your Spreadsheet: The final, optimized product titles and descriptions for both Shopify and Google are automatically written back to the original Google Sheet. Set up steps Setting up this workflow takes only a few minutes. You will need to configure credentials for the following services: Google Sheets**: To allow the workflow to read your product list and write back the results. OpenAI**: To power the AI agents that analyze images and generate the copy. Detailed instructions and customization tips are included in the sticky notes inside the workflow itself. Benefits Automated Vision-Based Copywriting**: Reduces manual description writing time. Multi-Channel Ready**: Outputs are optimized for both Shopify and Google Merchant Center standards. Brand Alignment**: Uses optional user-provided draft descriptions and brand voice to maintain brand tone. SEO and Conversion Focus**: Titles and descriptions are optimized for both search engines and consumer engagement. Image-Centric Accuracy**: Uses actual product images for accurate attribute extraction, minimizing errors from missing or vague text data. Tips & Customization To adjust brand voice or tone, modify the system prompts in the Shopify and GMC AI agents. To extend the workflow for scheduled runs, add a cron trigger or a Google Sheets "status column" filter. For QA/debugging, consider adding logging nodes to Slack or Discord, or export AI outputs to a review sheet before updating the main sheet. To improve Shopify or GMC field mappings, edit the final Google Sheets update node's column settings. For speed optimization, the batch size in the Loop Over Items node can be adjusted, but be mindful of API rate limits.
by Dhruv from Saleshandy
🧠 How it works This workflow automates QA review of Intercom support conversations by: Triggering on conversation.admin.closed events via a webhook Fetching full conversation data using Intercom API Structuring and summarizing the conversation into a readable transcript Using GPT to evaluate: Response time Clarity Tone & behavior Urgency handling Ownership & resolution Logging structured QA scores in a Google Sheet Providing coaching-style feedback if the rating is 3 or below ⚙️ Set up steps 🔐 Configure your Intercom and OpenAI credentials in n8n 📩 Set up the webhook in Intercom to post on conversation close 🧠 Use your OpenAI API key for the GPT-based nodes 🗃️ Connect your Google Sheet (or replace with another data sink) ✅ Add your own filtering logic for spam/promotional tickets if needed Note: This workflow contains a sticky notes to explain each step inside the n8n canvas.
by Zacharia Kimotho
What it does This workflow scrapes the top 10 pages on SERP and conducts an in-depth analysis of the keyword intent for each ranking keyword, saving the information to a Google Sheet for further analysis. How does this workflow work? We add our keywords and country code to a Google sheet that we need to monitor and research on Run the system Scrape the top 10 pages Analyze the intents of the top 10 and update to a Google sheet Technical Setup Make a copy of this G sheet Add your desired keywords to the Google sheet Map keyword and country code Update the Zone name to match your zone on Bright Data Run the scraper Upon successful scraping, we run an intent classifier to determine the intents for each ranking page and update the G sheet. Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential
by Alex Huy
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This n8n workflow automatically scrapes Airbnb listings from a specified location and saves the data to a Google Sheet. It performs pagination to collect listings across multiple pages, extracts detailed information for each property, and organizes the data in a structured format for easy analysis. How it Works The workflow operates through these high-level steps: Search Initialization: Starts with an Airbnb search for a specific location (London) with defined check-in/check-out dates and guest count Pagination Loop: Automatically processes multiple pages of search results using cursor-based pagination Data Extraction: Parses listing information including names, prices, ratings, reviews, and URLs Detail Enhancement: Fetches additional details for each listing (house rules, highlights, descriptions, amenities) Data Storage: Saves all collected data to a Google Sheet with proper formatting Loop Control: Continues until reaching the page limit (2 pages) or no more results are available Setup Steps Prerequisites n8n instance with MCP (Model Context Protocol) support Google Sheets API credentials configured Airbnb MCP client properly set up Configuration Steps Configure MCP Client Set up the Airbnb MCP client with credential ID: Ensure the client has access to airbnb_search and airbnb_listing_details tools Google Sheets Setup Create a Google Sheet with ID: 15IOJquaQ8CBtFilmFTuW8UFijux10NwSVzStyNJ1MsA Configure Google Sheets OAuth2 credentials (ID: 6YhBlgb8cXMN3Ra2) Ensure the sheet has these column headers: "id, name, url, price_per_night, total_price, price_details beds_rooms, rating, reviews, badge, location houseRules, highlights, description, amenities" Search Parameters Location: "London" (can be modified in the "Airbnb Search" node) Adults: 7 Children: 1 Check-in: "2025-08-14" Check-out: "2025-08-17" Page limit: 2 (can be adjusted in the "If1" condition node) Execution Use the manual trigger "When clicking 'Execute workflow'" to start the process Monitor the workflow execution through the n8n interface Check the Google Sheet for populated data after completion Key Features Automatic Pagination: Processes multiple pages without manual intervention Comprehensive Data: Extracts both basic listing info and detailed property information Error Handling: Includes JSON parsing error handling and data validation Batch Processing: Uses split batches for efficient processing of individual listings Real-time Updates: Appends new data to existing Google Sheet records Output Data Structure Each listing contains: Basic info: ID, name, URL, pricing details, room/bed count Ratings: Average rating and review count Location: Latitude and longitude coordinates Enhanced details: House rules, highlights, descriptions, amenities Metadata: Page number, check-in/out dates, badges