by iamvaar
⚠️ RUN the FIRST WORKFLOW ONLY ONCE (as it will convert your content in Embedding format and save it in DB and is ready for the RAG Chat) 📌 Telegram Trigger Type:** telegramTrigger Purpose:** Waits for new Telegram messages to trigger the workflow. Note:** Currently disabled. 📄 Content for the Training Type:** googleDocs Purpose:** Fetches document content from Google Docs using its URL. Details:** Uses Service Account authentication. ✂️ Splitting into Chunks Type:** code Purpose:** Splits the fetched document text into smaller chunks (1000 chars each) for processing. Logic:** Loops over text and slices it. 🧠 Embedding Uploaded Document Type:** httpRequest Purpose:** Calls Together AI embedding API to get vector embeddings for each text chunk. Details:** Sends JSON with model name and chunk as input. 🛢 Save the embedding in DB Type:** supabase Purpose:** Saves each text chunk and its embedding vector into the Supabase embed table. SECOND WORKFLOW EXPLAINATION: 💬 When chat message received Type:** chatTrigger Purpose:** Starts the workflow when a user sends a chat message. Details:** Sends an initial greeting message to the user. 🧩 Embend User Message Type:** httpRequest Purpose:** Generates embedding for the user’s input message. Details:** Calls Together AI embeddings API. 🔍 Search Embeddings Type:** httpRequest Purpose:** Searches Supabase DB for the top 5 most similar text chunks based on the generated embedding. Details:** Calls Supabase RPC function matchembeddings1. 📦 Aggregate Type:** aggregate Purpose:** Combines all retrieved text chunks into a single aggregated context for the LLM. 🧠 Basic LLM Chain Type:** chainLlm Purpose:** Passes the user's question + aggregated context to the LLM to generate a detailed answer. Details:** Contains prompt instructing the LLM to answer only based on context. 🤖 OpenRouter Chat Model Type:** lmChatOpenRouter Purpose:** Provides the actual AI language model that processes the prompt. Details:** Uses qwen/qwen3-8b:free model via OpenRouter and you can use any of your choice.
by Harshil Agrawal
This workflow analyzes the sentiments of the feedback provided by users and sends them to a Mattermost channel. Typeform Trigger node: Whenever a user submits a response to the Typeform, the Typeform Trigger node will trigger the workflow. The node returns the response that the user has submitted in the form. Google Cloud Natural Language node: This node analyses the sentiment of the response the user has provided and gives a score. IF node: The IF node uses the score provided by the Google Cloud Natural Language node and checks if the score is negative (smaller than 0). If the score is negative we get the result as True, otherwise False. Mattermost node: If the score is negative, the IF node returns true and the true branch of the IF node is executed. We connect the Mattermost node with the true branch of the IF node. Whenever the score of the sentiment analysis is negative, the node gets executed and a message is posted on a channel in Mattermost. NoOp: This node here is optional, as the absence of this node won't make a difference to the functioning of the workflow. This workflow can be used by Product Managers to analyze the feedback of the product. The workflow can also be used by HR to analyze employee feedback. You can even use this node for sentiment analysis of Tweets. To perform a sentiment analysis of Tweets, replace the Typeform Trigger node with the Twitter node. Note:* You will need a Trigger node or Start node to start the workflow. Instead of posting a message on Mattermost, you can save the results in a database or a Google Sheet, or Airtable. Replace the Mattermost node with (or add after the Mattermost node) the node of your choice to add the result to your database. You can learn to build this workflow on the documentation page of the Google Cloud Natural Language node.
by Hubschrauber
Overview This template describes a possible approach to handle a pseudo-callback/trigger from an independent, external process (initiated from a workflow) and combine the received input with the workflow execution that is already in progress. This requires the external system to pass through some context information (resumeUrl), but allows the "primary" workflow execution to continue with BOTH its own (previous-node) context, AND the input received in the "secondary" trigger/process. Primary Workflow Trigger/Execution The workflow path from the primary trigger initiates some external, independent process and provides "context" which includes the value of $execution.resumeUrl. This execution then reaches a Wait node configured with Resume - On Webhook Call and stops until a call to resumeUrl is received. External, Independent Process The external, independent process could be anything like a Telegram conversation, or a web-service as long as: it results in a single execution of the Secondary Workflow Trigger, and it can pass through the value of resumeUrl associated with the Primary Workflow Execution Secondary Workflow Trigger/Execution The secondary workflow execution can start with any kind of trigger as long as part of the input can include the resumeUrl. To combine / rejoin the primary workflow execution, this execution passes along whatever it receives from its trigger input to the resume-webhook endpoint on the Wait node. Notes IMPORTANT: The workflow ids in the Set nodes marked **Update Me have embedded references to the workflow IDs in the original system. They will need to be CHANGED to make this demo work. Note: The Resume Other Workflow Execution node in the template uses the $env.WEBHOOK_URL configuration to convert to an internal "localhost" call in a Docker environment. This can be done differently. ALERT:** This pattern is NOT suitable for a workflow that handles multiple items because the first workflow execution will only be waiting for one callback. The second workflow (not the second trigger in the first workflow) is just to demonstrate how the Independent, External Process needs to work.
by Davide
This workflow automates the process of converting images from JPG/PNG format to WEBP using the APYHub API. It retrieves image URLs from a Google Sheet, converts the images, and uploads the converted files to Google Drive. This workflow is a powerful tool for automating image conversion tasks, saving time and ensuring that images are efficiently converted and stored in the desired format. Using WebP images on a website provides several SEO benefits: Faster Loading Speed – WebP files are smaller than JPG and PNG, reducing page load times and improving user experience. Better Core Web Vitals – Google prioritizes websites with good performance metrics like LCP (Largest Contentful Paint). Improved Mobile Performance – Smaller images consume less bandwidth, enhancing mobile usability. Higher Search Rankings – Faster sites tend to rank better on Google due to improved user experience. Reduced Server Load – Lighter images lower hosting and CDN costs while improving site efficiency. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to convert images from JPG/PNG to WEBP format and manage the converted files. Here's how it works: Manual Trigger: The workflow starts with a Manual Trigger node, which initiates the process when the user clicks "Test workflow." Set API Key: The Set API KEY node defines the API key required to access the APYHub API. Get Images: The Get Images node retrieves a list of image URLs from a Google Sheet. The sheet contains columns for the original image URL (FROM), the converted image URL (TO), and a status flag (DONE). Get Extension: The Get Extension node extracts the file extension (JPG, JPEG, or PNG) from the image URL and adds it to the JSON data. Determine Image Type: The JPG or PNG? node checks the file extension and routes the workflow to the appropriate conversion node: JPG/JPEG: Routes to the From JPG to WEBP node. PNG: Routes to the PNG to WEBP node. Convert Image: The From JPG to WEBP and PNG to WEBP nodes send POST requests to the APYHub API to convert the images to WEBP format. The API returns the URL of the converted image. Update Google Sheet: The Update Sheet node updates the Google Sheet with the URL of the converted image and marks the row as done (DONE). Get Converted Image: The Get File Image node downloads the converted WEBP image from the URL provided by the APYHub API. Upload to Google Drive: The Upload Image node uploads the converted WEBP image to a specified folder in Google Drive. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: APYHub API Key: Obtain an API Key from APYHub. In the Set API KEY node, define the API key. Google Sheets Integration: Set up Google Sheets credentials in n8n for the Get Images and Update Sheet nodes. Create a Google Sheet with columns for FROM (original image URL), TO (converted image URL), and DONE (status flag). Provide the Document ID and Sheet Name in the Get Images node. Google Drive Integration: Set up Google Drive credentials in n8n for the Upload Image node. Specify the folder ID in Google Drive where the converted images will be uploaded. Test the Workflow: Click the "Test workflow" button in n8n to trigger the workflow. The workflow will: Retrieve image URLs from the Google Sheet. Convert the images to WEBP format using the APYHub API. Update the Google Sheet with the converted image URLs. Upload the converted images to Google Drive. Optional Customization: Modify the workflow to include additional features, such as: Adding more image formats for conversion. Sending notifications when the conversion is complete. Integrating with other storage services (e.g., Dropbox, AWS S3).
by Yaron Been
This workflow provides automated access to the Settyan Flash V2.0.0 Beta.4 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Settyan Flash V2.0.0 Beta.4 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Settyan/flash-v2.0.0-beta.4 AI model Settyan Flash V2.0.0 Beta.4**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by AI Native
This workflow automates the process of retrieving Hugging Face paper summaries, analyzing them with OpenAI, and storing the results in Notion. Here’s a breakdown of how it works: ⏰ Scheduled Trigger: The flow is set to run automatically at 8 AM on weekdays. 📄 Fetching Paper Data: It fetches Hugging Face paper summaries using their API. 🔍 Data Check: Before processing, the workflow checks if the paper already exists in Notion to avoid duplicates. 🤖 Content Analysis with OpenAI: If the paper is new, it extracts the summary and uses OpenAI to analyze the content. 📥 Store Results in Notion: After analysis, the summarized data is saved in Notion for easy reference. ⚙️ Set Up Steps for Automation Follow these steps to set up this automated workflow with Hugging Face, OpenAI, and Notion integration: 🔑 Obtain API Tokens: You’ll need the Notion and OpenAI API tokens to authenticate and connect these services with n8n. 🔗 Integration in n8n: Link Hugging Face, OpenAI, and Notion by configuring the appropriate nodes in n8n. 🔧 Configure Workflow Logic: Set up a cron trigger for automatic execution at 8 AM on weekdays. Use an HTTP request node to fetch Hugging Face paper data. Add logic to check if the data already exists in Notion. Set up the OpenAI integration to analyze the paper’s summary. Store the results in Notion for easy access and reference. Result:
by bangank36
This workflow retrieves all Shopify Orders and saves them into a Google Sheets spreadsheet using the Shopify Admin REST API. It uses pagination to ensure all orders are collected efficiently. I originally built this workflow for my own use and found it valuable for understanding how Shopify pagination works. Now, I’m sharing it to help others automate their order retrieval process. How It Works Instead of relying on the built-in Shopify node (Get Orders Many), this workflow leverages the HTTP Request node to fetch paginated chunks manually. Shopify uses cursor-based pagination (page_info) instead of traditional page numbers. Pagination data is stored in the response headers, so we need to enable Include Response Headers and Status in the HTTP Request node. You can modify the limit parameter to control batch sizes and optimize for rate limits. This workflow can be run on demand or scheduled to keep your data up to date. Parameters You can adjust these parameters in the HTTP Request node: limit – The number of orders per request (default: 50, max: 250). fields – Comma-separated list of fields to retrieve. page_info – Used for pagination; only limit and fields are allowed when paginating. 📌 Note: when you query the paginated chunks with page_info, only the limit and fields parameters are allowed Credentials Shopify API Key – Required for authentication. Google Sheets API credentials – Needed to insert data into the spreadsheet. 💾 Clone the Google Sheets template here Who Is This For? Shopify store owners who need to export all orders to Google Sheets. Users who want full control over API parameters for optimized queries. Anyone looking for a flexible and scalable Shopify data extraction solution. Explore More Templates 👉 Check out my other n8n templates
by David Olusola
How It Works – Data Deduplication in n8n This tutorial demonstrates how to remove duplicate records from a dataset using JavaScript logic inside n8n's Code nodes. It simulates real-world data cleaning by generating sample user data with intentional duplicates (based on email addresses) and walks you through the process of deduplication step-by-step. The process includes: Creating Sample Data with duplicates. Filtering Out Duplicates using filter() and findIndex() based on email. Displaying Cleaned Results with simple statistics for before-and-after comparison. This is ideal for scenarios like CRM imports, ETL processes, and general data hygiene. ⚙️ Set-Up Steps 🔹 Step 1: Manual Trigger Node: When clicking 'Test workflow' Purpose: Initiates the workflow manually for testing. 🔹 Step 2: Generate Sample Data Node: Create Sample Data (Code node) What it does: Creates 6 users, including 2 intentional duplicates (by email). Outputs data as usersJson with metadata (totalCount, message). Mimics real-world messy datasets. 🔹 Step 3: Deduplicate the Data Node: Deduplicate Users (Code node) What it does: Parses usersJson. Uses .filter() + .findIndex() to keep only the first instance of each email. Logs total, unique, and removed counts. Outputs clean user list as separate items. 🔹 Step 4: Display Results Node: Display Results (Code node) What it does: Outputs structured summary: Unique users Status Timestamp Prepares results for review or downstream use. 📈 Sample Output Original count: 6 users Deduplicated count: 4 users Duplicates removed: 2 users 🎯 Learning Objectives You'll learn how to: Use .filter() and .findIndex() in n8n Code nodes Clean JSON data within workflows Create simple, effective deduplication pipelines Output structured summaries for reporting or integration 🧠Best Practices Validate input format (e.g., JSON schema) Handle null or missing fields gracefully Use logging for visibility Add error handling for production use Use pagination/chunking for large datasets
by Gulfiia
UX Interview Analysis with OpenAI: Transcipt, Summarize, and Export to Google Sheets!* About Easily analyze and summarize UX interviews. Just upload your files to Google Drive and get the insights directly in Google Sheets. How It Works The workflow is triggered manually Upload the interview recordings in MP3 format to Google Drive (or modify the node “Filter by MP3” to support other formats) OpenAI transcribes the audio An AI agent generates a summary Store the results in Google Sheets How To Use Import the package into your n8n interface Set up credentials for each node to access the required tools Upload your interview files to Google Drive Create a Google Sheet with the following columns: • Persona • User Needs • Pain Points • New Feature Requests Connect the Google Sheets node titled “Insert results to Google Sheets” to your created document Start the workflow Requirements OpenAI for transcription and summarization (you can replace it with Gemini if preferred)
by Harshil Agrawal
This workflow analyzes the sentiments of the feedback provided by users and sends them to a Mattermost channel. Typeform Trigger node: Whenever a user submits a response to the Typeform, the Typeform Trigger node will trigger the workflow. The node returns the response that the user has submitted in the form. AWS Comprehend node: This node analyses the sentiment of the response the user has provided and gives a score. IF node: The IF node uses the data provided by the AWS Comprehend node and checks if the sentiment is negative. If the sentiment is negative we get the result as true, otherwise false. Mattermost node: If the score is negative, the IF node returns true and the true branch of the IF node is executed. We connect the Mattermost node with the true branch of the IF node. Whenever the score of the sentiment analysis is negative, the node gets executed and a message is posted on a channel in Mattermost. NoOp: This node here is optional, as the absence of this node won't make a difference to the functioning of the workflow. This workflow can be used by Product Managers to analyze the feedback of the product. The workflow can also be used by HR to analyze employee feedback. You can even use this node for sentiment analysis of Tweets. To perform a sentiment analysis of Tweets, replace the Typeform Trigger node with the Twitter node. Note: You will need a Trigger node or Start node to start the workflow. Instead of posting a message on Mattermost, you can save the results in a database or a Google Sheet, or Airtable. Replace the Mattermost node with (or add after the Mattermost node) the node of your choice to add the result to your database.
by Louis
Workflow Overview This workflow automates the process of updating a Spotify playlist with tracks from a YouTube playlist, ensuring no duplicates are added. Key Components Manual Trigger: Starts the workflow when you click ‘Test workflow’. Spotify Integration: Retrieves tracks from a specified Spotify playlist. YouTube Integration: Fetches tracks from a designated YouTube playlist. Batch Processing: Processes tracks in batches to handle multiple items efficiently. Track Search: Searches for YouTube tracks on Spotify to find corresponding IDs. Comparison: Compares existing Spotify tracks with YouTube tracks to identify which ones to add. Track Addition: Adds new Spotify tracks to the playlist that are not already included. If you have any questions or need clarification, feel free to ask!
by YungCEO
🤖 Discord AI Workflow: Your Automated Assistant! 🚀 🌟 Workflow Overview Transforms your Discord server into an intelligent, responsive powerhouse of communication and automation! 🔧 Core Components 💬 AI-Powered Messaging 🤝 Multi-Channel Interaction 🧠 Smart Response Generation 🔗 Seamless Workflow Integration 🚦 Trigger Modes 1️⃣ Workflow Trigger 🔓 Activated by external workflows 📨 Processes incoming tasks 🌐 Supports complex automation scenarios 2️⃣ Chat Message Trigger 🗣️ Responds to direct Discord messages 🤔 Contextual understanding 🔍 Real-time interaction 🛠️ Key Features 🤖 AI-Driven Conversations 📊 Dynamic Message Handling 🔒 Secure Credential Management 🌈 Flexible Configuration 🚀 Use Cases 📢 Automated Announcements 🆘 Support Ticket Management 📝 Content Generation 🤝 Community Engagement 💡 Smart Capabilities 🧩 Modular Design 🔄 Seamless Data Flow 📝 Character Limit Management 🌐 Multi-Channel Support 🛡️ Security & Performance 🔐 OAuth Integration 🚧 Error Handling 📊 Performance Optimization 🛠️ Continuous Improvement 🎯 Workflow Magic User Input ➡️ AI Processing ➡️ Smart Response ➡️ Discord Channel 🌟 🤖 💬 📨 🔍 Customization Playground 🎨 Personalize AI Responses 🔧 Adjust Interaction Rules 📐 Fine-Tune Workflow Behavior 🚧 Troubleshooting Toolkit 🕵️ Credential Verification 🔬 Permissions Check 📋 Comprehensive Logging 🆘 Error Handling Strategies 🌈 Future Possibilities 🤖 Advanced AI Integration 🚀 Expanded Interaction Modes 🧠 Machine Learning Enhancements 🌐 Ecosystem Expansion