by Niklas Hatje
Use case When working with multiple teams, bugs must get in front of the right team as quickly as possible to be resolved. Normally this includes a manual grooming of new bugs that have arrived in your ticketing system (in our case Linear). We found this way too time-consuming. That's why we built this workflow. What this workflow does This workflow triggers every time a Linear issue is created or updated within a certain team. For us at n8n, we created one general team called Engineering where all bugs get added in the beginning. The workflow then checks if the issue meets the criteria to be auto-moved to a certain team. In our case, that means that the description is filled, that it has the bug label, and that it's in the Triage state. The workflow then classifies the bug using OpenAI's GPT-4 model before updating the team property of the Linear issue. If the AI fails to classify a team, the workflow sends an alert to Slack. Setup Add your Linear and OpenAi credentials Change the team in the Linear Trigger to match your needs Customize your teams and their areas of responsibility in the Set me up node. Please use the format Teamname. Also, make sure that the team names match the names in Linear exactly. Change the Slack channel in the Set me up node to your Slack channel of choice. How to adjust it to your needs Play around with the context that you're giving to OpenAI, to make sure the model has enough knowledge about your teams and their areas of responsibility Adjust the handling of AI failures to your needs How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We're using an automation that enables everyone to add new bugs easily with the right data via a /bug command in Slack (check out this template if that's interesting to you) This workflow was built using n8n version 1.30.0
by Deborah
How it works This workflow shows how to set credentials dynamically using expressions. It accepts an API key via a form, then uses it in the NASA node to authenticate a request. Setup steps First, set up your NASA credential: Create a new NASA credential. Hover over API Key. Toggle Expression on. In the API Key field, enter {{ $json["Enter your NASA API key"] }}. Then, test the workflow: Get an API key from NASA Select Test workflow Enter your key using the form. The workflow runs and sends you to the NASA picture of the day. For more information on expressions, refer to n8n documentation | Expressions.
by Nicolas Chourrout
This workflow automatically generates draft replies in Gmail. It's designed for anyone who manages a high volume of emails or often face writer's block when crafting responses. Since it doesn't send the generated message directly, you're still in charge of editing and approving emails before they go out. How It Works: Email Trigger: activates when new emails reach the Gmail inbox Assessment: uses OpenAI gpt-4o and a JSON parser to determine if a response is necessary. Reply Generation: crafts a reply with OpenAI GPT-4 Turbo Draft Integration: after converting the text to html, it places the draft into the Gmail thread as a reply to the first message Set Up Overview (~10 minutes): OAuth Configuration (follow n8n instructions here): Setup Google OAuth in Google Cloud console. Make sure to add Gmail API with the modify scope. Add Google OAuth credentials in n8n. Make sure to add the n8n redirect URI to the Google Cloud Console consent screen settings. OpenAI Configuration: add OpenAI API Key in the credentials Tweaking the prompt: edit the system prompt in the "Generate email reply" node to suit your needs Detailed Walkthrough Check out this blog post where I go into more details on how I built this workflow. Reach out to me here if you need help building automations for your business.
by Mihai Farcas
How it works: The workflow starts by sending a request to a website to retrieve its HTML content. It then parses the HTML extracting the relevant information The extracted data is storted and converted into a CSV file. The CSV file is attached to an email and sent to your specified address. The data is simultaneously saved to both Google Sheets and Microsoft Excel for further analysis or use. Set-up steps: Change the website to scrape in the "Fetch website content" node Configure Microsoft Azure credentials with Microsoft Graph permissions (required for the Save to Microsoft Excel 365 node) Configure Google Cloud credentials with access to Google Drive, Google Sheets and Gmail APIs (the latter is required for the Send CSV via e-mail node).
by David Roberts
The built-in Gmail node doesn't yet support embedding images within the body of the email, but you can pull this off using the HTTP node, and this template shows you how. Requirements A Gmail account How it works The workflow downloads an image, converts it into the format that the Gmail API expects (base64), packages it into a multipart MIME email and uses the HTTP node to send it.
by Jimleuk
This n8n workflow demonstrates how to automate oftern time-consuming form filling tasks in the early stages of the tendering process; the Request for Proposal document or "RFP". It does this by utilising a company's knowledgebase to generating question-and-answer pairs using Large Language Models. How it works A buyer's RFP is submitted to the workflow as a digital document that can be parsed. Our first AI agent scans and extracts all questions from the document into list form. The supplier sets up an OpenAI assistant prior loaded with company brand, marketing and technical documents. The workflow loops through each of the buyer's questions and poses these to the OpenAI assistant. The assistant's answers are captured until all questions are satisified and are then exported into a new document for review. A sales team member is then able to use this document to respond quickly to the RFP before their competitors. Example Webhook Request curl --location 'https://<n8n_webhook_url>' \ --form 'id="RFP001"' \ --form 'title="BlueChip Travel and StarBus Web Services"' \ --form 'reply_to="jim@example.com"' \ --form 'data=@"k9pnbALxX/RFP Questionnaire.pdf"' Requirements An OpenAI account to use AI services. Customising the workflow OpenAI assistants is only one approach to hosting a company knowledgebase for AI to use. Exploring different solutions such as building your own RAG-powered database can sometimes yield better results in terms of control of how the data is managed and cost.
by Jimleuk
This n8n workflow demonstrates how to automate image captioning tasks using Gemini 1.5 Pro - a multimodal LLM which can accept and analyse images. This is a really simple example of how easy it is to build and leverage powerful AI models in your repetitive tasks. How it works For this demo, we'll import a public image from a popular stock photography website, Pexel.com, into our workflow using the HTTP request node. With multimodal LLMs, there is little do preprocess other than ensuring the image dimensions fit within the LLMs accepted limits. Though not essential, we'll resize the image using the Edit image node to achieve fast processing. The image is used as an input to the basic LLM node by defining a "user message" entry with the binary (data) type. The LLM node has the Gemini 1.5 Pro language model attached and we'll prompt it to generate a caption title and text appropriate for the image it sees. Once generated, the generated caption text is positioning over the original image to complete the task. We can calculate the positioning relative to the amount of characters produced using the code node. An example of the combined image and caption can be found here: https://res.cloudinary.com/daglih2g8/image/upload/f_auto,q_auto/v1/n8n-workflows/l5xbb4ze4wyxwwefqmnc Requirements Google Gemini API Key. Access to Google Drive. Customising the workflow Not using Google Gemini? n8n's basic LLM node supports the standard syntax for image content for models that support it - try using GPT4o, Claude or LLava (via Ollama). Google Drive is only used for demonstration purposes. Feel free to swap this out for other triggers such as webhooks to fit your use case.
by Jimleuk
This n8n workflow demonstrates a simple approach to improve chat UX by staggering an AI Agent's reply for users who send in a sequence of partial messages and in short bursts. How it works Twilio webhook receives user's messages which are recorded in a message stack powered by Redis. The execution is immediately paused for 5 seconds and then another check is done against the message stack for the latest message. The purpose of this check lets use know if the user is sending more messages or if they are waiting for a reply. The execution is aborted if the latest message on the stack differs from the incoming message and continues if they are the same. For the latter, the agent receives the buffered messages up to that point and is able to respond to them in a single reply. Requirements A Twilio account and SMS-enabled phone number to receive messages. Redis instance for the messages stack. OpenAI account for the language model. Customising the workflow This workflow should work for other common messaging platforms such as Whatsapp and Telegram. 5 seconds too long or too short? Adjust the wait threshold to suit your customers.
by Yaron Been
Transform YouTube comments into actionable insights with automated AI analysis and professional email reports. This intelligent workflow monitors your Google Sheets for YouTube video IDs, fetches comments using YouTube API, performs comprehensive AI sentiment analysis, and delivers formatted email reports with viewer insights - helping content creators understand their audience and improve engagement. 🚀 What It Does Smart Video Monitoring: Watches Google Sheets for new YouTube video IDs marked as "Pending" and triggers automated analysis Complete Comment Collection: Fetches up to 100 top comments per video using YouTube API with relevance-based ordering AI-Powered Analysis: Uses GPT-4 to analyze comments for sentiment, themes, questions, feedback, and actionable insights Professional Email Reports: Generates detailed HTML reports with statistics, sentiment breakdown, and improvement recommendations Automated Status Tracking: Updates spreadsheet status to prevent duplicate processing and maintain organized workflow 🎯 Key Benefits ✅ Deep Audience Insights: Understand what viewers really think about your content ✅ Save Hours of Manual Work: Automated comment analysis vs reading hundreds of comments ✅ Improve Content Strategy: Get actionable feedback for better video performance ✅ Track Sentiment Trends: Monitor positive/negative feedback patterns ✅ Professional Reporting: Receive formatted analysis reports via email ✅ Scalable Analysis: Process multiple videos automatically 🏢 Perfect For Content Creators & YouTubers Individual creators tracking audience engagement Educational channels analyzing learning feedback Entertainment creators understanding viewer preferences Business channels monitoring brand sentiment Marketing & Business Applications Brand Monitoring**: Track sentiment on branded content and partnerships Audience Research**: Understand viewer demographics and preferences Content Optimization**: Identify what resonates with your audience Competitor Analysis**: Analyze comments on competitor videos (where allowed) ⚙️ What's Included Complete Analytics Workflow: Ready-to-deploy YouTube comment analysis system Google Sheets Integration: Simple spreadsheet-based video management YouTube API Integration: Automated comment fetching with proper authentication AI Analysis Engine: GPT-4 powered sentiment and insight generation Email Reporting System: Professional HTML-formatted reports Status Management: Automatic processing tracking and duplicate prevention 🔧 Setup Requirements n8n Platform**: Cloud or self-hosted instance YouTube API Credentials**: Google Cloud Console API access OpenAI API**: GPT-4 access for comment analysis Google Sheets**: Video ID management and status tracking Gmail Account**: For receiving analysis reports 📊 Required Google Sheets Structure | ID | Video Title | YouTube Video ID | Status | |----|-------------|------------------|---------| | 1 | My Tutorial | dQw4w9WgXcQ | Pending | | 2 | Product Demo| abc123def456 | Mail Sent | | 3 | Weekly Vlog | xyz789uvw012 | Draft | Status Options: Draft → Pending → Mail Sent 📧 Sample Analysis Report 📺 YouTube Comments Analysis Report Video: "How to Build Your First Website" 📊 Quick Statistics: • Total Comments Analyzed: 87 • Average Likes per Comment: 3.2 • Total Replies: 156 • Sentiment Summary: Positive: 65%, Negative: 10%, Neutral: 25% ❓ Common Questions: • "What hosting service do you recommend?" • "Can I do this without coding experience?" • "How much does domain registration cost?" 💡 Key Feedback Points: • Tutorial pace is perfect for beginners • More examples of finished websites requested • Viewers want follow-up video on advanced features 🎯 Actionable Insights: • Create hosting comparison video • Add timestamps for different skill levels • Consider beginner-friendly series expansion 🎨 Customization Options Analysis Depth: Adjust AI prompts for different analysis focuses (engagement, education, entertainment) Comment Limits: Modify maximum comments processed (default: 100, AI analysis: 50) Report Recipients: Send reports to multiple team members or clients Custom Metrics: Add specific analysis criteria for your content niche Multi-Channel: Process videos from multiple YouTube channels Scheduling: Set up regular analysis of your latest videos 🏷️ Tags & Categories #youtube-analytics #comment-analysis #content-creator-tools #ai-sentiment-analysis #video-insights #audience-research #youtube-api #content-optimization #social-media-analytics #creator-economy #video-marketing #engagement-analysis #content-strategy #ai-reporting #youtube-automation 💡 Use Case Examples Educational Channel: Analyze tutorial comments to identify confusing concepts and improve teaching methods Product Reviews: Monitor sentiment on review videos to understand customer satisfaction trends Entertainment Creator: Track audience reactions to different content formats and optimize future videos
by Jay Hartley
What this workflow does This workflow allows you to monitor multiple Github repos simultaneously without polling due to use of Webhooks. It programmatically allows for adding and deleting of repos to your watchlist to make management convenient. Description Can monitor multiple repos simultaneously. Programmatically register or unregister repos from a list. No need for manual work. Webhook notification means no constant polling necessary. Setup 1. Creating Credentials on Github Generate a personal access token on github by following these esteps; Right hand side of page -> Settings -> scroll to bottom -> Developer Settings > Personal Access Token > Tokens (classic) > Generate New Token Give scopes: admin:repo_hook repo (if you want to use it for your own private repo) if you need more help, see here: https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens 2. Setting Credentials in n8n In Register Github Webhook Authenticaion > Generic Credential Type Generic Auth Type > Header Auth Header Auth > Create New Credential with Name set to 'Authorization' and Value set to 'Bearer '. (You can reuse this for Delete Github Webhook and Get Existing Webhooks). Now in Register Github Webhook, scroll down to Send Body > JSON and inside the JSON, change the value of "url" to the webhook address given as Production URL in the node Webhook Trigger. 3. Notification settings In the third row, link up the Webhook Trigger to any API of your choice. Slack and Telegram are given as examples. You can also format the notification message as you wish. Setup time: roughly 10 minutes. Instructions Video: https://vimeo.com/1013473758 Test 1. Register Webhooks In Repos to Monitor, add any repo you want to monitor changes for. Disable Webhook Trigger, Click Test Workflow and if your Github credentials were set correctly, it will automatically register the webhooks. - You can test this by running the single node Get Existing Webhook and confirming it outputs the repo addresses. 2. Handle Github Events Now that you have registered the webhooks, re-enable Webhook Trigger and activate the workflow. Make a commit to any of the registered repos. Confirm that the notification went through. That's it!
by simonscrapes
What this workflow does: This flow uses an AI node to generate Seed Keywords to focus SEO efforts on based on your ideal customer profile. You can use these keywords to form part of your SEO strategy. Outputs: List of 20 Seed Keywords Setup Fill the Set Ideal Customer Profile (ICP) Connect with your credentials Replace the Connect to your own database with your own database Pre-requisites / Dependencies You know your ideal customer profile (ICP) An AI API account (either OpenAI or Anthropic recommended) More templates and n8n workflows >>> @simonscrapes
by Aditya Gaur
Who is this template for? This template can be used by any automator who wants to create a workitem(incident/user story/bugs) in azure devops whenever an alert raised by systems. How it works Each time an alert raised in system( for ex: Elastic raises an alert for missing host or domain). Workflow reads an alert and creates a workitem in azure devops Workflow can be customized to send any required information as possible in azure devops Setup Instructions Azure DevOps Organization and Project:** Make sure you have access to an Azure DevOps organization and a project where the work item will be created. Personal Access Token (PAT):** You need a Personal Access Token with permissions to create work items. You can generate a PAT from the Azure DevOps user settings.