by Evoort Solutions
Automated YouTube to MP3 Conversion and Storage with Google Sheets This automated workflow allows seamless conversion of YouTube videos to MP3, using the YouTube to MP3 Downloader API. The converted MP3 files are uploaded to Google Drive, and all relevant conversion data like download links and file sizes are logged in Google Sheets. Ideal for content creators and download enthusiasts, it enhances efficiency and accuracy in handling YouTube-to-MP3 conversions. Node-by-Node Explanation: On form submission Triggers the workflow when a user submits a YouTube video URL for conversion. HTTP Request Sends a POST request to the YouTube to MP3 Downloader API to initiate the conversion of the YouTube URL to MP3 format. Google Drive Uploads the converted MP3 file to Google Drive for cloud storage. Google Sheets (Initial Log) Logs initial details such as URL and status in Google Sheets before the conversion is complete. Google Sheets (Final Log) After successful conversion, logs the download link, file size, and other relevant data in Google Sheets. If Condition Filters the process to only proceed if the conversion status is "done." Wait Pauses the workflow until the conversion process is completed. Code Converts file size from bytes to megabytes (MB) for easier reference in Google Sheets. Download MP3 Triggers the MP3 file download once the conversion is finished. Problem Solved: Converting YouTube videos to MP3 manually is time-consuming and tedious. The process involves multiple steps, such as downloading the video, extracting audio, and organizing the files, which can be a hassle, especially if you need to do it frequently. Additionally, managing and tracking these files and their statuses can be chaotic, leading to disorganization. This workflow automates the entire process: Conversion automation**: No need for third-party apps or websites to handle YouTube-to-MP3 conversion. Efficient tracking**: All conversion details (file size, download link, etc.) are logged in Google Sheets, keeping everything organized. Cloud storage**: Directly stores converted MP3s in Google Drive, ensuring files are secure, easy to access, and well-managed. By leveraging the YouTube to MP3 Downloader API, this workflow removes all the manual steps, allowing you to save time and effort while keeping everything organized. Benefits of the Flow: Time-Saving Automation: Automatically converts YouTube videos to MP3 using the **YouTube to MP3 Downloader API, eliminating the need for manual conversion. Data Logging**: Automatically logs essential conversion details (like file size, download link, etc.) in Google Sheets for easy reference. Cloud Storage Integration**: Converted MP3 files are directly uploaded to Google Drive for secure, cloud-based storage. No Hassle**: Eliminates the need for third-party tools or manual tracking of conversions. Use Cases: Content Creators If you’re a YouTuber or a podcast creator, you might need to convert and store multiple audio files for your content. This workflow can help by automatically converting YouTube videos or podcasts to MP3 and saving them to Google Drive, all while keeping a detailed log in Google Sheets. Educators and Trainers Teachers or trainers often use YouTube videos for educational purposes and might want to extract the audio (e.g., for podcasts or lectures). With this automation, they can easily convert YouTube content into MP3 format for use in offline teaching or sharing with students. Social Media Managers Social media managers working with audio content can use this workflow to quickly convert YouTube videos to MP3 files and upload them to Google Drive for easy sharing with their team or posting on social platforms. Music Enthusiasts Music lovers who want to save YouTube music videos or tracks into MP3 format for personal use or offline listening can benefit from this automated conversion process. The workflow makes it fast and easy to convert, store, and track MP3 files. Content Archivists If you’re working on archiving online media or curating content libraries, this system allows for quick and efficient conversion, storing, and cataloging of YouTube videos in MP3 format with all relevant metadata stored in Google Sheets for easy management. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by n8n Team
This is a workflow for a Telegram-echo bot. This bot is useful for debugging and learning purposes of the Telegram platform. Add your Telegram bot credentials for both nodes. Activate the workflow. Send data to the bot (i.e. a message, a forwarded message, sticker, emoji, voice, file, an image...). Second node will fetch the incoming JSON object, format it and send back.
by Muh Resky Adiansyah
Lead-Routing-Engine-with-SLA-Auto-Reassignment This repository contains an SLA-based lead routing workflow built in n8n, designed to ensure fast lead response, fair sales distribution, and controlled escalation without relying on a full CRM system. The workflow focuses on routing discipline and operational safety, not feature completeness. What This Workflow Does At a high level, the system: 1) Accepts new leads from a generic intake form 2) Assigns leads to sales reps using round-robin 3) Enforces a response SLA 4) Automatically re-routes uncontacted leads 5) Allows sales to mark leads as CONTACTED via Slack 6) Escalates to a manager once if SLA is repeatedly violated Architecture Overview Core Components 1) n8n 2) Google Sheets 3) Slack Primary Data Stores 1) sales_sheet (list of active sales reps) 2) lead_sheet (lead state and routing history) 3) routing_state_sheet (global routing + escalation flags) End-to-End Flow A) Lead Intake & Normalization 1) New leads enter via Form Trigger 2) Phone numbers are normalized to 62xxxxxxxx (Indonesia International Direct Dialing code) 3) A unique lead_id is generated 4) Lead is initialized with: stage = NEW route_count = 0 B) Initial Assignment (Round Robin) 1) Active sales reps are loaded from sales_sheet 2) Global last_index is read from routing_state_sheet 3) Lead is assigned to the next sales rep in sequence 4) Assignment metadata is stored: assigned sales timestamps route count 5) last_index is updated centrally C) Slack Notification (New Lead) 1) Assigned sales receives a Slack message 2) Message includes a “Mark as CONTACTED” button 3) SLA expectation is clearly communicated (1 hour by default) D) SLA Monitoring (Scheduled) 1) A scheduled trigger runs every hour 2) Workflow scans leads where: stage = NEW SLA window has elapsed since last assignment E) SLA Re-Routing For each qualifying lead: 1) Lead is reassigned to the next sales rep 2) Route count is incremented 3) Timestamps are updated 4) Slack notification is sent to the new assignee This process repeats until the lead is contacted or escalated. F) Controlled Escalation If route_count >= threshold (default: 10): 1) Workflow checks escalation state (escalated_<lead_id>) 2) If not escalated yet: Manager is notified via Slack Escalation flag is written 4) If already escalated: No further action is taken Escalation is one-time per lead. G) Stage Update via Slack (CONTACTED) 1) Sales marks a lead as CONTACTED via Slack button 2) Incoming Slack action is validated: Only assigned sales is allowed Only if current stage is NEW 3) Update is idempotent 4) Unauthorized or stale actions receive Slack feedback Once contacted: 1) SLA routing stops 2) Lead remains stable Safeguards Built In 1) Ownership enforcement (Only the assigned sales rep can update a lead) 2) Idempotent stage transitions (Prevents duplicate or stale Slack actions) 3) One-time escalation (No notification spam) 4) Fail-fast behavior (Missing sales data or malformed payloads halt execution early) Google Sheets Schema sales_sheet | Column | Description | |----------|-----------------| | name | Sales name | | email | Optional | | slack_id | Slack user ID | | active | ON / TRUE | lead_sheet | Column | Description | | -------------- | --------------------- | | lead_id | Unique identifier | | name | Lead name | | phone | Normalized phone | | stage | NEW / CONTACTED / QUALIFIED / CLOSED LOST | | assigned_sales | Current owner | | sales_slack_id | Slack ID | | route_count | Number of re-routes | | created_at | Creation timestamp | | assigned_at | Last assignment | | last_routed_at | Last SLA routing | | contacted_at | When marked contacted | routing_state_sheet | key | value | | ------------------- | ---------------------- | | last_index | Last round-robin index | | escalated_<lead_id> | Escalation timestamp | Limitations (By Design) 1) Google Sheets is not transactional 2) SLA enforcement is time-bucketed, not real-time 3) No concurrency locking across parallel runs 4) Slack is required for interaction 5) This is not a CRM, only a routing engine These constraints are explicit and intentional. When This Design Works Well 1) Small to mid-size teams 2) Human-response SLAs (minutes/hours) 3) Teams needing discipline, not heavy tooling 4) CRM-lite or pre-CRM environments When to Migrate Consider migrating if you need: 1) High-volume ingestion 2) Sub-minute SLA guarantees 3) Strong transactional consistency 4) Advanced analytics or forecasting The routing logic itself is portable to SQL or CRM systems.
by Agent Circle
This n8n workflow demonstrates how to use this AI Agent to extract, process, and analyze YouTube video comments to understand your audience beyond the view count. Use cases are many: Whether you're a YouTube creator exploring feedback, a social media manager fine-tuning engagement strategy, a brand team monitoring campaign sentiment, or a marketing agency conducting audits – this tool brings audience voice to the forefront with structured insights. How It Works The workflow starts when you manually click Test Workflow or Execute Workflow in n8n. It collects all the rows marked as Ready in Column A in the Video URLs tab of your connected Google Sheet. The tool checks if the URLs are not empty first, then it loops through each valid video URL and sends a GET request to the YouTube API to fetch its comments. It checks the response from the YouTube API. If the call is successful, the comment data is extracted and split into individual entries. The tool then checks whether the video URL has any comment. If no comment is found, the video URL’s status in Column A in the Video URLs tab is updated to Finished right away. If comments are available, they are passed to the AI Agent - Analyze Sentiment Of Every Comment using the Google Gemini chat model, where each comment is analyzed and classified by sentiment: Positive, Neutral, or Negative. Next, the analysis results are saved to the Results tab in your connected Google Sheet. Finally, the original video URL’s status in Column A in the Video URLs tab is updated to Finished, ensuring it won’t be reprocessed in the loop. How To Set Up Download the working package and import it into your n8n interface. Duplicate the YouTube Comment Analyzer Google Sheet template to your Google Sheets account. Set up necessary credentials for tools access and usability: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet template: Node Get Video URLs → connected to the Video URLs tab Node Insert Comment Data & Analysis → connected to the Results tab Node Update Video Status → connected to the Video URLs tab For YouTube access, connect to its API in the following node: Node HTTP Request - Get Comments For Google Gemini access, connect to its API in the following node: Node Google Gemini Chat Model Enter video URLs in Column B in the Video URLs tab in your connected Google Sheet and mark their status in Column A as Ready. Click Test Workflow or Execute Workflow to run the process. Check the results in the Results tab of the connected Google Sheet template to view all collected comments along with their sentiment analysis. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. API access to Google Gemini for sentiment analysis. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries in your connected Google Sheet template and starts the workflow automatically. In the AI Agent - Analyze Sentiment Of Every Comment node, you can also change the AI chat model. By default, it uses Google Gemini, but you can easily replace it with any other compatible provider such as Deepseek, Grok, etc. You can customize the sentiment categories and instruction prompt for the AI Agent in the AI Agent – Analyze Sentiment Of Every Comment node following your needs. Then, the Agent can return sentiment results that align more closely with your intended use case. Also, feel free to integrate additional nodes (like Telegram or Email) to notify you and your team whenever updates and analysis succeed or fail. Need Help? If you’d like this workflow customized, or if you’re looking to build a tailored AI Agent for your own business - please feel free to reach out to Agent Circle. We’re always here to support and help you to bring automation ideas to life. Join our community on different platforms for assistance, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Mutasem
Use case Error workflows are an important part of running workflows in production. Get alerts for errors directly in your inbox. How to setup Add your Gmail creds Add your target email Add this error workflow to other workflows docs here
by Jonathan
Task: Merge two datasets into one based on matching rules Why: A powerful capability of n8n is to easily branch out the workflow in order to process different datasets. Even more powerful is the ability to join them back together with SQL-like joining logic. Main use cases: Appending data sets Keep only new items Keep only existing items
by Jan Oberhauser
This workflow returns the current weather at a predefined or given city and returns it so that it can be displayed with bash-dash. By default does it return the weather in Berlin if no city got defined. That default can be changed in the "Set City" node. Example usage: \- weather london Example bash-dash config: commands[weather]="http://localhost:5678/webhook/weather"
by Masahiro Minami
Check email and create new Nextcloud Deck card from incoming email. Import workflow. Change Nextcloud URL with target board id and stack id. Configure IMAP credential and Nextcloud userid / password. The workflow starts with new IMAP email. This workflow creates new Nextcloud Deck Card with email subject and email body. (Note that all the emails will be marked as read.)
by Lorena
This workflow allows you to send a message in a Telegram chat via bash-dash. Example usage: - telegram I'll be late If you want to send a predefined message without typing it in the command line, you can replace the Text Expression in the Telegram node with a specific message. In this case, the dash command - telegram will send the predefined message to the chat. Example bash-dash config: commands[telegram]="http://localhost:5678/webhook/telegram"
by Tom
This workflow shows a no code approach to creating a HTML table based on Google Sheets data. To run the workflow: Make sure you have a Google Sheet with a header row and some data in it. Grab your sheet ID: Add it to the Google Sheets node: Activate the workflow or execute it manually Visit the URL provided by the webhook node in your browser (production URL if the workflow is active, test URL if the workflow is executed manually)
by Harshil Agrawal
This workflow allows you to generate, retrieve and download a report using the SecurityScorecard node. SecurityScorecard node: This node generates a full scorecard report. Based on your use-case, you can generate other type of report. SecurityScorecard1 node: This node fetches the latest report from SecurirtScoredcard. Toggle Return All to true to return all the reports. SecurityScorecard2 node: This node downloads the report that got fetched from the previous node. Based on your use-case, you can either store this report in Dropbox, Google Drive etc. or email it using the Gmail node, Send Email node or the Microsoft Outlook node. You can replace the Strat node with the Cron node to trigger the workflow on a regurlar interval.
by Luciano Gutierrez
🤖 AI-Powered Gmail MCP Server for n8n Description This n8n workflow template leverages an external AI Model Control Plane (MCP) Server to automate various Gmail tasks, such as composing emails, replying to threads, and handling follow-ups using dynamically generated content. It uses the native n8n Gmail nodes available from v1.88.0 onwards. Who is this template for? Ideal for developers, automation engineers, and power users using self-hosted n8n (v1.88.0+) who want to integrate artificial intelligence directly into their email workflows via a dedicated MCP Server for enhanced control and customization over AI interactions. What problem does this workflow solve? ⚙️ Reduces Manual Effort: Decreases the work involved in writing, sending, and following up on emails in Gmail. ✅ Consistency and Quality: Ensures standardized, professional responses free from typos by leveraging controlled AI prompts. 🔄 Complete Automation: Automates the entire email cycle: from the initial send, through waiting for a reply, to sending automated follow-ups based on AI logic. Workflow Overview This template provides a structured approach to integrating Gmail with an MCP Server: 📡 MCP Trigger (“MCP_GMAIL”): An n8n Webhook node that receives HTTP calls from your MCP Server. It standardizes the inputs (like recipient, subject, AI prompt) for all subsequent Gmail nodes. (You will need to configure your MCP Server to call this webhook URL). 📤 SEND_EMAIL (Gmail Node v2.1): Sends new messages. The email body (message field) is typically populated by content generated from an AI prompt processed by your MCP server and passed via the trigger. 🔄 REPLY_EMAIL (Gmail Node v2.1): Automatically replies to existing conversations (threads). It uses AI-generated content (via MCP) to formulate the reply based on the thread context. Requires Message ID and/or Thread ID. 📥 GET_EMAIL (Gmail Node v2.1): Fetches data for a specific message (using Message ID) for analysis, processing, or archiving. Useful for retrieving context before replying. ⏳ SEND_AND_WAIT (Gmail Node v2.1): Sends an email and pauses the workflow execution until a reply is received in that specific conversation (thread). This is crucial for building automated follow-up sequences. It then outputs data from the reply message. Note:* All Gmail nodes in this template use the native n8n *Gmail Tool*, integrated since v1.88.0. No additional installation of community nodes is required. See the official n8n documentation for more details on node configuration. Prerequisites Ensure you have the following before importing: 🚀 A self-hosted n8n instance running version 1.88.0 or higher. ☁️ A Google Cloud project with the Gmail API correctly enabled. 🔑 Gmail OAuth2 credentials configured in your n8n instance. Navigate to Settings > Credentials > New > Google > Gmail (OAuth2 API) to set this up if you haven't already. 🧠 Access to your MCP Server and an API Key (or other authentication method) required to interact with it via HTTP requests. How to Import and Configure Follow these steps to get the template running: In your n8n interface, navigate to Templates → Import from URL. Paste the JSON link provided for this workflow template. Configure the necessary credentials within n8n under Credentials: Gmail OAuth2: Select the Google OAuth2 credential you previously configured that has access to the desired Gmail account. MCP API Key: You'll likely need to configure credentials for interacting with your MCP Server. This might involve setting up a Header Auth credential in n8n with your MCP API Key, or configuring the HTTP Request node within the workflow directly, depending on your MCP's authentication scheme. Link this credential where needed (e.g., in the Trigger node if MCP calls n8n with auth, or in HTTP Request nodes if n8n calls MCP). Activate the Workflow: Ensure the workflow toggle is set to "Active" in the top right corner. Webhook URL: Copy the Webhook URL from the "MCP_GMAIL" Trigger node (Test or Production URL as needed) and configure your MCP Server to send requests to this URL. Recommendation: Rename the nodes with clear, descriptive names relevant to your specific use case (e.g., ✨ Generate Sales Email Body via MCP, 📥 Fetch Customer Replies). Utilize the workflow notes (sticky notes on the canvas) to document specific logic, prompt details, or configuration choices for future reference. Customization & Technical Guidance Tailor the workflow to your specific needs: 🔍 Search Filters: In nodes that fetch emails (like GET_EMAIL or if you add a Gmail - Get Many node), refine the search using the Search field (standard Gmail search operators) or filter by Label Names to process specific emails (e.g., unread from a specific sender, emails with a certain label). ✍️ AI Fine-Tuning (Prompt Engineering): The core of the AI integration happens in the prompts sent to your MCP Server. Modify these prompts (often constructed within function nodes or directly in the trigger input expected from MCP) in the message or body fields passed to the send/reply nodes. Adjust prompts to control: Tone & Style: Formal, informal, empathetic, technical, etc. Content & Format: Request bullet points, summaries, specific data extraction. Dynamic Variables: Inject data from previous n8n nodes (e.g., customer name, order details, previous email content) into the prompt for context-aware generation. Example: `"Reply to the following email thread [{{ $json.thread_content }}] addressing the customer {{ $json.customer_name }} about their query..."` 🔗 Post-Response Actions: Extend the workflow after key actions, especially the SEND_AND_WAIT node. Add nodes to: Log results to a database (MySQL, PostgreSQL, Airtable). Update CRM records (HubSpot, Salesforce). Send notifications (Slack, Discord, Telegram). Trigger other n8n workflows. 🛡️ Error Handling: Implement robust error handling. Connect the red output pins (error output) of critical nodes (like Gmail nodes or HTTP Requests to MCP) to an Error Trigger node. From there, you can: Log detailed error information to a monitoring tool or spreadsheet. Send failure notifications. Implement retry logic (using loops or specific retry settings on nodes). Route to alternative paths or fallback workflows.