by phil
This workflow automates voice reminders for upcoming appointments by generating a professional audio message and sending it to clients via email with the voice file attached. It integrates Google Calendar to track appointments, ElevenLabs to generate high-quality voice messages, and Gmail to deliver them efficiently. Who Needs Automated Voice Appointment Reminders? This automated voice appointment reminder system is ideal for businesses that rely on scheduled appointments. It helps reduce no-shows, improve client engagement, and streamline communication. Medical Offices & Clinics – Ensure patients receive timely appointment reminders. Real Estate Agencies – Keep potential buyers and renters informed about property visits. Service-Based Businesses – Perfect for salons, consultants, therapists, and coaches. Legal & Financial Services – Help clients remember important meetings and consultations. If your business depends on scheduled appointments, this workflow saves time and enhances client satisfaction. 🚀 Why Use This Workflow? Ensures clients receive timely reminders. Reduces appointment no-shows and scheduling issues. Automates the process with a personalized voice message. Step-by-Step: How This Workflow Automates Voice Reminders Trigger the Workflow – The system runs manually or on a schedule to check upcoming appointments in Google Calendar. Retrieve Appointment Data – It fetches event details (client name, time, and location) from Google Calendar. The workflow uses the summary, start.dateTime, location, and attendees[0].email fields from Google Calendar to personalize and send the voice reminders. Generate a Voice Reminder – Using ElevenLabs, the workflow converts the appointment details into a natural-sounding voice message. Send via Email – The generated audio file is attached to an email and sent to the client as a reminder. Customization: Tailor the Workflow to Your Business Needs Adjust Trigger Frequency – Modify the scheduling to run daily, hourly, or at specific intervals. Customize Voice Message Format – Change the script structure and voice tone to match your business needs. Change Notification Method – Instead of email, integrate SMS or WhatsApp for delivery. 🔑 Prerequisites Google Calendar Access** – Ensure you have access to the calendar with scheduled appointments. ElevenLabs API Key – Required for generating voice messages (you can start for free). Gmail API Access** – Needed for sending reminder emails. n8n Setup** – The workflow runs on an n8n instance (self-hosted or cloud). 🚀 Step-by-Step Installation & Setup Set Up Google Calendar API** Go to Google Cloud Console. Create a new project and enable Google Calendar API. Generate OAuth 2.0 credentials and save them for n8n. Get an ElevenLabs API Key** Sign up at ElevenLabs. Retrieve your API key from the dashboard. Configure Gmail API** Enable Gmail API in Google Cloud Console. Create OAuth credentials and authorize your email address for sending. Deploy n8n & Install the Workflow** Install n8n (Installation Guide). Add the required Google Calendar, ElevenLabs, and Gmail nodes. Import or build the workflow with the correct credentials. Test and fine-tune as needed. ⚠ Important: The LangChain Community node used in this workflow only works on self-hosted n8n instances. It is not compatible with n8n Cloud. Please ensure you are running a self-hosted instance before using this workflow. Summary This workflow ensures a professional and seamless experience for your clients, keeping them informed and engaged. 🚀🔊 Phil | Inforeole
by Hendriekus
Find OAuth URIs with AI Llama Overview: The AI agent identifies: Authorization URI Token URI Audience Methodology: Confidence scoring is utilized to assess the trustworthiness of extracted data: Score Range: 0 < x ≤ 1 Score Granularity: 0.01 increments Model Details: Leveraging the Wayfarer Large 70b Llama 3.3 model. How it works: This template is designed to assist users in obtaining OAuth2 settings using AI-powered insights. It is ideal for developers, IT professionals, or anyone working with APIs that require OAuth2 authentication. By leveraging the AI agent, users can simplify the process of extracting and validating key details such as the authorization_url, token_url, and audience. Set up instructions: 1. Configuration Nodes Structured Output Node**: Parses the AI model's output using a predefined JSON schema. This ensures the data is structured for downstream processing. Code Node**: If the AI model’s output does not match the required format, use the Code node to re-arrange and transform the data. Example code snippets are provided below for common scenarios. 2. AI Model Prompt The prompt for the AI model includes: A detailed structure and objectives of the query. Flexibility for the model to improvise when accurate results cannot be determined. 3. Confidence Scoring The AI model assigns a confidence score (0 < x ≤ 1) to indicate the reliability of the extracted data. Scores are provided in increments of 0.01 for granularity. Adaptability Customize this template: Update the AI model prompt with details specific to your API or OAuth2 setup. Adjust the JSON schema in the Structured Output node to match the data format. Modify the Code logic to suit the application's requirements.
by Adam Bertram
An AI-powered chat assistant that analyzes Azure virtual machine activity and generates detailed timeline reports showing VM state changes, performance metrics, and operational events over time. How It Works The workflow starts with a chat trigger that accepts user queries about Azure VM analysis. A Google Gemini AI agent processes these requests and uses six specialized tools to gather comprehensive VM data from Azure APIs. The agent queries resource groups, retrieves VM configurations and instance views, pulls performance metrics (CPU, network, disk I/O), and collects activity log events. It then analyzes this data to create timeline reports showing what happened to VMs during specified periods, defaulting to the last 90 days unless the user specifies otherwise. Prerequisites To use this template, you'll need: n8n instance (cloud or self-hosted) Azure subscription with virtual machines Microsoft Azure Monitor OAuth2 API credentials Google Gemini API credentials Proper Azure permissions to read VM data and activity logs Setup Instructions Import the template into n8n. Configure credentials: Add Microsoft Azure Monitor OAuth2 API credentials with read permissions for VMs and activity logs Add Google Gemini API credentials Update workflow parameters: Open the "Set Common Variables" node Replace <your azure subscription id here> with your actual Azure subscription ID Configure triggers: The chat trigger will automatically generate a webhook URL for receiving chat messages No additional trigger configuration needed Test the setup to ensure it works. Security Considerations Use minimum required Azure permissions (Reader role on subscription or resource groups). Store API credentials securely in n8n credential store. The Azure Monitor API has rate limits, so avoid excessive concurrent requests. Chat sessions use session-based memory that persists during conversations but doesn't retain data between separate chat sessions. Extending the Template You can add more Azure monitoring tools like disk metrics, network security group logs, or Application Insights data. The AI agent can be enhanced with additional tools for Azure cost analysis, security recommendations, or automated remediation actions. You could also integrate with alerting systems or export reports to external storage or reporting platforms.
by Oneclick AI Squad
This automated n8n workflow monitors ingredient price changes from external APIs or manual sources, analyzes historical trends, and provides smart buying recommendations. The system tracks price fluctuations in a PostgreSQL database, generates actionable insights, and sends alerts via email and Slack to help restaurants optimize their purchasing decisions. What is Price Trend Analysis? Price trend analysis uses historical price data to identify patterns and predict optimal buying opportunities. The system analyzes price movements over time and generates recommendations on when to buy ingredients based on current trends and historical patterns. Good to Know Price data accuracy depends on the reliability of external API sources Historical data improves recommendation accuracy over time (recommended minimum 30 days) PostgreSQL database provides robust data storage and complex trend analysis capabilities Real-time alerts help capture optimal buying opportunities Dashboard provides visual insights into price trends and recommendations How It Works Daily Price Check - Triggers the workflow daily to monitor price changes Fetch API Prices - Retrieves the latest prices from an external ingredient pricing API Setup Database - Ensures database tables are ready before inserting new data Store Price Data - Saves current prices to the PostgreSQL database for tracking Calculate Trends - Analyzes historical prices to detect patterns and price movements Generate Recommendations - Suggests actions based on price trends (buy/wait/stock up) Store Recommendations - Saves recommendations for future reporting Get Dashboard Data - Gathers necessary data for dashboard generation Generate Dashboard HTML - Builds an HTML dashboard to visualize insights Send Email Report - Emails the dashboard report to stakeholders Send Slack Alert - Sends key alerts or recommendations to Slack channels Database Structure The workflow uses PostgreSQL with two main tables: price_history - Historical price tracking with columns: id (Primary Key) ingredient (VARCHAR 100) - Name of the ingredient price (DECIMAL 10,2) - Current price value unit (VARCHAR 50) - Unit of measurement (kg, lbs, etc.) supplier (VARCHAR 100) - Source supplier name timestamp (TIMESTAMP) - When the price was recorded created_at (TIMESTAMP) - Record creation time buying_recommendations - AI-generated buying suggestions with columns: id (Primary Key) ingredient (VARCHAR 100) - Ingredient name current_price (DECIMAL 10,2) - Latest price price_change_percent (DECIMAL 5,2) - Percentage change from previous price trend (VARCHAR 20) - Price trend direction (INCREASING/DECREASING/STABLE) recommendation (VARCHAR 50) - Buying action (BUY_NOW/WAIT/STOCK_UP) urgency (VARCHAR 20) - Urgency level (HIGH/MEDIUM/LOW) reason (TEXT) - Explanation for the recommendation generated_at (TIMESTAMP) - When recommendation was created Price Trend Analysis The system analyzes historical price data over the last 30 days to calculate percentage changes, identify trends (INCREASING/DECREASING/STABLE), and generate actionable buying recommendations based on price patterns and movement history. How to Use Import the workflow into n8n Configure PostgreSQL database connection credentials Set up external ingredient pricing API access Configure email credentials for dashboard reports Set up Slack webhook or bot credentials for alerts Run the Setup Database node to create required tables and indexes Test with sample ingredient data to verify price tracking and recommendations Adjust trend analysis parameters based on your purchasing patterns Monitor recommendations and refine thresholds based on actual buying decisions Requirements PostgreSQL database access External ingredient pricing API credentials Email service credentials (Gmail, SMTP, etc.) Slack webhook URL or bot credentials Historical price data for initial trend analysis Customizing This Workflow Modify the Calculate Trends node to adjust the analysis period (currently 30 days) or add seasonal adjustments. Customize the recommendation logic to match your restaurant's buying patterns, budget constraints, or supplier agreements. Add additional data sources like weather forecasts or market reports for more sophisticated predictions.
by Jimleuk
This n8n template demonstrates an approach to image embeddings for purpose of building a quick image contextual search. Use-cases could for a personal photo library, product recommendations or searching through video footage. How it works A photo is imported into the workflow via Google Drive. The photo is processed by the edit image node to extract colour information. This information forms part of our semantic metadata used to identify the image. The photo is also processed by a vision-capable model which analyses the image and returns a short description with semantic keywords. Both pieces of information about the image are combined with the metadata of the image to form a document describing the image. This document is then inserted into our vector store as a text embedding which is associated with our image. From here, the user can query the vector store as they would any document and the relevant image references and/or links should be returned. Requirements Google account to download image files from Google Drive. OpenAI account for the Vision-capable AI and Embedding models. Customise this workflow Text summarisation is just one of many techniques to generate image embeddings. If the results are unsatisfactory, there are dedicated image embedding models such as Google's vertex AI multimodal embeddings.
by Tarek Mustafa
Who is this for? Jira users who want to automate the generation of a Lessons Learned or Retrospective report after an Epic is Done. What problem is this workflow solving? / use case Lessons Learned / Retrospective reports are often omitted in Agile teams because they take time to write. With the use of n8n and AI this process can be automated. What is this workflow doing Triggers automatically upon an Epic reaching the "Done" status in Jira. Collects all related tasks and comments associated with the completed Epic. Intelligently filters the gathered data to provide the LLM with the most relevant information. Utilizes an LLM with a structured System Message to generate insightful reports. Delivers the finalized report directly to your specified Google Docs document. Setup Create a Jira API key and follow the Credentials Setup in the Jira trigger node. Create credentials for Google Docs and paste your document ID into the Node. How to customize this workflow to your needs Change the System Message in the AI Agent to fit your needs.
by Jimleuk
Note: This template only works for self-hosted n8n. This n8n template demonstrates how to use the Langchain code node to track token usage and cost for every LLM call. This is useful if your templates handle multiple clients or customers and you need a cheap and easy way to capture how much of your AI credits they are using. How it works In our mock AI service, we're offering a data conversion API to convert Resume PDFs into JSON documents. A form trigger is used to allow for PDF upload and the file is parsed using the Extract from File node. An Edit Fields node is used to capture additional variables to send to our log. Next, we use the Information Extractor node to organise the Resume data into the given JSON schema. The LLM subnode attached to the Information Extractor is a custom one we've built using the Langchain Code node. With our custom LLM subnode, we're able to capture the usage metadata using lifecycle hooks. We've also attached a Google Sheet tool to our LLM subnode, allowing us to send our usage metadata to a google sheet. Finally, we demonstrate how you can aggregate from the google sheet to understand how much AI tokens/costs your clients are liable for. Check out the example Client Usage Log - https://docs.google.com/spreadsheets/d/1AR5mrxz2S6PjAKVM0edNG-YVEc6zKL7aUxHxVcffnlw/edit?usp=sharing How to use SELF-HOSTED N8N ONLY** - the Langchain Code node is only available in the self-hosted version of n8n. It is not available in n8n cloud. The LLM subnode can only be attached to non-"AI agent" nodes; Basic LLM node, Information Extractor, Question & Answer Chain, Sentiment Analysis, Summarization Chain and Text Classifier. Requirements Self-hosted version of n8n OpenAI for LLM Google Sheets to store usage metadata Customising this template Bring the custom LLM subnode into your own templates! In many cases, it can be a drop-in replacement for the regular OpenAI subnode. Not using Google Sheets? Try other databases or a HTTP call to pipe into your CRM.
by Derek Cheung
Use case This workflow enables a Telegram bot that can: Accept speech input in one of 55 supported languages Automatically detect the language spoken and translate the speech to another language Responds back with the translated speech output. This allows users to communicate across language barriers by simply speaking to the bot, which will handle the translation seamlessly. How does it work? Translation In the translation step the workflow converts the user's speech input to text and detects the language of the input text. If it's English, it will translate to French. If it's French, it will translate to English. To change the default translation languages, you can update the prompt in the AI node. Output In the output step, we provide the translated text output back to the user and speech output is generated in the translated language. Setup steps Obtain Telegram API Token Start a chat with the BotFather. Enter /newbot and reply with your new bot's display name and username. Copy the bot token and use it in the Telegram node credentials in n8n. Update the Settings node to customize the desired languages Activate the flow Full list of supported languages All supported languages:
by Radouane Driouich
Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis Description The "Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis" template is designed specifically for professionals, business owners, entrepreneurs, and anyone struggling to manage a high volume of daily emails. It solves common inbox problems such as email overload, missed important messages, manual sorting inefficiencies, and unorganized inbox clutter. By using intelligent content analysis powered by GPT-4o-mini, this workflow automatically categorizes incoming Gmail messages with relevant labels, ensuring efficient email management and significantly boosting productivity. Workflow Overview How It Works Email Detection**: Continuously monitors your Gmail inbox every minute to detect new incoming emails. Content Extraction**: Retrieves key email components including sender details, subject line, and body content for analysis. Intelligent Labeling**: Utilizes GPT-4o-mini AI to contextually analyze each email and assign 1-3 relevant labels based on your existing Gmail label structure. Automatic Application**: Applies the selected labels directly to your emails, equipped with robust error-handling mechanisms to ensure accuracy and reliability. Key Benefits Organized Inbox**: Automatically maintains inbox order and clarity. Time-Saving**: Reduces manual email management effort significantly. Customization**: Fully adaptable to specific labeling and organizational requirements. Pre-conditions Before using this template, ensure the following prerequisites are met: Active Gmail account with OAuth2 enabled. Active OpenAI account with GPT-4o-mini API key. Clearly defined labels set up in your Gmail account (e.g., "Work", "Personal", "Urgent"). Setup Instructions Follow these straightforward setup steps to activate the workflow: Connect Gmail Account Authorize your Gmail account using OAuth2 (takes approximately 2-3 minutes). Configure OpenAI GPT-4o-mini API Enter and validate your GPT-4o-mini API key to enable advanced email analysis. Establish Gmail Labels Ensure necessary labels are created within Gmail. Examples include "Work", "Personal", and "Urgent". Activate and Verify Click the "Activate" button in n8n. Send a test email to your Gmail inbox to confirm that labels are applied correctly. Customization Tips You can easily customize this workflow to fit your specific needs: Modify Gmail Labels**: Create and adapt labels to match your business or personal categorization strategy. Adjust GPT-4o-mini Criteria**: Fine-tune the AI prompts to improve accuracy and relevance based on your unique email management needs. Expand the Workflow**: Integrate additional conditions, actions, or external applications to further automate and optimize your email management processes. Improve your daily workflow efficiency and achieve a clutter-free Gmail inbox by leveraging the power of GPT-4o-mini today.
by Jimleuk
This n8n template demonstrates how you can leverage existing support site search to power your Support Chatbots and agents. Building a support chatbot need not be complicated! If building and indexing vector stores or duplicating data isn't necessarily your thing, an alternative implementation of the RAG approach is to leverage existing knowledge-bases such as support portals. In this way, document management and maintenance of your support agent is significantly reduced. Disclaimer: This template example uses AcuityScheduling's help center website but is not associated, supported nor endorsed by the company. How it works A simple AI agent is connected with chat trigger to receive user queries. The AI agent is instructed to fetch information from the knowledge-base via the attached custom workflow tool (aka "knowledgebase tool"). There is no step to replicate the entire support articles database into a vector store. You may choose not too because of time, cost and maintainence involved. Instead, the tool leverages the existing support portal's search API to retrieve knowledge-base articles. Finally, the search results are formatted before sending an aggregated response back to the agent. How to use? Customise the subworkflow to work with your own support portal API and format accordingly. Try the following queries How do I connect my icloud to acuityScheduling? How do I download past invoices for my Acuity account? Requirements OpenAI for LLM. If your organisation's APIs require authorisation, you may need to add custom credentials as necessary. Customising this workflow Add additional tools to reach other parts of your internal knowledgebase. Not using OpenAI? Feel free to swap but ensure the LLM has tools/function calling support.
by Ranjan Dailata
Who is this for? This workflow is designed for HR professionals, employer branding teams, talent acquisition strategists, market researchers, and business intelligence analysts who want to monitor, understand, and act upon employee sentiment and company perception on Glassdoor. It's ideal for organizations that value real-time feedback, are tracking employer brand perception, or need summarized insights for leadership reporting without sifting through thousands of raw reviews. What problem is this workflow solving? Manually reviewing and analyzing Glassdoor reviews is tedious, subjective, and not scalable especially for larger companies or those with many subsidiaries. This workflow: Automates review collection by making a Glassdoor company request via the Bright Data Web Scrapper API. Uses Google Gemini to summarize the content. Sends an actionable summary to HR dashboards, leadership teams, or alert systems via the Webhook notification. What this workflow does Makes an HTTP Request to Glassdoor via the Bright Data Web Scrapper API. Polls the BrightData Glassdoor for the completion of the request. Downloads the Glassdoor response when a new snapshot is ready. Sends the prompt to Google Gemini for summarization. Delivers the summarized insights (strengths, weaknesses, sentiment, patterns) to a configured webhook or dashboard endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). A webhook or endpoint to receive the summary (e.g., Slack, Notion, or custom HR dashboard). How to customize this workflow to your needs Change Summary Focus by updating the Summarization of Glassdoor Response node Summarization methods and prompts to extract specific insights: Cultural feedback Leadership issues Compensation comments Exit motivation Update the HTTP Request to Glassdoor node with a specific Glassdoor Company information that you are looking for. Format the output to produce a customized summary to Markdown or HTML for rich delivery. Integrate with HR Systems BambooHR, Workday, SAP SuccessFactors via API. Google Sheets or Airtable
by Msaid Mohamed el hadi
Automated YouTube Leads: Turn Comments into Enriched Prospects Workflow Overview This cutting-edge n8n workflow is a powerful automation tool designed to revolutionize how businesses and marketers identify and qualify leads directly from YouTube video comments. By leveraging specialized Apify Actors and an intelligent AI agent, this workflow seamlessly transforms raw comment data into comprehensive lead profiles, saving valuable time and resources. This workflow automatically: Discovers & Scrapes Comments: Monitors a Google Sheet for new YouTube video URLs. Automatically extracts all comments from specified YouTube videos using a dedicated Apify Actor. Marks videos as "scrapped" to avoid reprocessing. Intelligent Lead Enrichment: Retrieves unprocessed comments from Google Sheets. Activates an advanced AI agent (powered by OpenRouter's cutting-edge models) to research comment authors. Utilizes Google Search (via Serper API) and specialized Apify scrapers (for website content and Instagram profiles) to find publicly available information like social media links, bios, and potential contact details. Generates concise descriptions for each lead based on gathered data. Organized Data Storage: Creates new entries in a dedicated Google Sheet for each new lead. Updates lead profiles with all discovered enriched data (email, social media, short bio, etc.). Marks comments as "processed" once their authors have been researched and enriched. Key Benefits 🤖 Full Automation: Eliminates manual data collection and research, freeing up your team for strategic tasks. 💡 Smart Lead Enrichment: AI intelligently sifts through information to build rich, actionable lead profiles. ⏱️ Time-Saving: Instant, scalable lead generation without human intervention. 📈 Enhanced Lead Quality: Go beyond basic contact info with comprehensive social and professional context. 📊 Centralized Data: All leads are neatly organized in Google Sheets for easy access and integration. Setup Requirements n8n Installation: Install n8n (cloud or self-hosted). Import the workflow configuration. Configure API credentials. Set up scheduling preferences for continuous operation. Google Sheets Credentials: A Google Cloud API key with access to Google Sheets. Set up OAuth2 authentication in n8n for read/write access to your "youtube leads" spreadsheet (containing "videos", "comments", and "leads" sheets). OpenRouter API Access: Create an OpenRouter account. Generate an API key to access their chat models (e.g., google/gemini-2.5-flash-preview-05-20) for AI agent operations. Apify API Access: Create an Apify account. Generate a personal API token. This token is used to run the following Apify Actors: mohamedgb00714/youtube-video-comments (for comment extraction) mohamedgb00714/fireScraper-AI-Website-Content-Markdown-Scraper (for website content extraction) mohamedgb00714/instagram-full-profile-scraper (for Instagram profile details) Serper API Key: Sign up for an account on Serper.dev. Obtain an API key for performing Google searches to find social media profiles and other information. Potential Use Cases Content Creators: Identify highly engaged audience members for community building or direct outreach. Marketing Teams: Discover potential customers or influencers interacting with competitor content. Sales Professionals: Build targeted lead lists based on specific interests expressed in comments. Market Researchers: Analyze audience demographics and interests by enriching profiles of commenters on relevant videos. Recruiters: Find potential candidates based on their expertise or engagement in industry-specific discussions. Future Enhancement Roadmap CRM Integration: Directly push enriched leads into popular CRM systems (e.g., HubSpot, Salesforce). Automated Outreach: Implement automated email or social media messaging for qualified leads. Sentiment Analysis: Analyze comment sentiment before enrichment to prioritize positive interactions. Multi-Platform Support: Expand comment extraction and lead enrichment to other platforms (e.g., TikTok, Facebook). Advanced Lead Scoring: Develop a scoring model based on engagement, profile completeness, and relevance. Ethical Considerations Data Privacy: Ensure all collected data is publicly available and used in compliance with relevant privacy regulations (e.g., GDPR, CCPA). Platform Guidelines: Adhere strictly to YouTube's Terms of Service and Apify's usage policies. Transparency: If engaging with leads, be transparent about how their information was obtained (if applicable). No Spam: This tool is designed for lead identification, not for unsolicited mass messaging. Technical Requirements n8n v1.0.0 or higher (recommended for latest features and stability) Google Sheets API access OpenRouter API access Apify API access Serper API access Stable internet connection Workflow Architecture [YouTube Video URLs (Google Sheet)] ⬇️ [Schedule/Manual Trigger] ⬇️ [Extract Comments (Apify YouTube Scraper)] ⬇️ [Save Raw Comments (Google Sheet)] ⬇️ [AI Agent (OpenRouter) for Lead Research] ⬇️ [Google Search (Serper) & Web Scraping (Apify FireScraper/Instagram Scraper)] ⬇️ [Save Enriched Leads (Google Sheet)] ⬇️ [Mark Comments Processed (Google Sheet)] Connect With Me Exploring AI-Powered Lead Generation? 📧 Email: mohamedgb00714@gmail.com 💼 LinkedIn: Mohamed el Hadi Msaid Transform your YouTube engagement into a powerful lead generation engine with intelligent, automated insights\!