by Rajeet Nair
Overview This workflow intelligently routes incoming user requests using AI-powered task classification. It determines whether a task is simple or complex, assigns a confidence score, and dynamically delegates execution to the appropriate agent. If the confidence score is too low, the workflow triggers a fallback email alert for manual review—ensuring reliability and preventing incorrect automation. This design improves response accuracy, enables scalable automation, and introduces human-in-the-loop safety for uncertain scenarios. How It Works Webhook Trigger Receives incoming user requests. Workflow Configuration Stores the user request and confidence threshold. Supervisor Agent Analyzes the request. Classifies it as simple or complex. Returns a confidence score and reasoning. Structured Output Parser Ensures the classification follows a strict JSON format. Confidence Check (IF Node) Compares the confidence score with the threshold. Routing Logic If confidence is high: Task is passed to the Executor Agent Executor selects: Simple Agent Tool for basic tasks Complex Agent Tool for advanced tasks Agent Execution Each agent uses an OpenAI model to process the task. Fallback Handling If confidence is low: Sends an email alert for human review. Setup Instructions OpenAI Credentials Add credentials for all OpenAI nodes: Supervisor Executor Simple Agent Complex Agent Webhook Configuration Set the webhook path. Connect it to your frontend or API source. Email Node Setup Configure sender and recipient email addresses. Use SMTP or supported email service. Adjust Threshold Modify confidenceThreshold in the Set node if needed. Customize Prompts Update system messages in: Supervisor Agent Executor Agent Simple/Complex Agents Use Cases AI-powered task routing systems Customer support automation with fallback safety Intelligent chatbot orchestration Workflow automation with human-in-the-loop validation Multi-agent AI systems with decision control Requirements OpenAI API credentials Email (SMTP or service integration) n8n instance (cloud or self-hosted) Key Features AI-based task classification Confidence scoring for safe automation Dynamic agent routing Human fallback for low-confidence decisions Modular and scalable architecture Summary A smart AI routing workflow that classifies tasks, routes them to specialized agents, and ensures reliability through confidence scoring and fallback alerts—ideal for building safe, scalable automation systems in n8n.
by Robert Breen
This n8n workflow template creates an efficient data analysis system that uses Google Gemini AI to interpret user questions about spreadsheet data and processes them through a specialized sub-workflow for optimized token usage and faster responses. What This Workflow Does Smart Query Parsing**: Uses Gemini AI to understand natural language questions about your data Efficient Processing**: Routes calculations through a dedicated sub-workflow to minimize token consumption Structured Output**: Automatically identifies the column, aggregation type, and grouping levels from user queries Multiple Aggregation Types**: Supports sum, average, count, count distinct, min, and max operations Flexible Grouping**: Can aggregate data by single or multiple dimensions Token Optimization**: Processes large datasets without overwhelming AI context limits Tools Used Google Gemini Chat Model** - Natural language query understanding and response formatting Google Sheets Tool** - Data access and column metadata extraction Execute Workflow** - Sub-workflow processing for data calculations Structured Output Parser** - Converts AI responses to actionable parameters Memory Buffer Window** - Basic conversation context management Switch Node** - Routes to appropriate aggregation method Summarize Nodes** - Performs various data aggregations 📋 MAIN WORKFLOW - Query Parser What This Workflow Does The main workflow receives natural language questions from users and converts them into structured parameters that the sub-workflow can process. It uses Google Gemini AI to understand the intent and extract the necessary information. Prerequisites for Main Workflow Google Cloud Platform account with Gemini API access Google account with access to Google Sheets n8n instance (cloud or self-hosted) Main Workflow Setup Instructions 1. Import the Main Workflow Copy the main workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with name: "Gemini Data Query Parser" 2. Set Up Google Gemini Connection Go to Google AI Studio Sign in with your Google account Go to Get API Key section Create a new API key or use an existing one Copy the API key Configure in n8n: Click on Google Gemini Chat Model node Click Create New Credential Select Google PaLM API Paste your API key Save the credential 3. Set Up Google Sheets Connection for Main Workflow Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Create OAuth 2.0 Client ID credentials In n8n, click on Get Column Info node Create Google Sheets OAuth2 API credential Complete OAuth flow 4. Configure Your Data Source Option A: Use Sample Data The workflow is pre-configured for: Sample Marketing Data Make a copy to your Google Drive Option B: Use Your Own Sheet Update Get Column Info node with your Sheet ID Ensure you have a "Columns" sheet for metadata Update sheet references as needed 5. Set Up Workflow Trigger Configure how you want to trigger this workflow (webhook, manual, etc.) The workflow will output structured JSON for the sub-workflow ⚙️ SUB-WORKFLOW - Data Processor What This Workflow Does The sub-workflow receives structured parameters from the main workflow and performs the actual data calculations. It handles fetching data, routing to appropriate aggregation methods, and formatting results. Sub-Workflow Setup Instructions 1. Import the Sub-Workflow Create a new workflow in n8n Copy the sub-workflow JSON (embedded in the Execute Workflow node) Import as a separate workflow Save with name: "Data Processing Sub-Workflow" 2. Configure Google Sheets Connection for Sub-Workflow Apply the same Google Sheets OAuth2 credential you created for the main workflow Update the Get Data node with your Sheet ID Ensure it points to your data sheet (e.g., "Data" sheet) 3. Configure Google Gemini for Output Formatting Apply the same Gemini API credential to the Google Gemini Chat Model1 node This handles final result formatting 4. Link Workflows Together In the main workflow, find the Execute Workflow - Summarize Data node Update the workflow reference to point to your sub-workflow Ensure the sub-workflow is set to accept execution from other workflows Sub-Workflow Components When Executed by Another Workflow**: Trigger that receives parameters Get Data**: Fetches all data from Google Sheets Type of Aggregation**: Switch node that routes based on aggregation type Multiple Summarize Nodes**: Handle different aggregation types (sum, avg, count, etc.) Bring All Data Together**: Combines results from different aggregation paths Write into Table Output**: Formats final results using Gemini AI Example Usage Once both workflows are set up, you can ask questions like: Overall Metrics: "Show total Spend ($)" "Show total Clicks" "Show average Conversions" Single Dimension: "Show total Spend ($) by Channel" "Show total Clicks by Campaign" Two Dimensions: "Show total Spend ($) by Channel and Campaign" "Show average Clicks by Channel and Campaign" Data Flow Between Workflows Main Workflow: User question → Gemini AI → Structured JSON output Sub-Workflow: Receives JSON → Fetches data → Performs calculations → Returns formatted table Contact Information For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing these workflows, want to remove limitations, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by Margo Rey
Generate and send MadKudu Account Brief into Outreach This workflow generates an account brief tailored to your company using MadKudu MCP and OpenAI and syncs it to a custom field in Outreach. Its for Sales who want to give reps rich account context right inside Outreach, and draft Outreach email with Outreach Revenue Agent based on MadKudu account brief. ✨ Who it's for RevOps or GTM teams using MadKudu + Salesforce + Outreach Sales teams needing dynamic, AI-generated context for target accounts 🔧 How it works 1. Select Accounts: Use a Salesforce node to define which accounts to brief. Filter logic can be updated to match ICP or scoring rules (e.g., MadKudu Fit + LTB). 2. Generate Brief with MadKudu MCP & AI MadKudu MCP provides the account brief instructions, research online for company recent news and provides structured account context from your integrations connected to MadKudu + external signals (firmographics, past opportunities, active contacts, job openings...) The AI agent (OpenAI model) turns this into a readable account brief. 3. Send to Outreach Match account in Outreach via domain. Update a custom field (e.g., custom49) with the brief text. 📋 How to set up Connect your Salesforce account Used to pull accounts that need a brief. Set your OpenAI credentials Required for the AI Agent to generate the brief. Create a n8n Variable to store your MadKudu API key named madkudu_api_key used for the MadKudu MCP tool The AI Agent pulls the account brief instructions and all the context necessary to generate the briefs. Create an Oauth2 API credential to connect your Outreach account Used to sync to brief to Outreach. Customize the Salesforce filter In the “Get accounts” node, define which accounts should get a brief (e.g. Fit > 90). Map your Outreach custom field Update the JSON Body request with your actual custom field ID (e.g. custom49). 🔑 How to connect Outreach In n8n, add a new Oauth2 API credential and copy the callback URL Now go to Outreach developer portal Click “Add” to create a new app In Feature selection add Outreach API (OAuth) In API Access (Oauth) set the redirect URI to the n8n callback Select the following scopes accounts.read, accounts.write Save in Outreach Now enter the Outreach Application ID into n8n Client Id and the Outreach Application Secret into n8n Client secret Save in n8n and connect via Oauth your Outreach Account ✅ Requirements MadKudu account with access to API Key Salesforce Oauth Outreach Admin permissions to create an app OpenAI API Key 🛠 How to customize the workflow Change the targeting logic** Edit the Salesforce filter to control which accounts are eligible. Rewrite the prompt** Tweak the prompt in the AI Agent node to adjust format, tone, or insights included in the brief. Change the Outreach account field** Update the Outreach field where the brief is sync-ed if you're using a different custom field (e.g. custom48, custom32, etc). Use a different trigger** Swap the manual trigger for a Schedule or Webhook to automate the flow end-to-end.
by Mantaka Mahir
Automate Google Classroom: Topics, Assignments & Student Tracking Automate Google Classroom via the Google Classroom API to efficiently manage courses, topics, teachers, students, announcements, and coursework. Use Cases Educational Institution Management Sync rosters, post weekly announcements, and generate submission reports automatically. Remote Learning Coordination Batch-create assignments, track engagement, and auto-notify teachers on new submissions. Training Program Automation Automate training modules, manage enrollments, and generate completion/compliance reports. Prerequisites n8n (cloud or self-hosted) Google Cloud Console access for OAuth setup Google Classroom API enabled Google Gemini API key** (free) for the agent brain — or swap in any other LLM if preferred Setup Instructions Step 1: Google Cloud Project Create a new project in Google Cloud Console. Enable Google Classroom API. Create OAuth 2.0 Client ID credentials. Add your n8n OAuth callback URL as a redirect URI. Note down the Client ID and Client Secret. Step 2: OAuth Setup in n8n In n8n, open HTTP Request Node → Authentication → Predefined Credential Type. Select Google OAuth2 API. Enter your Client ID and Client Secret. Click Connect my account to complete authorization. Test the connection. Step 3: Import & Configure Workflow Import this workflow template into n8n. Link all Google Classroom nodes to your OAuth credential. Configure the webhook if using external triggers. Test each agent for API connectivity. Step 4: Customization You can customize each agent’s prompt to your liking for optimal results, or copy and modify node code to expand functionality. All operations use HTTP Request nodes, so you can integrate more tools via the Google Classroom API documentation. This workflow provides a strong starting point for deeper automation and integration. Features Course Topics List, create, update, or delete topics within a course. Teacher & Student Management List, retrieve, and manage teachers and students programmatically. Course Posts List posts, retrieve details and attachments, and access submission data. Announcements List, create, update, or delete announcements across courses. Courses List all courses, get detailed information, and view grading periods. Coursework List, retrieve, or analyze coursework within any course. Notes Once OAuth and the LLM connection are configured, this workflow automates all Google Classroom operations. Its modular structure lets you activate only what you need—saving API quota and improving performance.
by iamvaar
Workflow explanation: Watch on YouTube Automated Missed Call Recovery with Gohighlevel + Twilio + Gemini Prerequisites for the HVAC n8n Workflow Before setting up the workflow in n8n, ensure you have completed the following foundational steps: Twilio Call Status Webhook:** Set the webhook of Sub-workflow 1 in the Twilio Voice section for "Call Status Changes". GoHighLevel Custom Fields:** Create two custom fields in GoHighLevel (GHL): called phone number call sid Twilio API Integration:** Ensure your Twilio API credentials are ready and configured in n8n. GoHighLevel Developer App:** Create a free GoHighLevel Developer App with the following scopes: contacts.readonly, contacts.write, opportunities.readonly, opportunities.write, locations.readonly. Generate the Client ID and Secret within the Developer App. Enter these into the n8n GHL OAuth credentials. Copy the OAuth Redirect URL from n8n to the App OAuth redirection settings and complete the authentication process. GoHighLevel Automation Workflow:** Create a workflow inside GHL that triggers when a "New Appointment is Created" and fires a POST webhook to the URL generated by Sub-workflow 3 in n8n. GoHighLevel Pipeline Setup:** Create a pipeline in GHL named "Missed call to appointment" with the following 3 stages: SMS sent No Reply Engaged | Appointment Link Sent BOOKED Scheduling Link:** Note down your GoHighLevel scheduling link and keep it handy to insert into the Twilio SMS node. Workflow Breakdown This n8n architecture is divided into three distinct sub-workflows. Here is the node-by-node explanation. Sub-Workflow 1: Automated Missed Call Follow-Up Goal: Detect a missed call, log it in GoHighLevel, and immediately text the prospect. When Webhook Received (n8n-nodes-base.webhook):** Acts as the entry point. It receives incoming POST call data from your telephony provider (Twilio) whenever a call status changes. Filter Valid Call Statuses (n8n-nodes-base.if):** Evaluates the incoming webhook payload. It only allows the workflow to continue if the CallStatus contains busy, no-answer, or canceled. Prepare Lead Data (n8n-nodes-base.set):** Cleans and maps the incoming JSON data. It extracts the caller's phone number, removes the + sign for clean formatting, grabs the called number and CallSid, and attaches specific tags like missed-call-lead. Create Lead in HighLevel (n8n-nodes-base.highLevel):** Pushes the cleaned data into GHL to create a new Contact. It maps the custom fields you created (called phone number and call sid) and assigns the hvac-inbound-missed tag. Create Opportunity in HighLevel (n8n-nodes-base.highLevel):** Creates a pipeline opportunity for the newly generated lead. It names the opportunity dynamically (e.g., "Missed Call.... [Phone].... [Date/Time]"). Send SMS via Twilio (n8n-nodes-base.twilio):** Sends the initial outreach text message to the caller (e.g., "Hi, I believe you missed a call with us... Please state your issue directly here"). Update Opportunity Status (n8n-nodes-base.highLevel):** Updates the GHL opportunity stage to the first stage in your pipeline ("SMS sent No Reply") to track that the initial outreach has occurred. Sub-Workflow 2: AI-Powered SMS Lead Qualification & Booking Goal: Process replies to the initial SMS, use AI to determine if it's a valid HVAC opportunity, and send a booking link. When SMS Received (n8n-nodes-base.twilioTrigger):** Listens for incoming SMS messages on your Twilio number. Check If Lead (n8n-nodes-base.highLevel):** Searches GHL to see if the sender's phone number already exists as a contact. Check Pipeline State (n8n-nodes-base.highLevel):** Looks up the specific opportunity associated with this contact in the "Missed call to appointment" pipeline. Lead Analyzer Agent (@n8n/n8n-nodes-langchain.agent):** The core AI brain of this sub-workflow. It consists of three integrated parts: The Agent: Prompted to act as an HVAC Opportunity Finder. It evaluates the user's SMS context to determine if they need HVAC services and if it's appropriate to send a booking link. Gemini Chat Model: Uses Google's gemini-3.1-flash-lite-preview model to process the prompt and context. Parse Structured Output: Forces the AI to return a clean JSON response (e.g., {"HVAC_oppurtunity?": "yes"}). If HVAC Opportunity Found (n8n-nodes-base.if):** Checks the parsed JSON output from the AI. If the AI determined the answer is "yes" or "yeah", the workflow proceeds. Send Response SMS (n8n-nodes-base.twilio):** Sends a text message containing your GHL scheduling link to prompt the prospect to book a visit. Update Lead Opportunity (n8n-nodes-base.highLevel):** Moves the GHL opportunity stage forward to "Engaged | Appointment Link Sent". Sub-Workflow 3: GoHighLevel Appointment Sync & Pipeline Advancement Goal: Finalize the pipeline sequence once the prospect actually books an appointment through your scheduling link. When Appointment Booked (n8n-nodes-base.webhook):** Receives the payload triggered by the GHL automation workflow you created in the prerequisites (fired when an appointment is booked). Check Lead SMS Origin (n8n-nodes-base.highLevel):** Queries GHL using the phone number from the appointment payload to ensure it matches up with the correct existing contact record. Check Pipeline State1 (n8n-nodes-base.highLevel):** Retrieves the current opportunity linked to this phone number that is currently sitting in the "Engaged" stage. Update Contact in HighLevel (n8n-nodes-base.highLevel):** Fills in the missing data gaps. Since the initial missed call only gave you a phone number, this node uses the data submitted in the booking form to update the contact's First Name, Last Name, and Email address. Update Opportunity in HighLevel (n8n-nodes-base.highLevel):** Moves the opportunity to its final stage: "BOOKED".
by SpaGreen Creative
Bulk WhatsApp Campaign Automation with Rapiwa API (Unofficial Integration) Who’s it for This n8n workflow lets you send bulk WhatsApp messages using your own number through Rapiwa API, avoiding the high cost and limitations of the official WhatsApp API. It integrates seamlessly with Google Sheets, where you can manage your contacts and messages with ease. Ideal for easy-to-maintain bulk messaging solution using their own personal or business WhatsApp number. This solution is perfect for small businesses, marketers, or teams looking for a cost-effective way to manage WhatsApp communication at scale. How it Works / What It Does Reads data from a Google Sheet where the Status column is marked as "pending". Cleans each phone number (removes special characters, spaces, etc.). Verifies if the number is a valid WhatsApp user using the Rapiwa API. If valid: Sends the message via Rapiwa. Updates Status = sent and Verification = verified. If invalid: Skips message sending. Updates Status = not sent and Verification = unverified. Waits for a few seconds (rate-limiting). Loops through the next item. The entire process is triggered automatically every 5 minutes. How to Set Up Duplicate the Sample Sheet: Use this format. Fill Contacts: Add columns like WhatsApp No, Name, Message, Image URL, and set Status = pending. Connect Google Sheets: Authenticate and link Google Sheets node inside n8n. Subscribe to Rapiwa: Go to Rapiwa.com and get your API key. Paste API Key: Use the HTTP Bearer token credential in n8n. Activate the Workflow: Let n8n take care of the automation. Requirements Google Sheets API credentials Configured Google Sheet (template linked above) WhatsApp (Personal or Business) n8n instance with credentials setup How to Customize the Workflow Add delay between messages**: Use the Wait node to introduce pauses (e.g., 5–10 seconds). Change message format**: Modify the HTTP Request node to send media or templates. Personalize content**: Include dynamic fields like Name, Image URL, etc. Error handling**: Add IF or SET nodes to capture failed attempts, retry, or log errors. Workflow Highlights Triggered every 5 minutes** using the Schedule Trigger node. Filters messages** with Status = pending. Cleans numbers* and *verifies WhatsApp existence** before sending. Sends WhatsApp messages** via Rapiwa (Unofficial API). Updates Google Sheets** to mark Status = sent or not sent and Verification = verified/unverified. Wait node** prevents rapid-fire sending that could lead to being flagged by WhatsApp. Setup in n8n 1. Connect Google Sheets Add a Google Sheets node Authenticate using your Google account Select the document and worksheet Use filter: Status = pending 2. Loop Through Rows Use SplitInBatches or a Code node to process rows in small chunks (e.g., 5 rows) Add a Wait node to delay 5 seconds between messages 3. Send Message via HTTP Node How the "Send Message Using Rapiwa" Node Sends Messages This node makes an HTTP POST request to the Rapiwa API endpoint: https://app.rapiwa.com/api/send-message It uses Bearer Token Authentication with your Rapiwa API key. When this node runs, it sends a WhatsApp message to the specified number with the given text and optional image. The Rapiwa API handles message delivery using your own WhatsApp number connected to their service. JSON Body**: { "number": "{{ $json['WhatsApp No'] }}", "message": "{{ $json['Message'] }}" } Sample Google Sheet Structure A Google Sheet formatted like this sample | SL | WhatsApp No | Name | Message | Image URL | Verification | Status | |----|----------------|------------------------|----------------------|---------------------------------------------------------------------------|--------------|---------| | 1 | 8801322827799 | SpaGreen Creative | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | | 2 | 8801725402187 | Abdul Mannan Zinnat | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | Tips Modify the Limit node to increase/decrease messages per cycle. Adjust the Wait node to control how fast messages are sent (e.g., 5–10s delay). Make sure WhatsApp numbers are properly formatted (e.g., 8801XXXXXXXXX, no +, no spaces). Store your Rapiwa API key securely using n8n credentials. Use publicly accessible image URLs if sending images. Always mark processed messages as "sent" to avoid duplicates. Use the Error workflow in n8n to catch failed sends for retry. Test with a small batch before going full-scale. Schedule the Trigger node for every 5 minutes to keep automation running. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Robert Breen
This n8n workflow template creates an intelligent data analysis system that converts natural language questions into Google Sheets SQL queries using OpenAI's GPT-4o model. The system generates proper Google Sheets query URLs and executes them via HTTP requests for efficient data retrieval. What This Workflow Does Natural Language to SQL**: Converts user questions into Google Sheets SQL syntax Direct HTTP Queries**: Bypasses API limits by using Google Sheets' built-in query functionality Column Letter Mapping**: Automatically maps column names to their corresponding letters (A, B, C, etc.) Structured Query Generation**: Outputs properly formatted Google Sheets query URLs Real-time Data Access**: Retrieves live data directly from Google Sheets Memory Management**: Maintains conversation context for follow-up questions Tools Used OpenAI Chat Model (GPT-4o)** - SQL query generation and natural language understanding OpenAI Chat Model (GPT-4.1 Mini)** - Result formatting and table output Google Sheets Tool** - Column metadata extraction and schema understanding HTTP Request Node** - Direct data retrieval via Google Sheets query API Structured Output Parser** - Formats AI responses into executable queries Memory Buffer Window** - Conversation history management Chat Trigger** - Webhook-based conversation interface Step-by-Step Setup Instructions 1. Prerequisites Before starting, ensure you have: An n8n instance (cloud or self-hosted) An OpenAI account with API access and billing setup A Google account with access to Google Sheets The target Google Sheet must be publicly accessible or shareable via link 2. Import the Workflow Copy the workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with a descriptive name like "Google Sheets SQL Query Generator" 3. Set Up OpenAI Connections Get API Key: Go to OpenAI Platform Sign in or create an account Navigate to API Keys section Click Create new secret key Copy the generated API key Important: Add billing information and credits to your OpenAI account Configure Both OpenAI Nodes: OpenAI Chat Model1 (GPT-4o): Click on the node Click Create New Credential Select OpenAI API Paste your API key Save the credential OpenAI Chat Model2 (GPT-4.1 Mini): Apply the same OpenAI API credential This handles result formatting 4. Set Up Google Sheets Connection Create OAuth2 Credentials: Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Go to Credentials → Create Credentials → OAuth 2.0 Client IDs Set application type to Web Application Add authorized redirect URIs (get this from n8n credentials setup) Copy the Client ID and Client Secret Configure in n8n: Click on the Get Column Info2 node Click Create New Credential Select Google Sheets OAuth2 API Enter your Client ID and Client Secret Complete the OAuth flow by clicking Connect my account Authorize the required permissions 5. Prepare Your Google Sheet Option A: Use the Sample Data Sheet Access the pre-configured sheet: Sample Marketing Data Make a copy to your Google Drive Important**: Set sharing to "Anyone with the link can view" Critical: Set sharing to "Anyone with the link can view" for HTTP access Copy the Sheet ID from the URL Update the Get Column Info2 node with your Sheet ID and column metadata sheet 6. Configure Sheet References Get Column Info2 Node: Set Document ID to your Google Sheet ID Set Sheet Name to your columns metadata sheet (e.g., "Columns") This provides the AI with column letter mappings HTTP Request Node: No configuration needed - it uses dynamic URLs from the AI agent Ensure your sheet has proper sharing permissions 7. Update System Prompt (If Using Custom Sheet) If using your own Google Sheet, update the system prompt in the AI Agent3 node: Replace the URL in the system message with your Google Sheet URL Update the GID (sheet ID) to match your data sheet Keep the same query structure format Contact Information Robert Ynteractive For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing this workflow, want to add security features, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by Davide
🤖🎵 This workflow automates the creation, storage, and cataloging of AI-generated music using the Eleven Music API, Google Sheets, and Google Drive. Key Advantages ✅ Fully Automated Music Generation Pipeline Once started, the workflow automatically: Reads track parameters Generates music via API Uploads the file Updates your spreadsheet No manual steps needed after initialization. ✅ Centralized Track Management A single Google Sheet acts as your project control center, letting you organize: Prompts Durations Generated URLs This avoids losing track of files and creates a ready-to-share catalog. ✅ Seamless Integration with Google Services The workflow: Reads instructions from Google Sheets Saves the MP3 to Google Drive Updates the same Sheet with the final link This ensures everything stays synchronized and easy to access. ✅ Scalable and Reliable Processing The loop-with-delay mechanism: Processes tracks sequentially Prevents API overload Ensures stable execution This is especially helpful when generating multiple long tracks. ✅ Easy Customization Because the prompts and durations come from Google Sheets: You can edit prompts at any time You can add more tracks without modifying the workflow You can clone the Sheet for different projects ✅ Ideal for Creators and Businesses This workflow is perfect for: Content creators generating background music Agencies designing custom soundtracks Businesses needing AI-generated audio assets Automated production pipelines How It Works The process operates as follows: The workflow starts manually via the "Execute workflow" trigger It retrieves a list of music track requests from a Google Sheets spreadsheet containing track titles, text prompts, and duration specifications The system processes each track request individually through a batch loop For each track, it sends the text prompt and duration to ElevenLabs Music API to generate studio-quality music The generated MP3 file (in 44100 Hz, 128 kbps format) is automatically uploaded to a designated Google Drive folder Once uploaded, the workflow updates the original Google Sheets with the direct URL to the generated music file A 1-minute wait period between each track generation prevents API rate limiting The process continues until all track requests in the spreadsheet have been processed Set Up Steps Prerequisites: ElevenLabs paid account with Music API access enabled Google Sheets spreadsheet with specific columns: TITLE, PROMPT, DURATION (ms), URL Google Drive folder for storing generated music files Configuration Steps: ElevenLabs API Setup: Enable Music Generation access in your ElevenLabs account Generate an API key from the ElevenLabs developer dashboard Configure HTTP Header authentication in n8n with name "xi-api-key" and your API value Google Sheets Preparation: Create or clone the music tracking spreadsheet with required columns Fill in track titles, detailed text prompts, and durations in milliseconds (10,000-300,000 ms) Configure Google Sheets OAuth credentials in n8n Update the document ID in the Google Sheets nodes Google Drive Configuration: Create a dedicated folder for music uploads Set up Google Drive OAuth credentials in n8n Update the folder ID in the upload node Workflow Activation: Ensure all API credentials are properly configured Test with a single track entry in the spreadsheet Verify music generation, upload, and spreadsheet update work correctly Execute the workflow to process all pending track requests The workflow automatically names files with timestamp prefixes (song_yyyyMMdd) and handles the complete lifecycle from prompt to downloadable music file. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Sayone Technologies
⭐ Google Review Sentiment Analysis & Slack Notification Workflow This workflow automates the process of collecting Google Business Profile reviews 🏪, analyzing customer sentiment with Google Gemini 🤖✨, and sending structured reports to Slack 💬. 🔑 Key Advantages 📥 Fetches Google Business Profile reviews for a given business and time period 🧠 Runs sentiment analysis using Gemini AI 📊 Consolidates comments, ratings, and trends into a JSON-based summary 🧩 Restructures results into Slack Block Kit format for easy readability 🚀 Sends automated sentiment reports directly to a Slack channel ⚙️ Set Up Essentials You’ll Need 🔑 Google Business Profile API access with project approval ✅ Enabled Google Business Profile API service 🔐 Gemini API credentials 💬 Slack workspace & channel for receiving reports 🚀 How to Get Started 🔧 Configure your Google Business Profile API and enable access 👤 Set the owner name and 📍 location to fetch reviews ⏳ Define the review time period using the Set Time Period node 🔗 Connect your Slack account and select a channel for notifications 🕒 Deploy and let the workflow run on schedule for automated insights
by Yanagi Chinatsu
Who it's for This workflow is perfect for space enthusiasts, community managers, and content creators who want to automatically share stunning, curated space imagery with their Slack communities. It's ideal for teams that enjoy a daily dose of scientific inspiration and visually engaging content without any manual effort. What it does This workflow automates the creation and posting of a daily space image gallery to Slack. Every day at a scheduled time, it fetches three distinct images from NASA's public APIs: one from the Mars Rover, one from the EPIC satellite observing Earth, and one from the extensive Image Library. For each image, the workflow uses an AI model to generate a unique and poetic caption, transforming a simple image post into a more engaging piece of content. Finally, it combines these three images and their AI-generated captions into a single, beautifully formatted message and posts it to your designated Slack channel. As a bonus, it also saves a copy of the message to a Google Drive folder for archival purposes. How to set up Configure Variables: In the Workflow Configuration node, enter your NASA API Key in the nasaApiKey field and specify your target Slack channel name in the slackChannel field (e.g., general). Connect Credentials: You will need to add your credentials for the OpenAI Chat Model, Post to Slack, and Google Drive nodes. Activate Workflow: Once your credentials and variables are set, simply save and activate the workflow. Requirements A NASA API Key (free to generate). An OpenAI account and API key. A Slack workspace with permissions to post messages. A Google Drive account. How to customize the workflow Adjust the Schedule: Change the trigger time or frequency in the Daily 10:00 - Start Poll node. Change AI Tone: Modify the system message in the AI Agent node to alter the style, tone, or language of the generated captions. Swap Image Sources: Update the URLs in the Fetch nodes to pull images from different NASA APIs or use different search queries. Add More Channels: Duplicate the Post to Slack node and modify it to send notifications to other services like Discord or Telegram.
by Adil Khan
This workflow bridges the gap between anonymous website traffic and on-chain wallet activity. It captures wallet connections via a webhook, enriches the data with real-time USD balances from the Zerion API, and syncs the results to Google Analytics 4, BigQuery, and Discord for immediate action. This directly helps Web3 marketing and growth teams identify high-value "whales" the moment they connect to your dApp, allowing for real-time monitoring and advanced attribution analysis. How it works Video tutorial: https://youtu.be/2_wuTRzRpkg How it works Webhook Trigger: Receives the wallet address, GA Client ID, and Session ID from your website via GTM. Zerion API Integration: Queries the real-time USD balance and individual chain distributions for the connected wallet. Whale Filtering (Switch): A logic that filters wallets based on a USD threshold (e.g., >$50) to trigger high-priority alerts. Dynamic Discord Alerts: Sends a formatted message to Discord with a 2-decimal rounded total balance and a dynamic breakdown of assets across all active chains (Base, Ethereum, etc.). GA4 Push: Sends the wallet_usd_balance as a custom metric to GA4 via the Measurement Protocol to maintain session continuity. BigQuery Archive: Records the wallet address, hashed ID, and USD balance into a secure table for SQL joining with raw GA4 data Prerequisites Zerion API Key: Required for fetching real-time balance and chain data. Discord Bot Token: Required to send automated whale alerts to your team server. Google Cloud Project: A project with BigQuery enabled and a JSON Service Account key for secure data insertion. GA4 Measurement Protocol API Secret: Required to push custom metrics back into active GA4 sessions.
by InfyOm Technologies
✅ What problem does this workflow solve? Missed return pickups create logistics delays, extra follow-ups, and unhappy customers for e-commerce teams. This workflow automates return pickup reminders, ensuring customers are notified on the day of pickup via WhatsApp messages and automated voice calls, without any manual effort. ⚙️ What does this workflow do? Runs automatically on a daily schedule. Reads return pickup data from Google Sheets. Identifies customers with: 📅 Pickup date = today ⏳ Status = Pending Sends personalized WhatsApp reminders. Places automated voice call reminders when required. Updates reminder status in Google Sheets for clear tracking. 🧠 How It Works – Step by Step 1. ⏰ Scheduled Trigger The workflow starts at a fixed time every day (e.g., 9–10 AM) using a Schedule Trigger. 2. 📄 Read Pickup Data from Google Sheets It fetches rows from Google Sheets where: Pickup Date** = today Status** = Pending This ensures only relevant pickups are processed. 3. 🔁 Loop Through Pickups Each matching row is processed individually to send customer-specific reminders. 4. ✍️ Generate Personalized Messages Using a Code node, the workflow creates: 📲 A WhatsApp text message 📞 A voice message script Messages include: Customer name Product name Pickup address Return reason Pickup timing reminder 5. 📲 Send WhatsApp Reminder A personalized WhatsApp message is sent via Twilio, reminding the customer to keep the package ready. 6. 📞 Place Voice Call Reminder If required, the workflow places an automated voice call using Twilio and reads out a clear pickup reminder using text-to-speech. 7. ✅ Update Pickup Status Once notifications are sent: The workflow updates the Status column to “Reminder Sent” Ensures the same pickup is not notified again 📊 Sample Google Sheet Columns | Order ID | Customer Name | Phone Number | Product | Pickup Date | Address | Return Reason | Status | |--------|----------------|--------------|---------|-------------|---------|---------------|--------| 🔧 Integrations Used Google Sheets** – Pickup data source and tracking Twilio WhatsApp API** – Message delivery Twilio Voice API** – Automated call reminders n8n Schedule + Logic Nodes** – Automation orchestration 👤 Who can use this? Perfect for: 🛒 E-commerce brands 📦 Reverse logistics teams 🚚 Delivery & pickup operations 🧑💼 Customer support teams It also works well for service visits, deliveries, appointments, and field operations. 💡 Key Benefits ✅ Fewer missed pickups ✅ Improved customer compliance ✅ Reduced manual follow-ups ✅ Clear tracking in Google Sheets ✅ Scalable and fully automated 🚀 Ready to Use? Just connect: ✅ Google Sheets with pickup data ✅ Twilio credentials (WhatsApp + Voice) ✅ Schedule trigger time