by Pawan
Who's it for? This template is perfect for educational institutions, coaching centers (like UPSC, GMAT, or specialized technical training), internal corporate knowledge bases, and SaaS companies that need to provide instant, accurate, and source-grounded answers based on proprietary documents. It's designed for users who want to leverage Google Gemini's powerful reasoning but ensure its answers are strictly factual and based only on their verified knowledge repository. How it works / What it does This workflow establishes a Retrieval-Augmented Generation (RAG) pipeline to build a secure, fact-based AI Agent. It operates in two main phases: 1. Knowledge Ingestion: When a new document (e.g., a PDF, lecture notes, or policy manual) is uploaded via a form or Google Drive, the Embeddings Google Gemini node converts the content into numerical vectors. These vectors are then stored in a secure MongoDB Atlas Vector Store, creating a private knowledge base. 2. AI Query & Response: A user asks a question via Telegram. The AI Agent uses the question to perform a semantic search on the MongoDB Vector Store, retrieving the most relevant, source-specific passages. It then feeds this retrieved context to the Google Gemini Chat Model to generate a precise, factual answer, which is sent back to the user on Telegram. This process ensures the agent never "hallucinates" or uses general internet knowledge, making the responses accurate and trustworthy. Requirements To use this template, you will need the following accounts and credentials: n8n Account Google Gemini API Key: For generating vector embeddings and powering the AI Agent. MongoDB Atlas Cluster: A free-tier cluster is sufficient, configured with a Vector Search index. Telegram Bot: A bot created via BotFather and a Chat ID where the bot will listen for and send messages. Google Drive Credentials (if using the Google Drive ingestion path). How to set up Set up MongoDB Atlas:** Create a free cluster and a database. Create a Vector Search Index on your collection to enable efficient searching. Configure Ingestion Path:** Set up the Webhook trigger for your "On form submission" or connect your Google Drive credentials. Configure the Embeddings Google Gemini node with your API Key. Connect the MongoDB Atlas Vector Store node with your database credentials, collection name, and index name. Configure Chat Path:** Set up the Telegram Trigger with your Bot Token to listen for incoming messages. Configure the Google Gemini Chat Model with your API Key. Connect the MongoDB Atlas Vector Store 1 node as a Tool within the AI Agent. Ensure it points to the same vector store as the ingestion path. Final Step:* Configure the Send a text message node with your *Telegram Bot Token and the Chat ID**. How to customize the workflow Change Knowledge Source:** Replace the Google Drive nodes with nodes for Notion, SharePoint, Zendesk, or another document source. Change Chat Platform:** Replace the Telegram nodes with a Slack, Discord, or WhatsApp Cloud trigger and response node. Refine the Agent's Persona:** Open the AI Agent node and edit the System Instruction to give the bot a specific role (e.g., "You are a senior UPSC coach. Answer questions politely and cite sources."). 💡 Example Use Case An UPSC/JEE/NEET coaching uploads NCERT summaries and previous year notes to Google Drive. Students ask questions in the Telegram group — the bot instantly replies with contextually accurate answers from the uploaded materials. The same agent can generate daily quizzes or concise notes from this curated content automatically.
by Rahul Joshi
📊 Description Streamline sales prioritization by automatically identifying, scoring, and routing high-value leads from GoHighLevel CRM to your sales team. This workflow scores contacts daily, flags top prospects, alerts sales reps in Slack, logs data to Google Sheets, and schedules instant follow-ups in Google Calendar — ensuring no valuable lead slips through the cracks. 🚀📈 What This Template Does Triggers daily at 8:00 AM to fetch all contacts from GoHighLevel CRM. ⏰ Processes lead data and extracts key details from custom fields. 🧩 Calculates lead scores using your predefined CRM field mappings. 🔢 Filters out incomplete or invalid contacts to ensure clean data flow. 🧼 Identifies high-value leads with a score above 80 for immediate attention. 🎯 Sends real-time Slack alerts to sales teams with contact and lead score details. 💬 Logs high-priority leads into a dedicated Google Sheet for tracking and analytics. 📊 Creates automatic Google Calendar follow-up events within 1 hour of detection. 📅 Key Benefits ✅ Automatically surfaces top leads for faster follow-up ✅ Keeps sales teams aligned through instant Slack alerts ✅ Eliminates manual data review and prioritization ✅ Centralizes performance tracking via Google Sheets ✅ Ensures consistent follow-up with Google Calendar scheduling ✅ Fully customizable lead score threshold and timing Features Daily scheduled trigger (8:00 AM) GoHighLevel CRM integration for contact retrieval Smart lead scoring via custom field mapping Conditional filtering for high-value leads Slack alert system for real-time engagement Google Sheets logging for transparency and analytics Auto-created Google Calendar events for follow-ups Requirements GoHighLevel API credentials with contact read permissions Slack Bot token with chat:write access Google Sheets OAuth2 credentials Google Calendar OAuth2 credentials Defined custom fields for Lead Score and Assigned Representative in GoHighLevel Target Audience Sales and business development teams tracking high-value leads Marketing teams optimizing lead qualification and follow-up Agencies using GoHighLevel for CRM and lead management Operations teams centralizing sales activity and analytics Step-by-Step Setup Instructions Connect your GoHighLevel OAuth2 credentials and ensure contact API access. Replace placeholder custom field IDs (Lead Score & Assigned Rep) in the Code node. Add your Slack channel ID for team notifications. Connect your Google Sheets document and replace its Sheet ID in the workflow. Link Google Calendar for automatic follow-up event creation. Adjust the lead score threshold (default: 80) if needed. Run a manual test to verify data flow, then enable the daily trigger for automation.
by Aryan Shinde
Instagram Reel Downloader & Logger Automate Instagram Reel Downloads, Storage, and Activity Logging What does this workflow do? Handles incoming webhook requests (ideal for Instagram/Facebook API triggers). Validates the webhook via challenge-response and custom verify token. Checks for messages from yourself (filtering automated/self-triggered runs). Downloads Instagram Reels from URLs posted to the webhook. Uploads the reel to Google Drive and retrieves the download URL. Logs reel details (status, URL, and timestamp) to a Google Sheet for record-keeping. Notifies you on Telegram with the download details and Google Drive link. How does it work? Webhook: Listens for new messages/events (custom webhook endpoint for Meta). Validation: Confirms webhook subscribe/challenge and verify token from Meta API. Sender Check: Ignores messages unless they match your configured sender/recipient. Download Reel: Fetches the reel/attachment from Instagram using received URLs. Timestamp Gen: Adds a precise timestamp and ISO-based unique ID to the activity log. Upload to Drive: Saves the downloaded reel in a preset Google Drive folder. Log to Sheet: Updates a Google Sheet with the reel’s status, URL, and timestamp. Telegram Alert: Instantly notifies you when a new reel is downloaded and logged. What do I need to make this work? A registered webhook endpoint (from your Meta/Instagram app configuration). A Google Drive and Google Sheets account (OAuth2 connected to n8n). A Telegram Bot and Chat ID setup to receive download completion messages. The correct verify_token in your webhook event source matches your template (‘youtube-automation-n8n-token’ by default). Update your Drive/Sheet/Bot credentials as per your n8n instance’s environment. Why use this? Fully automates the collection and archival of Instagram Reels. Centralizes content download, backup, and activity records for your automation flows. Provides instant monitoring and archival of each event. Setup Tips: Make sure your webhook path and Meta app configuration match (/n8n-template-insta-webhook). Double-check the Google credentials and the sheet’s tab IDs/names. Replace the Telegram and Google connection credentials with your own securely. Use this as a foundation for any Instagram/Facebook-based automations in n8n, and customize as your automation stack evolves! Publish confidently, and let users know this template: Saves time, automates digital content management, and notifies users in real-time.
by Peliqan
How it works This template is an end-to-end demo of a chatbot using business data from multiple sources (e.g. Notion, Chargebee, Hubspot etc.) with RAG + SQL. Peliqan.io is used as a "cache" of all business data. Peliqan uses one-click ELT to sync all your business data to its built-in data warehouse, allowing for fast & accurate RAG and "Text to SQL" queries. The workflow will write source data to Supabase as a vector store, for RAG searches by the chatbot. The source URL (e.g. the URL of a Notion page) is added in metadata. The AI Agent will decide for each question to use either RAG or Text-to-SQL or a combination of both. Text-to-SQL is performed via the Peliqan node, added as a tool to the AI Agent. The question of the user in natural language is converted to an SQL query by the AI Agent. The query is executed by Peliqan.io on the source data and the result is interpreted by the AI Agent. RAG is typically used to answer knowledge questions, often on non-structured data (Notion pages, Google Drive etc.). Text-to-SQL is typically used to answer analytical questions, for example "Show list of customers with number of open support tickets and add customer revenue based on invoiced amounts". Preconditions You signed up for a Peliqan.io free trial account You have one or more data sources, e.g. a CRM, ERP, Accounting software, files, Notion, Google Drive etc. Set up steps Sign up for a free trial on peliqan.io: https://peliqan.io Add one or more sources in Peliqan (e.g. Hubspot, Pipedrive...) Copy your Peliqan API key under settings and use it here to add a Peliqan connection Run the "RAG" workflow to feed Supabase, change the name of the table in the Peliqan node "Get table data". Update the list of tables & columns that can be used for SQL in the System Message of the AI Agent. Visit https://peliqan.io/n8n for more information. Disclaimer: This template contains a community node and therefore only works for n8n self-hosted users.
by Elay Guez
🔍 AI-Powered Web Research in Google Sheets with Bright Data 📋 Overview Transform any Google Sheets cell into an intelligent web scraper! Type =BRIGHTDATA("cell", "search prompt") and get AI-filtered result from every website in ~20 seconds. What happens automatically: AI optimizes your search query Bright Data scrapes the web (bypasses bot detection) AI analyzes and filters result Returns clean data directly to your cell Completes in <25 seconds Cost: ~$0.02-0.05 per search | Time saved: 3-5 minutes per search 👥 Who's it for Market researchers needing competitive intelligence E-commerce teams tracking prices Sales teams doing lead prospecting SEO specialists gathering content research Real estate agents monitoring listings Anyone tired of manual copy-paste ⚙️ How it works Webhook Call - Google Sheets function sends POST request Data Preparation - Organizes input structure AI Query Optimization - GPT-4.1 Mini refines search query Web Scraping - Bright Data fetches data while bypassing blocks AI Analysis - GPT-4o Mini filters and summarizes result Response - Returns plain text to your cell Logging - Updates logs for monitoring 🛠️ Setup Instructions Time to deploy: 20 minutes Requirements n8n instance with public URL Bright Data account + API key OpenAI API key Google account for Apps Script Part 1: n8n Workflow Setup Import this template into your n8n instance Configure Webhook node: Copy your webhook URL: https://n8n.yourdomain.com/webhook/brightdata-search Set authentication: Header Auth Set API key: 12312346 (or create your own) Add OpenAI credentials to AI nodes. Configure Bright Data: Add API credentials Configure Output Language: Manually edit the "Set Variables" Node. Test workflow with manual execution Activate the workflow Part 2: Google Sheets Function Open Google Sheet → Extensions → Apps Script Paste this code: function BRIGHTDATA(prompt, source) { if (!prompt || prompt === "") { return "❌ Must enter prompt"; } source = source || "google"; // Update with YOUR webhook URL const N8N_WEBHOOK_URL = "https://your-n8n-domain.com/webhook/brightdata-search"; // Update with YOUR password const API_KEY = "12312346"; let spreadsheetId, sheetName, cellAddress; try { const sheet = SpreadsheetApp.getActiveSheet(); const activeCell = sheet.getActiveCell(); spreadsheetId = SpreadsheetApp.getActiveSpreadsheet().getId(); sheetName = sheet.getName(); cellAddress = activeCell.getA1Notation(); } catch (e) { return "❌ Cannot identify cell"; } const payload = { prompt: prompt, source: source.toLowerCase(), context: { spreadsheetId: spreadsheetId, sheetName: sheetName, cellAddress: cellAddress, timestamp: new Date().toISOString() } }; const options = { method: "post", contentType: "application/json", payload: JSON.stringify(payload), muteHttpExceptions: true, headers: { "Accept": "text/plain", "key": API_KEY } }; try { const response = UrlFetchApp.fetch(N8N_WEBHOOK_URL, options); const responseCode = response.getResponseCode(); if (responseCode !== 200) { Logger.log("Error response: " + response.getContentText()); return "❌ Error " + responseCode; } return response.getContentText(); } catch (error) { Logger.log("Exception: " + error.toString()); return "❌ Connection error: " + error.toString(); } } function doGet(e) { return ContentService.createTextOutput(JSON.stringify({ status: "alive", message: "Apps Script is running", timestamp: new Date().toISOString() })).setMimeType(ContentService.MimeType.JSON); } Update N8N_WEBHOOK_URL with your webhook Update API_KEY with your password Save (Ctrl+S / Cmd+S) - Important! Close Apps Script editor 💡 Usage Examples =BRIGHTDATA("C3", "What is the current price of the product?") =BRIGHTDATA("D30", "What is the size of this company?") =BRIGHTDATA("A4", "Do this comapny is hiring Developers?") 🎨 Customization Easy Tweaks AI Models** - Switch to GPT-4o for better optimization Response Format** - Modify prompt for specific outputs Speed** - Optimize AI prompts to reduce time Language** - Change prompts for any language Advanced Options Implement rate limiting Add data validation Create async mode for long queries Add Slack notifications 🚀 Pro Tips Be Specific** - "What is iPhone 15 Pro 256GB US price?" beats "What is iPhone price?" Speed Matters** - Keep prompts concise (30s timeout limit) Monitor Costs** - Track Bright Data usage Debug** - Check workflow logs for errors ⚠️ Important Notes Timeout:** 30-second Google Sheets limit (aim for <20s) Plain Text Only:** No JSON responses Costs:** Monitor Bright Data at console.brightdata.com Security:** Keep API keys secret No Browser Storage:** Don't use localStorage/sessionStorage 🔧 Troubleshooting | Error | Solution | |-------|----------| | "Exceeded maximum execution time" | Optimize AI prompts or use async mode | | "Could not fetch data" | Verify Bright Data credentials | | Empty cell | Check n8n logs for AI parsing issues | | Broken characters | Verify UTF-8 encoding in webhook node | 📚 Resources Bright Data API Docs n8n Webhook Documentation Google Apps Script Reference Built with ❤️ by Elay Guez
by takuma
This workflow automates reputation management for physical stores (restaurants, retail, clinics) by monitoring Google Maps reviews, analyzing them with AI, and drafting professional replies. It acts as a 24/7 customer support assistant, ensuring you never miss a negative review and saving hours of manual writing time. Who is this for? Store Managers & Owners:** Keep track of customer sentiment without manually checking Google Maps every day. Marketing Agencies:** Automate local SEO reporting and response drafting for multiple clients. Customer Support Teams:** Get instant alerts for negative feedback to resolve issues quickly. How it works Schedule: Runs every 24 hours (customizable) to fetch the latest data. Scrape: Uses Apify to retrieve the latest reviews from a specific Google Maps URL. Filter: Checks the Google Sheet database to identify only new reviews and avoid duplicates. AI Analysis: An AI Agent (via OpenRouter/OpenAI) analyzes the review text to: Generate a short summary. Draft a polite, context-aware reply based on the star rating (e.g., apologies for low stars, gratitude for high stars). Alert: Sends a Slack notification. Low Rating (<4 stars): Alerts a specific channel (e.g., #customer-support) with a warning. High Rating: Alerts a general channel (e.g., #wins) to celebrate. Save: Appends the review details, AI summary, and draft reply to the Google Sheet. Requirements n8n:** Cloud or self-hosted (v1.0+). Apify Account:* To run the *Google Maps Reviews Scraper. Google Cloud Platform:** Enabled Google Sheets API. Slack Workspace:** A webhook URL or OAuth connection. OpenRouter (or OpenAI) API Key:** For the LLM generation. How to set up Google Sheets: Create a new sheet with the following headers in the first row: reviewId, publishedAt, reviewerName, stars, text, ai_summary, ai_reply, reviewUrl, output, publishedAt date. Configure Credentials: Set up your accounts for Google Sheets, Apify, Slack, and OpenRouter within n8n. Edit the "CONFIG" Node: MAPS_URL: Paste the full Google Maps link to your store. SHEET_ID: Paste the ID found in your Google Sheet URL. SHOP_NAME: Your store's name. Slack Nodes: Select the appropriate channels for positive and negative alerts. How to customize Change the AI Persona:* Open the *AI Agent** node and modify the "System Message" to match your brand's tone of voice (e.g., casual, formal, or witty). Adjust Alert Thresholds:* Edit the *If Rating < 4** node to change the criteria for what constitutes a "negative" review (e.g., strictly < 3 stars). Multi-Store Support:** You can loop this workflow over a list of URLs to manage multiple locations in a single execution.
by moosa
Daily Tech & Startup Digest: Notion-Powered News Curation Description This n8n workflow automates the curation of a daily tech and startup news digest from articles stored in a Notion database. It filters articles from the past 24 hours, refines them using keyword matching and LLM classification, aggregates them into a single Markdown digest with categorized summaries, and publishes the result as a Notion page. Designed for manual testing or daily scheduled runs, it includes sticky notes (as required by the n8n creator page) to document each step clearly. This original workflow is for educational purposes, showcasing Notion integration, AI classification, and Markdown-to-Notion conversion. Data in Notion Workflow Overview Triggers Manual Trigger**: Tests the workflow (When clicking ‘Execute workflow’). Schedule Trigger**: Runs daily at 8 PM (Schedule Trigger, disabled by default). Article Filtering Fetch Articles**: Queries the Notion database (Get many database pages) for articles from the last 24 hours using a date filter. Keyword Filtering**: JavaScript code (Code in JavaScript) filters articles containing tech/startup keywords (e.g., "tech," "AI," "startup") in title, summary, or full text. LLM Classification**: Uses OpenAI’s gpt-4.1-mini (OpenAI Chat Model) with a text classifier (Text Classifier) to categorize articles as "Tech/Startup" or "Other," keeping only relevant ones. Digest Creation Aggregate Articles**: Combines filtered articles into a single object (Code in JavaScript1) for processing. Generate Digest**: An AI agent (AI Agent) with OpenAI’s gpt-4.1-mini (OpenAI Chat Model1) creates a Markdown digest with an intro paragraph, categorized article summaries (e.g., AI & Developer Tools, Startups & Funding), clickable links, and a closing note. Notion Publishing Format for Notion**: JavaScript code (Code in JavaScript2) converts the Markdown digest into a Notion-compatible JSON payload, supporting headings, bulleted lists, and links, with a title like “Tech & Startup Daily Digest – YYYY-MM-DD”. Create Notion Page**: Sends the payload via HTTP request (HTTP Request) to the Notion API to create a new page. Credentials Uses Notion API and OpenAI API credentials. Notes This workflow is for educational purposes, demonstrating Notion database querying, AI classification, and Markdown-to-Notion publishing. Enable and adjust the schedule trigger (e.g., 8 PM daily) for production use to create daily digests. Set up Notion and OpenAI API credentials in n8n before running. The date filter can be modified (e.g., hours instead of days) to adjust the article selection window.
by Interlock GTM
Summary Turns a plain name + email into a fully-enriched HubSpot contact by matching the person in Apollo, pulling their latest LinkedIn activity, summarising the findings with GPT-4o, and upserting the clean data into HubSpot Key use-cases SDRs enriching inbound demo requests before routing RevOps teams keeping executive records fresh Marketers building highly-segmented email audiences Inputs |Field |Type| Example| |-|-|-| name |string| “Jane Doe” email| string |“jane@acme.com” Required credentials |Service |Node |Notes| |-|-|-| Apollo.io API key | HTTP Request – “Enrich with Apollo” |Set in header x-api-key RapidAPI key| (Fresh-LinkedIn-Profile-Data) “Get recent posts”| Header x-rapidapi-key OpenAI 3 LangChain nodes| Supply an API key| default model gpt-4o-mini HubSpot OAuth2| “Enrich in HubSpot”| Add/create any custom contact properties referenced High-level flow Trigger – Runs when another workflow passes name & email. Clean – JS Code node normalises & deduplicates emails. Apollo match – Queries /people/match; skips if no person. LinkedIn fetch – Grabs up to 3 original posts from last 30 days. AI summary chain OpenAI → Structured/Auto-fixing parsers Produces a strict JSON block with job title, location, summaries, etc. HubSpot upsert – Maps every key (plus five custom properties) into the contact record. Sticky-notes annotate the canvas; error-prone bits have retry logic.
by Madame AI
Product Review Analysis with BrowserAct & Gemini-Powered Recommendations. This n8n template demonstrates how to perform product review sentiment analysis and generate improvement recommendations using an AI Agent. This workflow is perfect for e-commerce store owners, product managers, or marketing teams who want to automate the process of collecting feedback and turning it into actionable insights. How it works The workflow is triggered manually. An HTTP Request node initiates a web scraping task with the BrowserAct API to collect product reviews. A series of If and Wait nodes are used to check the status of the scraping task. If the task is not yet complete, the workflow pauses and retries until it receives the full dataset. An AI Agent node, powered by Google Gemini, then processes the scraped review summaries. It analyzes the sentiment of each review and generates actionable improvement recommendations. Finally, the workflow sends these detailed recommendations via a Telegram message and an Email to the relevant stakeholders. Requirements BrowserAct** API account for web scraping BrowserAct* "Product Review Sentiment Analysis*" Template Gemini** account for the AI Agent Telegram* and *SMTP** credentials for sending messages Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase How to INSTANTLY Get Product Improvement Ideas from Amazon Reviews | BrowserAct + n8n + Gemini
by Vadim
What it does This workflow is an AI agent in the form of a Telegram bot. Its main purpose is to capture contact information and store it in a CRM. The agent supports multi-modal inputs and can extract contact details from text messages, voice recordings, and images (like photos of business cards). The bot guides the user through data collection via a natural conversation, asks clarifying questions for missing information, and summarizes the extracted data for confirmation before saving. It also checks for duplicate contacts by email and gives users the choice to either create a new contact or update an existing one. For simplicity, this example uses a Google Sheets document to store collected contacts. It can easily be replaced by a real CRM like HubSpot, Pipedrive, Monday, etc. How to use the bot Send contact details via text or voice, or upload a photo of a business card. The bot will show the extracted information and ask questions when needed. Once the bot confirms saving of the current contact, you can send the next one. Use the /new command at any moment to discard the previous conversation and start from scratch. Requirements A Telegram bot Access Token Google Gemini API key Google Sheets credentials How to set up Create a new Telegram bot (see n8n docs and Telegram bot API docs for details) Take webhook URL from the Telegram Trigger node (WEBHOOK_URL) and your bot's access token (TOKEN) and run curl -X POST "https://api.telegram.org/bot{TOKEN}/setWebhook?url={WEBHOOK_URL}" Create a new Google Sheets document with "Full name", "Email", "Phone", "Company", "Job title" and "Meeting notes" columns Configure parameters in the parameters node: Set ID of the Google Sheets document Set sheet name ("Sheet1" by default) Configure Google Sheets credentials for AI Agent's tools: Search for contact and Create new contact and Update existing contact. Add Google Gemini API key for the models ("AI Agent", "Transcribe audio", "Analyze image" nodes)
by Trung Tran
AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) Watch the demo video below: > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. Who’s it for YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. How it works / What it does Weekly Schedule Trigger Runs the workflow once per week. Get all videos uploaded last week Queries YouTube for videos uploaded by the channel in the past 7 days. Get video detail Retrieves each video’s title, description, and ID. YouTube Video Auto Tagging Agent (LLM) Inputs: video.title, video.description, channelName. Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. Update video with AI-generated tags Writes the tags back to the video via YouTube Data API. Inform via Slack message Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. How to set up YouTube connection Create a Google Cloud project and enable YouTube Data API v3. Configure OAuth client (Web app / Desktop as required). Authorize with the Google account that manages the channel. In your automation platform, add the YouTube credential and grant scopes (see Requirements). Slack connection Create or use an existing Slack app/bot. Install to your workspace and capture the Bot Token. Add the Slack credential in your automation platform. LLM / Chat Model Select your model (e.g., OpenAI GPT). Paste the System Prompt (SEO expert) and the User Prompt template: Inputs: {{video_title}}, {{video_description}}, {{channel_name}}. Output: comma-separated list of 15–20 tags (no #, no duplicates). Node configuration Weekly Schedule Trigger: choose day/time (e.g., Mondays 09:00 local). Get all videos uploaded last week: date filter = now() - 7 days. Get video detail: map each video ID from previous node. Agent node: map fields to the prompt variables. Update video: map the agent’s tag string to the YouTube tags field. Slack message: The video "{{video_title}} - {{video_id}}" has been auto-tagged successfully. Tags: {{tags}} Test run Manually run the workflow with one recent video. Verify the tags appear in YouTube Studio and the Slack message posts. Requirements APIs & Scopes YouTube Data API v3** youtube.readonly (to list videos / details) youtube or youtube.force-ssl (to update video metadata incl. tags) Slack Bot Token Scopes** chat:write (post messages) channels:read or groups:read if selecting channels dynamically (optional) Platform Access to a chat/LLM provider (e.g., OpenAI). Outbound HTTPS allowed. Rate limits & quotas YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. How to customize the workflow Schedule:** switch to daily, or run on publish events instead of weekly. Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). Prompt tuning:** Add brand keywords to always include (e.g., “WiseStack AI”). Constrain to language (e.g., “Vietnamese tags only”). Enforce max 500 chars total for tags if you want a stricter cap. Safety guardrails:** Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. If the agent fails, fall back to a heuristic generator (title/keywords extraction). Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. Notifications:** add error Slack messages for failed API calls; summarize weekly results. Tip: Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.
by AppUnits AI
Generate Invoices for Customers with Jotform, Xero and Slack This workflow automates the entire process of receiving a product/service order, checking or creating a customer in Xero, generating an invoice, emailing it, and notifying the sales team for example (via Slack) — all triggered by a form submission (via Jotform). How It Works Receive Submission Triggered when a user submits a form. Collects data like customer details, selected product/service, etc. Check If Customer Exists Searches Xero to determine if the customer already exists. ✅ If Customer Exists: Update customer details. ❌ If Customer Doesn’t Exist: Create a new customer in Xero. Create The Invoice Generates a new invoice for the customer using the item selected. Send The Invoice Automatically sends the invoice via email to the customer. Notify The Team Notifies the sales team for example via Slack about the new invoice. Who Can Benefit from This Workflow? Freelancers** Service Providers** Consultants & Coaches** Small Businesses** E-commerce or Custom Product Sellers** Requirements Jotform webhook setup, more info here Xero credentials, more info here Make sure that products/services values in Jotform are exactly the same as your item Code in your Xero account Email setup, update email node (Send email) LLM model credentials Slack credentials, more info here