by Fahmi Fahreza
Match Resumes to Jobs Automatically with Gemini AI and Decodo Scraping Sign up for Decodo HERE for Discount This automation intelligently connects candidate profiles to job opportunities. It takes an intake form with a short summary, resume link, and optional LinkedIn profile, then enriches the data using Decodo and Gemini. The workflow analyzes skills, experience, and role relevance, ranks top matches, and emails a polished HTML report directly to your inbox—saving hours of manual review and matching effort. Who’s it for? This template is designed for recruiters, hiring managers, and talent operations teams who handle large candidate volumes and want faster, more accurate shortlisting. It’s also helpful for job seekers or career coaches who wish to identify high-fit openings automatically using structured AI analysis. How it works Receive an intake form containing a candidate’s resume, summary, and LinkedIn URL. Parse and summarize the resume with Gemini for core skills and experience. Enrich the data using Decodo scraping to gather extra profile details. Merge insights and rank job matches from Decodo’s job data. Generate an HTML shortlist and email it automatically through Gmail. How to set up Connect credentials for Gmail, Google Gemini, and Decodo. Update the Webhook path and test your form connection. Customize variables such as location or role preferences. Enable Send as HTML in the Gmail node for clean reports. Publish as self-hosted if community nodes are included.
by Rahul Joshi
📊 Description Empower your workflows with an intelligent AI chat assistant that retrieves real-time context from Google Sheets and a Pinecone knowledge base using Retrieval-Augmented Generation (RAG). 🤖📂 This workflow processes chat messages, retrieves relevant contextual data, generates natural and context-aware responses via OpenAI GPT-5, and automatically logs every interaction in Google Sheets. It also auto-indexes new documents from Google Drive into Pinecone for knowledge expansion and emails weekly chat history summaries for review. 💬📊 What This Template Does 1️⃣ Chat Trigger – Receives incoming chat messages through a webhook. 💬 2️⃣ Data Enrichment – Extracts topic, intent, and context from messages. 🧩 3️⃣ Context Retrieval – Fetches structured data from Google Sheets and semantic data from Pinecone. 🧠 4️⃣ AI Response Generation – Uses GPT-5 to produce relevant, human-like replies with contextual references. 🤖 5️⃣ Conversation Logging – Stores chat sessions with timestamps and detected intent in Google Sheets. 📋 6️⃣ Knowledge Base Expansion – Automatically indexes new files from Google Drive to Pinecone for future RAG responses. 📂 7️⃣ Weekly Report – Aggregates all chats and sends an email summary with logs attached via Gmail every Monday. 📧 Key Benefits ✅ AI chat assistant with contextual accuracy using RAG ✅ Auto-updating knowledge base via Google Drive → Pinecone ✅ End-to-end chat tracking and audit-ready logging ✅ Weekly performance reports via Gmail ✅ Seamless integration with Google Workspace Features Retrieval-Augmented Generation (RAG) workflow Google Sheets integration for structured memory Pinecone vector database for semantic retrieval GPT-5 for context-aware conversation generation Google Drive watcher for automatic knowledge updates Gmail integration for weekly report delivery Built-in JSON validation and session memory Requirements OpenAI API key (GPT-4/GPT-5) Google Sheets OAuth2 credentials Pinecone API credentials Google Drive OAuth2 credentials Gmail OAuth2 credentials Replace: PINECONE_INDEX with your index name GOOGLE_SHEET_ID with your log sheet ID GOOGLE_DRIVE_FOLDER_ID with monitored folder Target Audience Support and helpdesk automation teams 💬 AI chatbot developers enhancing context recall 🤖 Businesses maintaining searchable conversation history 🏢 Knowledge managers syncing chat + document intelligence 🧠 Step-by-Step Setup Instructions 1️⃣ Connect your OpenAI, Pinecone, Google Sheets, Google Drive, and Gmail credentials. 2️⃣ Update the Pinecone index and Google Sheet ID in the nodes. 3️⃣ Set your chat webhook URL for real-time message input. 4️⃣ Add your Google Drive folder ID for automated knowledge ingestion. 5️⃣ Test the chat workflow with sample messages. 6️⃣ Enable the schedule trigger to email weekly chat logs automatically. ✅
by Toshiya Minami
Who’s it for Teams building health/fitness apps, coaches running check-ins in chat, and anyone who needs quick, structured nutrition insights from food photos—without manual logging. What it does / How it works This workflow accepts a food image (URL or Base64), uses a vision-capable LLM to infer likely ingredients and rough gram amounts, estimates per-ingredient calories, and returns a strict JSON summary with total calories and a short nutrition note. It normalizes different payloads (e.g., Telegram/LINE/Webhook) into a common format, handles transient errors with retries, and avoids hardcoded secrets by using credentials/env vars. Requirements Vision-capable LLM credentials (e.g., gpt-4o or equivalent) One input channel (Webhook, Telegram, or LINE) Environment variables for model name/temperature and optional request validation How to set up Connect your input channel and enable the Webhook (copy the test URL). Add LLM credentials and set LLM_MODEL and LLM_TEMPERATURE (e.g., 0.3). Turn on the workflow, send a sample payload with imageUrl, and confirm the strict JSON output. (Optional) Configure a reply node (Telegram/Slack or HTTP Response) and a logger (Google Sheets/Notion). How to customize the workflow Outputs**: Add macros (protein/fat/carb) or micronutrient fields. Units**: Convert portion descriptions (piece/slice) to grams with your own mapping. Languages**: Toggle multilingual output (ja/en). Policies**: Tighten validation (reject low-confidence parses) or add manual review steps. Security**: Use signed/temporary URLs for private images; mask PII in logs. Data model (strict JSON) { "dishName": "string", "ingredients": [{ "name": "string", "amount": 0, "calories": 0 }], "totalCalories": 0, "nutritionEvaluation": "string" } Notes Rename all nodes clearly, include sticky notes explaining the setup, and never commit real IDs, tokens, or API keys.
by Keith Uy
What it's for: This is a base template for anyone trying to develop a Slack bot AI Agent. This base allows for multiple inputs (Voice, Picture, Video, and Text inputs) to be processed by an AI model of their choosing to a get a User started. From here, the User may connect any tools that they see fit to the AI Agent for their n8n workflows. NOTE: This build is specifically for integrating a Slack bot into a CHAT Channel If you want to allow the Slack bot to be integrated into the whole workspace, you'll need to adjust some of the nodes and bot parameters How it works: Input: Slack message mentioning a bot in a chat channel n8n Processing: Switch node determines the type: Voice Message Picture Message Video Message Text Message (Currently uses OpenAI and Gemini to analyze Voice/Photo/Video content but feel free to change these nodes with other models) AI Agent Proccessing: LLM of your choosing examines message and based on system prompt, generates an output Output: AI Output is sent back in Slack Message How to use: 1. Create your Slack bot and generate access token This part will be longest part of the guide but feel free to Youtube search "How to install Slack AI agent" or soemthing similar in case it's hard to follow Sign in to the Slack website then go to: https://api.slack.com/apps/ Click "Create App" (Top Right Corner) Choose "From Scratch" Enter desired name of App (bot) and desired workspace Go to "OAuth and Permissions" tab on the left side of the webpage Scroll to "Bot Token Scopes" and Add Permissions: app_metions:read channels:history channels:join channels:read chat:write files:read links:read links:write (Feel free to add other permissions here. These are just the ones that will be needed for the automation to work) Next, go to "Event Subscriptions" and paste your n8n webhook URL (Find webhook URL by clicking on the Slack trigger node and there should be a dropdown for webhook URL at the very top) Go back to "OAuth & Permissions" Tab and install your bot to the Slack workspace (should be a green button under the "Bot User OAuth Token" (Remember where this token is for later because you'll need it to create the n8n credentials) Add the bot to your channel by going to your channel, then type "@[your bot name]" Should be a message from Slack to add bot to Channel Congrats for following along, you've added the bot to your channel! 2. Create Credentials in n8n Open Slack trigger node Click create credential Paste access token (If you followed the steps above, it'll be under "OAuth & Permissions" -> Copy the "Bot User OAuth Token" and paste it in n8n accesss Save 3. Add Bot Token to HTTP Request nodes Open HTTP Request Nodes (Nodes under the "Downlaod" Note - Scroll down and paste your Bot Access token under "Header Parameters". Should be a placeholder "[Your bot access token goes here]". NOTE**: Replace everything, including the square brackets Do not** remove "Bearer". Only replace the placeholder. Finalized Authorization value should be: "Bearer + [Your bot access token]" NOT "[Your bot access token ONLY]" 4. Change ALL Slack nodes to your Slack Workspace and Channel Open the nodes, change workspace to your workspace Change channel to your channel Do this for all nodes 5. Create LLM access token (Different per LLM but search your LLM + API in google) (You will have to create an account with the LLM platform) Buy credits to use LLM API Generate Access token Paste token in LLM node Choose your model Requirements: Slack Bot Access Token Google Gemini Access Token (For Picture and Video messages) OpenAI Access Token (For Voice messages) LLM Access Token (Your preference for the AI Agent) Customizing this workflow: To personalize the AI Output, adjust the system prompt (give context or directions on the AI's role) Add tools to the AI agent to give it more utility besides a personalied LLM (Example: Calendars, Databases, etc).
by Hunyao
Upload a PDF and instantly get a neatly formatted Google Doc with all the readable text—no manual copy-paste, no messy line breaks. What this workflow does Accepts PDF uploads via a public form Sends the file to Mistral Cloud for high-accuracy OCR Detects and merges page images with their extracted text Cleans headers, footers, broken lines, and noise Creates a new Google Doc in your chosen Drive folder Writes the polished markdown text into the document What you need Mistral Cloud API key with OCR access Google Docs & Drive credentials connected in n8n Drive folder ID for new documents A PDF file to process (up to 100 MB) Setup Import the workflow into n8n and activate credentials. In Trigger • Form Submission, copy the webhook URL and share it or embed it. In Create • Google Doc, replace the default folder ID with yours. Fill out Mistral API key under Mistral Cloud API credentials. Save and activate the workflow. Visit the form, upload a PDF, name your future doc, and submit. Open Drive to view your newly generated, clean Google Doc. Example use cases Convert annual reports into editable text for analysis. Extract readable content from scan-only invoices for bookkeeping. Turn magazine PDFs into draft blog posts. Digitize lecture handouts for quick search and annotation. Convert image-heavy landing pages / advertorials into editable text for AI to analyze structure and content.
by ScoutNow
Automated Daily Competitor Tweet Summarizer with X API, GPT-5-Nano, and Gmail Stay on top of your competition with this powerful n8n workflow that automatically fetches and summarizes your competitors’ latest tweets every day. Using the official X (formerly Twitter) API and OpenAI's GPT-5-Nano model, this template extracts insights from public tweets and sends concise summaries directly to your Gmail inbox. Ideal for marketing teams, product managers, PR professionals, and competitive intelligence analysts, this solution turns noisy social feeds into clear, actionable summaries—automated and customized to your needs. Features Daily automation: Fetches competitor tweets every 24 hours using X API AI summarization: Uses GPT-5-Nano to highlight key insights and themes Smart filtering: Cleans and filters tweets for relevance before summarizing Email delivery: Sends summaries to Gmail (or your team’s inbox) Fully customizable: Modify schedules, accounts, and integrations as needed Setup Instructions Get API Keys: X API (Bearer Token) – from developer.x.com OpenAI API Key – from platform.openai.com Gmail OAuth2 credentials (via Google Cloud Console) Configure in n8n: Import the workflow Add credentials under the "Credentials" tab Set target X usernames and schedule Customize Delivery (Optional): Set email subject, recipients Add additional integrations (e.g., Slack, Notion, Sheets) How It Works Trigger: A daily cron node initiates the workflow. Fetch User ID: The workflow uses the X API to retrieve the user ID based on the provided username. This step is necessary because the tweet retrieval endpoint requires a user ID, not a username. Fetch Tweets: Using the extracted user ID, the workflow queries the X API for recent tweets from the selected account. Clean Data: Filters out replies, retweets, and any irrelevant content to ensure only meaningful tweets are summarized. Summarize: GPT-4 processes the cleaned tweet content and generates a concise, insightful summary. Send Email: The Gmail node sends the final summary to your inbox or chosen recipient. Use Cases Track competitor announcements and marketing messages Automate daily social media briefs for leadership Monitor trends in your industry effortlessly Keep your team aligned with market developments Requirements Valid X API credentials (Bearer token) OpenAI API key Gmail OAuth2 credentials Access to n8n (cloud or self-hosted) Delivery Options While Gmail is the default, you can easily extend the workflow to integrate with: Slack Notion Google Sheets Webhooks Any supported n8n integration Automate your competitive intelligence process and stay informed—without lifting a finger.
by DIGITAL BIZ TECH
AI Website Scraper & Company Intelligence Description This workflow automates the process of transforming any website URL into a structured, intelligent company profile. It's triggered by a form, allowing a user to submit a website and choose between a "basic" or "deep" scrape. The workflow extracts key information (mission, services, contacts, SEO keywords), stores it in a structured Supabase database, and archives a full JSON backup to Google Drive. It also features a secondary AI agent that automatically finds and saves competitors for each company, building a rich, interconnected database of company intelligence. Quick Implementation Steps Import the Workflow: Import the provided JSON file into your n8n instance. Install Custom Community Node: You must install the community node from: 👉 https://www.npmjs.com/package/n8n-nodes-crawl-and-scrape FIRECRAWL N8N Documentation 👉 https://docs.firecrawl.dev/developer-guides/workflow-automation/n8n Install Additional Nodes: n8n-nodes-crawl-and-scrape and n8n-nodes-mcp fire crawl mcp . Set up Credentials: Create credentials in n8n for FIRE CRAWL API,Supabase, Mistral AI, and Google Drive. Configure API Key (CRITICAL): Open the Web Search tool node. Go to Parameters → Headers and replace the hardcoded Tavily AI API key with your own. Configure Supabase Nodes: Assign your Supabase credential to all Supabase nodes. Ensure table names (e.g., companies, competitors) match your schema. Configure Google Drive Nodes: Assign your Google Drive credential to the Google Drive2 and save to Google Drive1 nodes. Select the correct Folder ID. Activate Workflow: Turn on the workflow and open the Webhook URL in the “On form submission” node to access the form. What It Does Form Trigger Captures user input: “Website URL” and “Scraping Type” (basic or deep). Scraping Router A Switch node routes the flow: Deep Scraping →** AI-based MCP Firecrawler agent. Basic Scraping →** Crawlee node. Deep Scraping (Firecrawl AI Agent) Uses Firecrawl and Tavily Web Search. Extracts a detailed JSON profile: mission, services, contacts, SEO keywords, etc. Basic Scraping (Crawlee) Uses Crawl and Scrape node to collect raw text. A Mistral-based AI extractor structures the data into JSON. Data Storage Stores structured data in Supabase tables (companies, company_basicprofiles). Archives a full JSON backup to Google Drive. Automated Competitor Analysis Runs after a deep scrape. Uses Tavily web search to find competitors (e.g., from Crunchbase). Saves competitor data to Supabase, linked by company_id. Who's It For Sales & Marketing Teams:** Enrich leads with deep company info. Market Researchers:** Build structured, searchable company databases. B2B Data Providers:** Automate company intelligence collection. Developers:** Use as a base for RAG or enrichment pipelines. Requirements n8n instance** (self-hosted or cloud) Supabase Account:** With tables like companies, competitors, social_links, etc. Mistral AI API Key** Google Drive Credentials** Tavily AI API Key** (Optional) Custom Nodes: n8n-nodes-crawl-and-scrape How It Works Flow Summary Form Trigger: Captures “Website URL” and “Scraping Type”. Switch Node: deep → MCP Firecrawler (AI Agent). basic → Crawl and Scrape node. Scraping & Extraction: Deep path: Firecrawler → JSON structure. Basic path: Crawlee → Mistral extractor → JSON. Storage: Save JSON to Supabase. Archive in Google Drive. Competitor Analysis (Deep Only): Finds competitors via Tavily. Saves to Supabase competitors table. End: Finishes with a No Operation node. How To Set Up Import workflow JSON. Install community nodes (especially n8n-nodes-crawl-and-scrape from npm). Configure credentials (Supabase, Mistral AI, Google Drive). Add your Tavily API key. Connect Supabase and Drive nodes properly. Fix disconnected “basic” path if needed. Activate workflow. Test via the webhook form URL. How To Customize Change LLMs:** Swap Mistral for OpenAI or Claude. Edit Scraper Prompts:** Modify system prompts in AI agent nodes. Change Extraction Schema:** Update JSON Schema in extractor nodes. Fix Relational Tables:** Add Items node before Supabase inserts for arrays (social links, keywords). Enhance Automation:** Add email/slack notifications, or replace form trigger with a Google Sheets trigger. Add-ons Automated Trigger:** Run on new sheet rows. Notifications:** Email or Slack alerts after completion. RAG Integration:** Use the Supabase database as a chatbot knowledge source. Use Case Examples Sales Lead Enrichment:** Instantly get company + competitor data from a URL. Market Research:** Collect and compare companies in a niche. B2B Database Creation:** Build a proprietary company dataset. WORKFLOW IMAGE Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|-----------| | Form Trigger 404 | Workflow not active | Activate the workflow | | Web Search Tool fails | Missing Tavily API key | Replace the placeholder key | | FIRECRAWLER / find competitor fails | Missing MCP node | Install n8n-nodes-mcp | | Basic scrape does nothing | Switch node path disconnected | Reconnect “basic” output | | Supabase node error | Wrong table/column names | Match schema exactly | Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. Contact: shilpa.raju@digitalbiz.tech For more such offerings, visit us: https://www.digitalbiz.tech
by David Olusola
📧 Auto-Send AI Follow-Up Emails to Zoom Attendees This workflow automatically emails personalized follow-ups to every Zoom meeting participant once the meeting ends. ⚙️ How It Works Zoom Webhook → Captures meeting.ended event + participant list. Normalize Data → Extracts names, emails, and transcript (if available). AI (GPT-4) → Drafts short, professional follow-up emails. Gmail → Sends thank-you + recap email to each participant. 🛠️ Setup Steps 1. Zoom App Enable meeting.ended event. Include participant email/name in webhook payload. Paste workflow webhook URL. 2. Gmail Connect Gmail OAuth in n8n. Emails are sent automatically per participant. 3. OpenAI Add your OpenAI API key. Uses GPT-4 for personalized drafting. 📊 Example Output Email Subject: Follow-Up: Marketing Strategy Session Email Body: Hi Sarah, Thank you for joining our Marketing Strategy Session today. Key points we discussed: Campaign launch next Monday Budget allocation approved Need design assets by Thursday Next steps: I'll follow up with the creative team and share the updated timeline. Best, David ⚡ With this workflow, every attendee feels valued and aligned after each meeting.
by Christian Lutz
How it works This workflow automates the delivery of personalized, AI-generated reports or roadmaps for new leads. When someone submits their information through a form, the workflow: Captures and stores the lead data. Uses an AI model to generate a customized report or roadmap. Formats the output into a professional, email-ready HTML document. Sends the report automatically to the lead via email. Optionally sends internal notifications (e.g., via chat or email) for tracking and follow-up. The process eliminates manual work and ensures every lead receives instant, high-quality output tailored to their input. Setup steps Webhook – Connect your form or website to the webhook endpoint to receive lead data. Data Table – Create or link a table to store incoming leads and track delivery status. AI Configuration – Add your OpenAI (or compatible) API credentials and customize prompts for your desired output. Email Setup – Configure SMTP credentials and define sender/recipient addresses for report delivery. Notifications – Optionally connect a chat or messaging service (e.g., Telegram) for internal alerts. Activation – Test the workflow, confirm the data flow and email delivery, then activate it for live use. This workflow transforms manual lead engagement into a fully automated, AI-driven experience that delivers instant, personalized value to every new contact.
by Ranjan Dailata
Who this is for This workflow is designed for: Recruiters, Talent Intelligence Teams, and HR tech builders automating resume ingestion. Developers and data engineers building ATS (Applicant Tracking Systems) or CRM data pipelines. AI and automation enthusiasts looking to extract structured JSON data from unstructured resume sources (PDFs, DOCs, HTML, or LinkedIn-like URLs). What problem this workflow solves Resumes often arrive in different formats (PDF, DOCX, web profile, etc.) that are difficult to process automatically. Manually extracting fields like candidate name, contact info, skills, and experience wastes time and is prone to human error. This workflow: Converts any unstructured resume into a structured JSON Resume format. Ensures the output aligns with the JSON Resume Schema. Saves the structured result to Google Sheets and local disk for easy tracking and integration with other tools. What this workflow does The workflow automates the entire resume parsing pipeline: Step 1: Trigger Starts manually with an Execute Workflow button. Step 2: Input Setup A Set Node defines the resume_url (e.g., a hosted resume link). Step 3: Resume Content Extraction Sends the URL to Thordata Universal API, which retrieves the web content, cleans HTML/CSS, and extracts structured text and metadata. Step 4: Convert HTML → Markdown Converts the HTML content into Markdown to prepare for AI model parsing. Step 5: JSON Resume Builder (AI Extraction) Sends the Markdown to OpenAI GPT-4.1-mini, which extracts: basics: name, email, phone, location work: companies, roles, achievements education: institutions, degrees, dates skills, projects, certifications, languages, and more The output adheres to the JSON Resume Schema. Step 6: Output Handling Saves the final structured resume: Locally to disk Appends to a Google Sheet for analytics or visualization. Setup Prerequisites n8n instance (self-hosted or cloud) Credentials for: Thordata Universal API (HTTP Bearer Token). First time users Signup OpenAI API Key Google Sheets OAuth2 integration Steps Import the provided workflow JSON into n8n. Configure your Thordata Universal API Token under Credentials → HTTP Bearer Auth. Connect your OpenAI account under Credentials → OpenAI API. Link your Google Sheets account (used in the Append or update row in sheet node). Replace the resume_url in the Set Node with your own resume file or hosted link. Execute the workflow. How to customize this workflow Input Sources Replace the Manual Trigger with: A Webhook Trigger to accept resumes uploaded from your website. A Google Drive / Dropbox Trigger to process uploaded files automatically. Output Destinations Send results to: Notion, Airtable, or Supabase via API nodes. Slack / Email for recruiter notifications. Language Model Options You can upgrade from gpt-4.1-mini → gpt-4.1 or a custom fine-tuned model for improved accuracy. Summary Unstructured Resume Parser with Thordata Universal API + OpenAI GPT-4.1-mini — automates the process of converting messy, unstructured resumes into clean, structured JSON data. It leverages Thordata’s Universal API for document ingestion and preprocessing, then uses OpenAI GPT-4.1-mini to extract key fields such as name, contact details, skills, experience, education, and achievements with high accuracy.
by Summer
Website Leads to Voice Demo and Scheduling Creator: Summer Chang AI Booking Agent Setup Guide Overview This automation turns your website into an active booking agent. When someone fills out your form, it automatically: Adds their information to Notion AI researches their business from their website Calls them immediately with a personalized pitch Updates Notion with call results Total setup time: 30-45 minutes What You Need Before starting, create accounts and gather these: n8n account (cloud or self-hosted) Notion account - Free plan works duplicate my notion template OpenRouter API key - Get from openrouter.ai Vapi account - Get from vapi.ai Create an AI assistant Set up a phone number Copy your API key, Assistant ID, and Phone Number ID How It Works The Complete Flow Visitor fills form on your website Form submission creates new record in Notion with Status = "New" Notion Trigger detects new record (checks every minute) Main Workflow executes: Fetches lead's website AI analyzes their business Updates Notion with analysis Makes Vapi call with personalized intro Call happens between your AI agent and the lead When call ends, Vapi sends webhook to n8n Webhook Workflow executes: Fetches call details from Vapi AI generates call summary Updates Notion with results and recording
by WhySoSerious
What it is This workflow listens for new tickets in HaloPSA via webhook, generates a professional AI-powered summary of the issue using Gemini (or another LLM), and posts it back into the ticket as a private note. It’s designed for MSPs using HaloPSA who want to reduce triage time and give engineers a clear head start on each support case. ⸻ ✨ Features • 🔔 Webhook trigger from HaloPSA on new ticket creation • 🚧 Optional team filter (skip Sales or other queues) • 📦 Extracts ticket subject, details, and ID • 🧠 Builds a structured AI prompt with MSP context (NinjaOne, M365, CIPP) • 🤖 Processes via Gemini or other LLM • 📑 Cleans & parses JSON output (summary, next step, troubleshooting) • 🧱 Generates a branded HTML private note (logo + styled sections) • 🌐 Posts the note back into HaloPSA via API ⸻ 🔧 Setup Webhook • Replace WEBHOOK_PATH and paste the generated Production URL into your HaloPSA webhook. Guard filter (optional) • Change teamName or teamId to skip tickets from specific queues. Branding • Replace YOUR_LOGO_URL and Your MSP Brand in the HTML note builder. HaloPSA API • In the HTTP node, replace YOUR_HALO_DOMAIN and add your Halo API token (Bearer auth). LLM credentials • Set your API key in the Gemini / OpenAI node credentials section. (Optional) Adjust the AI prompt with your own tools or processes. ⸻ ✅ Requirements • HaloPSA account with API enabled • Gemini / OpenAI (or other LLM) API key • SMTP (optional) if you want to extend with notifications ⸻ ⚡ Workflow overview `🔔 Webhook → 🚧 Guard → 📦 Extract Ticket → 🧠 Build AI Prompt → 🤖 AI Agent (Gemini) → 📑 Parse JSON → 🧱 Build HTML Note → 🌐 Post to HaloPSA`