by kote2
Overview This workflow lets you capture, store, and retrieve notes from LINE chats — both text and voice messages — and automatically send them to your Gmail inbox. By leveraging Supabase Vector Database, you can not only store and recall your notes, but also repurpose them for idea generation, quiz creation, or hypothesis building. Key Features Receive text and audio messages via LINE Transcribe audio messages automatically and save them in Supabase Trigger note storage with a specific keyword (default: “Diane”) Automatically send the latest notes to your Gmail every morning at 7 AM Search and reuse your notes (e.g., generate ideas, quizzes, or insights) Requirements Supabase account (free plan supported) LINE Messaging API channel setup (obtain your access token) Gmail authentication (OAuth2) Notes Replace placeholders such as LINE_CHANNELACCESS_TOKEN, YOUR_USERID, and YOUR_EMAIL_ADDRESS with your own information. All credentials (OpenAI, Supabase, LINE, Gmail, etc.) must be configured securely in the Credentials section of n8n. You may customize the trigger keyword (“Diane”) to any word you like.
by Rahul Joshi
Description: Keep your API documentation accurate and reliable with this n8n automation template. The workflow automatically tests your FAQ content related to authentication and rate limits, evaluating each answer using Azure OpenAI GPT-4o-mini for completeness, edge-case coverage, and technical clarity. It logs all results to Google Sheets, scores FAQs from 0–10, and sends Slack alerts when low-quality answers are detected. Ideal for API teams, developer relations managers, and technical writers who want to maintain high-quality documentation with zero manual review effort. ✅ What This Template Does (Step-by-Step) ▶️ Manual Trigger or On-Demand Run Start the evaluation anytime you update your FAQs — perfect for regression testing before documentation releases. 📖 Fetch FAQ Q&A from Google Sheets Reads FAQ questions and answers from your designated test sheet (columns A:B). Each Q&A pair becomes a test case for AI evaluation. 🤖 AI Evaluation via GPT-4o-mini Uses Azure OpenAI GPT-4o-mini to evaluate how well each FAQ covers critical aspects of API authentication and rate limiting. The AI provides a numeric score (0–10) and a short explanation. 🔍 Parse & Format AI Results Extracts structured JSON data (Question, Score, Explanation, Timestamp) and prepares it for reporting and filtering. 💾 Save Evaluation to Google Sheets Appends all results to a Results Sheet (A:D) — creating a running history of FAQ quality audits. ⚠️ Filter for Low-Scoring FAQs Identifies any FAQ with a score below 7, flagging them as needing review or rewrite. 🔔 Send Slack Alerts for Weak Entries Posts an alert message in your chosen Slack channel, including: The question text Score received AI’s explanation Link to the full results sheet This ensures your documentation team can quickly address weak or incomplete FAQ answers. 🧠 Key Features 🤖 AI-powered FAQ quality scoring (0–10) 📊 Automated tracking of doc health over time 📥 Seamless Google Sheets integration for results storage ⚙️ Slack notifications for underperforming FAQs 🧩 Ideal for continuous documentation improvement 💼 Use Cases 📘 Validate FAQ accuracy before API documentation updates ⚡ Auto-test new FAQ sets during content refresh cycles 🧠 Ensure API rate limit and auth topics cover all edge cases 📢 Alert documentation owners about weak answers instantly 📦 Required Integrations Google Sheets API – for reading and storing FAQs and test results Azure OpenAI (GPT-4o-mini) – for evaluating FAQ coverage and clarity Slack API – for sending quality alerts and notifications 🎯 Why Use This Template? ✅ Ensures API FAQ accuracy and completeness automatically ✅ Replaces tedious manual content reviews with AI scoring ✅ Builds an ongoing record of documentation improvements ✅ Keeps technical FAQs consistent, relevant, and developer-friendly
by Mychel Garzon
AI-Powered CV Feedback & Fit Score This workflow uses AI to automatically analyze a candidate’s CV against any job posting. It extracts key skills, requirements, and gaps, then generates a clear fit summary, recommendations, and optimization tips. Candidates also receive a structured email report, helping them improve their CV and focus on the right roles. No more guesswork, the workflow delivers objective, AI-powered career insights in minutes. Benefits • Automated CV analysis: Instantly compare your CV with any job description. • Clear recommendations: Get a fit score (1–10) plus “Apply,” “Consider,” or “Not a fit.” • Actionable feedback: See missing skills and concrete optimization tips. • Email reports: Candidates receive a professional summary directly in their inbox. Target Audience • Job seekers • Career coaches and recruiters • HR teams evaluating candidate job alignment • Tech bootcamps and training programs Required APIs • Google Gemini API (AI analysis) • Email credentials (send candidate reports) Easy Customization • Fit score logic: Adjust thresholds for “Apply,” “Consider,” and “Not a fit.” • Email templates: Personalize branding, tone, or add follow-up resources. • Delivery channels: Add Slack, Teams, or WhatsApp nodes for real-time feedback. • Language detection: Extend to more languages by adding translation nodes.
by NodeAlchemy
🧾 Short Description An AI-powered customer support workflow that automatically triages, summarizes, classifies, and routes tickets to the right Slack and CRM queues. It sends personalized auto-replies, logs results to Google Sheets, and uses a DLQ for failed cases. ⚙️ How It Works Trigger: Captures messages from email or form submissions. AI Triage: Summarizes and classifies issues, scores urgency, and suggests next steps. Routing: Directs to Slack or CRM queue based on type and priority. Logging: Records summaries, urgency, and responses in Google Sheets. Auto-Reply: Sends an acknowledgment email with ticket ID and SLA timeframe. Error Handling: Failed triage or delivery attempts are logged in a DLQ. 🧩 How to Use Configure triggers (email or webhook) and connect credentials for OpenAI, Slack, Gmail, and Google Sheets. In Workflow Configuration, set: Slack Channel IDs CRM Type (HubSpot, Salesforce, or custom) Google Sheet URL SLA thresholds (e.g., 2h, 6h, 24h) Test with a sample ticket and verify routing and summaries in Slack and Sheets. 🔑 Requirements OpenAI API key (GPT-4o-mini or newer) Slack OAuth credentials Google Sheets API access Gmail/SMTP credentials CRM API (HubSpot, Salesforce, or custom endpoint) 💡 Customization Ideas Add sentiment detection for customer tone. Localize responses for multilingual support. Extend DLQ logging to Notion or Airtable. Add escalation alerts for SLA breaches.
by Samir Saci
Tags: Image Compression, Tinify API, TinyPNG, SEO Optimisation, E-commerce, Marketing Context Hi! I’m Samir Saci, Supply Chain Engineer, Data Scientist based in Paris, and founder of LogiGreen. I built this workflow for an agency specialising in e-commerce to automate the daily compression of their images stored in a Google Drive folder. This is particularly useful when managing large libraries of product photos, website assets or marketing visuals that need to stay lightweight for SEO, website performance or storage optimisation. > Test this workflow with the free tier of the API! 📬 For business inquiries, you can find me on LinkedIn Who is this template for? This template is designed for: E-commerce managers** who need to keep product images optimised Marketing teams** handling large volumes of visuals Website owners** wanting automatic image compression for SEO Anyone using Google Drive** to store images that gradually become too heavy What does this workflow do? This workflow acts as an automated image compressor and reporting system using Tinify, Google Drive, and Gmail. Runs every day at 08:00 using a Schedule Trigger Fetches all images from the Google Drive Input folder Downloads each file and sends it to the Tinify API for compression Downloads the optimised image and saves it to the Compressed folder Moves the original file to the Original Images archive Logs: fileName, originalSize, compressedSize, imageId, outputUrl and processingId into a Data Table After processing, it retrieves all logs for the current batch Generates a clean HTML report summarising the compression results Sends the report via Gmail, including total space saved Here is an example from my personal folder: Here is the report generated for these images: P.S.: You can customise the report to match your company branding or visual identity. 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: Next Steps Before running the workflow, follow the sticky notes and configure the following: Get your Tinify API key for the free tier here: Get your key Replace Google Drive folder IDs in: Input, Compressed, and Original Images Replace the Data Table reference with your own (fields required: fileName, originalSize, compressedSize, imageId, outputUrl, processingId) Add your Tinify API key in the HTTP Basic Auth credentials Set up your Gmail credentials and recipient email (Optional) Customise the HTML report in the Generate Report Code node (Optional) Adjust the daily schedule to your preferred time Submitted: 18 November 2025 Template designed with n8n version 1.116.2
by Rahul Joshi
📘 Description: This end-to-end automation transforms developer support emails into actionable FAQs and sentiment insights using Azure OpenAI GPT-4o, Gmail, Notion, Slack, and Google Sheets. It not only classifies and summarizes each email into a Notion knowledge base but also detects sentiment and urgency, alerts the team on Slack for critical messages, and automatically replies to users with acknowledgment emails. Every failed or malformed payload is transparently logged in Google Sheets — ensuring zero message loss and full visibility into the AI pipeline. The result is a complete AI-driven customer support loop, from inbox to Notion and back to the sender. ⚙️ What This Workflow Does (Step-by-Step) 🟢 Gmail Polling Trigger – Developer Support Inbox Continuously monitors the developer support Gmail inbox every minute for new messages. Extracts the subject, sender, and snippet to initiate AI analysis. 🔍 Validate Email Payload (IF Node) Checks if each incoming email contains valid message data (like message ID and subject). ✅ True Path: continues to AI analysis ❌ False Path: logs error details in Google Sheets for debugging. 🧠 Configure GPT-4o Model (Azure OpenAI) Initializes GPT-4o as the reasoning model for semantic classification of developer support content. 🤖 Analyze & Classify Developer Email (AI Agent) Interprets each email and produces a structured JSON with: Problem summary FAQ category (e.g., API, Billing, UI) 2–3 line solution “Is recurring” flag for common issues. 🧹 Parse & Clean AI JSON Output (Code Node) Removes code formatting (json) and safely parses GPT-4o’s output into clean JSON. If parsing fails, the raw text and error message are sent to Google Sheets for review. 📘 Save FAQ Entry to Notion Database Creates a new FAQ record inside Notion’s “Release Notes” database. Stores the problem, category, and solution as searchable structured fields. 💬 Announce New FAQ in Slack Posts a summary of the new FAQ in Slack with title, category, and answer preview. Includes a link to view the Notion record instantly for team visibility. 🧠 Configure GPT-4o Model (Sentiment Analysis) Sets up another GPT-4o instance focused on understanding tone, emotion, and urgency of each email. ❤️ Analyze Email Sentiment & Urgency (AI Agent) Analyzes the email content to determine: Urgency: Low, Medium, High, Critical Sentiment: Positive, Neutral, Frustrated, Angry Immediate response required? (Yes/No) Provides a short “reason” explaining the classification. 🧹 Parse AI JSON Output – Sentiment Analysis Cleans and validates the JSON from sentiment AI for consistent field names (urgency, sentiment, reason). ⚖️ Filter Critical or High-Urgency Emails (IF Node) Checks if urgency == High or Critical. ✅ True Path: triggers escalation to Slack ❌ False Path: ends quietly to avoid unnecessary noise. 🚨 Alert Team in Slack – Critical Issue Sends an immediate Slack alert with: Email snippet Detected urgency and sentiment Short justification (reason) CTA for urgent action. Ensures fast team response to high-priority issues. 📨 Send Acknowledgment Email to Sender (Gmail Node) Automatically replies to the customer confirming receipt and providing a short AI-generated solution summary. Thanks the user and links the response back to the knowledge base — creating a closed-loop support experience. 🪶 Log Workflow Errors to Google Sheets Appends all failed validations, missing fields, or JSON parsing issues to the “error log sheet.” Provides a live audit trail for monitoring workflow health. 🧩 Prerequisites Gmail account with API access Azure OpenAI (GPT-4o) credentials Notion API integration (for FAQ database) Slack API access (for team alerts) Google Sheets (for logging errors) 💡 Key Benefits ✅ Converts support emails into structured FAQs automatically ✅ Detects sentiment & urgency for faster triage ✅ Keeps Notion knowledge base continuously updated ✅ Sends Slack alerts for critical issues instantly ✅ Maintains transparent error logs in Google Sheets 👥 Perfect For Developer Relations or Product Support Teams SaaS companies managing large support volumes Teams using Gmail, Notion, and Slack for internal comms Startups automating customer response and knowledge creation
by Billy Christi
Who is this for? This workflow is perfect for: Support teams and customer service departments managing Jira tickets Team leads and managers who need daily visibility into ticket resolution progress Organizations wanting to automate ticket reporting and communication IT departments seeking to streamline support ticket summarization and tracking What problem is this workflow solving? Manual ticket review and reporting is time-consuming and often lacks comprehensive analysis. This workflow solves those issues by: Automating daily ticket analysis** by fetching, analyzing, and summarizing all tickets created each day Providing intelligent summaries** using AI to extract key insights from ticket descriptions, comments, and resolutions Streamlining communication** by automatically sending formatted daily reports to stakeholders Saving time** by eliminating manual ticket review and report generation What this workflow does This workflow automatically fetches daily Jira tickets, analyzes them with AI, and sends comprehensive summaries via email to keep your team informed about support activities. Step by step: Schedule Trigger runs the workflow automatically at your chosen interval (or manual trigger for testing) Set Project Key defines the Jira project to monitor (default: SUP project) Get All Tickets from the specified project created today Split Out extracts individual ticket data including key, summary, and description Loop Tickets processes each ticket individually through batch processing Get Comments from Ticket retrieves all comments and conversations for complete context Merge combines ticket data with associated comments for comprehensive analysis Ticket Summarizer (AI Agent) uses OpenAI GPT-5 to generate professional summaries and proposed solutions Set Output structures the AI analysis into standardized JSON format Aggregate collects all processed ticket summaries into a single dataset Format Body creates a readable email format with direct Jira ticket links Send Ticket Summaries delivers the daily report via Gmail How to set up Connect your Jira account by adding your Jira Software Cloud API credentials to the Jira nodes Add your OpenAI API key to the OpenAI Chat Model node for AI-powered ticket analysis Configure Gmail credentials for the Send Ticket Summaries node to deliver reports Update the recipient email in the "Send Ticket Summaries" node to your desired recipient Adjust the project key in the "Set Project Key" node to match your Jira project identifier Configure the schedule trigger to run daily at your preferred time for automatic reporting Customize the JQL query in Jira nodes to filter tickets based on your specific requirements Test the workflow using the manual trigger to ensure proper ticket fetching and AI analysis Review email formatting in the "Format Body" node and adjust as needed for your reporting style How to customize this workflow to your needs Modify AI prompts**: customize the ticket analysis prompt in the "Ticket Summarizer" node to focus on specific aspects like priority, resolution time, or customer impact Adjust ticket filters**: change the JQL queries to filter by status, priority, assignee, or custom date ranges beyond "today" Add more data points**: include additional ticket fields like priority, status, assignee, or custom fields in the analysis Customize email format**: modify the "Format Body" node to change the report structure, add charts, or include additional formatting Set up different schedules**: create multiple versions for different reporting frequencies (hourly, weekly, monthly) Need help customizing? Contact me for consulting and support: 📧 billychartanto@gmail.com
by Maxim Osipovs
This n8n workflow template implements an intelligent research paper monitoring system that automatically tracks new publications in ArXiv's Artificial Intelligence category, filters them for relevance using LLM-based analysis, generates structured summaries, and delivers a formatted email digest. The system uses a three-stage pipeline architecture: automated paper retrieval from ArXiv's API AI-powered relevance filtering and analysis via Google Gemini Intelligent summarization with HTML formatting for clean email delivery This eliminates the need to manually browse ArXiv daily while ensuring you only receive summaries of papers genuinely relevant to your research interests. What This Template Does (Step-by-Step) Runs on a configurable schedule (Tuesday-Friday) to fetch new papers from ArXiv's cs.AI category via their API. Uses Google Gemini with structured output parsing to analyze each paper's relevance based on your defined criteria for "applicable AI research." Generates concise, structured summaries for the three selected papers using a separate LLM call Aggregates three relevant paper's data and summaries into a single, readable digest Important Notes The workflow only runs Tuesday through Friday, as ArXiv typically doesn't publish new papers on weekends Customize the "Paper Relevance Analyzer" criteria to match your specific research interests Adjust the similarity threshold and selection logic to control how many papers are included in each digest Required Integrations: ArXiv API (no authentication required) Google Gemini API (for relevance analysis and summarization) Email service (SMTP or email provider like Gmail, SendGrid, etc.) Best For: 🎓 Academic researchers tracking AI developments in their field 💼 ML practitioners and data scientists staying current with new techniques 🧠 AI enthusiasts who want curated, digestible research updates 🏢 Technical teams needing regular competitive intelligence on emerging approaches Key Benefits: ✅ Automates daily ArXiv monitoring, saving 60+ minutes of manual research time ✅ Uses AI to pre-filter papers, reducing information overload by 80-90% ✅ Delivers structured, readable summaries instead of raw abstracts ✅ Fully customizable relevance criteria to match your specific interests ✅ Professional HTML formatting makes digests easy to scan and share ✅ Eliminates the risk of missing important papers in your field
by Shelly-Ann Davy
Description Wake up gently. This elegant workflow runs every morning at 7 AM, picks one uplifting affirmation from a curated list, and delivers it to your inbox (with optional Telegram). Zero code, zero secrets—just drop in your SMTP and Telegram credentials, edit the affirmations, and activate. Perfect for creators, homemakers, and entrepreneurs who crave intention and beauty before the day begins. How it works (high-level steps) Cron wakes the flow daily at 7 AM. Set: Configuration stores your email, Telegram chat ID, and affirmations. Code node randomly selects one affirmation. Email node sends the message via SMTP. IF node decides whether to forward it to Telegram as well. Set-up time 2 – 3 minutes 30 s: add SMTP credential 30 s: add Telegram Bot credential (optional) 1 min: edit affirmations & email addresses 30 s: activate Detailed instructions All deep-dive steps live inside the yellow and white sticky notes on the canvas—no extra docs needed. Requirements SMTP account (SendGrid, Gmail, etc.) Telegram Bot account (optional) Customisation tips Change Cron time or frequency Swap affirmation list for quotes, verses, or mantras Add Notion logger branch for journaling
by masahiro hanawa
Video Metadata Extraction & YouTube Auto-Upload with AI Automatically process video files, extract metadata, generate AI-optimized titles/descriptions/tags, and upload directly to YouTube with proper categorization and thumbnail handling. Key Features YouTube Node Integration**: Direct video upload to YouTube with full metadata Binary Data Handling**: Proper video and thumbnail binary processing AI-Powered SEO Optimization**: Generates engaging titles, descriptions, and tags Video Metadata Extraction**: Analyzes video properties (duration, resolution, codec) Thumbnail Processing**: Extracts or uploads custom thumbnails Category Auto-Selection**: AI determines optimal YouTube category How It Works Video Intake: Receives video file via webhook or cloud storage trigger Metadata Extraction: Analyzes video file for technical properties AI Content Generation: Creates SEO-optimized title, description, and tags Thumbnail Processing: Extracts frame or uses provided thumbnail YouTube Upload: Uploads video with all metadata Post-Upload Processing: Retrieves video ID, creates playlist entry Notification: Sends confirmation with video URL Required Credentials YouTube OAuth2 (for video upload) OpenAI API (for AI metadata generation) Google Drive or Dropbox (optional, for cloud storage triggers) Gmail (for notifications) Google Sheets (for tracking) Unique Features Uses YouTube node for direct video upload (rarely used in templates) Binary data manipulation** for video and thumbnail handling AI-generated SEO metadata** optimized for YouTube algorithm Category detection** using AI classification Merge node** with chooseBranch for conditional flows Example Request { "videoFile": "<binary data>", "projectName": "Product Demo 2024", "targetAudience": "developers", "language": "en", "thumbnailFile": "<binary data>", "playlistId": "PLxxxxxxxx", "publishTime": "2024-01-15T14:00:00Z" } Supported Video Formats MP4, MOV, AVI, MKV, WebM Maximum file size: 128GB (YouTube limit) Recommended: MP4 with H.264 codec Output { "videoId": "dQw4w9WgXcQ", "videoUrl": "https://youtu.be/dQw4w9WgXcQ", "title": "AI-Generated Title", "description": "SEO-optimized description...", "tags": ["tag1", "tag2", "tag3"], "category": "Science & Technology", "uploadStatus": "processed", "thumbnailUrl": "https://i.ytimg.com/vi/..." }
by Asuka
Who Is This For? This workflow is designed for space enthusiasts, science educators, journalists, fact-checkers, and researchers who want to stay informed about near-Earth asteroid threats while filtering out media sensationalism. It's also valuable for anyone studying how different regions cover space-related news. What It Does This workflow creates an automated planetary defense monitoring system that: Scans NASA's Near Earth Object database for potentially hazardous asteroids over a 7-day window Searches news coverage across three regions (US, Japan, EU) to compare media reporting Uses AI (GPT-4o-mini) to fact-check news claims against official NASA data Detects misinformation and measures media sensationalism levels Generates visual charts comparing actual threat levels vs media panic Sends alerts through multiple channels (Slack, Discord, Email) Logs all alerts to Google Sheets for historical analysis How It Works Trigger: Runs daily at 9 AM or on-demand via webhook NASA Data Fetch: Retrieves 7-day asteroid forecast from NASA NeoWs API Threat Analysis: Identifies potentially hazardous asteroids and assigns alert levels (LOW/MEDIUM/HIGH) News Search: Searches news in US, Japan, and EU using Apify's Google Search Scraper AI Fact-Check: GPT-4o-mini compares news claims against NASA data, detecting misinformation Visualization: Generates gauge charts for threat level and media panic, plus regional comparison bar chart Multi-Channel Alerts: Sends formatted reports to Slack, Discord, Email, and logs to Google Sheets Set Up Steps Estimated time: 15-20 minutes NASA API (Required): Get your free API key at api.nasa.gov Apify (Required): Create account and connect via OAuth OpenAI (Required): Add your API key from platform.openai.com Notification Channels (Choose at least one): Slack: Create OAuth app and connect Discord: Create webhook URL Email: Configure SMTP settings Google Sheets (Optional): Create a sheet for logging with columns: Date, Alert Level, Hazardous Count, Threat Score, Media Panic Score, Misinformation Detected, Top Asteroid, Most Accurate Region Requirements NASA API key (free) Apify account (free tier available) OpenAI API key (paid) At least one notification channel configured n8n version 1.0+ How to Customize Change scan frequency**: Modify the Schedule Trigger node Add more regions**: Edit the "Configure Regional Search" code node Adjust alert thresholds**: Modify lunar distance threshold (currently 10) in "Analyze Asteroid Threats" Disable channels**: Simply remove connections to notification nodes you don't need Customize messages**: Edit the "Format Multi-Channel Messages" node
by Rahul Joshi
📘 Description: This workflow automates the creation, storage, and reporting of personalized sales collateral for booked leads using GPT-4o, Google Sheets, Google Drive, and Gmail. It pulls leads from a central sheet, filters booked ones, generates AI-written sales materials (summary, one-pager, and proposal), uploads the output to Drive, updates the sheet with proposal links, and emails a consolidated HTML summary to the marketing inbox. It serves as a full-cycle AI-powered outreach content generator that transforms structured lead data into ready-to-use collateral in minutes. ⚙️ What This Workflow Does (Step-by-Step) ▶️ When Clicking ‘Execute Workflow’ (Manual Trigger) Starts the automation manually, fetching the latest lead records for batch processing. 📊 Retrieve Lead Records from Google Sheets Pulls all lead details (company name, contact, email, booking status, etc.) from the outreach automation sheet used as the CRM base. 🧩 Validate Lead Data Payload Checks each row for a valid email format. ✅ Valid entries proceed to booking filter. ❌ Invalid ones are logged to an error sheet. ⚠️ Log Invalid Leads to Google Sheets Stores incomplete or malformed lead data in a separate tab for cleanup without interrupting execution. 🎯 Filter for Booked Leads Isolates leads marked as BOOKED—the confirmed clients eligible for personalized collateral generation. ⚙️ Configure GPT-4o Model (Azure OpenAI) Initializes the GPT-4o model to generate tailored text content based on lead data (company, title, industry, etc.). 🧠 Generate Sales Collateral (AI) Uses GPT-4o to produce three structured assets per lead: 1️⃣ Sales Summary — a concise 80-word follow-up note. 2️⃣ One-Pager — headline + three selling points + CTA. 3️⃣ Proposal Draft — introduction, scope, timeline, and next steps. All outputs returned as structured JSON for parsing. 🧹 Parse AI JSON Output Cleans and normalizes GPT-4o responses, ensuring JSON integrity and consistency across all generated materials. 📄 Convert Collateral into Text Reports Compiles each lead’s collateral into a .txt report containing all three sections. Formatting uses clean dividers and labeled blocks for readability. ☁️ Upload Sales Collateral to Google Drive Uploads each generated file to the collatral data Drive folder. Returns both view and download links for each report. 🔗 Map Uploaded Files with Lead Data Cross-references uploaded files with corresponding leads using index mapping. Prepares structured data with Email, ProposalLink, and timestamps. ✅ Update Lead Record with Proposal Link Updates the source Google Sheet, attaching each lead’s proposal link for traceability and internal access. 🗂️ Aggregate Uploaded File Metadata Compiles an HTML-formatted list of uploaded reports (file names and links). Calculates total processed leads for the summary section. ✉️ Generate Sales Summary Email (AI) Uses GPT-4o to create a clean HTML report section containing: Total booked leads processed Linked list of uploaded files Short insights paragraph summarizing sales activity 📧 Send Sales Summary Email via Gmail Delivers the HTML report to the internal inbox (e.g., newscctv22@gmail.com) with subject “Sales Collateral Summary.” The email is formatted for Gmail/Outlook rendering and ready for forwarding to management. 🧩 Prerequisites Google Sheets and Drive OAuth setup (Techdome account) Azure OpenAI GPT-4o credentials Gmail integration for report delivery 💡 Key Benefits ✅ Eliminates manual collateral drafting for booked leads ✅ Auto-updates CRM sheets with proposal links ✅ Generates consistent, professional B2B materials in real time ✅ Provides an instant HTML summary for daily or weekly reporting ✅ Ensures full traceability of every proposal created 👥 Perfect For B2B marketing and pre-sales teams Agencies managing client acquisition pipelines Business development operations using Google Sheets as CRM Teams seeking AI-driven, hands-off collateral generation and reporting