by Baptiste Fort
📘 Workflow Documentation – Stock Market Daily Digest 👋 Introduction Wake up to a clean, analyst-style stock digest in your inbox—top gainers/losers, a readable performance table, 3–5 insights, and upcoming events—no spreadsheets, no manual scraping, no copy-paste. This article explains, step by step, how to build a robust, daily, end-to-end automation that collects market data (Bright Data), waits until scraping is done, aggregates results, asks an AI model (OpenAI) to draft a styled HTML email, logs everything to Airtable, and finally sends the report via Gmail. You’ll find a friendly but technical tour of every single node, so you can rebuild or adapt the same pipeline with confidence. 🎯 Who is this workflow for? Investors & traders** who want a quick, readable daily summary. Finance/Product teams** building data-driven alerts/digests. Consultants & agencies** sending recurring client updates. Automation builders** prototyping finance ops quickly. 🧰 Tools you’ll need Bright Data** — dataset triggers & snapshots for reliable web data. OpenAI (GPT)** — to generate a professional HTML digest. Airtable** — store daily rows for history, filters, dashboards. Example Airtable Table: Daily Stocks | Ticker | Company | Price | Change % | Sentiment | Date | |--------|--------------------------|---------|----------|-----------|---------------------| | AAPL | Apple Inc. | 225.80 | +1.4% | 🟢 Positive | 2025-09-18 09:00 | | MSFT | Microsoft Corporation | 415.20 | -0.7% | 🔴 Negative | 2025-09-18 09:00 | | NVDA | NVIDIA Corporation | 124.55 | +2.1% | 🟢 Positive | 2025-09-18 09:00 | | TSLA | Tesla Inc. | 260.00 | -3.0% | 🔴 Negative | 2025-09-18 09:00 | | META | Meta Platforms Inc. | 310.45 | +0.5% | 🟡 Neutral | 2025-09-18 09:00 | Gmail** — deliver the final HTML email to stakeholders. n8n** — the automation engine that orchestrates every step. > Keep API keys in n8n Credentials (never hard-code secrets). 🗺️ Architecture at a glance Schedule fires daily Seed list of tickers Split into one item per stock Prepare keyword for scraping Launch Bright Data job Poll progress with a wait-loop Fetch snapshot data Aggregate for the AI Generate HTML summary (GPT) Save rows to Airtable Send email via Gmail ⚙️ Step-by-step — Every node explained ⏰ Daily Run Trigger (Schedule Trigger) Purpose Start the automation at a precise time each day so nobody needs to push a button. Parameters (English) Trigger Type**: Time Interval or Cron Every X**: 1 Day (or your preferred cadence) Timezone**: UTC (or your own) Start Time**: optional (e.g., 09:00) 📝 Set Stock List (Set Node – SAMPLE DATA) Purpose Define the universe of stocks to monitor. This acts as the seed data for scraping. Parameters Values to Set**: Fixed JSON (array of objects) Keep Only Set**: true Fields per item**: ticker, name, market_cap (you may add sector, isin, etc.) 🔀 Split Stocks (Split Out) Purpose Turn the array into individual items so each ticker is processed independently (scraping, polling, results). Parameters Operation**: Split Out Items Field to Split**: the array defined in the previous Set node 🏷 Prepare Stock Keyword (Set Node) Purpose Create a keyword field (typically equal to ticker) for Bright Data discovery. Parameters Values to Set**: Add Field Field Name**: keyword Value**: use an expression referencing the current item’s ticker (e.g., ` {{ $json.ticker }} `) 🕸 Bright Data Scraper (HTTP Request) Purpose Trigger the Bright Data dataset to start collecting information for the keyword. Returns a snapshot_id to poll later. Parameters Method**: POST Endpoint**: https://api.brightdata.com/datasets/v1/trigger Authentication**: Authorization: Bearer <token> (header) Body Fields**: dataset_id: your Bright Data dataset ID discover_by: usually keyword keyword: the value prepared above > Add a retry/backoff policy on 429/5xx in node options. 🔄 Check Scraper Progress (HTTP Request) Purpose Poll Bright Data to see whether the snapshot is running or ready. Parameters Method**: GET Endpoint**: https://api.brightdata.com/datasets/v1/snapshots/{snapshot_id} Authentication**: Authorization: Bearer <token> Expected Output**: a status field (running, ready) ⏳ Wait for Data (Wait Node) Purpose Pause between progress checks to avoid rate limits and give Bright Data time to finish. Parameters Mode**: Wait a fixed amount of time Time**: e.g., 30 seconds (tune to your dataset size) 🔀 Scraper Status Switch (Switch Node) Purpose Route logic based on the polled status. Parameters Value to Check**: status Rules**: Equals running → go to Wait for Data (then re-check) Equals ready → proceed to Fetch Scraper Results > Loop pattern: Check → Wait → Check, until ready. 📥 Fetch Scraper Results (HTTP Request) Purpose Download the completed snapshot data once Bright Data marks it ready. Parameters Method**: GET Endpoint**: https://api.brightdata.com/datasets/v1/snapshots/{snapshot_id}/data Authentication**: Authorization: Bearer <token> Query**: format=json Output**: array of rows per ticker (price, change %, any fields your dataset yields) > Normalize fields with a Set/Code node if needed. 📊 Aggregate Stock Data (Aggregate Node) Purpose Combine all individual items into one consolidated object so the AI can analyze the entire market snapshot. Parameters Mode**: Aggregate (merge to a single item) Fields to Include**: ticker, name, price, change, sentiment (plus any extra fields captured) Output**: one JSON item containing an array/map of the day’s stocks 🤖 Generate Daily Summary (AI Node – OpenAI) Purpose Ask the model to convert raw data into a styled HTML email: headline, top movers, table, insights, and (optional) upcoming events. Parameters Model**: gpt-4.1 Input**: the aggregated JSON from the previous node Prompt guidelines**: Output HTML only with inline styles (email-safe) Include a table (Ticker, Company, % Change with ↑/↓ & color, Market Cap, Sentiment icon) Highlight top 2 gainers & 2 losers with short reasoning if present Provide 3–5 insights (sector rotation, volatility, outliers) Add upcoming events when available (earnings, launches, macro) Footer: “Generated automatically by your AI-powered stock monitor” Output field**: confirm the exact property that contains the HTML (e.g., output, message, text) 🗂 Save to Airtable (Airtable – Create Record) Purpose Log each item (or the roll-up) to Airtable for history, filtering, and dashboards. Parameters Operation**: Create Record Base ID**: from your Airtable URL Table**: e.g., Daily Stocks Field Mapping**: Ticker ← ` {{ $json.ticker }} ` Company ← ` {{ $json.name }} ` Price ← ` {{ $json.price }} ` Change % ← ` {{ $json.change }} ` Sentiment ← ` {{ $json.sentiment }} ` Date ← ` {{ $now.toISO() }} ` > Use a Single-Select for Sentiment (🟢 / 🟡 / 🔴) to build clean Airtable views. 📧 Send Report via Gmail (Gmail Node) Purpose Deliver the AI-generated HTML digest to your recipients. Parameters Operation**: Send Email Send To**: one or more recipients (e.g., investor@domain.com) Subject**: Daily Stock Market Digest – {{ $now.format("yyyy-MM-dd") }} Message (HTML)**: reference the AI node’s HTML property (e.g., ` {{ $('Generate Daily Summary').first().json.output }} `) Options: set **Append Attribution to false (keep the email clean) > Test in Gmail, Outlook, and mobile to validate inline CSS. 🧪 Error handling & reliability tips Backoff on Bright Data* — If scraping many tickers, increase *Wait** or batch requests. Guard against empty results** — If a snapshot returns 0 rows, branch to a fallback email (“No data today”). AI guardrails** — Enforce “HTML-only” and skip missing sections gracefully. Airtable normalization** — Strip %, cast numbers to float before insert. Observability* — Add a final Slack/Email *On Fail** node with run ID and error message. 🧩 Customization ideas Sector deep-dives**: add sector fields and a second AI paragraph on sector rotation. CSV attachment**: generate & attach a CSV for power users. Multiple lists**: run parallel branches for Tech, Healthcare, or regions. Other asset classes**: Crypto, ETFs, Indices, FX. Audience targeting**: different “To” lists and slightly different prompts per audience. ✅ Why this workflow is powerful Hands-off** — the report simply shows up every day. Analyst-grade** — clean HTML, top movers, tidy table, actionable insights. Auditable** — rows archived in Airtable for history and dashboards. Composable** — swap scrapers, LLMs, storage, or email service. Scalable** — start with 10 tickers, grow to many lists using the same loop. For advanced no-code & AI projects, see 0vni – Agence automatisation.
by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automated Academic Paper Monitoring Stay updated with the latest research in your field. This bot monitors multiple academic databases for new papers matching your interests and sends personalized alerts. Bot Features: Monitor keywords across multiple databases Filter by authors, journals, or institutions Daily/weekly digest emails Slack notifications for high-impact papers Automatic paper summarization Workflow Components: Schedule: Run daily/weekly checks Search: Query latest papers across databases Filter: Apply custom criteria Summarize: Generate paper summaries Notify: Send alerts via email/Slack Archive: Store papers for future reference Perfect For: Research groups tracking their field PhD students monitoring specific topics Labs following competitor publications
by Mihai Farcas
This n8n template demonstrates how to deploy an AI workflow in production while simultaneously running a robust, data-driven Evaluation Framework to ensure quality and optimize costs. Use Cases Model Comparison: Quickly A/B test different LLM models (e.g., Gemini 3 Pro vs. Flash Lite) for speed and cost efficiency against your specific task. Prompt Regression: Ensure that tweaks to your system prompt do not introduce new errors or lower the accuracy of your lead categorization. Production Safety: Guarantee that test runs never trigger real-world actions like sending emails to a client or sales team. Requirements A configured Gmail Trigger (or equivalent email trigger). A Google Gemini account for the LLM models. An n8n Data Table containing your "Golden Dataset" of test cases and ground truths. How it Works The workflow contains two distinct, parallel execution paths: Production Path: The Gmail Trigger monitors for new emails. The email text is routed through the Sentiment Analysis node, which categorizes the lead as Positive, Neutral, or Negative. Check if Evaluating nodes verify the current execution mode. If it is not an evaluation run (the Fail branch), the lead is routed to the corresponding Send Email node for action. Evaluation Path: The When fetching a dataset row trigger pulls test cases (input text and expected sentiment/ground truth) from an n8n Data Table. Each test case loops through the same Sentiment Analysis node. The Check if Evaluating nodes route this path to the Success branch, skipping the real email actions. The Save Output node writes the model's prediction to the Data Table. The Set Metrics node uses the Categorization metric to compare the prediction against the ground truth, returning a score (0 or 1) to measure accuracy. Key Technical Details Model Switching: Multiple Google Gemini Chat Model nodes are connected via the Model input on the Sentiment Analysis node, allowing you to easily swap and compare models without changing the core logic. Edge Case Handling: The System Prompt Template in the Sentiment Analysis node is customized to handle tricky inputs, such as negative feedback about a competitor that should be classified as a Positive lead. Metrics: The workflow uses the built-in Categorization metric, which is ideal for classification tasks like sentiment analysis, to provide objective evidence of performance.
by Jitesh Dugar
Transform your order fulfillment process with complete invoice automation. This workflow automatically generates professional PDF invoices from Jotform orders and delivers them to customers while keeping organized records. What This Workflow Does ✅ Receives order data from Jotform Trigger ✅ Generates professional HTML invoice with your branding ✅ Converts to PDF using HTML to PDF conversion ✅ Saves invoice to Google Drive for record-keeping ✅ Emails PDF invoice to customer automatically Workflow Steps Jotform Trigger - Captures order data when a customer places an order Format Invoice Data - Prepares and structures line item data Generate HTML Invoice - Creates custom branded HTML invoice Generate PDF Invoice - Converts HTML to professional PDF format Download File PDF - Prepares PDF for distribution Save to Google Drive - Archives invoice for your records Email to Customer - Sends invoice PDF directly to customer's inbox Requirements Jotform account with order form set up Sign up for free here PDF generation API key (get yours at pdfmunk.com) Google Drive connection for storage Email service connection (Gmail, SMTP, etc.) Benefits Save time** - Eliminate manual invoice creation Professional branding** - Customize invoice template to match your brand Organized records** - All invoices automatically saved to Google Drive Better customer experience** - Instant invoice delivery after order placement Scalable** - Handles unlimited orders without additional work
by Fabian Perez
This workflow automates your entire lead follow-up process across email, SMS, and WhatsApp. It starts on a schedule and pulls your latest leads from FollowUpBoss (FUB), checking when the workflow last ran. Each new contact is automatically validated — phone numbers and emails are cleaned, filtered, and checked for duplicates before sending any message. Once validated, the system intelligently decides how to reach each lead: 💬 Email + SMS if all data looks good 📧 Email only if phone is invalid 📱 SMS/WhatsApp only if email is missing Each message is personalized using data from the lead record, and everything is tracked back in your database for future reporting. This template helps agents, marketing teams, and CRM users run consistent follow-ups without missing a single contact. Whether you manage 10 or 10 000 leads, this flow scales effortlessly. Tools used: FollowUpBoss, Gmail, Twilio/WhatsApp, n8n (Tip: Replace your API keys and Gmail credentials before running.)
by Rahul Joshi
Description This workflow automates employee retention analytics by combining candidate performance data with trait-level retention statistics. It scores candidates, validates data, and generates a polished Retention Digest HTML email using GPT (Azure OpenAI). Hiring managers receive structured insights weekly, highlighting top/weak traits, candidate scores, and actionable JD refinement tips. What This Template Does (Step-by-Step) ⚡ Manual Trigger – Starts workflow execution on demand. 📑 Candidate Data Fetch (Google Sheets – Hires Tracking) – Pulls candidate-level details like name, role, traits, start date, and retention status. 📑 Trait Summary Fetch (Google Sheets – Retention Summary) – Fetches aggregated trait-level retention statistics, including hires, stayed, left, retention %, and weight adjustments. 🔀 Merge Candidate + Trait Data – Combines both datasets into a unified stream for scoring. 🧮 Candidate Scoring & Data Normalization (Code Node) – Cleans and standardizes data. Builds a trait → weight map. Calculates each candidate’s Candidate_Score. Outputs normalized JSON. ✅ Data Validation (If Node) – Ensures both candidate and trait datasets are present. TRUE → continues to AI digest generation. FALSE → routes to error logging. ⚠️ Error Handling Logic (Google Sheets – Error Log) – Logs any failed or incomplete runs into a dedicated error sheet for auditing. 🧠 AI Processing Backend (Azure OpenAI) – Prepares candidate + trait data for GPT processing. 🤖 Retention Digest Generator (LLM Chain) – Uses GPT (gpt-4o-mini) to create a structured HTML Retention Digest, including: TL;DR summary Top Traits (positive retention) Weak Traits (negative retention) Candidate highlights (scores & retention status) 3 actionable JD refinement tips 📧 Email Delivery (Gmail) – Sends the digest directly to hiring managers as a styled HTML email with subject: Retention Analysis Digest – Weekly Update Prerequisites Google Sheets (Hires Tracking + Retention Summary + Error Log) Gmail API credentials Azure OpenAI access (gpt-4o-mini model) n8n instance (self-hosted or cloud) Key Benefits ✅ Automates retention analytics & reporting ✅ Provides AI-powered insights in structured HTML ✅ Improves hiring strategy with trait-based scoring ✅ Reduces manual effort in weekly retention reviews ✅ Ensures reliability with error handling & validation Perfect For HR & Recruitment teams monitoring post-hire retention Organizations optimizing job descriptions & hiring strategy Talent analytics teams needing automated, AI-driven insights Stakeholders requiring clear weekly digest emails
by Avkash Kakdiya
How it works This workflow automatically creates daily AI-generated videos for any niche. It generates a short script, converts it into a cinematic video prompt, and produces an 8-second video with Veo 3. The workflow waits for the video, downloads it, and sends it via Gmail with a ready-to-post social media description. You can customize the script prompt to match any industry or topic. Step-by-step 1. Trigger the workflow Daily Trigger** – Starts the workflow automatically every day. 2. Generate content Generate Script** – Creates a short, engaging script for your chosen niche. Generate Veo3 Prompt** – Turns the script into a cinematic video prompt for Veo 3. Social Media Description** – Writes an SEO-friendly description for LinkedIn, Instagram, and YouTube. 3. Generate the video Create Video** – Sends the prompt to Veo 3 for video generation. Wait for Video** – Pauses until video processing is complete. Status** – Checks whether the video is ready. If** – Loops until the video is successfully generated. 4. Download and share Download Video** – Fetches the completed video file. Send a message** – Emails the video with the social media description attached. Why use this? Create short, engaging videos in any niche automatically. Combine scriptwriting, video creation, and content delivery in one workflow. Save time by eliminating manual editing and waiting. Ensure consistent, professional social content for multiple platforms. Flexible for marketing, education, news, product updates, and more.
by Meak
Auto-Post Instagram Carousels from Google Sheets + Drive (Cloudinary + IG Graph) This workflow checks your Google Sheet for “Carousel” posts to do, pulls images from a Drive folder, uploads them to Cloudinary, creates an Instagram carousel, publishes it, and marks the row as “Processed”. Benefits Hands-off posting from a simple Google Sheet queue Pulls all images from a Drive folder for each carousel Uses Cloudinary for fast, reliable hosting Posts via Instagram Graph API (official) Updates your Sheet status to “Processed” after publish How It Works Schedule Trigger runs every few minutes. Get Execution for Carousel reads rows where Status = ToDo and Type = Carousel. Get image list loads all files from the Drive folder in that row. Download Image fetches each file from Drive. Upload to Cloudinary stores the image and returns a public URL. Setup for Instagram prepares access_token, ig_user_id, image_url, caption. Create Media Container (Image) creates an IG container for each image. Combine containers collects all container IDs. Create Media Container (Carousel) builds one carousel with the children IDs + caption. Publish Instagram Carousel publishes the carousel post. Update Execute sets Status = Processed for that ExecuteId in Sheets. Who Is This For Social media managers batching carousels Agencies posting client content on a schedule Creators who organize posts in Google Sheets Setup Google Sheets: set Spreadsheet and “Execute” tab (with ExecuteId, Folder, Expected content, Status, Type) Google Drive: put carousel images in the folder referenced by the row Cloudinary: set cloud name + upload_preset Instagram Graph: get ig_user_id and a valid access_token In n8n: paste tokens/IDs into the Set nodes and HTTP nodes as shown Tips Keep image order in Drive as you want it to appear (or sort before upload) Caption comes from the Sheet field Expected content Make sure the IG account is a Business/Creator account connected to a Facebook Page Use high-res images; Cloudinary will optimize delivery Add error alerts (Slack/Email) if a step fails ROI Save 2–4 hours/week on manual uploads Fewer posting mistakes (everything logged in Sheets) Scales easily to multiple brands and calendars Strategy Insights Add a “Schedule_at” column and delay publishing until that time Write back the IG post ID to Sheets for tracking Extend to cross-post (e.g., Facebook Page) with the same media containers Check Out My Channel For more practical automation workflows for content teams, check out my YouTube channel where I share the exact systems I use to run social posting at scale.
by Davide
This workflow automates the creation of AI-generated viral selfie images with celebrities using Nano Banana Pro Edit via RunPod, generates engaging social media captions, and publishes the content to Instagram via Postiz. It starts with a form submission where the user provides an image URL, a custom prompt, and an aspect ratio. | START | RESULT | |------|--------| | | | Key Advantages 1. ✅ Full Automation, Zero Manual Effort From image generation to caption writing and publishing, the entire process is automated. This drastically reduces production time and eliminates repetitive manual tasks. 2. ✅ Scalable Content Creation The workflow can handle unlimited submissions, making it ideal for: Creators Agencies Growth teams SaaS products offering AI-generated content 3. ✅ Consistent Viral Quality By using a dedicated AI content agent with strict guidelines, every post is: Optimized for engagement Consistent in tone and quality Designed to maximize comments, shares, and saves 4. ✅ No Technical Skills Required for End Users The form-based entry point allows anyone to generate high-quality, celebrity-style content without understanding AI, APIs, or automation. 5. ✅ Multi-Tool Integration in One Pipeline The workflow seamlessly connects: AI image generation (RunPod) AI content intelligence (Google Gemini) Asset storage (Google Drive) Social media distribution (Postiz) 6. ✅ Brand-Safe and Platform-Native Output The captions are written to feel human and authentic, avoiding: Obvious AI language Overuse of emojis Mentions of AI generation This increases trust and platform compatibility. 7. ✅ Perfect for Growth and Monetization This workflow is ideal for: Viral growth experiments Personal brand scaling Automated influencer-style content AI-powered SaaS or lead magnets How it works The workflow then: Sends the image and prompt to RunPod’s Nano Banana Pro Edit API for AI image generation. Periodically checks the generation status until it is completed. Once the image is ready, it is downloaded and analyzed by Google Gemini to generate a viral-ready Instagram caption and hashtags. The final image is uploaded to Google Drive and to Postiz for social media publishing. The caption and image are combined and scheduled for posting on Instagram through the Postiz integration. The process includes conditional logic, waiting intervals, and error handling to ensure reliable execution from input to publication. Set up steps To use this workflow in n8n: Configure credentials: Add RunPod API credentials under httpBearerAuth named “Runpods”. Set up Google Gemini (PaLM) API credentials for caption generation. Add Postiz API credentials for social media posting. Configure Google Drive OAuth2 credentials for image backup. Prepare nodes: Ensure the Form Trigger node is properly set up with the required fields: IMAGE_URL, PROMPT, and FORMAT. Update the RunPod API endpoints in the “Generate selfie” and “Get status clip” nodes if needed. Verify the Google Drive folder ID in the “Upload file” node. Replace XXX in the “Upload to Social” node with a valid Postiz integration ID. Test the flow: Use the pinned test data in the “On form submission” node to simulate a form entry. Activate the workflow and submit the form to trigger the process. Monitor execution in n8n’s workflow view to ensure all nodes run successfully. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by WeblineIndia
AI Nutrition Tracker with Webhook, OpenAI, Google Sheets & Slack Alerts This workflow is an AI-powered nutrition and fitness tracking system built in n8n. It receives user input via webhook, analyzes food or activity using AI, updates daily calorie and protein intake in Google Sheets and sends alerts via Slack when limits or risks are detected. It also resets daily metrics automatically using a scheduled trigger. Quick Start Guide (Fast Implementation) Connect your Google Sheets account with required columns. Configure OpenAI API credentials. Set up Slack integration for alerts. Deploy the Webhook node for receiving user input. Enable the Schedule Trigger for daily reset. Test by sending a sample message (food/activity) to the webhook. What It Does This workflow automates real-time nutrition tracking by processing user input such as food consumption or physical activities. When a user sends a message through a webhook, the system fetches their profile data from Google Sheets and validates their existence before continuing. The input is then analyzed using an AI model that classifies it as food or activity. Based on this classification, it estimates calories, protein intake, calories burned and identifies potential health risks. The processed data is combined with existing user metrics to maintain an updated daily summary. Additionally, the workflow monitors calorie thresholds and risk patterns. If a user exceeds their daily calorie target or shows unhealthy behavior patterns, alerts are sent to Slack. A scheduled process resets all users’ daily metrics, ensuring accurate tracking for each new day. Who’s It For Fitness app developers Health and wellness platforms Personal trainers and nutritionists Automation engineers building health tracking systems Businesses managing user diet and activity tracking Requirements to Use This Workflow n8n instance** (self-hosted or cloud) Google Sheets account** with a structured sheet OpenAI API credentials** Slack workspace** with API access Basic understanding of webhook usage Required Google Sheets Columns: Name Phone (used as unique identifier) Goal Daily_Calorie_Target Consumed_Calories Protein Last_Meal Status Risk_Count How It Works & How To Set Up Step 1: Configure Webhook Node:** Receive User Input Set webhook path (e.g., /diet-input) This endpoint will receive user messages (food/activity input). Step 2: Connect Google Sheets Node:** Fetch User Data Map Phone column with incoming webhook data (body.phone). Ensure correct Sheet ID and Sheet Name. Step 3: Validate User Node:** Validate User Exists Ensures only valid users are processed. Step 4: Configure AI Analysis Node:** AI Nutrition Analyzer Uses OpenAI model to: Detect input type (FOOD / ACTIVITY) Estimate calories & protein Detect risks Generate suggestions and replies Node:** AI Model (GPT) Ensure OpenAI credentials are properly connected. Step 5: Process Data Node:** Process Nutrition Data Updates: Total calories, Protein intake and Risk count. Handles both food intake and activity adjustments. Step 6: Monitor Thresholds Node:** Check Calorie Threshold Sends Slack alert if calorie limit is exceeded. Step 7: Risk Detection Node:** Check Health Risk: Sends alert if any risk is detected. Node:** Check Critical Risk: Sends critical alert if risk count >= 3. Step 8: Respond to User Node:** Send API Response Returns: AI-generated reply, calories consumed and daily total. Step 9: Update User Data Node:** Update User Data Writes updated metrics back to Google Sheets. Step 10: Daily Reset Automation Node:** Daily Reset Trigger Node:** Fetch All Users → Reset Daily Stats Resets: Calories = 0, Protein = 0, Risk Count = 0, Status = NEW_DAY. How To Customize Nodes AI Prompt (AI Nutrition Analyzer):** Modify calorie logic, risk definitions or add new dietary rules. Calorie Threshold:** Change comparison logic based on different fitness goals. Slack Messages:** Customize alert messages and formatting. Google Sheets Mapping:** Add more fields like carbs, fats or water intake. Webhook Input Structure:** Adjust input format if integrating with mobile apps or chatbots. Add-ons (Extend Functionality) Add WhatsApp/SMS notifications for user alerts. Integrate with mobile apps for real-time tracking. Add meal history logging. Include weekly/monthly analytics dashboards. Add AI meal recommendations based on user goals. Integrate wearable data (steps, workouts). Use Case Examples Fitness App Backend:** Automatically track user diet and activity in real time. Personal Trainer Dashboard:** Monitor client calorie intake and risks. Corporate Wellness Programs:** Track employee health metrics and send alerts. Diet Coaching Automation:** Provide AI-based suggestions and feedback. Health Monitoring Systems:** Detect unhealthy patterns and escalate alerts. Troubleshooting Guide | Issue | Possible Cause | Solution | | :--- | :--- | :--- | | Webhook not receiving data | Incorrect webhook URL | Verify webhook path and method | | User not found | Phone mismatch in Sheets | Ensure phone format matches input | | AI response parsing fails | Invalid JSON from AI | Adjust prompt to enforce strict JSON | | Slack alerts not sent | Incorrect Slack credentials | Reconnect Slack API | | Data not updating in Sheets | Column mapping issue | Verify matching column (Phone) | | Calories not updating | Parsing or logic issue | Check code node calculations | | Daily reset not working | Schedule trigger configuration | Verify interval settings | Need Help? If you need assistance setting up this workflow, customizing AI logic or building advanced automation features, our n8n development team at WeblineIndia is here to help. We specialize in: n8n workflow automation AI-powered business solutions Custom integrations and scaling systems 📩 Reach out to WeblineIndia to accelerate your automation journey and build powerful, production-ready workflows tailored to your needs.
by Davide
🤝🖊️🤖 This workflow automates the process of retrieving meeting transcripts from Fireflies.ai, extracting and summarizing relevant content using Google Gemini, and sending or drafting well-formatted summaries and emails via Gmail. Fireflies is an AI-powered meeting assistant that automatically records, transcribes, and summarizes meetings. It integrates with popular video conferencing tools like Zoom, Google Meet, and Microsoft Teams, helping teams capture key insights and action items without manual note-taking. This workflow automates meeting recap generation, from email detection to AI-powered summarization and delivery. Key Benefits 💡 Automated Insight Extraction**: Uses AI (OpenAI & Gemini) to extract and summarize key insights from meetings automatically. 📩 Instant Client Communication**: Generates ready-to-send meeting summaries and drafts without human intervention. 📥 Email Monitoring**: Listens to Gmail for specific meeting recap messages and reacts accordingly. 🔗 Seamless Fireflies Integration**: Dynamically pulls transcript data 🧠 Dual AI Models**: Combines the strengths of OpenAI and Gemini for rich, contextual summaries in multiple formats. 🛠 Modular Design**: Easily customizable and extensible for adding more destinations (e.g., Slack, Notion, CRM). 🧑💼 Ideal for Teams & Consultants**: Great for sales teams, project managers, or consultants who handle multiple client meetings daily. How It Works Trigger: The workflow starts with a Gmail Trigger node that monitors incoming emails with the subject "Your meeting recap". It checks for new emails every hour. Alternatively, it can be manually triggered using the "When clicking ‘Execute workflow’" node for testing. Alternatively, via Webhook. Email Processing: The "Get a message" node fetches the full email content. The "Set Meeting link" node extracts the meeting link from the email. The "Information Extractor" (powered by OpenAI) processes the email text to identify the meeting URL. Transcript Retrieval: A Code node parses the meeting ID from the URL. The "Get a transcript" node (Fireflies.ai integration) fetches the full meeting transcript using the extracted meeting ID. Transcript Processing: The "Set sentences" and "Set summary" nodes extract structured data (sentences, short summary, overview) from the transcript. The "Full transcript" node combines all transcript segments into a readable format. AI Summarization & Email Generation: Google Gemini models analyze and summarize the transcript in Italian ("Expert Meeting transcripts") and generate a client-friendly recap ("Meeting summary expert"). The "Email writer" node combines summaries into a cohesive email draft. The Markdown to HTML nodes format the content for email readability. Output: A "Draft email to client" node prepares the final recap. Two Gmail nodes ("Send Full meeting summary" and "Send a message1") dispatch the summaries to the specified recipient. Set Up Steps Configure Credentials: Ensure the following credentials are set up in n8n: Fireflies.ai API (for transcript retrieval). Gmail OAuth2 (for email triggering/sending). OpenAI API (for initial text extraction). Google Gemini (PaLM) (for summarization). Adjust Nodes: Update the "Gmail Trigger" node with the correct email filter (subject:Your meeting recap). Replace YOUR_EMAIL in the Gmail Send nodes with the recipient’s address. Verify the Code nodes (e.g., meeting ID extraction) match your URL structure. Deploy: Activate the workflow. Test using the Manual Trigger or wait for the Gmail trigger to execute automatically. Optional Customization: Modify the Google Gemini prompts for different summary styles. Adjust the email templates in the final Gmail nodes. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Daniel Rosehill
This workflow provides a mechanism for using AI transcribed voice notes using Voicenotes AI and then running them into an AI agent as prompts. On the "collection" end of the workflow, we gather the output (with the recorded prompt) and do two things: 1) It is saved into NocoDB as a new row on a database table recording AI outputs and prompts. 2) The prompt gets sent to an AI agent and the output gets returned to the user's email Who Is It For? If you like using voice AI tools to write detailed prompts for AI, then this workflow helps to remove the points of friction in getting from A to B! How Does It Work? Simply tag your voice note in Voicenotes with your preferred tag (I'm using 'prompt'). Then, provide the N8N webhook as the URL for a webhook that will trigger whenever a new note is created with this tag (and this tag only). Now, whenever you wish to use a voice note as a prompt, just add the 'tag.' This will trigger the webhook which, in turn, will trigger this workflow - sending the prompt to an AI agent of your choosing (configure within the workflow) and then saving the output into a database and returning it by email. Note: The AI agent system prompt is written to define a structured output to provide Gmail-safe HTML. This is thin injected into a template. You can use a Google Group to gather together the output runs or just receive them at your main address (if you don't use Gmail, just swap out for any other email node or your preferred delivery channel). How To Set It Up You'll need a Voicenotes account in order to use the service! Once you have one, you'll next want to create the tag and the webhook. In N8N, create your webhook node and then provide that to Voicenotes: Create a note. Then assign it a new tag: "Prompts" (or as you prefer). The webhook is matched to the tag. Requirements Voicenotes AI account Customisation The delivery mechanism can be customized to your preferences. If you're not a Google user, substitute the template and sending mechanism for your preferred delivery provider You could for example collect the outputs to a Slack channel or Telegram bot. You may omit the collector in NocoDB or substitute it for another wiki or knowledge management platform such as Notion or Nuclino.