by Marián Današ
Who’s this for 💼 This template is designed for teams and developers who need to generate PDF documents automatically from HTML templates. It’s suitable for use cases such as invoices, confirmations, reports, certificates, or any custom document that needs to be created dynamically based on incoming data. What this workflow does ⚙️ This workflow automates the full lifecycle of document generation, from request validation to delivery and storage. It is triggered by a POST webhook that receives structured JSON data describing the requested document and client information. Before generating the document, the workflow validates the client’s email address using Hunter Email Verification to prevent invalid or mistyped emails. If the email is valid, the workflow loads the appropriate HTML template from a Postgres database, fills it with the incoming data, and converts it into a PDF using PDF Generator API. Once the PDF is generated, it is sent to the client via Gmail, uploaded to Supabase Storage, and the transaction is recorded in the database for tracking and auditing purposes. How it works 🛠️ Receives a document generation request via a POST webhook. Validates the client’s email address using Hunter. Generates a PDF document from an HTML template using PDF Generator API. Sends the PDF via Gmail and uploads it to Supabase Storage. Stores a document generation record in the database. How to set up 🖇️ Before activating the workflow, make sure all required services and connections are prepared and available in your n8n environment. Create a POST webhook endpoint that accepts structured JSON input. Add Hunter API credentials for email verification. Add PDF Generator API credentials for HTML to PDF conversion. Prepare a Postgres database with tables for HTML templates and document generation records. Set up Gmail or SMTP credentials for email delivery. Configure Supabase Storage for storing generated PDF files. Requirements ✅ PDF Generator API account Hunter account Postgres database Gmail or SMTP-compatible email provider Supabase project with Storage enabled How to customize the workflow 🤖 This workflow can be adapted to different document generation scenarios by extending or modifying its existing steps: Add extra validation steps before document generation if required. Extend delivery options by sending the generated PDF to additional services or webhooks. Enhance security by adding document encryption or access control. Add support for additional document types by storing more HTML templates in the database. Modify the database schema or queries to store additional metadata related to generated documents. Adjust the data mapping logic in the Code node to match your input structure.
by Rahul Joshi
📊 Description Most learning newsletters send the same email to everyone. This workflow does the opposite — every subscriber gets a completely different email every single day, personalized to their topic interest, learning phase, and how far they've come in their journey. Built specifically for women's skill development programs, coaching platforms, and edtech creators who want to deliver real personalized learning at scale without manually writing hundreds of emails. You manage the content library, the AI handles the personalization, and the workflow handles the delivery — every morning at 9AM without you touching anything. What This Workflow Does ⏰ Triggers every morning at 9AM automatically 📋 Fetches all subscribers from Google Sheets with their name, email, topic interest, and subscription date 🧮 Calculates each subscriber's day number, week number, and learning phase — Beginner, Intermediate, or Advanced — based on how long they've been subscribed 📚 Fetches your full content library and scores every lesson against each subscriber's topic interest and phase 🎯 Picks the best matching lesson per subscriber — falls back to day-index rotation if no strong match found 🔁 Loops through each subscriber one at a time to ensure every email is individually generated 🤖 Sends each subscriber's profile and matched lesson to GPT-4o which generates a fully personalized 3-paragraph lesson explanation, actionable task, key takeaways, and a motivational quote from a woman leader in their field 📧 Builds a beautiful branded HTML email and sends it via Gmail 📝 Logs every delivery to a SendLog sheet with date, name, lesson title, phase, category, and AI snippet Key Benefits ✅ Every subscriber gets a unique email — no generic blasts ✅ Learning phase auto-advances as subscribers stay longer ✅ GPT-4o adapts lesson tone and depth to Beginner, Intermediate, or Advanced ✅ Motivational quotes always come from women leaders in the relevant field ✅ Full delivery log in Google Sheets for tracking and analytics ✅ Works for any skill category — coding, finance, leadership, marketing, and more Features Cron-based daily trigger at 9AM Automatic learning phase calculation per subscriber Content scoring and matching engine Day-index fallback rotation for unmatched subscribers GPT-4o lesson personalization with phase-aware prompting Woman leader motivational quotes per field Branded HTML email template with inline CSS Dynamic subject line per subscriber Gmail delivery with individual personalization Full SendLog tracking in Google Sheets Loop-based processing — one subscriber at a time for accuracy Requirements OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection A configured Google Sheet with 3 sheets: Subscribers, ContentLibrary, SendLog Setup Steps Copy the Google Sheet template and grab your Sheet ID Paste the Sheet ID into all Google Sheets nodes Add your Google Sheets OAuth2 credentials Add your OpenAI API key to the GPT-4o node Add your Gmail OAuth2 credentials Populate the Subscribers sheet with your learners Populate the ContentLibrary sheet with your lessons — at least 5-10 per category Run the workflow manually once to test with your first subscriber Confirm the HTML email looks correct in your inbox Target Audience 🎓 Women's skill development platforms and bootcamps 📧 Edtech creators running personalized learning newsletters 💼 Career coaches who want to deliver daily value to their community 🤖 Automation agencies building AI-powered email learning systems for clients
by Milo Bravo
Automated Email Outreach: Telegram → Gmail → Sheets Dashboard Who is this for? Solo founders, sales teams, and event organizers who need email outreach without expensive tools but want full control from Telegram. What problem is this workflow solving? Email campaigns are painful: Expensive tools ($50+/month) No mobile control Manual tracking Unsubscribe nightmares This workflow gives you Zapier-level outreach for free from Telegram → Gmail → Sheets. What this workflow does Telegram Control /outreach command launches campaigns Smart Sending Gmail + random delays (anti-spam) Real-time Tracking Open pixels + unsubscribe webhooks Sheets Dashboard Leads, logs, stats in one place Compliance Auto-unsubscribe + opt-out tracking Full flow: Telegram Bot → Parse Command → Sheets Leads → Gmail Send → Pixel/Unsub Track → Update Dashboard Setup (7 minutes) Telegram: Create bot → Get token → Update chatId Gmail: OAuth2 credential (any account) Google Sheets: Create sheet with tabs: Dashboard (stats) Leads (email, name, status) Logs (sends, opens, unsubs) Config: Update Sheet ID + webhook URLs Test: /outreach cap:2 → Verify sends text Telegram commands: /outreach sender:you@domain.com subject:"Event Invite" body:"Hi {{name}}..." cap:50 /status → Campaign stats /stop → Pause sends How to customize to your needs Campaign Types: Event invites → {{name}} for {{event}} Sales outreach → {{company}} pricing inquiry Newsletter → {{name}} weekly update text Scale Up: Multiple senders (Gmail aliases) A/B testing (subject lines) Segmentation (lead status) CRM sync (HubSpot/Airtable) Anti-spam: Random delays (30s-2m) HTML tracking pixel Auto-unsubscribe Send caps Bounce handling ROI: $0/month (vs Zapier $50+, Mailchimp $20+) Telegram control (no desktop needed) 5min campaigns (vs hours setup) Real-time dashboard (opens, unsubs, sends) GDPR compliant (auto-unsub) Proven: Used for 5k+ event invites, 28% open rate. Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: n8n email outreach, telegram automation, gmail campaigns, google sheets dashboard, no-code email marketing, sales outreach automation, event invite workflow.
by Yaron Been
Track companies adopting tools that complement yours and send AI-drafted co-marketing outreach emails to new adopters. This workflow reads a list of complementary tools (with their PredictLeads technology IDs) from Google Sheets, discovers companies that recently adopted each tool via the PredictLeads Technology Detections API, compares against previously scanned domains to find new adopters, enriches each new company, and uses OpenAI to draft a personalized co-marketing partnership email. How it works: Schedule trigger runs the workflow daily at 8 AM. Reads complementary tool names and PredictLeads tech IDs from Google Sheets. Loops through each tool and discovers recent technology adopters via PredictLeads. Reads previously scanned domains from a separate Google Sheets tab. Compares current detections against previous scans to identify new adopters only. Limits processing to 2 new companies per tool per run (adjustable). Enriches each new adopter with PredictLeads company data. Builds a structured prompt and sends it to OpenAI to draft a co-marketing email. Sends the email via Gmail. Logs the domain, tool name, and timestamp to the Previous Scan sheet to prevent duplicates. Setup: Create a Google Sheet with two tabs: "Complementary Tools" with columns: tool_name, tech_id (PredictLeads technology ID). "Previous Scan" with columns: domain, tool_name, detected_at, email_sent. Connect your Gmail account (OAuth2) for sending outreach emails. Add your OpenAI API key in the Draft Co-Marketing Email HTTP Request node. Add your PredictLeads API credentials (X-Api-Key and X-Api-Token headers). Requirements: Google Sheets OAuth2 credentials. Gmail OAuth2 credentials. OpenAI API account (uses gpt-4o-mini, ~$0.003-0.008 per call). PredictLeads API account (https://docs.predictleads.com). Notes: The Limit node caps outreach at 2 companies per tool per run -- adjust as needed. Technology IDs for the complementary tools can be found via the PredictLeads API. The Previous Scan tab prevents the same company from being contacted twice. PredictLeads Technology Detections and Company API docs: https://docs.predictleads.com
by Ayush Singh
How it works This workflow automatically generates a detailed AI-powered campaign performance report across Meta, Google and Microsoft Ads and emails it to your team every month. It reads campaign data from three tabs in a Google Sheet (Google, Meta, Microsoft), merges all rows, and passes them to a Code node that calculates KPIs and builds a structured prompt. Groq AI (Llama 3.3 70B) then analyzes the data and generates expert insights. A second Code node combines the KPIs and AI analysis into a full HTML email with platform tables, charts, benchmarks and recommendations — sent automatically via Gmail. Setup steps Create a Google Sheet with 3 tabs: Google, Meta, Microsoft Paste your monthly ad exports into the matching tab Connect your Google account in the 3 Sheets nodes and select the correct tab in each Add your Groq API key in the HTTP Request node header Connect your Gmail account in the Send node and set your recipient email Activate — the workflow runs automatically on schedule
by PollupAI
Who’s it for This workflow is built for B2B SaaS and CX teams that are drowning in unstructured customer feedback across tools. It’s ideal for Customer Success, Product and Support leaders who want a light “voice of customer engine” without rebuilding their stack: Gmail for interactions, Slack for conversations, Pipedrive for notes and Zendesk for tickets, plus Notion for follow-up tasks. How it works / What it does The workflow runs on a schedule or manual trigger and first sets the CSM’s email address. It then uses an AI “Data agent” to pull recent customer signals from multiple sources: Gmail messages, Slack messages, Pipedrive notes and Zendesk tickets. A “Signals agent” compresses each piece of feedback into a concise, neutral summary, which is then grouped by topic via a “Clustering agent”. Each cluster gets a label, count and examples. Finally, an “Action agent” routes clusters based on their label: Create Zendesk tickets for product/performance issues Post to a dedicated Slack channel for billing / contract topics Create Notion tasks for sales-related feedback Send targeted Gmail messages to the CSM for high-risk or engagement-related items How to set up Import the workflow into n8n. Connect credentials for Gmail, Slack, Pipedrive, Zendesk, Notion and OpenAI. Update the CSM email in the “Set CSM email” node. Adjust date filters, send-to addresses and Slack channel IDs as needed. Enable the schedule trigger for weekly or daily digests. Requirements Active accounts & credentials for: Gmail, Slack, Pipedrive, Zendesk and Notion OpenAI (or compatible) API key for the LLM node At least one Slack channel for posting feedback (e.g. #billing-feedback) How to customize the workflow Change the time window or filters (sender, channel, query) for each data source. Edit the clustering and routing prompts to match your own categories and teams. Add new destinations (e.g. Jira, HubSpot) by connecting more tools to the Action agent. Modify thresholds (e.g. minimum count) before a cluster triggers an action. Localize labels and email copy to your team’s language and tone.
by Veena Pandian
Who is this for? Founders, product managers, content strategists, indie hackers, and anyone who wants to automatically monitor tech industry trends across multiple sources — without manually browsing Hacker News and Product Hunt every day. What this workflow does This workflow scans public RSS feeds from Hacker News and Product Hunt daily, scores every item against configurable keyword groups (AI, SaaS, No-Code, Dev Tools, etc.), clusters the results into ranked themes, and delivers a prioritized intelligence report via Slack and email. All signals and themes are logged to Google Sheets for historical trend analysis. How it works Daily trigger fires on a configurable schedule (default: every 24 hours). Fetches RSS feeds from Hacker News (posts with 50+ points) and Product Hunt in parallel. Parses and normalizes all feed items — extracting titles, descriptions, URLs, and publish dates from RSS/Atom XML. Scores each item against 7 weighted keyword groups. Title matches receive a bonus multiplier. Source weights (Hacker News 1.5x, Product Hunt 1.3x) amplify signals from higher-authority sources. Clusters into themes — groups scored items by primary category, calculates theme strength using source diversity and volume bonuses, and classifies each as VERY_STRONG, STRONG, MODERATE, or WEAK. Builds an intelligence report with theme rankings, top 10 signals, and action items for surging topics. Generated in both plain text (Slack) and HTML (email). Delivers and logs — posts to Slack, sends HTML email, and appends both individual signals and theme summaries to separate Google Sheet tabs. Setup steps Connect Google Sheets OAuth2 credentials and update the Sheet ID in both "Log Signals to Sheet" and "Log Themes to Sheet" nodes. Create a Google Sheet with two tabs: signal — headers: date, title, source, score, category, url themes — headers: date, category, signal_level, theme_strength, item_count, sources, top_keywords Connect Slack OAuth2 credentials and configure your target channel in the "Post Report to Slack" node. Connect Gmail OAuth2 credentials and update YOUR_EMAIL@EXAMPLE.COM in the "Email Daily Report" node. Activate the workflow. Requirements n8n instance (self-hosted or cloud) Google Cloud project with Sheets API enabled Slack workspace with a bot configured Gmail account with OAuth2 credentials (or swap for SMTP) No API keys needed for RSS feeds — they are publicly accessible How to customize Add more RSS feeds** — duplicate a feed node (e.g., TechCrunch, Reddit, Lobsters), connect it to the Merge node as an additional input, and add the parsing logic in the "Parse All RSS Feeds" code node. Edit keyword groups** — modify the keywordGroups object in the "Score and Classify Signals" node. Add your industry-specific keywords, adjust weights, and rename categories. Adjust source weights** — change the weight multipliers in the parser node to reflect which sources you trust most. Theme thresholds** — modify the strength cutoffs (30 = VERY_STRONG, 15 = STRONG, 8 = MODERATE) in the "Aggregate Signals into Themes" node. Schedule** — change from daily to hourly for real-time monitoring, or weekly for a digest format. Add AI analysis** — insert an LLM node after the report builder to generate strategic commentary on detected trends.
by Samyotech
What this workflow does This workflow implements a two-stage news automation system designed for reusable and topic-driven email delivery. News articles are continuously collected from multiple platforms using RSS feeds and stored in a vector database with semantic embeddings and category metadata. Instead of fetching news on demand, the workflow separates daily ingestion from weekly delivery. This allows the same news data to be reused across different topics, audiences, or delivery schedules. On a weekly basis, relevant articles are retrieved from the vector store based on defined areas of interest and item limits. The selected news is then processed by an AI agent, which converts the raw articles into a structured, email-ready format before sending the final content to users. How it works News articles are collected daily from multiple RSS feeds Articles are categorized and stored in a vector database On a weekly trigger, topic preferences are evaluated Relevant articles are retrieved using vector-based search An AI agent formats the content for email delivery The email is sent to the user Setup To use this workflow, complete the following steps: Add and configure your RSS feed sources Connect a vector database and embedding model Configure AI model credentials for content generation Set up email service credentials Define weekly scheduling and topic inputs Test retrieval and email output Customization You can customize this workflow by: Adding or removing RSS feed sources Adjusting news categories or topic filters Changing the number of articles retrieved per topic Modifying the AI agent’s writing tone or structure Reusing the vector store for other content workflows Updating email frequency or delivery format Requirements RSS feed URLs Vector database credentials AI model credentials Email service credentials
by Frederik Duchi
This n8n template demonstrates how to automatically create tasks (or in general, records) in Baserow based on template or blueprint tables. The first blueprint table is the master table that holds the general information about the template. For example: a standard procedure to handle incidents. The second table is the details table that holds multiple records for the template. Each record in that table is a specific task that needs to be assigned to someone with a certain deadline. This makes it easy to streamline task creation for recurring processes. Use cases are many: Project management (generate tasks for employees based on a project template) HR & onboarding (generate tasks for employee onboarding based on a template) Operations (create checklists for maintenance, audits, or recurring procedures) Good to know The Baserow template for handling Standard Operating Procedures works perfect as a base schema to try out this workflow. Authentication is done through a database token. Check the documentation on how to create such a token. Tasks are inserted using the HTTP request node instead of a dedicated Baserow node. This is to support batch import instead of importing records one by one. Requirements Baserow account (cloud or self-hosted) A Baserow database with at least the following tables: Assignee / employee table. This is required to be able to assign someone to a task. Master table with procedure or template information. This is required to be able to select a certain template Details table with all the steps associated with a procedure or template. This is required to convert each step into a specific task. A step must have a field Days to complete with the number of days to complete the step. This field will be used to calculate the deadline. Tasks table that contains the actual tasks with an assignee and deadline. How it works Trigger task creation (webhook)** The automation starts when the webhook is triggered through a POST request. It should contain an assignee, template, date and note in the body of the request. It will send a succes or failure response once all steps are completed. Configure settings and ids** Stores the ids of the involved Baserow database and tables, together with the API credentials and the data from the webhook. Get all template steps** Gets all the steps from the template Details table that are associated with the id of the Master template table. For example: the master template can have a record about handling customer complaints. The details table contains all the steps to handle this procedure. Calculate deadlines for each step** Prepares the input of the tasks by using the same property names as the field of the Tasks table. Adjust this names, add or remove fields if this is required for your database structure. The deadline of each step is calculated by adding the number of days a step can take based on the deadline of the first step. This is done through a field Days to complete in the template Details table. For example. If the schedule_date property in the webhook is set to 2025-10-01 and the Days to complete for the step is 3, then the deadline will be 2025-10-04 Avoid scheduling during the weekend** It might happen that the calculated deadline is on a Saturday or Sunday. This Code node moves those dates to the first Monday to avoid scheduling during the weekend. Aggregate tasks for insert** Aggregates the data from the previous nodes as an array in a property named items. This matches perfect with the Baserow API to insert new records in batch. Generate tasks in batch** Calls the API endpoint /api/database/rows/table/{table_id}/batch/ to insert multiple records at once in the tasks table. Check the Baserow API documentation for further details. Success / Error response** Sends a simple text response to indicate the success or failure of the record creation. This is to offer feedback when triggering the automation from a Baserow application, but can be replaced with a JSON response. How to use Call the Trigger task creation node with the required parameters through a POST request. This can be done from any web application. For example: the application builder in Baserow supports an action to send an HTTP request. The Procedure details page in the Standard Operating Procedures template demonstrates this action. The following information is required in the body of the request. This information is required to create the actual tasks. { "assignee_id": integer refering to the id of the assignee in the database, "template_id": integer refering to the id of the template or procedure in the master table, "schedule_date": the date the tasks need to start scheduling, "note": text with an optional note about the tasks } Set the corresponding ids in the Configure settings and ids node. Check the names of the properties in the Calculate deadlines for each step node. Make sure the names of those properties match the field names of your Tasks table. You can replace the text message in the Success response and Failure response with a more structured format if this is necessary in your application. Customising this workflow Add support for public holidays (e.g., using an external calendar API). Modify the task assignment logic (e.g., pre-assign tasks in the details table). Combine with notifications (email, Slack, etc.) to alert employees when new tasks are generated.
by Rahul Joshi
📊 Description The scoreboard shows you what happened. This workflow tells you why it happened. Every time an IPL match ends this automation detects the completed result, fetches the full scorecard, and sends it to GPT-4o which produces a detailed journalist-style post-match analysis — innings breakdowns, tactical decisions, key turning points, player of the match, and what the result means for both teams. Every Monday it also generates a weekly roundup digest covering all the week's matches in one beautifully designed email. Built for sports media companies, IPL fan platforms, cricket newsletters, and automation agencies who want to produce expert-level match analysis at scale without a dedicated editorial team. What This Workflow Does ⏰ Polls CricAPI every 30 minutes for recently completed IPL matches 📋 Checks the Match Log sheet to avoid analyzing the same match twice 🏏 Detects new completed IPL matches and saves them to the Match Log 🧮 Computes both innings run rates and builds a structured analytical prompt 🤖 Sends full match context to GPT-4o which generates a complete post-match analysis 📧 Assembles the analysis into a branded HTML email and sends it immediately after the match 📝 Logs every analysis to the Analysis Log sheet with match name, winner, and player of match 📊 Every Monday reads all analyses from the past 7 days and generates a weekly roundup 🤖 GPT-4o writes the weekly digest with match recaps, player of the week, and next week preview 📧 Sends the weekly roundup as a branded HTML email every Monday at 9AM Key Benefits ✅ Fully automatic — detects match completion and triggers analysis without manual input ✅ Duplicate prevention — never analyzes the same match twice ✅ GPT-4o writes like a cricket journalist not a data report ✅ Two email formats — immediate post-match deep dive and weekly roundup digest ✅ Complete audit trail across two Google Sheets ✅ Falls back to any completed T20 when IPL is off-season so testing always works ✅ Clean termination on both IF nodes — no dangling branches How It Works SW1 — Match Completion Detector Every 30 minutes the workflow fetches all current and recent matches from CricAPI and reads the Match Log sheet. The Code node filters for completed IPL T20 matches by checking that the match name contains IPL or Indian Premier League, the match type is T20, and both matchStarted and matchEnded are true. It then compares every completed match against the set of already-analyzed match IDs in the Match Log. If a new unanalyzed match is found it gets saved to the Match Log with analyzed set to false and all scorecard data flows forward into the analysis engine. If no new match is found an IF node stops the workflow cleanly. SW2 — Deep Dive Analyzer The match data flows directly from SW1 into the analysis prompt builder. The Code node computes run rates for both innings and assembles a structured prompt containing both innings stats, the match result, and clear instructions for GPT-4o to act as a cricket journalist. GPT-4o returns a headline, 3-4 sentence match summary, separate tactical breakdowns for each innings, three key moments that decided the match, an overall tactical assessment, player of the match with reasoning, and a one-sentence forward-looking note. The response is parsed and assembled into a branded HTML email with a dark blue header, score display, color-coded analysis sections, and a player of the match spotlight. The email is sent immediately and both Google Sheets are updated to record that this match has been analyzed and the email has been sent. SW3 — Weekly Digest Every Monday at 9AM the workflow reads all rows from the Analysis Log and filters for entries from the past 7 days. If matches exist GPT-4o generates a weekly roundup covering the week's headline, individual one-liner recaps for each match, player of the week, the biggest talking point or controversy, and a preview of the upcoming week. The response is assembled into a branded weekly roundup email and sent. If no matches were analyzed in the past 7 days the workflow stops cleanly without sending a blank email. Features 30-minute polling for match completion detection Dynamic IPL match filtering — no hardcoded IDs Duplicate prevention via Match Log sheet lookup Both innings run rate computation GPT-4o post-match analysis with 8 structured output fields Immediate post-match email delivery Weekly Monday digest with recaps, POTW, talking point, and preview Two branded HTML email templates with dark blue cricket theme Two Google Sheets for match tracking and analysis history IF nodes with No Operation fallbacks on both SW1 and SW3 Fallback to any completed T20 for off-season testing Requirements CricAPI account and API key — free tier at cricapi.com OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection Setup Steps Sign up at cricapi.com and get your free API key Create a Google Sheet called "IPL Post Match Analyzer" with 2 sheets — Match Log and Analysis Log Add the correct column headers to both sheets. Paste your Sheet ID into all Google Sheets nodes Connect Google Sheets OAuth2 credentials Add your OpenAI API key to both OpenAI nodes Add your Gmail OAuth2 credentials and set your email in both Gmail nodes Activate the workflow — the system runs itself from here Target Audience 📺 Sports media companies automating post-match editorial content 🏏 IPL cricket newsletters and fan platforms delivering expert analysis 🤖 Automation agencies building cricket intelligence products for media and franchise clients 📱 Fan apps that want to surface match analysis without hiring a commentary team
by Avkash Kakdiya
How it works This workflow runs on a schedule and retrieves newly created HubSpot contacts from the past 24 hours. It processes each contact individually and generates a tailored marketing campaign using an AI model. The generated content is formatted into a clean HTML structure. Finally, a personalized email is sent to each contact with their campaign strategy. Step-by-step Trigger and fetch contacts** Schedule Trigger – Starts the workflow at defined intervals. Search contacts – Retrieves contacts created in the last 24 hours from HubSpot. Process and generate campaign** Loop Over Contacts – Splits contacts into individual items for processing. AI Agent – Generates a personalized marketing campaign strategy. Groq Chat Model – Sub-node providing the AI model for content generation. Format AI's output – Converts AI text into HTML-friendly format. Send a message – Sends the personalized email to each contact. Why use this? Automates personalized outreach for every new lead Delivers instant value with AI-generated campaign strategies Reduces manual marketing effort and response time Improves engagement through highly tailored messaging Easily scalable and customizable for different business needs
by Yassin Zehar
Description Automatically triage Product UAT feedback with AI, deduplicate it against your existing Notion backlog, create/update the right Notion item, and close the loop with the tester (Slack or email). This workflow standardizes incoming UAT feedback, runs AI classification (type, severity, summary, suggested title, confidence), searches Notion to prevent duplicates, and upserts the roadmap entry for product review. It then confirms receipt to the tester and returns a structured webhook response. Context Feature requests often arrive unstructured and get lost across channels. Product teams waste time re-triaging the same ideas, creating duplicates, and manually confirming receipt. This workflow ensures: Faster feature request triage Fewer duplicates in your roadmap/backlog Consistent structure for every feedback item Automatic tester acknowledgement Full traceability via webhook response Who is this for? Product Managers running UAT or beta programs Product Ops teams managing a roadmap backlog Teams collecting feature requests via forms, Slack, or internal tools Anyone who wants AI speed with clean backlog hygiene Requirements Webhook trigger (form / Slack / internal tool) OpenAI account (AI triage) Notion account (roadmap/backlog database) Slack and/or Gmail (tester notification) How it works Trigger: feedback received via webhook Normalize & Clean: standardizes fields and cleans message AI Triage: returns structured JSON (type, severity, title, confidence…) Notion Dedupe & Upsert: search by suggested title → update if found, else create Closed Loop: notify tester (Slack or email) + webhook response payload What you get One workflow to capture and structure feature requests Clean Notion backlog without duplicates Automatic tester confirmation Structured output for downstream automation About me : I’m Yassin a Product Manager Scaling tech products with a data-driven mindset. 📬 Feel free to connect with me on Linkedin