by Jamot
This n8n template automatically summarizes your WhatsApp group activity from the past week and generates a team report. Why use this? Remote teams rely on chat for communication, but important discussions, decisions, and ideas get buried in message threads and forgotten by Monday. This workflow ensures nothing falls through the cracks. How it works Runs every Monday at 6am to collect the previous week's group messages Groups conversations by participant and analyzes message threads AI summarizes individual member activity into personal reports Combines all individual reports into one comprehensive team overview Posts the final report back to your WhatsApp group to kick off the new week Setup requirements WhatsApp (whapAround.pro) no need Meta API Gemini AI (or alternative LLM of choice) Best practices Use one workflow per WhatsApp group for focused results Filter for specific team members if needed Customize the report tone to match your team culture Adjust the schedule if weekly reports don't suit your team's pace Customization ideas Send reports via email instead of posting to busy groups Include project metrics alongside message summaries Connect to knowledge bases or ticket systems for additional context Perfect for project managers who want to keep distributed teams aligned and ensure important conversations don't get lost in the chat noise.
by Hunyao
What it does Automatically monitors multiple subreddits daily, identifies trending posts with high engagement, and delivers AI-powered summaries directly to your inbox. Never miss important discussions in your favorite communities again. Perfect for Investors tracking market sentiment, researchers monitoring industry discussions, content creators finding trending topics, or anyone wanting curated Reddit insights without endless scrolling. Apps used Reddit, OpenRouter (GPT-4o mini), Gmail How it works Triggers daily at your chosen time across all specified subreddits Fetches hot posts from the last 24 hours with scores above 30 upvotes Sorts posts by engagement score to prioritize trending content Extracts post content plus top-level comments for full context Generates concise AI summaries for each high-value thread Compiles summaries into a clean HTML email digest Delivers the digest to your Gmail inbox with clickable Reddit links Setup Configure these three essential settings: Schedule time: Set your preferred daily delivery time in the Schedule Trigger node. **Replace with your preferred hour (currently 6 AM). Note: Times display in your workflow timezone Topic and subreddits: In the "Set Topic, Subreddits and Email Address" node, **replace with your topic name (e.g., "Investing") and replace with your subreddit array (e.g., ["investing", "stocks"]) Email recipient: **Replace with your Gmail address in the same node Credentials Reddit OAuth2 for API access, OpenRouter API key for AI summaries, Gmail OAuth2 for email delivery If you have any questions in running the workflow, feel free to reach out to me at my youtube channel: https://www.youtube.com/@lifeofhunyao
by Juan Carlos Cavero Gracia
This workflow automates batch video publishing prep from a Google Drive folder with AI-generated, platform-specific copy and a simple approval queue in Google Sheets. Perfect for Agencies, content creators or Teams What This Workflow Does Fetches videos from a Google Drive folder You provide a folder ID and the workflow lists all files, filtering to keep only .mp4. Builds a simple publishing calendar You configure: Start date Cadence (daily, 5/week, 3/week) Timezone One Publish Hour shared across all selected platforms The workflow creates: Schedule Date Schedule DateTime for each video Analyzes each video with AI Gemini performs a structured analysis of the video to understand: What happens in the content Key topics Tone Audience intent Generates platform-specific social copy For each video, the AI creates unique text for: TikTok Instagram Reels YouTube Shorts The prompts are language-aware: If the video is in English, the titles/descriptions are generated in English. If the video is in Spanish, they are generated in Spanish. Saves everything to Google Sheets as drafts Each video becomes one row with: Titles, descriptions, hashtags/tags A single shared Schedule DateTime Status = draft Auto-publishes approved rows (Flow 2) Every hour: Loads the sheet Filters rows where Status = approved Downloads the Drive file Schedules the video to the selected platforms Updates Status = scheduled Sheet Structure The tracking sheet is designed to be a clean approval queue. Recommended columns: Video ID Video Name Index Status Schedule Date Schedule DateTime TikTok Title TikTok Description TikTok Hashtags Instagram Title Instagram Description Instagram Hashtags YouTube Title YouTube Description YouTube Tags Summary Profile Platforms Created At How Approval Works New rows start as draft. You revise any copy directly in Sheets. When ready, change Status to approved. Flow 2 schedules the video and updates the row. Requirements Google Drive** access Google Sheets** access Gemini API key** Upload-Post** account with connected social profiles Installation & Setup Create your tracking sheet Copy this sheet in your drive https://docs.google.com/spreadsheets/d/1cegJHxj7Kx4Tg8gMr3uixpzToNc62VEvuuz37iFvnRw/edit?usp=sharing Connect credentials in n8n Google Drive OAuth Google Sheets OAuth Gemini credentials Upload-Post credentials Run the form Provide: Drive Folder ID Profile Username Platforms Timezone Start Date Cadence Publish Hour Google Sheet ID If you want to explore the API used for publishing These docs can help for custom extensions: social media API Ideal Use Cases Creators** batching Shorts/Reels/TikToks and wanting a single approval queue Agencies** that need a simple client-friendly review workflow Teams** building internal content ops with predictable scheduling Notes This version keeps scheduling simple with one Schedule DateTime per video shared across all selected platforms. If you later want per-platform times, extend the calendar builder to generate separate datetimes again.
by Intuz
This n8n template from Intuz provides a complete and automated solution for creating an autonomous social media manager. This workflow uses an AI agent to intelligently generate unique, high-quality content, check for duplicates, and post it on a consistent schedule to automate your entire Twitter presence. Who's this workflow for? Social Media Managers Marketing Teams & Agencies Startup Founders & Solopreneurs Content Creators How it works 1. Runs on a Schedule: The workflow automatically starts at a set interval (e.g., every 6 hours), ensuring a consistent posting schedule. 2. AI Generates a New Tweet: An advanced AI Agent, powered by OpenAI, uses a detailed prompt to craft a new, engaging tweet. The prompt defines the tone, topics, character limits, and hashtags. 3. Checks for Duplicates: Before finalizing the tweet, the AI Agent is equipped with a tool to read a Google Sheet containing a log of all previously published posts. This allows it to ensure the new content is always unique. 4. Posts to Twitter (X): The final, unique tweet is automatically posted to your connected Twitter account. 5. Logs the New Post: After posting, the workflow logs the new tweet back into the Google Sheet, updating the history for the next run. This completes the autonomous loop. Setup Instructions Schedule Your Posts: In the Start Workflow (Schedule Trigger) node, set the frequency you want the workflow to run (e.g., every 6 hours). Connect OpenAI: Add your OpenAI API key in the OpenAI Chat Model node. Customize the prompt in the AI Agent node to match your brand's voice, target keywords, and specific URLs. Configure Google Sheets: Connect your Google Sheets account. Create a sheet with two columns: Tweet Content and Status. In both the Get Data from Google Sheet and Add new Tweet to Google sheet nodes, select your credentials and specify the Document ID and Sheet Name. Connect Twitter (X): In the Create Tweet node, connect the Twitter account where you want to post. Activate Workflow: Save the workflow and toggle the "Active" switch to ON. Your AI social media manager is now live! Key Requirements to Use This Template Before you start, please ensure you have the following accounts and assets ready: An n8n Instance: An active n8n account (Cloud or self-hosted) where you can import and run this workflow. OpenAI Account: An active OpenAI account with an API Key. You will need to have billing enabled to use the language models for tweet generation. Google Account & Sheet: A Google account and a pre-made Google Sheet. The sheet must have two specific columns: Tweet Content and Status. Twitter (X) Developer Account: A Twitter (X) account with an approved Developer profile. You need an App created within the Developer Portal with the necessary permissions (v2 API access with Write scopes) to post tweets automatically. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Muhammad Nouman
How it works This workflow turns a Google Drive folder into a fully automated YouTube publishing pipeline. Whenever a new video file is added to the folder, the workflow generates all YouTube metadata using AI, uploads the video to your YouTube channel, deletes the original file from Drive, sends a Telegram confirmation, and can optionally post to Instagram and Facebook using permanent system tokens. High-level flow: Detects new video uploads in a specific Google Drive folder. Downloads the file and uses AI to generate: • a polished first-person YouTube description • an SEO-optimized YouTube title • high-ranking YouTube tags Uploads the video to YouTube with the generated metadata. Deletes the original Drive file after upload. Sends a Telegram notification with video details. (Optional) Posts to Instagram & Facebook using permanent system user tokens. Set up steps Setup usually takes a few minutes. Add Google Drive OAuth2 credentials for the trigger and download/delete nodes. Add your OpenAI (or Gemini) API credentials for title/description/tag generation. Add YouTube OAuth2 credentials in the YouTube Upload node. Add Facebook/Instagram Graph API credentials if enabling cross-posting. Replace placeholder IDs (Drive folder ID, Page ID, IG media endpoint). Review sticky notes in the workflow—they contain setup guidance and token info. Activate the Google Drive trigger to start automated uploads.
by Bhavy Shekhaliya
Overview AI-powered workflow that transforms any article URL into platform-optimized social media posts for LinkedIn, Twitter (X), and Reddit. Uses Mozilla Readability for content extraction, multi-agent AI with RAG from viral LinkedIn post database, and interactive review forms for content refinement before auto-publishing. Key Capabilities: Extracts article content: title, author, text, images, metadata Generates LinkedIn posts using 3-agent system with viral pattern matching Creates Twitter threads under 280 characters with article links Auto-posts to Reddit with AI-selected flairs Interactive review/regeneration workflow with feedback loops Auto-publishes with images or links to all platforms How It Works Stage 1: Article Content Extraction Form Submission: User enters article URL (with basic auth protection) URL Validation: Checks if valid URL format Article Scraping: HTTP request fetches HTML content Readability Parsing: Mozilla Readability extracts: Clean article text (removes ads, navigation, etc.) Title, author, excerpt Word count, site name Featured image (from og:image, twitter:image, or first img tag) Error Handling: Returns user-friendly error if scraping fails Stage 2: LinkedIn Post Generation (3-Agent System) Agent 1: LinkedIn Post Strategist Input**: Extracted article content (title, text, author, excerpt) RAG Process**: Queries Supabase vector database for similar viral LinkedIn posts Analysis**: Identifies patterns, hooks, formatting, engagement triggers Output**: Strategic insights and viral content patterns Agent 2: LinkedIn Post Generator Input**: Article content + strategist insights Process**: Creates post using viral patterns from database Rule**: Must include article URL in post Output**: Draft LinkedIn post Agent 3: LinkedIn Post Formatter Input**: Generated post Process**: Removes extraneous content Applies Sans Serif Bold Unicode for emphasis (𝗯𝗼𝗹𝗱 𝘁𝗲𝘅𝘁) Removes markdown/em dashes Ensures clean formatting Output**: Polished, ready-to-post LinkedIn content Review Loop: User sees formatted post in web form Options: "Regenerate" or "Continue" If regenerate: Provide feedback → Agent creates new version Second review form with same options After 2 iterations or approval, proceeds to image selection Stage 3: Image Handling for LinkedIn Image Preview: Shows extracted article image User Choice: "Yes" → Downloads image, posts with text + image "Continue without Image" → Posts with text + article link preview Auto-Publish: Posts to LinkedIn with selected format Stage 4: Twitter (X) Post Generation Parallel process (runs alongside LinkedIn): Twitter Agent: Creates tweet under 280 characters (including spaces) Must include article URL Uses GPT-4.1 or GPT-5 models Tweet Review Form: User reviews generated tweet Regeneration Loop (if requested): User provides feedback Re-generate Tweet Agent creates new version Second review form Auto-Tweet: Posts with article image attachment Stage 5: Reddit Post Automation Parallel process (runs alongside LinkedIn/Twitter): Subreddit Selection: User picks from dropdown (r/n8n, r/mcp, r/technews) Flair Retrieval: Fetches available flairs for selected subreddit via Reddit API AI Flair Selection: GPT-4o-mini analyzes article title + available flairs Selects most appropriate flair Auto-Post: Submits link post to Reddit with title and selected flair How To Use Prerequisites API Credentials Required OpenAI API: GPT-4.1, GPT-5, GPT-5-mini, GPT-4o-mini access Supabase: Vector database with linkedin_post table (from previous workflow) LinkedIn OAuth2: Developer app with posting permissions Twitter OAuth2: Developer account with tweet permissions Reddit OAuth2: App credentials with submit permissions Basic Auth: For form password protection Setup Steps 1. Configure Form Access Open "On Article Submission" node Set up basic auth credentials for form protection Get form URL from webhook settings 2. Link Vector Database Ensure Supabase vector store has viral LinkedIn posts (use previous workflow to populate) Verify "LinkedIn Post Vector Store" credentials Check "Embedding" node has OpenAI API key 3. Set Up Social Media APIs LinkedIn: Configure "Text + Image" and "Text + Link" nodes Update person parameter with your LinkedIn profile ID Add OAuth2 credentials Twitter: Configure "Tweet" and "Re-generated Tweet" nodes Add Twitter OAuth2 credentials Reddit: Update subreddit list in "Reddit Form" dropdown (customize to your subreddits) Configure "Get Flair", "Reddit Post" nodes with OAuth2 Update subreddit name in "Reddit Post" query parameters 4. Configure AI Models Verify all OpenAI credentials in language model nodes Models used: GPT-4.1, GPT-5, GPT-5-mini (adjust based on your access)
by totoma
Use Cases Receive a newsletter featuring curated, contributor-friendly issues from your favorite repositories. By regularly reviewing active issues and new releases, you'll naturally develop stronger habits around open source contribution as your brain starts recognizing these projects as important. How It Works Collects the latest issues, comments, and recent commits using the GitHub API. Uses an AI model to select up to three beginner-friendly issues worth contributing to. Summarizes each issue—with contribution guidance and relevance insights—using Deepwiki MCP. Converts the summaries into HTML and delivers them as an email newsletter. Requirements GitHub Personal Access Token OpenRouter API Key Google App Password Make sure your target open-source project is indexed at https://deepwiki.com/{owner}/{repo} (e.g. https://deepwiki.com/vercel/next.js) How to Use Update the “Load repo info” node with your target repository’s owner and name (e.g. owner: vercel, repo: next.js). Add your GitHub Personal Access Token to the credentials of the “Get Issues from GitHub” node. Connect your OpenRouter API key to all models linked to the Agent node. Add your Google App Password to the “Send Email” node credentials. Enter the same email address (associated with the Google App Password) in both the “to email” and “from email” fields — the newsletter will be sent to this address. Customization Adjust the maximum number of contributor-friendly issues retrieved in the “Get Top Fit Issues” node. Improve results by tuning the models connected to the Agent node. Refine the criteria for “contributor-friendliness” within the “IssueRank Agent” node. Cron Setup Replace the manual trigger with a Schedule Trigger node or another scheduling-capable node. If you don't have an n8n Cloud account, use this alternative setup: fork the repository and follow the setup instructions. TroubleShooting If there is an issue with the AI model’s response, modify the ai_model setting. (If you want to use a free model, search for models containing “free” and choose one of them.)
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? 📞 Book a Call | 💬 DM me on Linkedin What This Workflow Does This automation eliminates the tedious task of manually entering receipt data by automatically processing receipt images uploaded to Google Drive. When you drop a new receipt into a monitored folder, the workflow extracts vendor name, date, itemized purchases, and total amount using AI, logs everything to a Google Sheet, and sends you an email confirmation with a formatted summary. Key Benefits Save time:** No more manual data entry from receipts Reduce errors:** AI-powered extraction ensures accuracy Stay organized:** All expense data automatically tracked in one spreadsheet Get notified:** Instant email confirmation when receipts are processed Common Use Cases Personal expense tracking and budgeting Small business accounting and bookkeeping Reimbursement documentation Tax preparation record-keeping Setup Requirements Accounts needed: Google Drive account (for receipt storage) OpenAI account (for AI-powered data extraction) Google Sheets account (for data logging) Gmail account (for notifications) Configuration steps: Google Drive: Connect your account and select the folder where you'll upload receipts Google Sheets: Make a copy of the template spreadsheet (link in workflow notes) to your own account and update the Sheet ID in the workflow Email recipient: Change the notification email address to your own AI model: The workflow uses GPT-4-mini by default, but you can select a different OpenAI model based on your accuracy and cost preferences The AI Agent is configured to extract data in a strict JSON format with fields for vendor, date (converted to DD/MM/YYYY), itemized purchases, and total amount.
by WeblineIndia
(Retail) Customer Cleanup API → Supabase and send Notification This workflow provides an API-first solution to validate, clean, deduplicate and store customer data in Supabase. It ensures consistent customer records, prevents duplicates and keeps both internal teams and customers informed through automated notifications. This workflow acts as a backend customer intake API. It validates and normalizes incoming customer data, checks for existing users in Supabase, stores new customers safely and returns clear API responses. Internal teams receive Slack and Telegram updates, while customers get an email confirmation on successful creation. You receive: Centralized customer data validation** Automatic duplicate prevention** Supabase-backed customer storage** Real-time API responses** Team notifications + user confirmation email** Ideal for retail, e-commerce and SaaS teams that want clean customer data without manual intervention. Quick Start – Implementation Steps Import the provided n8n workflow JSON. Configure Supabase credentials with read/write access. Connect Slack, Telegram and Gmail/SMTP credentials. Copy the Webhook URL and use it as your customer intake API. Activate the workflow — your customer API is live. What It Does This workflow automates customer data intake and processing: Receives customer data via a POST API call. Cleans and normalizes names, email and phone numbers. Validates required fields and formats using JavaScript. Aggregates clear, field-specific validation errors. Checks Supabase to prevent duplicate users. Stores valid, new customers in Supabase. Returns structured API responses (success, validation error or duplicate). Sends notifications to Slack and Telegram. Emails the customer after successful account creation. This ensures reliable, consistent customer records across systems. Who’s It For This workflow is ideal for: Retail and e-commerce platforms CRM and customer data teams SaaS product teams Backend automation teams Marketing teams needing clean contact lists Developers building API-driven systems Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Supabase project** with a customers table Supabase service role key Slack workspace** with API access Telegram bot** + chat ID Gmail or SMTP account** for user notifications How It Works API Request – Client sends customer data to the webhook endpoint. Validation & Cleanup – JavaScript validates and formats data. Validation Fail – Returns 400 response with clear error messages. Duplicate Check – Supabase is queried using the email address. Duplicate Found – Returns 409 response without creating a record. Create Customer – New customer is saved in Supabase. Success Response – API confirms successful creation. Notifications – Slack, Telegram and customer email are triggered. Setup Steps Create a Supabase table with fields for customer data, validation status and errors. Add Supabase credentials to n8n. Import the workflow JSON into n8n. Configure the Webhook node and copy the API URL. Review the validation logic if custom rules are required. Configure Slack, Telegram and email credentials. Test using Postman for invalid, duplicate and valid requests. Activate the workflow. How To Customize Nodes Customize Validation Rules Update the JavaScript validation node to add country-specific phone rules or additional fields. Customize Duplicate Logic Extend duplicate checks to include phone numbers or other identifiers. Customize Notifications Modify Slack and Telegram messages, add emojis, mentions or execution metadata. Add-Ons (Optional Enhancements) You can extend this workflow to: Add API key authentication Enable rate limiting Log failed attempts separately Support multi-country phone validation Add CRM or email marketing sync Implement soft deletes or upserts Use Case Examples 1. Customer Registration API Centralize customer creation for web and mobile apps. 2. Data Hygiene Automation Prevent invalid or duplicate contacts in your database. 3. Retail & CRM Integration Keep customer records consistent across systems. 4. Marketing Readiness Ensure only clean, valid contacts enter campaigns. Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------- | ----------------------------- | ----------------------------------- | | Validation always fails | Incorrect payload structure | Ensure data is sent in request body | | Duplicate user created | Duplicate check misconfigured | Verify Supabase filter conditions | | No Slack alert | Invalid credentials | Reconnect Slack API | | No email sent | Gmail/SMTP not configured | Verify sender account | | API not responding | Webhook not active | Activate the workflow | Need Help? If you need help customizing or extending this workflow, adding authentication, scaling for high traffic, integrating CRMs or enhancing validation, the n8n automation team at WeblineIndia can assist you with production-ready automation and integration support. Contact us today.
by Cheng Siong Chin
How It Works This workflow automates real-time energy grid telemetry ingestion, compliance validation, and multi-channel reporting for grid operators, energy managers, and compliance teams. Telemetry data arrives via webhook and is routed to a central Coordination Agent with persistent memory. Four specialised AI sub-agents operate in parallel: Grid Signal Agent (validates signals via Telemetry Validation Tool and parses structure), Compliance Agent (checks against compliance history), Reporting Agent (generates structured reports), and Notification Agent (triggers Slack alerts). Results flow into a Prepare Telemetry Storage node, then branch into three outputs, validated telemetry stored to a grid database, compliance alerts prepared and stored, and email reports dispatched. This eliminates manual grid monitoring, accelerates anomaly response, and maintains a continuous compliance audit trail across energy infrastructure. Setup Steps Configure webhook URL in Grid Telemetry Webhook node. Set AI model credentials (OpenAI/Anthropic) in all agent and model nodes. Connect Slack credentials and target channel to Slack Notification Tool node. Configure email credentials in Send Report Email node. Connect database/Google Sheets credentials. Prerequisites Slack workspace and bot token Email account (SMTP or Gmail OAuth2) Database or Google Sheets for telemetry and alert storage Use Cases Real-time anomaly detection and alerting across smart grid sensor networks Automated regulatory compliance reporting for energy grid operators Customisation Extend Compliance Agent thresholds to match regional grid standards Replace Slack with Teams or PagerDuty for incident escalation Benefits Eliminates manual telemetry review — processes grid events at machine speed
by Veena Pandian
Who is this for? E-commerce store owners, product managers, marketplace sellers, and pricing analysts who want to automatically track competitor pricing and get actionable alerts when their products are overpriced or underpriced relative to the market. What this workflow does This workflow runs daily to compare your product prices against live competitor prices from Google Shopping. It identifies pricing gaps, calculates suggested prices that protect your margins, sends instant Slack alerts for critical issues, logs everything to a historical price tracking sheet, and delivers a comprehensive daily summary via Slack and email. How it works Daily trigger fires on a configurable schedule (default: every 24 hours). Reads your product catalog from a Google Sheet — auto-detects column names regardless of naming convention. Searches Google Shopping for each product using SearchAPI to find real-time competitor prices. Analyzes the pricing gap — compares your price to the market average and classifies each product as UNDERPRICED, SLIGHTLY_UNDER, COMPETITIVE, SLIGHTLY_OVER, or OVERPRICED. Suggests optimal prices based on market averages while maintaining a minimum margin above your cost. Sends instant Slack alerts when a product hits critical or warning thresholds. Logs all results to a price_log tab in your Google Sheet for trend analysis. Sends a daily summary via Slack message and HTML email with a full breakdown of all products. Setup steps Create a Google Sheet with a products tab containing columns: product_name, my_price (required), and optionally sku, my_cost. Add a second tab called price_log with headers: date, product_name, sku, my_price, my_cost, margin_now, competitor_lowest, competitor_average, competitor_highest, competitor_count, gap_pct, signal, suggested_price, action. Get a SearchAPI key from searchapi.io and set it as the n8n environment variable SEARCHAPI_KEY. Connect Google Sheets OAuth2 credentials and update the sheet ID in both Sheets nodes. Connect Slack OAuth2 credentials and configure your alert channel. Connect Gmail OAuth2 credentials and update the recipient email address. Requirements n8n instance (self-hosted or cloud) SearchAPI.io account and API key Google Cloud project with Sheets API enabled Slack workspace with a bot configured Gmail account with OAuth2 credentials How to customize Pricing thresholds** — Adjust the 0.85, 0.95, 1.05, 1.15 multipliers in the "Analyze Pricing Gap" node to change sensitivity. Minimum margin** — Change the 1.15 cost multiplier to set your floor margin (default: 15%). Schedule** — Modify the trigger interval for more or less frequent checks. Notifications** — Replace or add Slack/email with Telegram, Discord, Microsoft Teams, or webhooks. Region** — Change the gl parameter in the search node from us to your target market country code.
by V3 Code Studio
🚀 Never miss a new lead again — get instant email alerts and stay ahead of every opportunity! This workflow automatically notifies your team the moment a new lead is created in your CRM or form submission. It keeps your sales, marketing, and support teams aligned — so no lead goes unnoticed, and every customer feels heard right away. ✨ How it works ✅ Capture: A webhook receives new lead data from your CRM or online form. 🧩 Clean: The workflow filters and formats the data for clear presentation. 💌 Compose: Generates a beautiful HTML email with your branding and lead details. 📨 Send: Instantly emails your team or the lead using Gmail or SMTP. ⚡ React Fast: Your team gets notified in seconds — no manual checks needed! ⚙️ Set up steps 🔗 Add a POST webhook in your CRM or app that points to your n8n webhook URL. 🏢 Update your company info (logo, name, website) in the configuration node. 📧 Connect Gmail OAuth2 or your SMTP credentials to send branded emails instantly. Compatible with Odoo, HubSpot, Zoho CRM, Salesforce, Pipedrive, Typeform, and any system that supports outgoing webhooks.