by Mohamed Abubakkar
Description This workflow runs every Monday morning and automatically diagnoses the health of your SQL Server database across 4 dimensions — slow queries, missing indexes, index fragmentation, and server wait stats. It merges all results, scores each finding by severity, and sends a beautifully formatted HTML email report only when issues are found. Zero noise when everything is healthy. Who it's for SQL Server developers, DBAs, and backend engineers who manage production MSSQL databases and want weekly automated diagnostics delivered to their inbox — without manual querying or third-party monitoring tools. How it works: Schedule Trigger fires every Monday at 7 AM. 4 SQL nodes run in parallel — slow queries, missing indexes, index fragmentation, wait stats. Merge node combines all results into a single stream. Code node filters by field signature, scores severity (critical/warning), and builds the HTML email. Send Email dispatches the report only when issues exist. ⚙️ Setup required: • Add MSSQL credential with VIEW SERVER STATE permission • Add SMTP or Gmail credential for email • Adjust severity thresholds in the Code node to match your workload
by Avkash Kakdiya
How it works This workflow automates the complete employee leave approval process from submission to final resolution. Employees submit leave requests through a form, which are summarized professionally using AI and sent for approval via email. The workflow waits for the approver’s response and then either sends an approval confirmation or schedules a clarification discussion automatically. All communication is handled consistently with no manual follow-ups required. Step-by-step Step 1: Capture leave request, generate summary, and request approval** On form submission – Captures employee details, leave dates, reason, and task handover information. AI Agent – Generates a professional, manager-ready summary of the leave request. OpenAI Chat Model – Provides the language model used to generate the summary. Structured Output Parser – Extracts the email subject and HTML body from the AI response. Send message and wait for response – Emails the summary to the approver and pauses the workflow until approval or rejection. If – Routes the workflow based on the approval decision. Step 2: Notify employee or schedule discussion automatically** Approved path Send a message – Sends an official leave approval email to the employee. Clarification or rejection path Booking Agent – Determines the next business day and finds the first available 10-minute slot. OpenAI – Applies scheduling logic to select the earliest valid slot. Get Events – Fetches existing calendar events to avoid conflicts. Check Availability – Confirms free time within working hours. Output Parser – Extracts the final meeting start time. Send a message1 – Emails the employee with the scheduled discussion details. Why use this? Eliminate manual approval follow-ups and email back-and-forth Ensure consistent, professional communication for every leave request Automatically handle both approvals and clarification scenarios Reduce manager effort with AI-generated summaries Schedule discussions without manual calendar coordination
by Avkash Kakdiya
How it works This workflow runs on a daily schedule to monitor users currently on a trial plan. It fetches user data from MongoDB, calculates their trial stage, and assigns a trigger label such as Day 3, Day 7, Day 13, or Last Day. Based on this stage, the workflow sends a targeted email using Gmail. This ensures consistent engagement and improves trial-to-paid conversion without manual effort. Step-by-step Fetch and filter trial users** Schedule Trigger – Runs the workflow daily at a fixed time. Find documents – Retrieves users with an active trial plan from MongoDB. Code in JavaScript – Calculates trial progress and assigns a trigger type (day_3, day_7, day_13, last_day). Send stage-based emails** Loop Over Items – Iterates through each filtered user. Switch – Routes users based on their trigger type. Send day 3 email – Sends onboarding encouragement email. Send day 7 mail – Sends mid-trial engagement email. Send day 13 mail – Sends near-expiry reminder email. Send Last day email – Sends final urgency email before trial ends. Merge – Combines all branches and continues looping for remaining users. Why use this? Automates user engagement throughout the trial lifecycle Improves activation and feature adoption with timely nudges Increases conversion rates with strategic email timing Eliminates manual tracking of trial users Scales easily for large SaaS user bases
by Niclas Aunin
LinkedIn Content Generation Workflow Summary Automated workflow that transforms Notion content notes into publication-ready LinkedIn posts using Claude AI. Monitors Notion database and generates multiple variations based on structured outlines, so that the author can pick the one they like most. Use Cases Automate LinkedIn content creation from content planning database. Generate multiple post variations from a single outline. Maintain consistent voice and formatting across all posts. Scale content production while preserving quality. How It Works Trigger - Monitors Notion "Content Plan" database hourly for updates. Conditional Check - Verifies "LinkedIn Post (Main)" tag and "Ready for Writing" status Main Post - Claude generates single post from project name and notes Outline Analysis - Parallel process creates 3 distinct post concepts with different angles Multi-Post Generation - Each outline becomes a complete LinkedIn post Save to Notion - All posts automatically saved to database AI Setup: Claude Sonnet 4.5 (claude-sonnet-4-5-20250929) Main post: temperature 0.8 (creative) Multi-post: default temperature (consistent) How to Use Setup a content database in notion, or link your existing one: Use field mapping as outlined below or update field mapping in n8n template. Add content to Notion: Project name (topic) Notes (article content/key points) Tag: "LinkedIn Post (Main)" Status: "Ready for Writing" Workflow triggers automatically (hourly check) Retrieve posts from Notion database Review and publish to LinkedIn Requirements Credentials: Notion API (access to Content Plan database) Anthropic API key OpenAI API Key Notion Database: Connect Database Required properties: Project name (text) Notes (rich text) Tags (multi-select with "LinkedIn Post (Main)") Status (select with "Ready for Writing") Notes: Posts optimized for 1800 character limit Generates both single posts and multi-angle variations
by WeblineIndia
This n8n workflow automatically fetches all interview events scheduled for today from a specified Google Calendar and sends a personalized email to each interviewer. The email contains a formatted HTML table listing their interview details, including meeting times, Google Meet links, and attendees with their response status. This ensures all interviewers are informed daily at 8:00 AM IST without any manual coordination. Who’s it for Interviewers** who want a quick morning packet instead of opening multiple calendar tabs. Recruiters/coordinators** who need a reliable, zero‑friction daily brief for interviewers. Teams** that paste CV/notes links directly into calendar events (no file search required). How it works Cron triggers daily at 08:00 IST. Google Calendar reads today’s events from the Interviews calendar. A Code step parses each event to identify interviewers and extract candidate details, meeting link, and any CV: / Notes: links from the description and create a table to share via Gmail. A grouping step compiles a per‑interviewer list for the day. Email (Gmail) sends one digest to interviewer. How to set up Ensure all interviews are scheduled on the Google Calendar named Interviews and that interviewers are added as attendees. Add CV: <url> and Notes: <url> in the event description when available. Import the workflow and add credentials: Google Calendar (OAuth) SMTP/Gmail for sending the email digests Keep the default 08:00 IST schedule in the Cron node or adjust as needed. Requirements Google Workspace account with access to the Interviews calendar. Gmail sender account for digests (App Password if using 2FA). n8n instance (cloud or self‑hosted). Steps Trigger Schedule Node:** Schedule Trigger Purpose:* Starts the workflow daily at *8:00 AM**. Fetch Interview Events Node:** Google Calendar(Fetch Interview Events) Purpose:** Retrieves all events (interviews) from the configured calendar. Output:** Event details including summary, time, and organizer email. Group & Format Schedule Node:** HTML Table (JavaScript Code Node) Purpose:** Groups events by interviewer email and generates an HTML schedule table. Output:** Formatted fields: interviewer_email subject htmlContent Send Personalized Emails Node:** Gmail Purpose:** Sends the formatted interview schedule to each interviewer’s email address. Send To:** Dynamically set using ={{ $json.interviewer_email }} Subject:** "Interview Reminder" Body:** ={{ $json.htmlContent }} (HTML) How to customize the workflow Parsing rules:** If your event titles follow a pattern (e.g., Onsite – {Candidate} – {Role}), tweak the regex in the Code node. Attendee logic:** Refine how interviewers are detected (e.g., filter only accepted responses, or include tentative). Digest format:** Adjust table columns/order, or add role/team labels from the title. Schedule:** Duplicate the Cron for regional time zones or add a midday refresh. Add-ons to level up the Workflow with additional nodes Reminder pings:** Add 10‑minute pre‑interview reminders via Email or Slack/Teams. Conflict alerts:** Flag if an interviewer is double‑booked within a 15‑minute window. Feedback follow‑up:** After the scheduled time, DM or email a standardized feedback form link. Drive search (optional):** If you later want auto‑attach CVs, add a Google Drive search step (by candidate name) in a designated folder. Common troubleshooting points No events found:* Confirm the calendar name is *Interviews* and that events exist *today**. Missing links:** If CV/notes links aren’t in the email, add CV:/Notes: links to the event description. Email not sent:** Verify SMTP credentials, from‑address permissions, and any sending limits. Time mismatch:* Confirm workflow timezone and Calendar times are set to *Asia/Kolkata** (or adjust). A short note If you need help tailoring the parsing rules, adjusting the schedule or troubleshooting any step, feel free to reach out we will happy to help.
by Dahiana
YouTube Transcript Extractor This n8n template demonstrates how to extract transcripts from YouTube videos using two different approaches: automated Google Sheets monitoring and direct webhook API calls. Use cases: Content creation, research, accessibility, meeting notes, content repurposing, SEO analysis, or building transcript databases for analysis. How it works Google Sheets Integration:** Monitor a sheet for new YouTube URLs and automatically extract transcripts Direct API Access:** Send YouTube URLs via webhook and get instant transcript responses Smart Parsing:** Extracts video ID from various YouTube URL formats (youtube.com, youtu.be, embed) Rich Metadata:** Returns video title, channel, publish date, duration, and category alongside transcript Fallback Handling:** Gracefully handles videos without available transcripts Two Workflow Paths Automated Sheet Processing: Add URLs to Google Sheet → Auto-extract → Save results to sheet Webhook API: Send POST request with video URL → Get instant transcript response How to set up Replace "Dummy YouTube Transcript API" credentials with your YouTube Transcript API key Create your own Google Sheet with columns: "url" (input sheet) and "video title", "transcript" (results sheet) Update Google Sheets credentials to connect your sheets Test each workflow path separately Customize the webhook path and authentication as needed Requirements YouTube Transcript API access (youtube-transcript.io or similar) Google Sheets API credentials (for automated workflow) n8n instance (cloud or self-hosted) YouTube videos How to customize Modify transcript processing in the Code nodes Add additional metadata extraction Connect to other storage solutions (databases, CMS) Add text analysis or summarization steps Set up notifications for new transcripts
by Oneclick AI Squad
This n8n workflow monitors blood stock levels daily and sends alerts when availability is low. It fetches data from a Google Sheet, checks stock status, and notifies via WhatsApp. Key Features Daily Monitoring**: Checks blood stock every day. Automated Alerts**: Sends notifications when stock is low. Real-Time Updates**: Uses live data from Google Sheets. Efficient Delivery**: Alerts sent instantly via WhatsApp. Continuous Check**: Loops to ensure ongoing monitoring. Workflow Process Daily Check Blood Stock: Triggers the workflow daily. Fetch Blood Stock: Reads data from a Google Sheet. Get All Stock: Collects all available blood stock details. Check Stock Availability: Analyzes stock levels for low thresholds. Send Alert Message: Sends WhatsApp alerts if stock is low. Sheet Columns Blood Type**: Type of blood (e.g., A+, O-). Quantity**: Current stock amount. Threshold**: Minimum acceptable stock level. Last Updated**: Date and time of last update. Status**: Current status (e.g., Low, Sufficient). Setup Instructions Import Workflow**: Add the workflow to n8n via the import option. Configure Sheet**: Set up a Google Sheet with blood stock data. Set Up WhatsApp**: Configure WhatsApp API credentials in n8n. Activate**: Save and enable the workflow. Test**: Simulate low stock to verify alerts. Requirements n8n Instance**: Hosted or cloud-based n8n setup. Google Sheets**: Access with stock data. WhatsApp API**: Integration for sending alerts. Admin Access**: For monitoring and updates. Customization Options Adjust Threshold**: Change low stock limits. Add Channels**: Include email or SMS alerts. Update Frequency**: Modify daily trigger time.
by Ossian Madisson
This n8n template allows you to automatically upload all attached files from incoming emails to Google Drive with optional filters on sender, receiver and file types This template is built to be customized for your specific needs. This template has the core logic and n8n node specific references sorted to work with dynamic file names throughout the workflow. Use cases Store invoices in Google Drive Save recurring reports in Google Drive Post recurring reports to another n8n workflow for further processing Archive files to Google Drive by email Save all files received by a client in a dedicated Google Drive folder Good to know The workflow is designed to not use custom code, preferring built-in nodes in n8n How it works Trigger on incoming emails with attachments (Optional) filter on sender/recipient Splits all attachments of the email into separate items (Optional) filter attachment based on file type (Optional) treat attachments with different file types through different paths Upload attachment to Google Drive Mark the email read and archive it after all attachments has been processed Notify in Slack how many attachments was processed in the execution How to use Configure Google credentials (1,2,6) Configure Slack credentials (7) Configure or disable sender/receiver filter (3) Configure or disable file type filter (4) Configure or disable file type paths (5) Configure destination folder (6) Build on this to fit your use case Note: there's a similar template with the same basics but with less ready-made modifications and no loop that allows us to archive the email and notify to Slack when done.
by Abdulrahman Alhalabi
Arabic OCR Telegram Bot How it Works Receive PDF Files - Users send PDF documents via Telegram to the bot OCR Processing - Mistral AI's OCR service extracts Arabic text from document pages Text Organization - Processes and formats extracted content with page numbers Create Google Doc - Generates a formatted document with all extracted text Deliver Results - Sends users a clickable link to their processed document Set up Steps Setup Time: ~20 minutes Create Telegram Bot - Get bot token from @BotFather on Telegram Configure APIs - Set up Mistral AI OCR and Google Docs API credentials Set Folder Permissions - Create Google Drive folder for storing results Test Bot - Send a sample Arabic PDF to verify OCR accuracy Deploy Webhook - Activate the Telegram webhook for real-time processing Detailed API configuration and Arabic text handling notes are included as sticky notes within the workflow. What You'll Need: Telegram Bot Token (free from @BotFather) Mistral AI API key (OCR service) Google Docs/Drive API credentials Google Drive folder for document storage Sample Arabic PDF files for testing Key Features: Real-time progress updates (5-step process notifications) Automatic page numbering in Arabic Direct Google Docs integration Error handling for non-PDF files
by Oneclick AI Squad
This workflow is a fully automated AI-powered business intelligence agent. It accepts a research topic or company name via webhook, autonomously collects data from multiple live sources (web search, news feeds, financial APIs), runs a multi-stage Claude AI analysis pipeline, and delivers a structured professional business report — all without human intervention. What's the Goal? To eliminate the hours analysts spend manually gathering data, switching between tools, and writing reports. This workflow does it all in under 3 minutes: Collects live market and competitor data Pulls recent news and sentiment signals Runs deep AI analysis across all sources Generates a structured executive report with SWOT, risks, and opportunities Delivers the final report via email and saves to Google Drive Why Does It Matter? Manual business research is slow, inconsistent, and expensive. This workflow: Saves 4-8 hours of analyst time per report Produces consistent, structured outputs every time Runs on a schedule or on-demand via API Scales to any number of topics or companies Integrates directly into your CRM, Slack, or email Generates billable deliverables for consulting agencies How It Works Stage 1 — INTAKE Webhook receives a research request. Set node normalizes all inputs and stores credentials. IF node validates the request has a valid topic. Stage 2 — DATA COLLECTION (parallel) Three HTTP Request nodes run simultaneously: Serper.dev fetches top 10 Google results for the topic NewsAPI pulls the latest 10 news articles from the past 7 days Alpha Vantage fetches financial/market data if a ticker is provided Stage 3 — DATA PROCESSING Code node merges and cleans all collected data. Extracts headlines, snippets, URLs, publication dates, sentiment signals, and key figures into a structured context object ready for AI analysis. Stage 4 — AI ANALYSIS (3-pass Claude pipeline) Pass 1 — Research Synthesis: Claude reads all raw data and extracts key facts, trends, and signals Pass 2 — Strategic Analysis: Claude performs SWOT analysis, identifies risks and opportunities Pass 3 — Report Generation: Claude writes the final structured executive report in Markdown Stage 5 — OUTPUT & DELIVERY Report is saved to Google Drive as a document. Summary is posted to Slack. Full report is emailed via SendGrid. All metadata is logged to Google Sheets. Webhook returns JSON response. Configuration Requirements ANTHROPIC_API_KEY — Claude AI (claude-sonnet-4-20250514) SERPER_API_KEY — Google Search results (serper.dev, free tier available) NEWSAPI_KEY — News articles (newsapi.org, free tier available) ALPHA_VANTAGE_KEY — Financial data (alphavantage.co, free tier available) SENDGRID_API_KEY — Email delivery SLACK_WEBHOOK_URL — Slack notifications GOOGLE_DRIVE_FOLDER_ID — Where to save reports GOOGLE_SHEET_ID — Report audit log Setup Guide Step 1: Import this workflow into your n8n instance Step 2: Open the Set Credentials node and replace all placeholder values with your real API keys Step 3: Set your GOOGLE_SHEET_ID in the Log to Sheets node Step 4: Set your GOOGLE_DRIVE_FOLDER_ID in the Save to Drive node Step 5: Configure your Slack webhook URL in the Notify Slack node Step 6: Activate the workflow or trigger manually via POST Sample Request POST /webhook/business-report { "topic": "OpenAI market position 2025", "company": "OpenAI", "ticker": "", "industry": "Artificial Intelligence", "reportType": "competitive_analysis", "recipientEmail": "analyst@yourcompany.com", "urgency": "standard" } Report Types Supported competitive_analysis market_research industry_overview company_profile investment_brief
by Yassin Zehar
Description This workflow sends a summary of your meeting minutes via Gmail, directly from the notes stored in your Google Sheet. Context Taking notes during meetings is important, but sharing them with the team can be time-consuming. This workflow makes it simple: just write down your meeting minutes in a Google Sheets, and n8n will automatically send them by email after each meeting. Who is this for? Perfect for anyone who: Uses Google Sheets to keep track of meeting notes. Wants to automatically share minutes with teammates or stakeholders. Values speed, productivity, and automation. Requirements Google account. Google Sheets (with your meeting minutes). You will need to setup the required columns first : Topic, Status, Owner, Next Step. Gmail. How it works ⏰ Trigger starts after a new row is added in your Google Sheet. 📑 The meeting minutes are retrieved from the sheet. 📨 Gmail automatically sends the minutes to the configured recipients. Steps 🗒️ Use the sticky notes in the n8n canvas to: Add your Google credentials (Sheets + Gmail). Define your sheet and recipient email addresses. Test the workflow to check if the minutes are sent. You’ll get this: An email containing your full meeting minutes, straight from your notes. Tutorial video Watch the Youtube Tutorial video About me : I'm Yassin, IT Project Manager, Agile & Data specialist. Scaling tech products with data-driven project management. 📬 Feel free to connect with me on Linkedin
by Richard Besier
📤 Search Products from Facebook Ads on Amazon Once connected, this automation automatically scrapes Facebook ads from a specific Facebook Ad Library URL and searches for that same product on Amazon. Can be useful for Amazon FBA or dropshipping. 🔨 Setup This automation workflow is connected with two Apify scrapers. Make sure to connect the two scrapers mentioned in the blue and orange box, with their specific API endpoints. 👋 Need Help? If you need further help, or want a specific automation to be built for you, feel free to contact me via richard@advetica-systems.com.