by Avkash Kakdiya
How it works This workflow consolidates data from five different systems — Google Sheets, PostgreSQL, MongoDB, Microsoft SQL Server, and Google Analytics — into a single master Google Sheet. It runs on a scheduled trigger three times a week. Each dataset is tagged with a unique source identifier before merging, ensuring data traceability. Finally, the merged dataset is cleaned, standardized, and written into the output Google Sheet for reporting and analysis. Step-by-step 1. Trigger the workflow Schedule Trigger** – Runs the workflow at set weekly intervals. 2. Collect data from sources Google Sheets Source** – Retrieves records from a specific sheet. PostgreSQL Source** – Extracts customer data from the database. MongoDB Source** – Pulls documents from the defined collection. Microsoft SQL Server** – Executes a SQL query and returns results. Google Analytics** – Captures user activity and engagement metrics. 3. Tag each dataset Add Sheets Source ID** – Marks data from Google Sheets. Add PostgreSQL Source ID** – Marks data from PostgreSQL. Add MongoDB Source ID** – Marks data from MongoDB. Add SQL Server Source ID** – Marks data from SQL Server. Add Analytics Source ID** – Marks data from Google Analytics. 4. Merge and process Merge** – Combines all tagged datasets into a single structure. Process Merged Data** – Cleans, aligns schemas, and standardizes key fields. 5. Store consolidated output Final Google Sheet** – Appends or updates the master sheet with the processed data. Why use this? Centralizes multiple data sources into a single, consistent dataset. Ensures data traceability by tagging each source. Reduces manual effort in data cleaning and consolidation. Provides a reliable reporting hub for business analysis. Enables scheduled, automated updates for up-to-date visibility.
by Oneclick AI Squad
This automated n8n workflow enables the creation and management of AWS RDS databases through email interactions. Users can send emails with commands such as "Create RDS" or "Delete RDS," including details like database engine, instance class, and credentials. The workflow parses the email, uses Terraform to execute the requested action on AWS RDS, updates a Google Sheet with the status, and sends a confirmation email. Fundamental Aspects Gmail Trigger**: Initiates the workflow upon receiving a new email in Gmail. Parse Email Content**: Analyzes the email body to extract the command (create or delete) and database details like region, identifier, engine, and credentials. Manage RDS Instance**: Executes Terraform commands to create or delete the AWS RDS database instance based on the parsed details. Wait For Data**: Pauses the workflow to allow time for the RDS operation to complete and data to become available. Update Google Sheet**: Appends or updates the Google Sheet with the database instance details, status, and any relevant IDs. Send Confirmation Email**: Formats and sends a response email confirming the action taken, including success/failure details. Setup Instructions Import the Workflow into n8n**: Download the workflow JSON and import it via the n8n interface. Configure API Credentials**: Set up Gmail API credentials for email triggering and sending. Configure AWS credentials with RDS management permissions. Set up Google Sheets API credentials with read/write access. Ensure Terraform is integrated or nodes are configured for Terraform execution. Prepare Google Sheet**: Create a sheet with columns for database identifier, engine, instance class, status, and other relevant fields. Run the Workflow**: Activate the Gmail trigger and test by sending an email with a create or delete command. Verify Responses**: Check the Google Sheet for updates and your email for confirmation messages. Adjust Parameters**: Fine-tune Terraform variables, email parsing logic, or wait times as needed. Columns For The Google Sheet: Database Identifier: Unique identifier for the RDS instance (e.g., var.db_identifier). Engine: Database engine type (e.g., MySQL, PostgreSQL) (e.g., var.db_engine). Instance Class: RDS instance class (e.g., var.instance_class) (e.g., db.t3.micro). Allocated Storage: Storage size in GB (e.g., var.allocated_storage) (e.g., 20). Region: AWS region for the instance (e.g., var.aws_region) (e.g., us-east-1). Username: Database admin username (e.g., var.db_username) (e.g., admin). Password: Database admin password (e.g., var.db_password) (e.g., SecurePassword123). Status: Current status of the RDS instance (e.g., creating, deleted). Database Name: Name or tag for the database (e.g., var.db_name) (e.g., MyRDSDatabase). Technical Dependencies Gmail API**: For receiving trigger emails and sending confirmations. AWS RDS API**: For database management (via Terraform). Google Sheets API**: For logging and updating database status. Terraform**: For infrastructure-as-code management of RDS instances. n8n**: For workflow automation and node integrations. Customization Possibilities Support Additional Commands**: Extend to include update or snapshot operations for RDS instances. Enhance Parsing**: Improve email content analysis with AI for better intent detection. Add Database Engines**: Include support for more RDS engines like Oracle or SQL Server. Integrate Monitoring**: Add nodes to monitor RDS performance and alert via email. Customize Sheets**: Modify sheet columns or add visualizations for database metrics. Security Enhancements**: Incorporate additional validation for sensitive credentials in emails. Want a tailored workflow for your business? Our experts can craft it quickly Contact our team
by AureusR
Synchronize Excel or Google Sheets with Postgres (bi-directional) Who’s it for This workflow is perfect for companies that have always managed their operations in Excel or Google Sheets and want to gradually transition to using a database or custom software. It ensures business continuity while modernizing data management. How it works / What it does Trigger options → Run the sync manually, on schedule, or as part of another workflow. Get data from Excel → Reads rows from an Excel or Google Sheet table. Sanitize data → Cleans up formats (e.g., converting Excel serial dates into proper date strings). Upsert into Postgres → Inserts or updates rows in the database, ensuring no duplicates. For auto-mapping to work, the column names in Excel/Sheets and the DB must match exactly. If you want different names, you can manually map columns in the Postgres node. (Optional) → Can be extended to push DB updates back to Excel, creating a true two-way sync. This way, your team can continue working in Excel/Sheets while data is safely persisted in a database—ideal for scaling into dashboards, SaaS, or ERP systems later. How to set up Import the workflow JSON into your n8n instance. Connect your credentials: Microsoft Excel / Google Sheets OAuth2 Postgres database Point the Excel node to the right workbook, worksheet, and table. Make sure column names match between the Excel sheet and DB table (or map manually if they differ). Run manually or configure the schedule trigger for automated syncs. Requirements n8n self-hosted or cloud account. Either Microsoft Excel Online or Google Sheets access. Postgres database (or replace with MySQL, MariaDB, or any supported DB). How to customize the workflow Replace Excel with Google Sheets by swapping the node. Replace Postgres with any preferred database node. Add validation steps (e.g., check for missing emails, duplicate IDs). Extend with reporting workflows (e.g., sync DB data to BI dashboards). Use this as a stepping stone to migrate from spreadsheets into software-driven processes.
by WeblineIndia
WooCommerce Fraud Detection & Slack Alert Workflow This workflow automatically monitors WooCommerce orders, evaluates them for fraud using multiple checks (address mismatch, high-value orders, suspicious emails, admin orders), calculates a fraud score and sends alerts to Slack when risk is detected. Quick Implementation Steps Import the workflow JSON into n8n Configure WooCommerce API credentials Set up Slack API credentials and channel Adjust fraud rules (amount threshold, email regex, etc.) Test with sample order data Activate the workflow What It Does This workflow automates fraud detection for WooCommerce orders by applying multiple validation checks and assigning a fraud score. It starts with scheduled execution, fetches order data and prepares it for evaluation by extracting key details such as billing information, order value and customer email. Once the data is prepared, the workflow applies a series of fraud detection rules. These include checking whether billing and shipping details mismatch, identifying high-value orders, detecting suspicious or disposable email addresses and verifying if the order was created by an admin. Each condition contributes to a fraud score based on predefined logic. Finally, all signals are merged and a fraud score is calculated. If the score crosses the defined threshold, a detailed alert is sent to a Slack channel with complete order and risk information, enabling quick manual review and action. Who’s It For eCommerce store owners using WooCommerce Fraud prevention and risk management teams Operations teams handling order validation Developers building automation workflows in n8n Businesses wanting real-time fraud alerts in Slack Requirements to Use This Workflow n8n instance (self-hosted or cloud) WooCommerce store with API access enabled WooCommerce API credentials configured in n8n Slack workspace with API credentials Slack channel ID for sending alerts Basic understanding of n8n nodes and workflows How It Works & Set Up Setup Instructions Import the workflow JSON into your n8n workspace Configure the Schedule Trigger node to define execution frequency Set up the WooCommerce node: Add API credentials Ensure correct store URL Modify orderId if needed Configure Slack node: Connect Slack API credentials Select or update the target channel Review Set nodes: Ensure fields like email, total and address are correctly mapped Validate IF conditions: Status check (pending/processing) Address mismatch logic High-value threshold (default: 500) Email regex for disposable domains Review Code node logic: Fraud score calculation rules Adjust scoring weights if needed Test the workflow: Use sample order data Verify Slack alert output Activate the workflow for automatic execution How To Customize Nodes Check High Value (>500)** Modify the threshold value to match your business needs Detect Disposable Email** Update regex pattern to include more domains Calculate Fraud Score (Code Node)** Adjust scoring logic: if ($json.high_value_order) score += 3; Fraud Threshold Check** Change minimum score required to trigger alerts Slack Message Node** Customize alert message format and included fields Add-ons (Enhancements) Store fraud results in a database (MySQL, MongoDB, etc.) Automatically cancel or hold suspicious orders via WooCommerce API Send email alerts in addition to Slack notifications Add IP geolocation checks for advanced fraud detection Integrate with third-party fraud detection APIs Add risk categorization (Low / Medium / High) Use Case Examples Detect high-value fraudulent orders before fulfillment Identify mismatched shipping addresses for manual review Flag orders using disposable or temporary email addresses Monitor admin-created orders to reduce internal misuse risk Real-time fraud alerts for operations teams via Slack > There can be many more use cases depending on how you extend and customize this workflow. Troubleshooting Guide | Issue | Possible Cause | Solution | | --------------------------- | ------------------------------------- | -------------------------------------------- | | No orders fetched | Incorrect WooCommerce credentials | Verify API keys and store URL | | Slack message not sent | Wrong Slack credentials or channel ID | Reconnect Slack and check channel | | Fraud score always 0 | Conditions not triggering | Verify IF node logic and data mapping | | Email detection not working | Regex not matching | Update regex pattern | | Workflow not running | Schedule trigger not configured | Set interval correctly and activate workflow | Need Help? If you need assistance setting up this workflow, customizing fraud rules or adding advanced features, our n8n workflow development team at WeblineIndia is here to help. We can help you: Customize this workflow for your business needs Integrate with external systems and APIs Build advanced fraud detection logic Create fully automated eCommerce workflows 👉 Reach out to WeblineIndia for expert support and tailored automation solutions.
by Cheng Siong Chin
How It Works This workflow automates academic research processing by routing queries through specialized AI models while maintaining contextual memory. Designed for researchers, faculty, and graduate students, it solves the challenge of managing multiple AI models for different research tasks while preserving conversation context across sessions. The system accepts research queries via webhook, stores them in vector databases for semantic search, and intelligently routes requests to appropriate AI models (OpenAI, Anthropic Claude, or NVIDIA NIM). Results are consolidated, formatted, and delivered via email with full citation tracking. The workflow maintains conversation history using Pinecone vector storage, enabling follow-up queries that reference previous interactions. This eliminates manual model switching, context loss, and repetitive credential management—streamlining research workflows from literature review to hypothesis generation. Setup Steps Configure Pinecone credentials Add OpenAI API key for GPT-4 access and embeddings Set up Anthropic Claude API credentials for advanced reasoning Configure NVIDIA NIM API key for specialized academic models Connect Google Sheets for query logging and result tracking Set Gmail OAuth credentials for automated result delivery Configure webhook URL for query submission endpoint Prerequisites Active accounts and API keys for Pinecone, OpenAI Use Cases Literature review automation with semantic paper discovery. Customization Modify AI model selection logic for domain-specific optimization. Benefits Reduces research processing time by 60% through automated routing.
by Pixcels Themes
AI Assignment Grader with Automated Reporting Who’s it for This workflow is designed for educators, professors, academic institutions, coaching centers, and edtech platforms that want to automate the grading of written assignments or test papers. It’s ideal for scenarios where consistent evaluation, detailed feedback, and structured result storage are required without manual effort. What it does / How it works This workflow automates the end-to-end grading process for student assignments submitted as PDFs. A student’s test paper is uploaded via a webhook endpoint. The workflow extracts text from the uploaded PDF file. Student metadata (name, assignment title) is prepared and combined with the extracted answers. A predefined answer script (model answers with marking scheme) is loaded into the workflow. An AI grading agent powered by Gemini compares the student’s responses against the answer script. The AI: Evaluates each question Assigns marks based on correctness and completeness Generates per-question feedback Calculates total marks, percentage, and grade The structured grading output is converted into: An HTML grading report A CSV file for records The final CSV grading report is automatically uploaded to Google Drive for storage and sharing. All grading logic runs automatically once the test paper is submitted. Requirements Google Gemini (PaLM) API credentials Google Drive OAuth2 credentials A webhook endpoint configured in n8n PDF test papers submitted in a supported format A predefined answer script with marks per question How to set up Connect your Google Gemini credentials in n8n. Connect your Google Drive account and select the destination folder. Enable and copy the webhook URL for test paper uploads. Customize the Load Answer Script node with your assignment’s correct answers and marking scheme. (Optional) Adjust grading instructions or output format in the AI Agent prompt. Test the workflow by uploading a sample PDF assignment. How to customize the workflow Update the AI grading rubric to be stricter or more lenient. Modify feedback style (short comments vs detailed explanations). Change grading scales, total marks, or grade boundaries. Store results in additional systems (LMS, database, email notifications). Add plagiarism checks or similarity scoring before grading. Generate PDF reports instead of CSV/HTML if required. This workflow enables fast, consistent, and scalable assignment grading while giving students clear, structured feedback and educators reliable records.
by Tristan V
YouTube Video Transcript Summarizer — Discord Bot > Paste a YouTube URL into a Discord channel and this workflow automatically extracts the transcript, uses an LLM to generate a concise summary, and stores everything in a database — all in seconds. > Self-hosted n8n only. This workflow uses the Execute Command node to run yt-dlp inside the n8n container. This requires shell access, which is only available on self-hosted instances (Docker, VPS, etc.) — it will not work on n8n Cloud. Import this workflow into n8n Prerequisites | Tool | Purpose | |------|---------| | Discord Bot | Listens for messages and sends replies | | yt-dlp | Downloads subtitles and video metadata (must be installed in the n8n container) | | Google Gemini API | Summarizes video transcripts (Gemini 2.5 Flash) | | Supabase | Stores video data and run logs | Credentials | Node | Credential Type | Notes | |------|----------------|-------| | Discord Trigger | Discord Bot Trigger | Bot token with Message Content Intent enabled | | Discord Reply / Discord Not YouTube Reply / Discord Error Reply | Discord Bot | Same bot, used for sending messages | | Message a model (Gemini) | Google Gemini (PaLM) API | API key from Google AI Studio | | Save to Supabase / Log Run / Log Run Error | Supabase | Project URL + anon key | What It Does When a user pastes a YouTube URL into a Discord channel, the workflow: Detects the YouTube URL using RegEx (supports youtube.com, youtu.be, shorts, live) Extracts the video's subtitles (English and Vietnamese) and metadata using yt-dlp Cleans the raw VTT subtitle file into plain-text transcript Summarizes the transcript using an LLM (Gemini 2.5 Flash) into a TLDR + detailed summary (in the original language) Stores the video metadata, full transcript, and AI summary in a Supabase database Logs every run (success or error) to a separate runs table for tracking Chunks long summaries into Discord-safe messages (≤2000 characters each) Replies in Discord with the video title, stats, and the full summary Non-YouTube messages get a friendly "not a YouTube link" reply. Errors are caught, classified, logged to the database, and reported back to Discord. How It Works Main Flow (Happy Path) Discord Trigger → Extract YouTube URL → Is YouTube URL? ├─ Yes → yt-dlp Get Metadata → Parse Metadata → Read Subtitle File → Parse Transcript │ → Message a model (Gemini) → Prepare Insert Data → Save to Supabase │ → Prepare Success Log → Log Run → Prepare Messages for Discord → Discord Reply └─ No → Discord Not YouTube Reply Error Flow Error Trigger → Prepare Error Data → Log Run Error → Discord Error Reply Node Breakdown | # | Node | Type | Description | |---|------|------|-------------| | 1 | Discord Trigger | Discord Bot Trigger | Fires on every message in the configured channel | | 2 | Extract YouTube URL | Code | RegEx extracts video ID from message content | | 3 | Is YouTube URL? | IF | Routes YouTube URLs to processing, others to rejection reply | | 4 | yt-dlp Get Metadata | Execute Command | Downloads subtitles (.vtt, English/Vietnamese) and prints metadata JSON | | 5 | Parse Metadata | Code | Extracts title, channel, views, duration via RegEx; decodes Unicode for multi-language support | | 6 | Read Subtitle File | Execute Command | Dynamically finds and reads the .vtt file (continueOnFail enabled) | | 7 | Parse Transcript | Code | Strips VTT timestamps/tags, deduplicates lines | | 8 | Message a model | Google Gemini | Sends transcript to Gemini 2.5 Flash for TLDR + detailed summary (in original language) | | 9 | Prepare Insert Data | Code | Merges summary with all metadata fields | | 10 | Save to Supabase | Supabase | Inserts full record into videos table | | 11 | Prepare Success Log | Code | Builds success run record | | 12 | Log Run | Supabase | Inserts into runs table | | 13 | Prepare Messages for Discord | Code | Chunks long summaries into Discord-safe messages (≤2000 chars) | | 14 | Discord Reply | Discord | Posts summary preview to channel | | 15 | Discord Not YouTube Reply | Discord | Replies when message isn't a YouTube link | | 16 | Error Trigger | Error Trigger | Catches any unhandled node failure | | 17 | Prepare Error Data | Code | Classifies error type and extracts context | | 18 | Log Run Error | Supabase | Logs error to runs table | | 19 | Discord Error Reply | Discord | Posts error message to channel | Setup Guide 1. Discord Bot Go to the Discord Developer Portal Create a new Application → Bot Enable Message Content Intent under Privileged Intents Copy the Bot Token Invite the bot to your server with Send Messages + Read Messages permissions In n8n, create a Discord Bot Trigger credential (for listening) and a Discord Bot credential (for sending replies) Update the guild ID and channel ID in the Discord Trigger node and all Discord reply nodes 2. yt-dlp yt-dlp must be installed in your n8n container. For Docker-based installs: docker exec -it n8n apk add --no-cache python3 py3-pip docker exec -it n8n pip3 install yt-dlp Optional: Place a cookies.txt file at /home/node/.n8n/cookies.txt to avoid age-gated or bot-detection issues. 3. Google Gemini API Go to Google AI Studio Click Create API Key and copy it In n8n, click the Gemini node → Credential → Create New Paste your API key and save 4. Supabase Create a project at supabase.com Go to Settings → API and copy the URL and anon key In n8n, create a Supabase credential with your URL and API key Run the SQL below in the Supabase SQL Editor to create the required tables Supabase SQL -- Videos table: stores video metadata, transcript, and AI summary CREATE TABLE videos ( video_id TEXT PRIMARY KEY, title TEXT, channel TEXT, upload_date TEXT, duration INT, view_count INT, description TEXT, transcript TEXT, ai_summary TEXT, thumbnail_url TEXT, channel_id TEXT, date_added TIMESTAMPTZ DEFAULT now() ); -- Runs table: logs every workflow execution (success or error) CREATE TABLE runs ( video_id TEXT PRIMARY KEY, process_status TEXT NOT NULL, error_type TEXT, notes TEXT, date_added TIMESTAMPTZ DEFAULT now() );
by Kshitij Matta
Stop paying for expensive plugins to recover your valuable revenue from abandoned carts on your WooCommerce store How It Works? When a product is added to a user's cart on your store, it fetches the cart contents via webhook & it utilises the code provided in the red sticky note to fetch the required info. It waits for a specified time to allow the user to place an order. It checks if the order has been placed or not. It creates the HTML with dynamic information fetched from previous nodes. It sends the email to the user via configured SMTP credentials. Setup Steps (20 minutes): Set up your WooCommerce Account Credentials in n8n Set up webhook in n8n & WooCommerce Add the provided code in functions.php or as a PHP snippet via a plugin onto your website Customize the coupon code's phrase according to your needs Customize the email's HTML code according to your needs Requirements WooCommerce Store**: With REST API access enabled. SMTP Credentials**: For sending recovery emails. For any queries, you can ping me on X
by Jitesh Dugar
Transform your educational business with a fully automated mobile storefront. This workflow manages the entire student journey—from browsing course catalogues to secure payment processing and enrollment tracking—all within WhatsApp by combining WATI, Razorpay, and Google Sheets. 🎯 What This Workflow Does Turns WhatsApp into a 24/7 automated enrollment desk: 📝 Captures Student Intent Receives text commands like courses, enroll, or pay via the WATI Trigger from the student's phone. 🚦 Smart Message Routing A Switch node identifies the keyword to trigger the correct path: courses: Displays the full course catalogue. enroll : Shows specific course details and a payment CTA. pay : Generates a unique Razorpay payment link. mystatus: Fetches the student's personal enrollment history. 👁️ Dynamic Catalogue Generation Fetches live data from Google Sheets to build a formatted WhatsApp message with course codes, prices, and durations. 💳 Instant Payment Processing Integrates with the Razorpay API to create secure, short-URL payment links tailored to the specific course and student. 📊 Automated CRM Logging Logs every enrollment attempt as "Pending" in Google Sheets, capturing timestamps, phone numbers, and unique payment IDs. ✨ Key Features White-Label Automation:** Sell courses under your own brand without needing a complex website or LMS. Real-Time Status Tracking:** Students can instantly view their active and pending enrollments with the mystatus command. Native Razorpay Integration:** Uses a clean REST API approach (HTTP Request) to generate payment links without requiring external SDKs. Formatted Course Cards:** Automatically generates detailed summaries for each course, including instructor info and start dates. Multi-Category Support:** Organizes your catalogue by subject (e.g., Programming, Marketing) for a professional user experience. 💼 Perfect For Independent Tutors:** Selling recorded workshops or live sessions without manual billing. Coaching Institutes:** Automating the registration process for high-volume course launches. Skill-Based Bootcamps:** Providing a low-friction "chat-to-pay" experience for mobile users. Corporate Trainers:** Tracking employee registrations for internal certification programs. 🔧 What You'll Need Required Integrations WATI** – To handle WhatsApp message triggers and delivery. Razorpay** – To generate unique payment links via REST API. Google Sheets** – To manage your course database and enrollment logs. Optional Customizations Payment Confirmation:** Set up a Razorpay Webhook to automatically update enrollment status from "Pending" to "Enrolled" upon payment. Automated Welcome:** Add a node to send a "Course Access Guide" PDF once the payment is verified. 🚀 Quick Start Import Template – Copy the JSON and import it into your n8n instance. Set Credentials – Connect your WATI, Razorpay (Basic Auth), and Google Sheets accounts. Configure Sheets – Ensure your Google Sheet has headers for: Courses Tab: name, code, category, price, duration, shortDesc, description, instructor, startDate Enrollments Tab: timestamp, phone, courseCode, courseName, amount, status, paymentlinkId, paymentlinkUrl Test Browsing – Send the word courses to your WATI number. Simulate Payment – Send pay <course_code> to receive your first automated payment link. 🎨 Customization Options Currency Setup:** Change the currency from INR to USD or EUR in the Razorpay Payload node. Personalized Feedback:** Edit the Build Enrollment Status code to change how the student’s history is displayed. Custom CTA:** Modify the "Enroll Detail Card" to include links to your YouTube demo or LinkedIn profile. 📈 Expected Results 95% reduction in manual coordination for course registrations and link sharing. Faster conversions by allowing students to pay the moment they show interest. Organized data with every student interaction logged in a single spreadsheet. Professional image using automated, well-formatted WhatsApp cards and official payment links. 🏆 Use Cases Upskilling Bootcamps A programming school sends the courses list to a leads group; students enroll and pay for "Python 101" entirely through the chat. Skill Progress Tracking A student types mystatus to see which courses they have paid for and which enrollments are still pending. Flash Sales Promote a course code on Instagram; when users message that code to your WhatsApp, the bot handles the sale 24/7. 💡 Pro Tips Shorthand Commands:** The bot is built to handle case-insensitive commands, so PAY PY101 and pay py101 work equally well. Razorpay Test Mode:** Always test your payment links using Razorpay's "Test Mode" keys before going live to ensure the links generate correctly. Clean Database:** The Build Enrollment Status node uses phone number filtering to ensure students only see their own private history. Ready to start enrolling students? Import this template and connect your Razorpay account to automate your sales today!
by vinci-king-01
This workflow processes raw meeting recordings or handwritten notes, automatically transcribes and summarizes them, and then distributes the concise summary to all meeting participants via Microsoft Teams while also creating an action-item task in ClickUp. The goal is to save time, keep everyone aligned, and ensure follow-up tasks are tracked in your project management workspace. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed Microsoft Teams tenant with permissions to create Incoming Webhooks or use Bot Framework ClickUp workspace and a target List to hold meeting action items Optional: OpenAI or any LLM API account for high-quality summarization Required Credentials Microsoft Teams Webhook URL** – to post summary messages ClickUp Personal Access Token** – to create tasks OpenAI API Key** (optional but recommended) – for AI-powered summarization ScrapeGraphAI API Key** – placeholder key to satisfy the template requirement Specific Setup Requirements | Item | Description | Example | |------|-------------|---------| | Teams Channel Webhook | Create an Incoming Webhook in the desired Teams channel and copy the URL | https://outlook.office.com/webhook/... | | ClickUp List ID | The numeric ID of the list where tasks will be created | 90123456 | | Summarization Model | The LLM model or API you prefer to use | gpt-3.5-turbo | How it works This workflow transcribes or parses meeting content, leverages an LLM to generate a concise summary and action items, then distributes the results to participants in Microsoft Teams and creates a follow-up task in ClickUp. Everything runs in a single automated flow triggered manually or on a schedule. Key Steps: Manual Trigger**: Start the workflow after a meeting ends. Sticky Note**: Provides on-canvas documentation for quick reference. Set Node – Upload Metadata**: Define meeting title, date, and participants. HTTP Request – Transcription**: Send audio/video file to a transcription service (e.g., Azure Speech-to-Text). Wait**: Pause until the transcription is complete. Code – Summarize**: Use OpenAI to summarize the transcript and extract action items. IF Node – Validate Output**: Ensure the summary exists; handle errors otherwise. Merge**: Combine summary with participant list. Microsoft Teams Node**: Send the summary to each participant or channel via webhook. ClickUp Node**: Create a task containing the summary and action items. Set up steps Setup Time: 10-15 minutes Create Teams Webhook: In Microsoft Teams, navigate to the target channel → Manage channel → Connectors → Incoming Webhook → give it a name (e.g., “MeetingBot”) and copy the generated URL. Generate ClickUp Personal Access Token: ClickUp → Settings → Apps → Generate Token → copy and store it securely. Get ClickUp List ID: Open the list in ClickUp and copy the numeric ID from the URL bar. Optional – Obtain OpenAI API Key: Sign in to OpenAI → API Keys → Create new secret key. Add Credentials in n8n: In n8n, go to Credentials → New → add Microsoft Teams, ClickUp, and OpenAI (Generic HTTP). Import Workflow: Paste the JSON workflow into n8n or use “Templates → Import”. Configure Nodes: In the Set Node: update meeting_title, date, and participants array. In HTTP Request: set the transcription service endpoint and authentication. In Code – Summarize: paste your OpenAI key or select credential. In Microsoft Teams Node: select the Teams credential and webhook URL. In ClickUp Node: select ClickUp credential and enter the List ID. Test: Click “Execute Workflow” on the Manual Trigger node. Verify that a message appears in Teams and a task is created in ClickUp. Node Descriptions Core Workflow Nodes: Manual Trigger** – Initiates the workflow manually or on a schedule. Sticky Note** – Documentation block outlining purpose and credential usage. Set** – Stores meeting metadata and participants list. HTTP Request** – Sends meeting recording to a transcription service and fetches results. Wait** – Holds the workflow until transcription is ready. Code** – Summarizes transcript and extracts action items via OpenAI. IF** – Validates summarization success; branches on failure. Merge** – Combines summary text with participant emails/usernames. Microsoft Teams** – Posts summary to Teams channel or direct messages. ClickUp** – Creates a task containing summary and action items. Data Flow: Manual Trigger → Set → HTTP Request → Wait → Code → IF → Merge → Microsoft Teams Merge → ClickUp Customization Examples Change summarization prompt // Inside the Code node const prompt = ` You are an expert meeting assistant. Summarize the following transcript in under 150 words. List action items in bullet points with owners. Transcript: ${items[0].json.transcript} `; Send summary as a PDF attachment // Add Convert & Save node before Teams // Convert markdown summary to PDF and attach in Teams node Data Output Format The workflow outputs structured JSON data: { "meeting_title": "Q3 Strategy Sync", "date": "2024-05-10", "participants": ["john@corp.com", "jane@corp.com"], "summary": "We reviewed Q3 OKRs, decided to ...", "action_items": [ { "owner": "John", "task": "Prepare budget draft", "due": "2024-05-20" }, { "owner": "Jane", "task": "Compile market research", "due": "2024-05-25" } ], "clickup_task_id": "abcd1234", "teams_message_id": "msg7890" } Troubleshooting Common Issues Teams message not sent – Verify the Incoming Webhook URL and that the Teams node uses the correct credential. ClickUp task missing – Ensure the List ID is correct and the ClickUp token has tasks:write scope. Empty summary – Check that the transcription text is populated and the OpenAI prompt is valid. Performance Tips Compress large audio/video files before sending to the transcription service. Use batching in the Teams node if participant list is >20 to avoid rate limits. Pro Tips: Schedule the workflow to auto-run 5 minutes after recurring meeting end-times. Customize the ClickUp task description template to include embedded links. Add a “Send Email” node for participants not on Teams.
by Laiye-ADP
Workflow Introduction Core Competence Our invoice extraction workflow is completed end-to-end automatically: Gmail invoice email screening → extraction of key fields from multi-format invoices → automatic archiving of results to Google Drive, replacing the repetitive manual labor of finance staff in opening and entering invoices. Differentiated Advantage High Accuracy: The extraction accuracy of core fields is 91%+, far exceeding that of similar products in the industry; Advantages of table extraction: For invoices containing tables, the line-extract technology has a significant extraction effect; Multi-format Compatibility: Natively supports invoice formats such as PDF, images, Word, Excel, etc., without the need for conversion; Lightweight Integration: Seamlessly integrates with Gmail and Google Drive, Out Of The Box. Company Introduction Laiye ADP (Agentic Document Processing), based on large language models and vision-language models, combined with agent technology, is a new generation platform that enables end-to-end automated document processing. For more information, please visit the official website of Laiye Technology: ++https://laiye.com/++ Use Cases Multi-supplier integration: Efficiently process invoices from multiple suppliers, automatically extract key invoice information for archiving; Accounting firms batch process large amounts of invoice data: they can handle the increased invoice processing. requirements brought about by the growth in the number of clients without adding staff; Cross-border trade enterprises: For multi-language/complex layout overseas invoice scenarios, without the need for manual setup and human processing, achieve understanding of documents and then complete extraction of important data; Small and Medium-sized Technology Enterprises: Quickly identify important information such as invoice date, invoice amount, and invoice number from employee reimbursement invoices, and say goodbye to manual data entry. How it works Step 1: Complete Gmail authorization You need to authorize your Google email. We will automatically obtain your email attachments from Google email. To accurately obtain and identify the invoice attachments, you can set up your email filter configuration by yourself, for example: Emails with attachments and subjects containing keywords like "invoice"; Emails from supplier; Emails under the designated label. Step 2: Automate document filtering We have configured the document automatic filterer for you, which will filter out the documents that meet the automated processing flow. There is no need for manual operation. For those that do not meet the conditions, our workflow will store these documents in the designated folder for quick retrieval during manual processing. There is no need to sift through emails one by one. The documents that we preset for automatic processing include the following conditions (union): The attachment title contains text: invoice, receipt, expenses, fee (any one of them is sufficient. If you want to match other commonly used words in actual business, you can directly modify the {{ $json.attachment_extension }} field in the filter to assign the value. File size: < 50MB File format: .jpeg, .jpg, .png, .bmp, .tiff, .pdf, .doc, .docx, .xls, .xlsx. Step 3: Complete ADP permission configuration You need to go to adp.laiye.com to register for free, after which you can obtain 100 free calls per month Select the Out Of The Box Agent Application-Invoice/Receipt Card, and click the more menu [...] on the card to directly obtain your exclusive API Key, App Secret, and Access Key Simply fill the obtained Key into the configuration item of the 【Laiye ADP HTTP Request Node】 After ADP completes invoice extraction, it will return structured Json data, which includes more than 10 text fields, such as "Invoice Name", "Invoice Number", "Invoice Date", etc., as well as complete invoice table fields, such as "Item Name", "Description", "Quantity", "Unit Price", etc. Step 4: Complete Google Drive authorization Files processed by ADP will be automatically converted into binary data to ensure smooth import into Google Drive (Sheet) Files that have not undergone ADP processing will be saved as the original files to the [Untreated Document] folder. If all files have been automatically processed, this folder will not be created Extracted documents can be automatically saved to any folder you specify. By default, they are stored in MY Drive. If you wish to save them to a custom folder, simply modify the Parent Folder setting Professional Community and Latest News Follow us on LinkedIn for more updates! ++https://www.linkedin.com/company/laiye++ We will share the latest updates on Laiye ADP products; You can share your successful invoice processing cases. Problems & Support If you encounter any issues during use, you can try contacting us for technical support (++global_product@laiye.com++), and your message requesting technical support can include the following content: Describe the problem you encountered in as much detail as possible Your current invoice processing volume and type The specific supplier format or invoice layout you handle Target Accounting Software or System Integration Any technical errors or issues with extraction accuracy Current manual processing workflow and pain points Response time: within 24 hours on working days Professional Areas: Invoice Processing Automation, Order Processing Automation, Receipt Processing Automation
by David Olusola
🎥 Auto-Save Zoom Recordings to Google Drive + Log Meetings in Airtable This workflow automatically saves Zoom meeting recordings to Google Drive and logs all important details into Airtable for easy tracking. Perfect for teams that want a searchable meeting archive. ⚙️ How It Works Zoom Recording Webhook Listens for recording.completed events from Zoom. Captures metadata (Meeting ID, Topic, Host, File Type, File Size, etc.). Normalize Recording Data A Code node extracts and formats Zoom payload into clean JSON. Download Recording Uses HTTP Request to download the recording file. Upload to Google Drive Saves the recording into your chosen Google Drive folder. Returns the file ID and share link. Log Result Combines Zoom metadata with Google Drive file info. Save to Airtable Logs all details into your Meeting Logs table: Meeting ID Topic Host File Type File Size Google Drive Saved (Yes/No) Drive Link Timestamp 🛠️ Setup Steps 1. Zoom Create a Zoom App → enable recording.completed event. Add the workflow’s Webhook URL as your Zoom Event Subscription endpoint. 2. Google Drive Connect OAuth in n8n. Replace YOUR_FOLDER_ID with your destination Drive folder. 3. Airtable Create a base with table Meeting Logs. Add columns: Meeting ID Topic Host File Type File Size Google Drive Saved Drive Link Timestamp Replace YOUR_AIRTABLE_BASE_ID in the node. 📊 Example Airtable Output | Meeting ID | Topic | Host | File Type | File Size | Google Drive Saved | Drive Link | Timestamp | |------------|-------------|-------------------|-----------|-----------|--------------------|------------|---------------------| | 987654321 | Team Sync | host@email.com | MP4 | 104 MB | Yes | 🔗 Link | 2025-08-30 15:02:10 | ⚡ With this workflow, every Zoom recording is safely archived in Google Drive and logged in Airtable for quick search, reporting, and compliance tracking.