by Oneclick AI Squad
This enterprise-grade n8n workflow automates the Pharmaceutical Raw Material COA Verification & Vendor Quality Scoring System — from upload to final reporting — using AI-powered document extraction, specification matching, and dynamic vendor scoring. It processes Certificates of Analysis (COAs) to validate compliance, assign quality scores, generate approvals or CAPA requests, and notify stakeholders, ensuring regulatory adherence and vendor accountability with full audit trails and zero manual data entry. Key Features Webhook-triggered COA Upload** for seamless integration with file-sharing systems AI Document Extraction** to parse test results and data from uploaded COAs Automated Specification Analysis** matching against predefined quality standards Weighted Vendor Scoring** based on compliance metrics and historical performance Compliance Decision Engine** with approve/reject branching and CAPA flagging Dynamic Certificate Generation** for approved materials, including digital signatures Vendor Database Synchronization** to update scores and records in real-time Targeted Email Notifications** for QA, production, and executive teams Executive Reporting Dashboard** with summaries, scores, and verification logs Audit-Ready Logging** for all steps, deviations, and decisions Workflow Process | Step | Node | Description | | ---- | ----------------------------------- | -------------------------------------------------------- | | 1 | START: Upload COA | Webhook trigger receives uploaded COA file for verification process | | 2 | EXTRACT: Parse COA | Extracts test results and data from the COA document using AI parsing | | 3 | ANALYZE: Vendor Compliance | Compares extracted data against specifications and flags deviations | | 4 | SCORE: Vendor Quality Rating | Calculates weighted compliance score based on test results and history | | 5 | DECISION: Compliance Route | Evaluates score/status: Branches to approve (green) or reject (red) path | | 6 | APPROVED: Generate Approval Cert (Approved Path) | Creates digital approval certificate for compliant materials | | 7 | Update Vendor Database | Saves verification record, score, and status to vendor database | | 8 | NOTIFY: Email Alert | Sends detailed notifications to QA/production teams | | 9 | REPORT: Final Report | Generates executive summary with COA scores and verifications | | 10 | REJECT: Generate Rejection Report (Reject Path) | Produces rejection report with deviation details | | 11 | Request CAPA | Initiates Corrective and Preventive Action (CAPA) process | | 12 | PATH REJECTED | Terminates rejected branch with audit log entry | Setup Instructions 1. Import Workflow Open n8n → Workflows → Import from Clipboard Paste the JSON workflow 2. Configure Credentials | Integration | Details | | ----------------- | -------------------------------------------------- | | File Storage (e.g., Google Drive/AWS S3) | API key or OAuth for COA upload handling | | AI Extraction (e.g., Claude or OCR Tool) | API key for document parsing (e.g., claude-3-5-sonnet-20241022) | | Database (e.g., PostgreSQL/Airtable) | Connection string for vendor records and specs | | Email (SMTP/Gmail) | SMTP credentials or OAuth for notifications | 3. Update Database/Sheet IDs Ensure your database or Google Sheets include: VendorDatabase for scores and history Specifications for quality standards 4. Set Triggers Webhook:** /coa-verification (for real-time file uploads) Manual/Scheduled:** For batch processing if needed 5. Run a Test Use manual execution to confirm: COA extraction and analysis Score calculation and branching Email notifications and report generation (use sample COA file) Database/Sheets Structure VendorDatabase | vendorId | coaId | score | complianceStatus | lastVerified | deviations | capaRequested | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------| | VEND-123456 | COA-789012 | 92.5 | Approved | 2025-11-04T14:30:00Z | None | No | Specifications | materialType | testParam | specMin | specMax | weight | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------|----------| | API Excipient | Purity (%) | 98.0 | 102.0 | 0.4 | System Requirements | Requirement | Version/Access | | --------------------- | ---------------------------------------------- | | n8n | v1.50+ (AI and database integrations supported) | | AI Parsing API | claude-3-5-sonnet-20241022 or equivalent OCR | | Database API | SQL connection or Google Sheets API | | Email API | https://www.googleapis.com/auth/gmail or SMTP | | File Storage | AWS S3 or Google Drive API access | Optional Enhancements Integrate ERP Systems (e.g., SAP) for direct material release Add Regulatory Export to PDF/CSV for FDA audits Implement Historical Trend Analysis for vendor performance dashboards Use Multi-Language Support for global COA extraction Connect Slack/Teams for real-time alerts beyond email Enable Batch Processing for high-volume uploads Add AI Anomaly Detection for predictive non-compliance flagging Build Custom Scoring Models via integrated ML tools Result: A fully automated quality assurance pipeline that verifies COAs, scores vendors, and drives compliance decisions — ensuring pharmaceutical safety and efficiency with AI precision and complete traceability. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by Amit Mehta
Streamline Your Zoom Meetings with Secure, Automated Stripe Payments This comprehensive workflow automates the entire process of setting up a paid online event, from scheduling a Zoom meeting and creating a Stripe payment link to tracking participants and sending confirmation emails. How it Works This workflow has two primary, distinct branches: Event Creation and Participant Registration. Event Creation Flow (Triggered via Form): An administrator submits details (title, price, date/time) via a form. The workflow creates a new Zoom meeting with a unique password. It creates a Stripe Product and a Payment Link. A dedicated Google Sheet tab is created for tracking participants. An email is sent to the event organizer with all the details, including the Zoom link, payment link, and participant list URL. Participant Registration Flow (Triggered via Stripe Webhook): A webhook is triggered when a Stripe payment is completed (checkout.session.completed). The participant's details are added to the dedicated Google Sheet tab. A confirmation email is sent to the participant with the Zoom link and password. A notification email is sent to the event organizer about the new registration. Use Cases Webinar Sales**: Automate setup and registration for paid webinars. Consulting/Coaching Sessions**: Streamline the booking and payment process for group coaching calls. Online Classes**: Handle registration, payment, and access distribution for online courses or classes. Setup Instructions Credentials: Add credentials for: Zoom: For creating the meeting. Google: You need both Gmail and Google Sheets credentials. Stripe: For creating products and handling payment webhooks. Google Sheet: Create a new, blank Google Sheet to hold meeting and participant information. Config Node: Fill the Config node with: currency (e.g., EUR). sheet_url (the URL of the Google Sheet you created). teacher_email (the organizer/host's email). Workflow Logic The workflow splits into two logical parts handled by an if node: Part A: Event Creation (Triggered by Creation Form) Trigger: Creation Form (Form Trigger). Check: if is creation flow (If) evaluates to true. Zoom: Create Zoom meeting creates the session. Stripe Product: Create Stripe Product creates a product and price in Stripe. Stripe Link: Create payment link generates the public payment link, embedding Zoom and sheet metadata. Google Sheet: Create participant list creates a new sheet tab for the event. Email Host: Send email to teacher notifies the host of the successful setup. Part B: Participant Registration (Triggered by On payment) Trigger: On payment (Stripe Trigger - checkout.session.completed). Format: Format participant extracts customer details. Google Sheet: Add participant to list appends the new participant's info to the event's sheet. Email Participant: Send confirmation to participant sends the Zoom access details. Email Host: Notify teacher sends a registration alert. Node Descriptions | Node Name | Description | |-----------|-------------| | Creation Form | A form trigger used to input the event's required details (title, price, start date/time). | | On payment | A Stripe trigger that listens for the checkout.session.completed event, indicating a successful payment. | | Create Zoom meeting | Creates a new Zoom meeting, calculating the start time based on the form inputs. | | Create Stripe Product | Posts to the Stripe API to create a new product and price based on the form data. | | Create payment link | Creates a Stripe Payment Link, embedding Zoom meeting and Google Sheet ID metadata. | | Create participant list | Creates a new tab (named dynamically) in the configured Google Sheet for event tracking. | | Add participant to list | Appends a new row to the event's Google Sheet tab upon payment completion. | | Send email to teacher / Notify teacher | Sends emails to the host/organizer for creation confirmation and new participant registration, respectively. | | Send confirmation to participant | Sends the welcome email to the paying customer with the Zoom access details retrieved from the Stripe metadata. | Customization Tips Email Content**: You are encouraged to adapt the email contents in the Gmail nodes to fit your branding and tone. Currency**: Change the currency in the Config node. Zoom Password**: The password is set to a random 4-character string; you can modify the logic in the Create Zoom meeting node. Stripe Price**: The price is sent to Stripe in the smallest currency unit (e.g., cents, * 100). Suggested Sticky Notes for Workflow Setup**: "Add Your credentials [Zoom, Google, Stripe]. Note: For Google, you need to add Gmail and Google Sheet. Create a new Google Sheet. Keep this sheet blank for now. And fill the config node." Creation Form**: "Your journey to easy event management starts here. Click this node, copy the production URL, and keep it handy. It's your personal admin tool for quickly creating new meetings." Customize**: "Feel free to adapt email contents to your needs." Config**: "Setup your flow". Required Files 2DT5BW5tOdy87AUl_Streamline_Your_Zoom_Meetings_with_Secure,_Automated_Stripe_Payments.json: The n8n workflow export file. A new, blank Google Sheet (URL configured in the Config node). Testing Tips Test Creation**: Run the Creation Form to trigger the Part A flow. Verify that a Zoom meeting and Stripe Payment Link are created, a new Google Sheet tab appears, and the host receives the setup email. Test Registration**: Simulate a successful payment to the generated Stripe link to trigger the Part B flow. Verify that the participant is added to the Google Sheet, receives the confirmation email with Zoom details, and the host receives the notification. Suggested Tags & Categories #Stripe #Zoom #Payment #E-commerce #GoogleSheets #Gmail #Automation #Webinar
by Spiritech Studio
This n8n template demonstrates how to automatically extract text content from PDF documents received via WhatsApp messages using OCR. It is designed for use cases where users submit documents through WhatsApp and the document content needs to be digitized for further processing — such as document analysis, AI-powered workflows, compliance checks, or data ingestion. Good to know This workflow processes PDF documents only. OCR is handled using AWS Textract, which supports both scanned and digital PDFs. AWS Textract pricing depends on the number of pages processed. Refer to AWS Textract Pricing for up-to-date costs. An AWS S3 bucket is required as an intermediate storage layer for the PDF files. Processing time may vary depending on PDF size and number of pages. How it works The workflow is triggered when an incoming WhatsApp message containing a PDF document is received. The PDF file is downloaded from WhatsApp’s media endpoint using an HTTP Request node. The downloaded PDF is uploaded to an AWS S3 bucket to make it accessible for OCR processing. AWS Textract is invoked to analyze the PDF stored in S3 and extract all readable text content. The Textract response is parsed and consolidated into a clean, ordered text output representing the PDF’s content. How to use The workflow can be triggered using a webhook connected to WhatsApp Cloud API or any compatible WhatsApp integration. Ensure your AWS credentials have permission to upload to S3 and invoke Textract. Once active, simply send a PDF document via WhatsApp to start the extraction process automatically. Requirements WhatsApp integration (e.g. WhatsApp Cloud API or provider webhook) AWS account with: S3 bucket access Textract permissions n8n instance with HTTP Request and AWS nodes configured Customising this workflow Store extracted text in a database or document store. Pass the extracted content to an AI model for summarization, classification, or validation. Split output by pages or sections. Add file type validation or size limits. Extend the workflow to support additional document formats.
by Tristan V
YouTube Video Transcript Summarizer — Discord Bot > Paste a YouTube URL into a Discord channel and this workflow automatically extracts the transcript, uses an LLM to generate a concise summary, and stores everything in a database — all in seconds. > Self-hosted n8n only. This workflow uses the Execute Command node to run yt-dlp inside the n8n container. This requires shell access, which is only available on self-hosted instances (Docker, VPS, etc.) — it will not work on n8n Cloud. Import this workflow into n8n Prerequisites | Tool | Purpose | |------|---------| | Discord Bot | Listens for messages and sends replies | | yt-dlp | Downloads subtitles and video metadata (must be installed in the n8n container) | | Google Gemini API | Summarizes video transcripts (Gemini 2.5 Flash) | | Supabase | Stores video data and run logs | Credentials | Node | Credential Type | Notes | |------|----------------|-------| | Discord Trigger | Discord Bot Trigger | Bot token with Message Content Intent enabled | | Discord Reply / Discord Not YouTube Reply / Discord Error Reply | Discord Bot | Same bot, used for sending messages | | Message a model (Gemini) | Google Gemini (PaLM) API | API key from Google AI Studio | | Save to Supabase / Log Run / Log Run Error | Supabase | Project URL + anon key | What It Does When a user pastes a YouTube URL into a Discord channel, the workflow: Detects the YouTube URL using RegEx (supports youtube.com, youtu.be, shorts, live) Extracts the video's subtitles (English and Vietnamese) and metadata using yt-dlp Cleans the raw VTT subtitle file into plain-text transcript Summarizes the transcript using an LLM (Gemini 2.5 Flash) into a TLDR + detailed summary (in the original language) Stores the video metadata, full transcript, and AI summary in a Supabase database Logs every run (success or error) to a separate runs table for tracking Chunks long summaries into Discord-safe messages (≤2000 characters each) Replies in Discord with the video title, stats, and the full summary Non-YouTube messages get a friendly "not a YouTube link" reply. Errors are caught, classified, logged to the database, and reported back to Discord. How It Works Main Flow (Happy Path) Discord Trigger → Extract YouTube URL → Is YouTube URL? ├─ Yes → yt-dlp Get Metadata → Parse Metadata → Read Subtitle File → Parse Transcript │ → Message a model (Gemini) → Prepare Insert Data → Save to Supabase │ → Prepare Success Log → Log Run → Prepare Messages for Discord → Discord Reply └─ No → Discord Not YouTube Reply Error Flow Error Trigger → Prepare Error Data → Log Run Error → Discord Error Reply Node Breakdown | # | Node | Type | Description | |---|------|------|-------------| | 1 | Discord Trigger | Discord Bot Trigger | Fires on every message in the configured channel | | 2 | Extract YouTube URL | Code | RegEx extracts video ID from message content | | 3 | Is YouTube URL? | IF | Routes YouTube URLs to processing, others to rejection reply | | 4 | yt-dlp Get Metadata | Execute Command | Downloads subtitles (.vtt, English/Vietnamese) and prints metadata JSON | | 5 | Parse Metadata | Code | Extracts title, channel, views, duration via RegEx; decodes Unicode for multi-language support | | 6 | Read Subtitle File | Execute Command | Dynamically finds and reads the .vtt file (continueOnFail enabled) | | 7 | Parse Transcript | Code | Strips VTT timestamps/tags, deduplicates lines | | 8 | Message a model | Google Gemini | Sends transcript to Gemini 2.5 Flash for TLDR + detailed summary (in original language) | | 9 | Prepare Insert Data | Code | Merges summary with all metadata fields | | 10 | Save to Supabase | Supabase | Inserts full record into videos table | | 11 | Prepare Success Log | Code | Builds success run record | | 12 | Log Run | Supabase | Inserts into runs table | | 13 | Prepare Messages for Discord | Code | Chunks long summaries into Discord-safe messages (≤2000 chars) | | 14 | Discord Reply | Discord | Posts summary preview to channel | | 15 | Discord Not YouTube Reply | Discord | Replies when message isn't a YouTube link | | 16 | Error Trigger | Error Trigger | Catches any unhandled node failure | | 17 | Prepare Error Data | Code | Classifies error type and extracts context | | 18 | Log Run Error | Supabase | Logs error to runs table | | 19 | Discord Error Reply | Discord | Posts error message to channel | Setup Guide 1. Discord Bot Go to the Discord Developer Portal Create a new Application → Bot Enable Message Content Intent under Privileged Intents Copy the Bot Token Invite the bot to your server with Send Messages + Read Messages permissions In n8n, create a Discord Bot Trigger credential (for listening) and a Discord Bot credential (for sending replies) Update the guild ID and channel ID in the Discord Trigger node and all Discord reply nodes 2. yt-dlp yt-dlp must be installed in your n8n container. For Docker-based installs: docker exec -it n8n apk add --no-cache python3 py3-pip docker exec -it n8n pip3 install yt-dlp Optional: Place a cookies.txt file at /home/node/.n8n/cookies.txt to avoid age-gated or bot-detection issues. 3. Google Gemini API Go to Google AI Studio Click Create API Key and copy it In n8n, click the Gemini node → Credential → Create New Paste your API key and save 4. Supabase Create a project at supabase.com Go to Settings → API and copy the URL and anon key In n8n, create a Supabase credential with your URL and API key Run the SQL below in the Supabase SQL Editor to create the required tables Supabase SQL -- Videos table: stores video metadata, transcript, and AI summary CREATE TABLE videos ( video_id TEXT PRIMARY KEY, title TEXT, channel TEXT, upload_date TEXT, duration INT, view_count INT, description TEXT, transcript TEXT, ai_summary TEXT, thumbnail_url TEXT, channel_id TEXT, date_added TIMESTAMPTZ DEFAULT now() ); -- Runs table: logs every workflow execution (success or error) CREATE TABLE runs ( video_id TEXT PRIMARY KEY, process_status TEXT NOT NULL, error_type TEXT, notes TEXT, date_added TIMESTAMPTZ DEFAULT now() );
by Adam Gałęcki
How it works: This workflow automates comprehensive SEO reporting by: Extracting keyword rankings and page performance from Google Search Console. Gathering organic reach metrics from Google Analytics. Analyzing internal and external article links. Tracking keyword position changes (gains and losses). Formatting and importing all data into Google Sheets reports. Set up steps: Connect Google Services: Authenticate Google Search Console, Google Analytics, and Google Sheets OAuth2 credentials. Configure Source Sheet: Set up a data source Google Sheet with article URLs to analyze. Set Report Sheet: Create or specify destination Google Sheets for reports. Update Date Ranges: Modify date parameters in GSC and GA nodes for your reporting period. Customize Filters: Adjust keyword filters and row limits based on your needs. Test Individual Sections: Each reporting section (keywords, pages, articles, position changes) can be tested independently. The workflow includes separate flows for: Keyword ranking (top 1000). Page ranking analysis. Organic reach reporting. Internal article link analysis. External article link analysis. Position gain/loss tracking.
by Kshitij Matta
Stop paying for expensive plugins to recover your valuable revenue from abandoned carts on your WooCommerce store How It Works? When a product is added to a user's cart on your store, it fetches the cart contents via webhook & it utilises the code provided in the red sticky note to fetch the required info. It waits for a specified time to allow the user to place an order. It checks if the order has been placed or not. It creates the HTML with dynamic information fetched from previous nodes. It sends the email to the user via configured SMTP credentials. Setup Steps (20 minutes): Set up your WooCommerce Account Credentials in n8n Set up webhook in n8n & WooCommerce Add the provided code in functions.php or as a PHP snippet via a plugin onto your website Customize the coupon code's phrase according to your needs Customize the email's HTML code according to your needs Requirements WooCommerce Store**: With REST API access enabled. SMTP Credentials**: For sending recovery emails. For any queries, you can ping me on X
by vinci-king-01
This workflow processes raw meeting recordings or handwritten notes, automatically transcribes and summarizes them, and then distributes the concise summary to all meeting participants via Microsoft Teams while also creating an action-item task in ClickUp. The goal is to save time, keep everyone aligned, and ensure follow-up tasks are tracked in your project management workspace. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed Microsoft Teams tenant with permissions to create Incoming Webhooks or use Bot Framework ClickUp workspace and a target List to hold meeting action items Optional: OpenAI or any LLM API account for high-quality summarization Required Credentials Microsoft Teams Webhook URL** – to post summary messages ClickUp Personal Access Token** – to create tasks OpenAI API Key** (optional but recommended) – for AI-powered summarization ScrapeGraphAI API Key** – placeholder key to satisfy the template requirement Specific Setup Requirements | Item | Description | Example | |------|-------------|---------| | Teams Channel Webhook | Create an Incoming Webhook in the desired Teams channel and copy the URL | https://outlook.office.com/webhook/... | | ClickUp List ID | The numeric ID of the list where tasks will be created | 90123456 | | Summarization Model | The LLM model or API you prefer to use | gpt-3.5-turbo | How it works This workflow transcribes or parses meeting content, leverages an LLM to generate a concise summary and action items, then distributes the results to participants in Microsoft Teams and creates a follow-up task in ClickUp. Everything runs in a single automated flow triggered manually or on a schedule. Key Steps: Manual Trigger**: Start the workflow after a meeting ends. Sticky Note**: Provides on-canvas documentation for quick reference. Set Node – Upload Metadata**: Define meeting title, date, and participants. HTTP Request – Transcription**: Send audio/video file to a transcription service (e.g., Azure Speech-to-Text). Wait**: Pause until the transcription is complete. Code – Summarize**: Use OpenAI to summarize the transcript and extract action items. IF Node – Validate Output**: Ensure the summary exists; handle errors otherwise. Merge**: Combine summary with participant list. Microsoft Teams Node**: Send the summary to each participant or channel via webhook. ClickUp Node**: Create a task containing the summary and action items. Set up steps Setup Time: 10-15 minutes Create Teams Webhook: In Microsoft Teams, navigate to the target channel → Manage channel → Connectors → Incoming Webhook → give it a name (e.g., “MeetingBot”) and copy the generated URL. Generate ClickUp Personal Access Token: ClickUp → Settings → Apps → Generate Token → copy and store it securely. Get ClickUp List ID: Open the list in ClickUp and copy the numeric ID from the URL bar. Optional – Obtain OpenAI API Key: Sign in to OpenAI → API Keys → Create new secret key. Add Credentials in n8n: In n8n, go to Credentials → New → add Microsoft Teams, ClickUp, and OpenAI (Generic HTTP). Import Workflow: Paste the JSON workflow into n8n or use “Templates → Import”. Configure Nodes: In the Set Node: update meeting_title, date, and participants array. In HTTP Request: set the transcription service endpoint and authentication. In Code – Summarize: paste your OpenAI key or select credential. In Microsoft Teams Node: select the Teams credential and webhook URL. In ClickUp Node: select ClickUp credential and enter the List ID. Test: Click “Execute Workflow” on the Manual Trigger node. Verify that a message appears in Teams and a task is created in ClickUp. Node Descriptions Core Workflow Nodes: Manual Trigger** – Initiates the workflow manually or on a schedule. Sticky Note** – Documentation block outlining purpose and credential usage. Set** – Stores meeting metadata and participants list. HTTP Request** – Sends meeting recording to a transcription service and fetches results. Wait** – Holds the workflow until transcription is ready. Code** – Summarizes transcript and extracts action items via OpenAI. IF** – Validates summarization success; branches on failure. Merge** – Combines summary text with participant emails/usernames. Microsoft Teams** – Posts summary to Teams channel or direct messages. ClickUp** – Creates a task containing summary and action items. Data Flow: Manual Trigger → Set → HTTP Request → Wait → Code → IF → Merge → Microsoft Teams Merge → ClickUp Customization Examples Change summarization prompt // Inside the Code node const prompt = ` You are an expert meeting assistant. Summarize the following transcript in under 150 words. List action items in bullet points with owners. Transcript: ${items[0].json.transcript} `; Send summary as a PDF attachment // Add Convert & Save node before Teams // Convert markdown summary to PDF and attach in Teams node Data Output Format The workflow outputs structured JSON data: { "meeting_title": "Q3 Strategy Sync", "date": "2024-05-10", "participants": ["john@corp.com", "jane@corp.com"], "summary": "We reviewed Q3 OKRs, decided to ...", "action_items": [ { "owner": "John", "task": "Prepare budget draft", "due": "2024-05-20" }, { "owner": "Jane", "task": "Compile market research", "due": "2024-05-25" } ], "clickup_task_id": "abcd1234", "teams_message_id": "msg7890" } Troubleshooting Common Issues Teams message not sent – Verify the Incoming Webhook URL and that the Teams node uses the correct credential. ClickUp task missing – Ensure the List ID is correct and the ClickUp token has tasks:write scope. Empty summary – Check that the transcription text is populated and the OpenAI prompt is valid. Performance Tips Compress large audio/video files before sending to the transcription service. Use batching in the Teams node if participant list is >20 to avoid rate limits. Pro Tips: Schedule the workflow to auto-run 5 minutes after recurring meeting end-times. Customize the ClickUp task description template to include embedded links. Add a “Send Email” node for participants not on Teams.
by Ranjan Dailata
This n8n workflow automates domain level keyword ranking analysis and enriches raw SEO metrics with AI-generated summaries. It combines structured keyword data from SE Ranking with natural-language insights produced by OpenAI, turning complex SERP datasets into actionable SEO intelligence. Who this is for? This workflow is designed for: SEO engineers and technical marketers Growth teams running programmatic SEO Agencies managing multi-domain keyword analysis Product teams building SEO analytics pipelines Developers using n8n for data enrichment and reporting If you work with keyword data and need machine-readable output plus human-readable insights, this workflow is for you. What this workflow does Accepts a target domain or URL, region, keyword type (organic/paid), and filters Fetches keyword ranking data from the SE Ranking Domain Keywords API Extracts metrics such as: Keyword positions Search volume & CPC Competition & difficulty SERP features & search intent Traffic estimates Uses OpenAI GPT-4.1-mini to generate: A comprehensive narrative summary A concise abstract overview Merges raw data and AI insights into a single enriched dataset Exports the final output as structured JSON for downstream use Setup Prerequisites Active SE Ranking API access OpenAI API key with GPT-4.1-mini enabled Running n8n instance (self-hosted or cloud) Basic understanding of keyword ranking metrics Configuration steps If you are new to SE Ranking, please signup on seranking.com Import the workflow JSON into n8n Configure credentials: SE Ranking using HTTP Header Authentication. Please make sure to set the header authentication as below. The value should contain a Token followed by a space with the SE Ranking API Key. OpenAI API (GPT-4.1-mini model) Open the Set the Input Fields node and define: target_site (domain or URL) source (region, e.g. us) type (organic or paid) limit, filters, and requested columns Verify the output as per the export data handling. Converts enriched SEO results into structured JSON output Creates binary data to support file-based exports Converts processed data into CSV format for easy analysis Inserts or updates records in Google Sheets for reporting Ensures data consistency across all export destinations Enables downstream automation, dashboards, and audits Click Execute Workflow How to customize this workflow to your needs You can easily adapt this workflow by: Switching between organic and paid keyword analysis Changing regions for international SEO tracking Modifying requested keyword columns and SERP filters Customizing the OpenAI prompt to generate: SEO action items Competitive insights Executive summaries Replacing file export with: Databases Dashboards Slack/Email alerts Data warehouses Summary This n8n template delivers a production ready SEO analytics pipeline that bridges structured SERP data with AI powered interpretation. By combining SE Ranking’s keyword intelligence with OpenAI driven summarization, it enables faster insights, better reporting, and scalable SEO decision making without manual analysis.
by David Olusola
🎥 Auto-Save Zoom Recordings to Google Drive + Log Meetings in Airtable This workflow automatically saves Zoom meeting recordings to Google Drive and logs all important details into Airtable for easy tracking. Perfect for teams that want a searchable meeting archive. ⚙️ How It Works Zoom Recording Webhook Listens for recording.completed events from Zoom. Captures metadata (Meeting ID, Topic, Host, File Type, File Size, etc.). Normalize Recording Data A Code node extracts and formats Zoom payload into clean JSON. Download Recording Uses HTTP Request to download the recording file. Upload to Google Drive Saves the recording into your chosen Google Drive folder. Returns the file ID and share link. Log Result Combines Zoom metadata with Google Drive file info. Save to Airtable Logs all details into your Meeting Logs table: Meeting ID Topic Host File Type File Size Google Drive Saved (Yes/No) Drive Link Timestamp 🛠️ Setup Steps 1. Zoom Create a Zoom App → enable recording.completed event. Add the workflow’s Webhook URL as your Zoom Event Subscription endpoint. 2. Google Drive Connect OAuth in n8n. Replace YOUR_FOLDER_ID with your destination Drive folder. 3. Airtable Create a base with table Meeting Logs. Add columns: Meeting ID Topic Host File Type File Size Google Drive Saved Drive Link Timestamp Replace YOUR_AIRTABLE_BASE_ID in the node. 📊 Example Airtable Output | Meeting ID | Topic | Host | File Type | File Size | Google Drive Saved | Drive Link | Timestamp | |------------|-------------|-------------------|-----------|-----------|--------------------|------------|---------------------| | 987654321 | Team Sync | host@email.com | MP4 | 104 MB | Yes | 🔗 Link | 2025-08-30 15:02:10 | ⚡ With this workflow, every Zoom recording is safely archived in Google Drive and logged in Airtable for quick search, reporting, and compliance tracking.
by Anna Bui
Automatically analyze n8n workflow errors with AI, create support tickets, and send detailed Slack notifications Perfect for development teams and businesses that need intelligent error handling with automated support workflows. Never miss critical workflow failures again! How it works Error Trigger captures any workflow failure in your n8n instance AI Debugger analyzes the error using structured reasoning to identify root causes Clean Data transforms AI analysis into organized, actionable information Create Support Ticket automatically generates a detailed ticket in FreshDesk Merge combines ticket data with AI analysis for comprehensive reporting Generate Slack Alert creates rich, formatted notifications with all context Send to Team delivers instant alerts to your designated Slack channel How to use Replace FreshDesk credentials with your helpdesk system API Configure Slack channel for your team notifications Customize AI analysis prompts for your specific error types Set up as global error handler for all your critical workflows Requirements FreshDesk account (or compatible ticketing system) Slack workspace with bot permissions OpenAI API access for AI analysis n8n Cloud or self-hosted with AI nodes enabled Good to know OpenAI API calls cost approximately $0.01-0.03 per error analysis Works with any ticketing system that supports REST API Can be triggered by webhooks from external monitoring tools Slack messages use rich formatting for mobile-friendly alerts Need Help? Join the Discord or ask in the Forum! Happy Monitoring!
by Shayan Ali Bakhsh
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Try It Out! Automatically generate Linkedin Carousal and Upload to Linkedin Use case : Linkedin Content Creation, specifically carousal. But could be adjusted for many other creations as well. How it works It will run automatically every 6:00 AM Get latest News from TechRadar Parse it into readable JSON AI will decide, which news resonates with your profile Then give the title and description of that news to generate the final linkedin carousal content. This step is also trigerred by Form trigger After carousal generation, it will give it to Post Nitro to create images on that content. Post Nitro provides the PDF file. We Upload the PDf file to Linkedin and get the file ID, in next step, it will be used. Finally create the Post description and Post it to Linkedin How to use It will run every 6:00 AM automatically. Just make it Live Submit the form, with correct title and description ( i did not added tests for that so must give that correct 😅 ) Requirements Install Post Nitro community Node @postnitro/n8n-nodes-postnitro-ai We need the following API keys to make it work Google Gemini ( for Gemini 2.5-Flash Usage ) Docs Google Gemini Key Post Nitro credentials ( API key + Template id + Brand id ) Docs Post Nitro Linkedin API key Docs Linkedin API Need Help? Message on Linkedin the Linkedin Happy Automation!
by Sridevi Edupuganti
Try It Out! Use n8n to extract medical test data from diagnostic reports uploaded to Google Drive, automatically detect abnormal values, and generate personalized health advice. How it works Upload a medical report (PDF or image) to a monitored Google Drive folder Mistral AI extracts text using OCR while preserving document structure GPT-4 parses the extracted text into structured JSON (patient info, test names, results, units, reference ranges) All test results are saved to the "All Values" sheet in Google Sheets JavaScript code compares each result against its reference range to detect abnormalities For out-of-range values, GPT-4 generates personalized dietary, lifestyle, and exercise advice based on patient age and gender Abnormal results with recommendations are saved to the "Out of Range Values" sheet How to use Set up Google Drive folder monitoring and Google Sheets with two tabs: "All Values" and "Out of Range Values" Configure API credentials for Google Drive, Mistral AI, and OpenAI (GPT-4) Upload medical reports to your monitored folder Review extracted data and personalized health advice in Google Sheets Requirements Google Drive and Sheets with OAuth2 authentication Mistral AI API key for OCR OpenAI API key (GPT-4 access required) for intelligent extraction and advice generation Need Help? See the detailed Read Me file at https://drive.google.com/file/d/1Wv7dfcBLsHZlPcy1QWPYk6XSyrS3H534/view?usp=sharing Join the n8n community forum for support