by Cheng Siong Chin
Introduction Automate candidate evaluation from CV submission to interview booking. Perfect for HR teams and recruiters. How It Works Webhook receives CVs, extracts Airtable data, AI assesses qualifications, filters candidates, sends emails, schedules Google Calendar interviews, and updates records. Workflow Template Webhook → Airtable (Get Data) → AI Extract CV → AI Assessment → Filter Qualified → Generate Email → Send Email → Filter Interview Candidates → Schedule Calendar → Update Airtable → Slack Notification → Respond Workflow Steps Receive & Store: Webhook receives CVs, saves to Airtable. Fetch & Download: Gets job criteria, downloads CVs. AI Assessment: Parses skills, scores candidates. Filter & Email: Routes qualified, sends messages. Schedule & Update: Books interviews, updates Airtable. Notify: Alerts via Slack, confirms status. Setup Instructions Webhook & Airtable: Set URL, create tables, add credentials. AI Configuration: Add OpenAI key, define schema, customize scoring. Communication: Connect Gmail, Calendar, and Slack. Prerequisites Airtable account OpenAI API key Gmail and Google Calendar Slack workspace (optional) Customization Multi-stage scheduling ATS integration (Greenhouse, Lever) Benefits Reduces screening time by 90% Ensures uniform evaluation Cuts time-to-hire by 60%
by go-surfe
🚀 Build Hyper-Targeted Prospecting Lists with Surfe & HubSpot This template automatically discovers companies that match your Ideal Customer Profile (ICP), finds the right people inside those companies and enriches them — ready to drop straight into HubSpot. Launch the workflow, sit back, and get a clean list of validated prospects in minutes. 1. ❓ What Problem Does This Solve? Sourcing prospects that truly fit your ICP is slow and repetitive. You jump between databases, copy domains, hunt down decision-makers, and then still have to enrich emails and phone numbers one by one. This workflow replaces all that manual effort: It queries Surfe’s database for companies that match your exact industry, size, revenue and geography filters. It pulls the best-fit people inside each company and enriches them in bulk. It keeps only records with both a direct email and mobile phone, then syncs them to HubSpot automatically. No spreadsheets, no copy-paste — just a fresh, qualified prospect list ready for outreach. 2. 🧰 Prerequisites You’ll need: A self-hosted or cloud instance of n8n A Surfe API Key A HubSpot Private App Token with contact read/write scopes A Gmail account (OAuth2) for the completion notification The workflow JSON file linked above N8N_FLOW_2__Building_Prospecting_Lists.json 3. 📌 Search ICP Companies Configuration — Fine-Tune Your Targeting 3.1 Editing the JSON Every targeting rule lives inside the “🔍 Search ICP Companies” HTTP node. Open the node Search ICP Companies → Parameters tab → JSON Body to edit the filters. | Filter | JSON path | What it does | Example | | --- | --- | --- | --- | | industries | filters.industries | Narrow to specific verticals (case-sensitive strings) | ["Software","Apps","SaaS"] | | employeeCount.from / to | filters.employeeCount | from / to | 1 / 35 | | countries | filters.countries | 2-letter ISO codes | ["FR","DE"] | | revenues | filters.revenues | Annual revenue brackets | ["1-10M"] | | limit | limit | Companies per run | 20 | 3.2 Where to find allowed values Surfe exposes an “🗂 Get Filters” endpoint that returns every accepted value for: industries employeeCounts revenues countries (always ISO-2 codes) You can hit it with a simple GET /v1/people/search/filters request or browse the interactive docs here: https://developers.surfe.com/public-008-people-filters developers.surfe.com For company-level searches, the same enumerations apply. 4. ⚙️ Setup Instructions 4.1 🔐 Create Your Credentials in n8n 4.1.1 🚀 Surfe API In your Surfe dashboard → Use Surfe Api → copy your API key Go to n8n → Credentials → Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.2 📧 Gmail OAuth2 API Go to n8n → Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.3 🎯 HubSpot 🔓 Private App Token Go to HubSpot → Settings → Integrations → Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n → Credentials → Create Credential → HubSpot App Token Paste your App Token ✅ You are now all set for the credentials 4.2 📥 Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the … on the top left Import from File 4.2.1 🔗 Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Surfe HTTP nodes: Authentication → Generic Credential Type Generic Auth Type → Bearer Auth Bearer Auth → Select the credentials you created before Gmail Node Credentials to connect with → Gmail account Hubspot Node →Credentials to connect with → Gmail account Surfe HTTP nodes Surfe HTTP nodes HubSpot node → Credentials to connect with → select your HubSpot credentials in the list 5. 🔄 How This N8N Workflow Works Manual Trigger – Click Execute Workflow (or schedule it) to start. Search ICP Companies – Surfe returns company domains that match your filter set. Prepare JSON Payload with Company Domains – Formats the domain list for the next call. Search People in Companies – Finds people inside each company. Prepare JSON Payload Enrichment Request – Builds the bulk-enrichment request. Surfe Bulk Enrichments API – Launches one enrichment job for the whole batch. Wait + Polling loop – Checks job status every 3 seconds until it’s COMPLETED. Extract List of People – Pulls the enriched contacts from Surfe’s response. Filter: phone AND email – Keeps only fully reachable prospects (email and mobile). HubSpot: Create or Update – Inserts/updates each contact in HubSpot. Gmail – Sends you a “Your ICP prospecting enrichment is done” email. 6. 🧩 Use Cases Weekly prospect list refresh** – Generate 50 perfectly-matched prospects every Monday morning. Territory expansion** – Spin up a list of SMB software CEOs in a new country in minutes. ABM prep** – Build multi-stakeholder buying-group lists for target accounts. Campaign-specific lists** – Quickly assemble contacts for a limited-time product launch. 7. 🛠 Customization Ideas prepare 🎯 Refine filters for people – Add seniorities or other filters in the node JSON PAYLOAD WITH Company Domains use the surfe search people api doc https://developers.surfe.com/public-009-search-people-v2 ♻️ Deduplicate – Check HubSpot first to skip existing contacts. 🟢 Slack alert – Replace Gmail with a Slack notification. 📊 Reporting – Append enriched contacts to a Google Sheet for analytics. 8. ✅ Summary Fire off the workflow, and n8n will find ICP-fit companies, pull key people, enrich direct contact data and drop everything into HubSpot — all on autopilot. Prospecting lists, done for you.
by iTzJok3r
Overview Intelligent email-to-WhatsApp automation that monitors Gmail and Outlook accounts, uses Google Gemini AI to filter important emails, and forwards them to WhatsApp via Evolution API. Key Features Multi-account support (Gmail + 2 Outlook accounts) AI-powered email classification with Google Gemini Automatic Arabic translation for foreign emails Approved sender whitelist Security email prioritization (2FA, activations, passwords) Spam and promotion filtering Automatic mark-as-read Link extraction from emails Setup Requirements Services Needed: n8n instance Gmail account with API access Microsoft Outlook account(s) Google Gemini API key (free tier available) Evolution API (self-hosted WhatsApp API) Credentials to Add: Gmail OAuth2 Microsoft Outlook OAuth2 Google Gemini API Evolution API Configuration Steps Import workflow Add all credentials in n8n Update WhatsApp numbers in all "Send" nodes (format: number@s.whatsapp.net) Replace "YourInstanceName" with your Evolution API instance Customize approved sender emails in AI Agent system prompts Test and activate How It Works Workflow monitors emails every minute, parses content with JavaScript, classifies importance with Google Gemini AI, extracts links, translates non-Arabic content, and sends formatted messages to WhatsApp. Use Cases Perfect for professionals needing instant mobile notifications for critical emails while filtering spam and promotions.
by Onur
Template Description: > Stop manually reading every CV and copy-pasting data into a spreadsheet. This workflow acts as an AI recruiting assistant, automating your entire initial screening process. It captures applications from a public form, uses AI to read and understand PDF CVs, structures the candidate data, saves it to Google Sheets, and notifies all parties. This template is designed to save HR professionals and small business owners countless hours, ensuring no applicant is missed and all data is consistently structured and stored. 🚀 What does this workflow do? Provides a public web form for candidates to submit their name, email, experience, and PDF CV. Automatically reads the text content from the uploaded PDF CV. Uses an AI Agent (OpenAI) to intelligently parse the CV text, extracting key data like contact info, work experience, education, skills, and more. Writes a concise summary** of the CV, perfect for quick screening by HR. Checks for duplicate applications** based on the candidate's email address. Saves all structured applicant data** into a new row in a Google Sheet, creating a powerful candidate database. Sends an automated confirmation email to the applicant. Sends a new application alert with the CV summary to the recruiter. 🎯 Who is this for? HR Departments & Recruiters:** Streamline your hiring pipeline and build a structured candidate database. Small Business Owners:** Manage job applications professionally without dedicated HR software. Hiring Managers:** Quickly get a summarized overview of each candidate without reading the full CV initially. ✨ Benefits Massive Time Savings:** Drastically reduces the time spent on manual CV screening and data entry. Structured Candidate Data:** Turns every CV into a consistently formatted row in a spreadsheet, making it easy to compare candidates. Never Miss an Applicant:** Every submission is logged, and you're instantly notified. Improved Candidate Experience:** Applicants receive an immediate confirmation that their submission was successful. AI-Powered Summaries:** Get a quick, AI-generated summary of each CV delivered to your inbox. ⚙️ How it Works Form Submission: A candidate fills out the n8n form and uploads their CV. PDF Extraction: The workflow extracts the raw text from the PDF file. AI Analysis: The text is sent to OpenAI with a prompt to structure all key information (experience, skills, etc.) into a JSON format. Duplicate Check: The workflow checks your Google Sheet to see if the applicant's email already exists. If so, it stops. Save to Database: If the applicant is new, their structured data is saved as a new row in Google Sheets. Send Notifications: Two emails are sent simultaneously: a confirmation to the applicant and a notification with the CV summary to the recruiter. 📋 n8n Nodes Used Form Trigger Extract From File OpenAI Code (or JSON Parser) Google Sheets If Gmail 🔑 Prerequisites An active n8n instance. OpenAI Account & API Key**. Google Account** with access to Google Sheets and Gmail (OAuth2 Credentials). A Google Sheet** prepared with columns to store the applicant data (e.g., name, email, experience, skills, cv_summary, etc.). 🛠️ Setup Import the workflow into your n8n instance. Configure Credentials: Connect your credentials for OpenAI and Google (for Sheets & Gmail) in their respective nodes. Customize the Form: In the 1. Applicant Submits Form node, you can add or remove fields as needed. Activate the workflow. Once active, copy the Production URL from the Form Trigger node and share it to receive applications. Set Your Email: In the 8b. Send Notification... (Gmail) node, change the "To" address to your own email address to receive alerts. Link Your Google Sheet: In the 5. Check for Duplicate... and 7. Save Applicant Data... nodes, select your spreadsheet and sheet.
by Fayzul Noor
Description This workflow is built for e-commerce store owners, customer support teams, and retail businesses who want to provide instant, intelligent email support without hiring additional staff. If you're tired of manually responding to customer inquiries, searching through product catalogs, and copying information into emails, this automation will transform your support process. It turns your inbox into a smart AI-powered support system that reads, understands, and responds to customer questions about your store products while you focus on growing your business. How it works / What it does: This n8n automation completely transforms how you handle customer email inquiries using AI and Retrieval-Augmented Generation (RAG) technology. Here's a simple breakdown of how it works: Monitor your Gmail inbox using the Gmail Trigger node, which checks every minute for new customer emails (excluding emails sent by you). Assess if a reply is needed with an AI-powered classification system. The workflow uses GPT-4.1 with a structured JSON parser to determine whether incoming emails are genuine customer inquiries about your men's clothing store that require a response. Filter relevant emails through the If Needs Reply node, which only passes emails that need attention to the AI Agent, preventing unnecessary processing. Generate intelligent responses using an AI Agent powered by GPT-4.1-nano. The agent uses a friendly, professional tone and starts each email with "Dear" and ends with "Best regards" to maintain proper email etiquette. Search your knowledge base with a Vector Store RAG tool connected to Pinecone. The AI Agent queries your men's clothing product database using OpenAI embeddings to find accurate, up-to-date information about prices, features, and product details. Send personalized replies automatically through the Gmail node, which responds directly to the original email thread with clear, concise, and empathetic answers to customer questions. Once everything is set up, the system runs on autopilot and provides 24/7 customer support without any manual intervention. How to set up: Follow these steps to get your AI-powered email support system running: Import the JSON file into your n8n instance. Add your API credentials: Gmail OAuth2 credentials for reading and sending emails OpenAI API key for the AI Agent and embeddings Pinecone API credentials for vector storage Set up your Pinecone vector database: Create a Pinecone index. Create a namespace. Upload your store data to the vector store Configure the Gmail Trigger node to monitor the correct email account. Customize the AI Agent's system message to match your brand voice and support policies. Activate the workflow to enable automatic monitoring and responses. Requirements: Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) A Gmail account for receiving and sending customer emails OpenAI API access for the AI Agent and embeddings (GPT-4.1 and GPT-4.1-nano models) A Pinecone account with a configured vector database containing your product information Your store data, product catalog prepared and uploaded to Pinecone How to customize the workflow: This workflow is flexible and can be customized to fit your business needs. Here's how you can tailor it: Adjust the response style by modifying the system message in the AI Agent node. You can make it more casual, formal, or brand-specific. Add response length controls by updating the system message instructions. Currently set to keep responses short and concise, you can adjust this for more detailed explanations. Change the polling frequency in the Gmail Trigger node. The default is every minute, but you can adjust it to check more or less frequently based on your email volume. Filter specific types of emails by modifying the filters in the Gmail Trigger and "Assess if message needs a reply" nodes to handle specific subjects, senders, or keywords. Connect to different email platforms by replacing the Gmail nodes with other email services like Outlook, IMAP, or customer support platforms. Add human-in-the-loop approval by inserting a webhook or notification node before the Gmail reply node, allowing manual review before sending responses. Implement response tracking by adding database nodes to log all AI-generated responses for quality control and training purposes. Add multi-language support by incorporating translation nodes or configuring the AI Agent to detect and respond in the customer's language.
by Ajay Yadav
Lead Qualification & Follow‑up (Gemini) Automate lead intake, AI qualification, and next‑step outreach. Qualified leads get a scheduled meeting, Zoom details, an email confirmation, CRM update, and Mailchimp enrollment. Not‑qualified leads receive a follow‑up sequence, CRM update, and a 30‑day reminder. What this workflow does AI qualifies leads as QUALIFIED or NOT QUALIFIED using Google Gemini. Supports two triggers: Webhook (wordpress-form) or n8n Form Trigger. QUALIFIED branch: AI phone call via VAPI Schedules Google Calendar event Creates Zoom meeting Sends confirmation email via Gmail Adds to Mailchimp audience Updates contact in HubSpot NOT QUALIFIED branch: AI phone call via VAPI Adds to Mailchimp audience Sends follow‑up email via Gmail Updates contact in HubSpot Creates 30‑day follow‑up calendar event Apps and credentials required Google Gemini (PaLM/Gemini API) Gmail HubSpot Zoom Google Calendar VAPI (for AI phone calls) Mailchimp Environment variables MAILCHIMP_LIST_ID_QUALIFIED=your_mailchimp_list_id_for_qualified MAILCHIMP_LIST_ID_FOLLOWUP=your_mailchimp_list_id_for_followup Triggers supported Webhook: path wordpress-form (POST) Form Trigger: built‑in n8n form Use only one in production. Keep the other disabled. Expected input (fields) name: string email: string message: string If using Webhook, send a JSON body with the fields above. Setup Connect credentials: Google Gemini (model: models/gemini-2.5-flash) Gmail HubSpot (OAuth) Zoom Google Calendar (select the target calendar) VAPI (HTTP header auth: Bearer token) Set env vars: MAILCHIMP_LIST_ID_QUALIFIED MAILCHIMP_LIST_ID_FOLLOWUP Choose your trigger: Webhook: enable and use the provided URL for wordpress-form Form Trigger: enable and publish the form Review timing: adjust Wait nodes for your timezone and SLA. Personalize messaging: edit Gmail subjects/bodies and Zoom topic. CRM and lists: confirm HubSpot properties and Mailchimp list IDs. How it works (at a glance) Intake → AI classifies (QUALIFIED / NOT QUALIFIED) QUALIFIED: VAPI call → Schedule Calendar → Create Zoom → Add to Mailchimp (qualified) → Gmail confirmation → HubSpot update NOT QUALIFIED: VAPI call → Add to Mailchimp (follow‑up) → Gmail follow‑up → HubSpot update → 30‑day calendar event Test the workflow (before going live) Submit a test via your chosen trigger with name, email, message. Confirm AI decision at the “Lead Decision” node. If QUALIFIED: VAPI call executed Calendar event created Zoom meeting created (join URL available) Mailchimp enrollment (qualified list) Gmail confirmation sent HubSpot contact created/updated If NOT QUALIFIED: VAPI call executed Mailchimp enrollment (follow‑up list) Gmail follow‑up sent HubSpot updated 30‑day calendar reminder created Open any failing HTTP nodes and review response codes/messages. Go‑live checklist All credentials connected (no warnings) MAILCHIMP_LIST_ID_QUALIFIED and MAILCHIMP_LIST_ID_FOLLOWUP set Timezone and delays validated Email copy approved Only one trigger enabled Final end‑to‑end test passed Toggle workflow Active Customization ideas Add a Slack or Microsoft Teams notification on QUALIFIED Enrich leads (Clearbit, ZoomInfo, etc.) before AI decision Swap Mailchimp for your ESP (Klaviyo, SendGrid Marketing) Add a second‑chance branch for ambiguous AI classifications Localize email copy by country or language Troubleshooting Webhook receives no data: ensure external form POSTs JSON to the n8n URL and network rules allow it. AI decision empty/garbled: verify Gemini credentials/model ID and input fields. Mailchimp errors: verify List IDs and that email is valid. Gmail send fails: check OAuth scopes and daily limits. Zoom/Calendar issues: re‑connect OAuth; verify calendar access. HubSpot errors: confirm OAuth scopes and property mappings. Security and scopes Gmail: send email Google Calendar: create events Zoom: create meetings HubSpot: read/write contacts Mailchimp: list membership VAPI: authenticated HTTP requests Gemini: model inference Use least‑privilege for each integration. Limits and notes Gmail and Mailchimp rate limits may apply during spikes. Zoom and Google Calendar API quotas apply for frequent scheduling. VAPI call timeouts are 30s by default; adjust as needed. Changelog 2025‑09‑15: Initial public template with dual triggers, Gemini qualification, VAPI calls, scheduling, Mailchimp, Gmail, and HubSpot updates.
by Vigh Sandor
Network Vulnerability Scanner (used NMAP as engine) with Automated CVE Report Workflow Overview This n8n workflow provides comprehensive network vulnerability scanning with automated CVE enrichment and professional report generation. It performs Nmap scans, queries the National Vulnerability Database (NVD) for CVE information, generates detailed HTML/PDF reports, and distributes them via Telegram and email. Key Features Automated Network Scanning**: Full Nmap service and version detection scan CVE Enrichment**: Automatic vulnerability lookup using NVD API CVSS Scoring**: Vulnerability severity assessment with CVSS v3.1/v3.0 scores Professional Reporting**: HTML reports with detailed findings and recommendations PDF Generation**: Password-protected PDF reports using Prince XML Multi-Channel Distribution**: Telegram and email delivery Multiple Triggers**: Webhook API, web form, manual execution, scheduled scans Rate Limiting**: Respects NVD API rate limits Comprehensive Data**: Service detection, CPE matching, CVE details with references Use Cases Regular security audits of network infrastructure Compliance scanning for vulnerability management Penetration testing reconnaissance phase Asset inventory with vulnerability context Continuous security monitoring Vulnerability assessment reporting for management DevSecOps integration for infrastructure testing Setup Instructions Prerequisites Before setting up this workflow, ensure you have: System Requirements n8n instance (self-hosted) with command execution capability Alpine Linux base image (or compatible Linux distribution) Minimum 2 GB RAM (4 GB recommended for large scans) 2 GB free disk space for dependencies Network access to scan targets Internet connectivity for NVD API Required Knowledge Basic networking concepts (IP addresses, ports, protocols) Understanding of CVE/CVSS vulnerability scoring Nmap scanning basics External Services Telegram Bot (optional, for Telegram notifications) Email server / SMTP credentials (optional, for email reports) NVD API access (public, no API key required but rate-limited) Step 1: Understanding the Workflow Components Core Dependencies Nmap: Network scanner Purpose: Port scanning, service detection, version identification Usage: Performs TCP SYN scan with service/version detection nmap-helper: JSON conversion tool Repository: https://github.com/net-shaper/nmap-helper Purpose: Converts Nmap XML output to JSON format Prince XML: HTML to PDF converter Website: https://www.princexml.com Version: 16.1 (Alpine 3.20) Purpose: Generates professional PDF reports from HTML Features: Password protection, print-optimized formatting NVD API: Vulnerability database Endpoint: https://services.nvd.nist.gov/rest/json/cves/2.0 Purpose: CVE information, CVSS scores, vulnerability descriptions Rate Limit: Public API allows limited requests per minute Documentation: https://nvd.nist.gov/developers Step 2: Telegram Bot Configuration (Optional) If you want to receive reports via Telegram: Create Telegram Bot Open Telegram and search for @BotFather Start a chat and send /newbot Follow prompts: Bot name: Network Scanner Bot (or your choice) Username: network_scanner_bot (must end with 'bot') BotFather will provide: Bot token: 123456789:ABCdefGHIjklMNOpqrsTUVwxyz (save this) Bot URL: https://t.me/your_bot_username Get Your Chat ID Start a chat with your new bot Send any message to the bot Visit: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/getUpdates Find your chat ID in the response Save this chat ID (e.g., 123456789) Alternative: Group Chat ID For sending to a group: Add bot to your group Send a message in the group Check getUpdates URL Group chat IDs are negative: -1001234567890 Add Credentials to n8n Navigate to Credentials in n8n Click Add Credential Select Telegram API Fill in: Access Token: Your bot token from BotFather Click Save Test connection if available Step 3: Email Configuration (Optional) If you want to receive reports via email: Add SMTP Credentials to n8n Navigate to Credentials in n8n Click Add Credential Select SMTP Fill in: Host: SMTP server address (e.g., smtp.gmail.com) Port: SMTP port (587 for TLS, 465 for SSL, 25 for unencrypted) User: Your email username Password: Your email password or app password Secure: Enable for TLS/SSL Click Save Gmail Users: Enable 2-factor authentication Generate app-specific password: https://myaccount.google.com/apppasswords Use app password in n8n credential Step 4: Import and Configure Workflow Configure Basic Parameters Locate "1. Set Parameters" Node: Click the node to open settings Default configuration: network: Input from webhook/form/manual trigger timestamp: Auto-generated (format: yyyyMMdd_HHmmss) report_password: Almafa123456 (change this!) Change Report Password: Edit report_password assignment Set strong password: 12+ characters, mixed case, numbers, symbols This password will protect the PDF report Save changes Step 5: Configure Notification Endpoints Telegram Configuration Locate "14/a. Send Report in Telegram" Node: Open node settings Update fields: Chat ID: Replace -123456789012 with your actual chat ID Credentials: Select your Telegram credential Save changes Message customization: Current: Sends PDF as document attachment Automatic filename: vulnerability_report_<timestamp>.pdf No caption by default (add if needed) Email Configuration Locate "14/b. Send Report in Email with SMTP" Node: Open node settings Update fields: From Email: report.creator@example.com → Your sender email To Email: report.receiver@example.com → Your recipient email Subject: Customize if needed (default includes network target) Text: Email body message Credentials: Select your SMTP credential Save changes Multiple Recipients: Change toEmail field to comma-separated list: admin@example.com, security@example.com, manager@example.com Add CC/BCC: In node options, add: cc: Carbon copy recipients bcc: Blind carbon copy recipients Step 6: Configure Triggers The workflow supports 4 trigger methods: Trigger 1: Webhook API (Production) Locate "Webhook" Node: Path: /vuln-scan Method: POST Response: Immediate acknowledgment "Process started!" Async: Scan runs in background Trigger 2: Web Form (User-Friendly) Locate "On form submission" Node: Path: /webhook-test/form/target Method: GET (form display), POST (form submit) Form Title: "Add scan parameters" Field: network (required) Form URL: https://your-n8n-domain.com/webhook-test/form/target Users can: Open form URL in browser Enter target network/IP Click submit Receive confirmation Trigger 3: Manual Execution (Testing) Locate "Manual Trigger" Node: Click to activate Opens workflow with "Pre-Set-Target" node Default target: scanme.nmap.org (Nmap's official test server) To change default target: Open "Pre-Set-Target" node Edit network value Enter your test target Save changes Trigger 4: Scheduled Scans (Automated) Locate "Schedule Trigger" Node: Default: Daily at 1:00 AM Uses "Pre-Set-Target" for network To change schedule: Open node settings Modify trigger time: Hour: 1 (1 AM) Minute: 0 Day of week: All days (or select specific days) Save changes Schedule Examples: Every day at 3 AM: Hour: 3, Minute: 0 Weekly on Monday at 2 AM: Hour: 2, Day: Monday Twice daily (8 AM, 8 PM): Create two Schedule Trigger nodes Step 7: Test the Workflow Recommended Test Target Use Nmap's official test server for initial testing: Target**: scanme.nmap.org Purpose**: Official Nmap testing server Safe**: Designed for scanning practice Permissions**: Public permission to scan Important: Never scan targets without permission. Unauthorized scanning is illegal. Manual Test Execution Open workflow in n8n editor Click Manual Trigger node to select it Click Execute Workflow button Workflow will start with scanme.nmap.org as target Monitor Execution Watch nodes turn green as they complete: Need to Add Helper?: Checks if nmap-helper installed Add NMAP-HELPER: Installs helper (if needed, ~2-3 minutes) Optional Params Setter: Sets scan parameters 2. Execute Nmap Scan: Runs scan (5-30 minutes depending on target) 3. Parse NMAP JSON to Services: Extracts services (~1 second) 5. CVE Enrichment Loop: Queries NVD API (1 second per service) 8-10. Report Generation: Creates HTML/PDF reports (~5-10 seconds) 12. Convert to PDF: Generates password-protected PDF (~10 seconds) 14a/14b. Distribution: Sends reports Check Outputs Click nodes to view outputs: Parse NMAP JSON**: View discovered services CVE Enrichment**: See vulnerabilities found Prepare Report Structure**: Check statistics Read Report PDF**: Download report to verify Verify Distribution Telegram: Open Telegram chat with your bot Check for PDF document Download and open with password Email: Check inbox for report email Verify subject line includes target network Download PDF attachment Open with password How to Use Understanding the Scan Process Initiating Scans Method 1: Webhook API Use curl or any HTTP client and add "network" parameter in a POST request. Response: Process started! Scan runs asynchronously. You'll receive results via configured channels (Telegram/Email). Method 2: Web Form Open form URL in browser: https://your-n8n.com/webhook-test/form/target Fill in form: network: Enter target (IP, range, domain) Click Submit Receive confirmation Wait for report delivery Advantages: No command line needed User-friendly interface Input validation Good for non-technical users Method 3: Manual Execution For testing or one-off scans: Open workflow in n8n Edit "Pre-Set-Target" node: Change network value to your target Click Manual Trigger node Click Execute Workflow Monitor progress in real-time Advantages: See execution in real-time Debug issues immediately Test configuration changes View intermediate outputs Method 4: Scheduled Scans For regular, automated security audits: Configure "Schedule Trigger" node with desired time Configure "Pre-Set-Target" node with default target Activate workflow Scans run automatically on schedule Advantages: Automated security monitoring Regular compliance scans No manual intervention needed Consistent scheduling Scan Targets Explained Supported Target Formats Single IP Address: 192.168.1.100 10.0.0.50 CIDR Notation (Subnet): 192.168.1.0/24 # Scans 192.168.1.0-255 (254 hosts) 10.0.0.0/16 # Scans 10.0.0.0-255.255 (65534 hosts) 172.16.0.0/12 # Scans entire 172.16-31.x.x range IP Range: 192.168.1.1-50 # Scans 192.168.1.1 to 192.168.1.50 10.0.0.1-10.0.0.100 # Scans across range Multiple Targets: 192.168.1.1,192.168.1.2,192.168.1.3 Hostname/Domain: scanme.nmap.org example.com server.local Choosing Appropriate Targets Development/Testing: Use scanme.nmap.org (official test target) Use your own isolated lab network Never scan public internet without permission Internal Networks: Use CIDR notation for entire subnets Scan DMZ networks separately from internal Consider network segmentation in scan design Understanding Report Contents Report Structure The generated report includes: 1. Executive Summary: Total hosts discovered Total services identified Total vulnerabilities found Severity breakdown (Critical, High, Medium, Low, Info) Scan date and time Target network 2. Overall Statistics: Visual dashboard with key metrics Severity distribution chart Quick risk assessment 3. Detailed Findings by Host: For each discovered host: IP address Hostname (if resolved) List of open ports and services Service details: Port number and protocol Service name (e.g., http, ssh, mysql) Product (e.g., Apache, OpenSSH, MySQL) Version (e.g., 2.4.41, 8.2p1, 5.7.33) CPE identifier 4. Vulnerability Details: For each vulnerable service: CVE ID**: Unique vulnerability identifier (e.g., CVE-2021-44228) Severity**: CRITICAL / HIGH / MEDIUM / LOW / INFO CVSS Score**: Numerical score (0.0-10.0) Published Date**: When vulnerability was disclosed Description**: Detailed vulnerability explanation References**: Links to advisories, patches, exploits 5. Recommendations: Immediate actions (patch critical/high severity) Long-term improvements (security processes) Best practices Vulnerability Severity Levels CRITICAL (CVSS 9.0-10.0): Color: Red Characteristics: Remote code execution, full system compromise Action: Immediate patching required Examples: Log4Shell, EternalBlue, Heartbleed HIGH (CVSS 7.0-8.9): Color: Orange Characteristics: Significant security impact, data exposure Action: Patch within days Examples: SQL injection, privilege escalation, authentication bypass MEDIUM (CVSS 4.0-6.9): Color: Yellow Characteristics: Moderate security impact Action: Patch within weeks Examples: Information disclosure, denial of service, XSS LOW (CVSS 0.1-3.9): Color: Green Characteristics: Minor security impact Action: Patch during regular maintenance Examples: Path disclosure, weak ciphers, verbose error messages INFO (CVSS 0.0): Color: Blue Characteristics: No vulnerability found or informational Action: No action required, awareness only Examples: Service version detected, no known CVEs Understanding CPE CPE (Common Platform Enumeration): Standard naming scheme for IT products Used for CVE lookup in NVD database Workflow CPE Handling: Nmap detects service and version Nmap provides CPE (if in database) Workflow uses CPE to query NVD API NVD returns CVEs associated with that CPE Special case: nginx vendor fixed from igor_sysoev to nginx Working with Reports Accessing HTML Report Location: /tmp/vulnerability_report_<timestamp>.html Viewing: Open in web browser directly from n8n Click "11. Read Report for Output" node Download HTML file Open locally in any browser Advantages: Interactive (clickable links) Searchable text Easy to edit/customize Smaller file size Accessing PDF Report Location: /tmp/vulnerability_report_<timestamp>.pdf Password: Default: Almafa123456 (configured in "1. Set Parameters") Change in workflow before production use Required to open PDF Opening PDF: Receive PDF via Telegram or Email Open with PDF reader (Adobe, Foxit, Browser) Enter password when prompted View, print, or share Advantages: Professional appearance Print-optimized formatting Password protection Portable (works anywhere) Preserves formatting Report Customization Change Report Title: Open "8. Prepare Report Structure" node Find metadata object Edit title and subtitle fields Customize Styling: Open "9. Generate HTML Report" node Modify CSS in <style> section Change colors, fonts, layout Add Company Logo: Edit HTML generation code Add `` tag in header section Include base64-encoded logo or URL Modify Recommendations: Open "9. Generate HTML Report" node Find Recommendations section Edit recommendation text Scanning Ethics and Legality Authorization is Mandatory: Never scan networks without explicit written permission Unauthorized scanning is illegal in most jurisdictions Can result in criminal charges and civil liability Scope Definition: Document approved scan scope Exclude out-of-scope systems Maintain scan authorization documents Notification: Inform network administrators before scans Provide scan window and source IPs Have emergency contact procedures Safe Targets for Testing: scanme.nmap.org: Official Nmap test server Your own isolated lab network Cloud instances you own Explicitly authorized environments Compliance Considerations PCI DSS: Quarterly internal vulnerability scans required Scan all system components Re-scan after significant changes Document scan results HIPAA: Regular vulnerability assessments required Risk analysis and management Document remediation efforts ISO 27001: Vulnerability management process Regular technical vulnerability scans Document procedures NIST Cybersecurity Framework: Identify vulnerabilities (DE.CM-8) Maintain inventory Implement vulnerability management License and Credits Workflow: Created for n8n workflow automation Free for personal and commercial use Modify and distribute as needed No warranty provided Dependencies: Nmap**: GPL v2 - https://nmap.org nmap-helper**: Open source - https://github.com/net-shaper/nmap-helper Prince XML**: Commercial license required for production use - https://www.princexml.com NVD API**: Public API by NIST - https://nvd.nist.gov Third-Party Services: Telegram Bot API: https://core.telegram.org/bots/api SMTP: Standard email protocol Support For Nmap issues: Documentation: https://nmap.org/book/ Community: https://seclists.org/nmap-dev/ For NVD API issues: Status page: https://nvd.nist.gov Contact: https://nvd.nist.gov/general/contact For Prince XML issues: Documentation: https://www.princexml.com/doc/ Support: https://www.princexml.com/doc/help/ Workflow Metadata External Dependencies**: Nmap, nmap-helper, Prince XML, NVD API License**: Open for modification and commercial use Security Disclaimer This workflow is provided for legitimate security testing and vulnerability assessment purposes only. Users are solely responsible for ensuring they have proper authorization before scanning any network or system. Unauthorized network scanning is illegal and unethical. The authors assume no liability for misuse of this workflow or any damages resulting from its use. Always obtain written permission before conducting security assessments.
by Oneclick AI Squad
This n8n workflow automates the monitoring of warehouse inventory and sales velocity to predict demand, generate purchase orders automatically, send them to suppliers, and record all transactions in ERP and database systems. It uses AI-driven forecasting to ensure timely restocking while maintaining operational efficiency and minimizing stockouts or overstocking. Key Features Automated Scheduling:** Periodically checks inventory and sales data at defined intervals. Real-Time Data Fetching:** Retrieves live warehouse stock levels and sales trends. AI Demand Forecasting:** Uses OpenAI GPT to predict future demand based on sales velocity and stock trends. Auto-Purchase Orders:** Automatically generates and sends purchase orders to suppliers. ERP Integration:** Logs completed purchase orders into ERP systems like SAP, Oracle, or Netsuite. Database Logging:** Saves purchase order details and forecast confidence data into SQL databases (PostgreSQL/MySQL). Email Notifications:** Notifies relevant teams upon successful order creation and logging. Modular Configuration:** Each node includes configuration notes and credentials setup instructions. Workflow Process Schedule Trigger Runs every 6 hours to monitor stock and sales data. Interval can be adjusted for higher or lower frequency checks. Fetch Current Inventory Data Retrieves live inventory levels from the warehouse API endpoint. Requires API credentials and optional GET/POST method setup. Fetch Sales Velocity Pulls recent sales data for forecasting analysis. Used later for AI-based trend prediction. Merge Inventory & Sales Data Combines inventory and sales datasets into a unified JSON structure. Prepares data for AI model input. AI Demand Forecasting Sends merged data to OpenAI GPT for demand prediction. Returns demand score, reorder need, and confidence levels. Parse AI Response Extracts and structures forecast results. Combines AI data with original inventory dataset. Filter: Reorder Needed Identifies items flagged for reorder based on AI output. Passes only reorder-required products to next steps. Create Purchase Order Automatically creates a PO document with item details, quantity, and supplier information. Calculates total cost and applies forecast-based reorder logic. Send PO to Supplier Sends the generated purchase order to supplier API endpoints. Includes response validation for order success/failure. Log to ERP System Records confirmed purchase orders into ERP platforms (SAP, Oracle, Netsuite). Includes timestamps and forecast metrics. Save to Database Stores all PO data, supplier responses, and AI forecast metrics into PostgreSQL/MySQL tables. Useful for long-term audit and analytics. Send Notification Email Sends summary emails upon PO creation and logging. Includes PO ID, supplier, cost, and demand reasoning. Setup Instructions Schedule Trigger:** Adjust to your preferred interval (e.g., every 6 hours or once daily). API Configuration:** Provide credentials in Inventory, Sales, and Supplier nodes. Use Authorization headers or API keys as per your system. AI Node (OpenAI):** Add your OpenAI API key in the credentials section. Modify the prompt if you wish to include additional forecasting parameters. ERP Integration:** Replace placeholder URLs with your ERP system endpoints. Match fields like purchase order number, date, and cost. Database Connection:** Configure credentials for PostgreSQL/MySQL in the Save to Database node. Ensure tables (purchase_orders) are created as per schema provided in sticky notes. Email Notifications:** Set up SMTP credentials (e.g., Gmail, Outlook, or custom mail server). Add recipients under workflow notification settings. Industries That Benefit This automation is highly beneficial for: Retail & E-commerce:** Predicts product demand and auto-orders from suppliers. Manufacturing:** Ensures raw materials are restocked based on production cycles. Pharmaceuticals:** Maintains optimum inventory for high-demand medicines. FMCG & Supply Chain:** Balances fast-moving goods availability with minimal overstocking. Automotive & Electronics:** Prevents delays due to missing components. Prerequisites API access to inventory, sales, supplier, and ERP systems. Valid OpenAI API key for demand forecasting. SQL database (PostgreSQL/MySQL) for record storage. SMTP or mail server credentials for email notifications. n8n environment with required nodes installed (HTTP, AI, Filter, Email, Database). Modification Options Change forecast logic or thresholds for different industries. Integrate Slack/Teams for live notifications. Add approval workflow before sending POs. Extend AI prompt for seasonality or promotional trends. Add dashboard visualization using Grafana or Google Sheets. Explore More AI Workflows: Get in touch with us to build industry-grade n8n automations with predictive intelligence.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Stay ahead of market changes with this Automated Price Intelligence System! This workflow monitors e-commerce product prices 3x daily using advanced web scraping and AI analysis, tracking price changes, comparing against strategic thresholds, and sending intelligent alerts for competitive pricing opportunities. Perfect for e-commerce teams, retailers, and pricing strategists maintaining market competitiveness. What This Template Does Triggers 3x daily (9 AM, 3 PM, 9 PM) for continuous price monitoring. Configures global settings for products, thresholds, and alert recipients. Parses product targets and maps price thresholds by category (premium/default/budget). Uses Decodo scraper to extract real-time pricing data from e-commerce sites. Analyzes product pages with AI to extract structured price information. Logs all price data to Google Sheets for historical tracking and trend analysis. Evaluates prices against thresholds with AI-powered strategic recommendations. Classifies alerts as CRITICAL or STANDARD based on price drop severity. Sends targeted email alerts with strategic action recommendations. Key Benefits Continuous price monitoring across multiple e-commerce platforms AI-powered strategic recommendations for pricing decisions Historical price tracking for trend analysis and forecasting Multi-tier alert system for critical vs. standard price changes Automated competitive intelligence without manual monitoring Configurable thresholds for different product categories Features Triple-daily scheduling for comprehensive market coverage Multi-product monitoring with individual threshold configuration AI-powered price extraction and data structuring Real-time web scraping with Decodo integration Strategic alert classification (CRITICAL/STANDARD) Automated email notifications with actionable insights Google Sheets integration for data centralization Batch processing for efficient multi-product handling Quality assurance with auto-fixing output parsing Requirements Decodo API credentials for web scraping OpenAI API credentials for AI analysis Google Sheets OAuth2 credentials with edit access Gmail OAuth2 credentials for email alerts Environment variables for configuration settings Product URLs with internal tracking IDs Target Audience E-commerce and retail pricing teams Competitive intelligence analysts Pricing strategy and revenue optimization teams E-commerce marketing and sales teams Retail operations and category managers Digital agency e-commerce specialists Step-by-Step Setup Instructions Connect Decodo API credentials for reliable web scraping Set up OpenAI credentials for AI price analysis and strategy recommendations Configure Google Sheets for price history tracking and logging Add Gmail credentials for critical and standard alert notifications Define your product URLs with internal IDs and threshold types Set price thresholds for premium, default, and budget categories Configure alert recipients for different notification levels Test with sample product URLs to verify data extraction and alerting Activate for automated triple-daily price intelligence monitoring Pro Tip: Use coupon code "YARON" for free Decodo credits to enhance your price intelligence capabilities! This workflow ensures you never miss a pricing opportunity with automated monitoring, intelligent analysis, and strategic alerting!
by Rahul Joshi
Streamline the final stage of your content production workflow by automating publishing, formatting, metadata generation, and approval routing. This AI-powered subworkflow pulls optimized drafts from Google Sheets, enriches them with SEO metadata, converts them into publish-ready HTML, and delivers them via email and Slack for approval or distribution. Ideal for teams managing high-volume content pipelines with structured review processes. ✨📝🚀 What This Template Does Triggers via chat to start the content publishing process. 💬 Fetches the latest optimized content draft from Google Sheets using a content ID. 📄 Prepares metadata such as topic, intent, platform, and parameters. 🧩 Uses an AI agent (GPT-4) to generate SEO metadata, HTML-formatted article, tags, and structured publish data. 🤖 Enforces JSON structure to ensure consistent output formatting. 🧱 Saves the publish-ready content (title, meta description, HTML, tags) back into Google Sheets for version tracking. 📊 Sends the content to an approver via Gmail with a previewed HTML body. 📧 Awaits approval and branches based on decision. 🔀 If approved, sends the final published content to the intended recipient via Gmail. 📨 Sends a success confirmation message to Slack for team visibility. 📢 Key Benefits ✅ AI-generated SEO optimization, metadata, and HTML formatting ✅ Centralizes content versioning within Google Sheets ✅ Automates approval workflows and content delivery ✅ Ensures consistent output structure with JSON parsing ✅ Reduces manual formatting, editing, and routing tasks ✅ Delivers instant Slack notifications for team transparency Features Chat-triggered publishing workflow Google Sheets content retrieval and storage AI-driven formatting, metadata generation, HTML conversion Structured JSON enforcement for clean automation Gmail integration for approval + publishing Slack notifications for successful publication Short-term memory support for context persistence Requirements Google Sheets OAuth2 credentials OpenAI API key (GPT-4 or GPT-4 mini) Gmail OAuth2 credentials for sending and receiving approval messages Slack API credentials with chat:write access Preconfigured Google Sheet containing optimized content drafts Target Audience Content operations teams handling recurring content workflows SEO and marketing teams producing high-volume articles Agencies managing structured approval pipelines Automation specialists building content publishing systems Teams needing standardized, AI-enhanced HTML content Step-by-Step Setup Instructions Connect your Google Sheets OAuth2 credential and replace the sheet/document IDs. 🗂️ Add your OpenAI API key for the AI Publishing Agent. 🔑 Connect Gmail credentials for both approval and final publishing emails. 📧 Update all email addresses and Slack channel IDs with your own. ✏️ Modify metadata fields (topic, intent, platform) if needed. 🎯 Run the workflow with a sample content ID to verify the flow. 🔍 Enable and integrate as a subworkflow inside your main content pipeline. 🚀
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Keep your SEO performance on track with this automated SEO Watchlist Monitor! This workflow combines AI-powered strategy analysis with real-time search ranking checks to track keyword positions, identify content gaps, and alert you to critical ranking drops. Perfect for marketing teams ensuring search visibility and competitive intelligence across platforms. 🚀🔍 What This Template Does 1️⃣ Triggers daily SEO intelligence checks to monitor keyword performance. 2️⃣ Configures target keywords, competitor domains, and geographic focus. 3️⃣ Validates SEO configuration to ensure proper setup. 4️⃣ Uses AI to analyze keyword competitiveness and strategic opportunities. 5️⃣ Checks real-time search rankings using Google Search scraper. 6️⃣ Detects critical ranking drops below position 10. 7️⃣ Saves SEO intelligence to Google Sheets for tracking. 8️⃣ Sends email alerts for urgent ranking issues. 9️⃣ Provides daily Slack summaries of SEO performance. Key Benefits ✅ Monitors keyword rankings and competitor movements daily ✅ Identifies content gaps and strategic opportunities with AI analysis ✅ Alerts instantly to critical ranking drops for quick action ✅ Centralizes SEO intelligence in Google Sheets for team visibility ✅ Combines AI insights with real-time search data for comprehensive monitoring Features Daily automated schedule for continuous monitoring AI-powered SEO strategy analysis and competitive intelligence Real-time search ranking checks using Decodo scraper Critical alert system for ranking drops Google Sheets integration for data centralization Slack and Gmail notifications for team awareness Configuration validation and error logging Structured data parsing for consistent reporting Requirements OpenAI API credentials for AI analysis Decodo API credentials for search scraping Google Sheets OAuth2 credentials with edit access Gmail OAuth2 credentials for email alerts Slack Bot Token with chat:write permission Environment variables for configuration settings Target Audience SEO and digital marketing teams 🎯 Content strategy and growth teams 📈 Competitive intelligence professionals 🔍 Marketing operations teams 🚀 Agency account managers managing multiple clients 💼 Step-by-Step Setup Instructions 1️⃣ Connect OpenAI credentials for AI analysis capabilities 2️⃣ Set up Decodo API credentials for search scraping functionality 3️⃣ Configure Google Sheets with required headers (Keyword, Rank, description, etc.) 4️⃣ Add Gmail and Slack credentials for alerting and notifications 5️⃣ Set your target keywords, competitors, and geographic focus in the configuration node 6️⃣ Configure the cron schedule (hourly) for daily monitoring frequency 7️⃣ Run once manually to verify all integrations and data flow 8️⃣ Activate for ongoing SEO performance tracking and alerting ✅
by Ramon David
This workflow manages subscription billing reminders and data updates via Telegram. It runs daily at 8:00 AM to check for upcoming due subscriptions, formats relevant information, and sends reminders to users. It also processes user messages for subscription management—adding, updating, or retrieving billing info—using AI-powered natural language understanding. Main outcomes include automated subscription tracking, timely reminders, and conversational interaction through Telegram, reducing manual tracking efforts and improving billing accuracy. Automation Benefits Time & Cost Savings Manual Process: Several hours/week spent managing subscriptions and reminders manually. Automated Process: Workflow completes checks, reminders, and data updates in under a minute. Time Savings: Saves approximately 5 hours weekly, translating to significant productivity gains and cost reduction. ROI: Automation pays for itself within the first month due to saved labor. Error Reduction: Minimized manual entry errors, ensuring accurate billing records and timely reminders. Business Impact Solves the problem of manual subscription tracking and reminders. Scales effortlessly as subscription list grows. Opens new opportunities for proactive customer engagement, personalized messaging, and integrated billing insights. Setup Guide Prerequisites Google Sheets account with subscription data sheet. OpenAI API key with access to GPT-4. Telegram bot token with messaging permissions. Email SMTP setup if email reminders are used. API Configuration Google Sheets: Generate OAuth2 credentials, enable Sheets API, and authorize access. OpenAI: Create API key, set model to GPT-4, and test connectivity. Telegram: Create bot via BotFather, retrieve token, and set webhook URL. Webhook URL: Use the provided URL in the Telegram bot settings. Node-by-Node Setup OpenAI Chat Model: Enter API credentials, select GPT-4 model. Google Sheets: Input spreadsheet ID, sheet name, and ensure correct permissions. Telegram Nodes: Insert chat ID, message parsing, and response formatting. Schedule Trigger: Confirm cron expression for daily execution. For AI nodes, test with sample messages to verify formatting and extraction. Testing & Validation Run workflow manually. Confirm data is retrieved, processed, and responses sent. Verify subscription updates in Google Sheets. Check Telegram chats for correct message flow. N8N Documentation References Google Sheets Node OpenAI Node Telegram Node Schedule Trigger Maintenance & Troubleshooting Regular Maintenance (Monthly) Check API credentials and renew tokens if expired. Monitor workflow logs for errors. Review Google Sheets data for consistency. Update API keys when new versions or permissions are granted. Verify currency conversion accuracy periodically. Common Issues & Solutions Workflow not triggering: check schedule settings and webhook URLs. Data not updating: verify Google Sheets credentials and permissions. Incorrect responses: test AI prompt inputs and outputs. API failures: regenerate API keys or check quota limits. Reconfigure nodes if external API changes. Monitoring & Alerts Set up email or Slack alerts for failures. Regularly review execution logs. Track key metrics like successful runs, error rates, and response times. Support & Escalation Check n8n logs first for errors. Export workflow for support if needed. Use n8n community forums for common issues. Contact API providers for account-specific problems. Emergency procedures: restart workflow, regenerate tokens. Updates & Improvements Review workflow performance quarterly. Optimize AI prompts for better accuracy. Backup workflow configurations before major changes. Incorporate user feedback for feature enhancements.