by Qandil
What it does A conversational AI agent that connects to WAFtester via MCP (Model Context Protocol) for interactive Web Application Firewall security testing. Type natural language requests — the agent picks the right tools, runs the tests, and explains the results. About WAFtester WAFtester is an open-source CLI for testing Web Application Firewalls. It ships 27 MCP tools, 2,800+ attack payloads across 18 categories (SQLi, XSS, SSRF, SSTI, command injection, XXE, and more), detection signatures for 26 WAF vendors and 9 CDNs, and enterprise-grade assessment with F1/MCC scoring and letter grades (A+ through F). GitHub: github.com/waftester/waftester Docs: Installation | Examples | Commands Who it's for Security engineers** running ad-hoc WAF assessments Penetration testers** who want AI-assisted reconnaissance and bypass discovery DevSecOps teams** validating WAF coverage before and after deployments API security teams** testing OpenAPI/Swagger specs against WAF rules How it works The workflow has four nodes: Chat Trigger — Opens an n8n chat interface where you type requests in plain English AI Agent — Receives your message, reasons about which tools to call, and orchestrates the testing workflow OpenAI Chat Model — Provides the LLM reasoning layer (GPT-4o recommended; swappable for Anthropic, Ollama, etc.) WAFtester MCP — Connects to the WAFtester server via SSE and exposes all 27 tools to the agent The agent follows a standard WAF testing workflow: detect_waf — Fingerprint the WAF vendor and CDN protecting the target discover — Map the attack surface (endpoints, parameters, technologies) from robots.txt, sitemaps, JavaScript, and Wayback Machine learn — Generate a prioritized test plan based on discovery results scan — Fire 2,800+ attack payloads and measure detection vs. bypass rates bypass — Systematic mutation matrix testing to find WAF evasion techniques assess — Generate a formal security grade with F1, precision, MCC, and false positive rate Long-running operations (scan, assess, bypass, discover, discover_bypasses, event_crawl, scan_spec) run asynchronously — the agent polls for results automatically. Key capabilities | Capability | Details | |---|---| | WAF detection | Fingerprint 26 WAF vendors and 9 CDNs from response headers, cookies, and error pages | | Payload scanning | 2,800+ payloads across 18 attack categories | | Bypass discovery | Mutation matrix with 40+ tamper techniques to find WAF evasions | | Enterprise assessment | F1 score, precision, MCC, false positive rate, and A+ through F grading | | API spec testing | Validate, plan, and scan OpenAPI/Swagger/Postman specs | | Headless crawling | Click-driven DOM crawling via headless browser for JS-rendered endpoints | | Knowledge resources | 12 built-in resources covering WAF signatures, evasion techniques, OWASP mappings, and config defaults | Example prompts "What WAF is protecting https://example.com?" "Scan https://example.com for SQLi and XSS" "Find WAF bypasses for https://example.com" "Run a full security assessment of https://example.com" "Validate my API spec at https://example.com/openapi.json" "Discover the attack surface of https://example.com" How to set up Start WAFtester MCP server: docker run -p 8080:8080 ghcr.io/waftester/waftester:latest mcp --http :8080 Add OpenAI credentials in n8n: Settings → Credentials → New → OpenAI API Select the credential in the OpenAI Chat Model node Activate the workflow and open the chat interface Alternatively, use the included docker-compose.yml to run both n8n and WAFtester together with docker compose up -d. Requirements | Requirement | Details | |---|---| | WAFtester MCP server | Docker image (ghcr.io/waftester/waftester:latest) or binary install for macOS, Linux, Windows | | LLM API key | OpenAI (default), or swap the model node for Anthropic, Ollama, Azure OpenAI, or any LangChain-compatible provider | | Authorization | Only test targets you have explicit written permission to test | Links WAFtester website GitHub repository Installation guide Full examples Docker Hub
by Milo Bravo
Abandoned Cart Recovery for Event Registration Who is this for? Event organizers losing 80% of form starters who never finish registration and want automated follow-up emails triggered by abandonment beacons. What problem is this workflow solving? Abandoned carts kill revenue: 80-90% drop-off after form open No visibility into who started Manual follow-up doesn't scale Lost leads = lost tickets This auto-recovers drop-offs with timed email sequences. What this workflow does Beacon → Recovery Engine: Trigger**: /reg-beacon pixel from registration page Profile Capture**: Email + page + timestamp 3-Email Sequence**: Day 1 / Day 3 / Day 7 nudges Gemini Personalization**: Dynamic subject/body per visitor Data Tables**: Tracks opens/clicks/conversions Slack Alerts**: High-value abandons flagged Main workflow required. This is a sub-workflow triggered by Event Registration + Auto-Enrichment. Setup (5 minutes) Data Tables**: reg_analytics, abandoned_carts Gemini**: Flash Lite API key Email**: SMTP credential Slack**: OAuth2 for alerts Config**: Update recovery copy templates Fully configurable—no code changes needed. How to customize to your needs Timing**: Adjust Day 1/3/7 sequence Copy**: Edit Gemini prompts (urgency, offers) Segments**: VIP emails vs. standard Channels**: Add SMS/WhatsApp Escalation**: Phone calls for high-LTV ROI: 20% recovery rate** (industry avg) $5k+ revenue** from 500 abandons Zero manual tracking** Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: abandoned cart recovery, event registration optimization, email automation, conversion recovery, event marketing workflows
by Feedspace
What problem does it solve? Manually copying contacts and testimonials from Feedspace to HubSpot is time-consuming and error-prone. This workflow automates the entire process, ensuring every testimonial is: Linked to the correct contact (creating new contacts if needed) Stored as a detailed, formatted note with all metadata Processed in real-time as testimonials are submitted How It Works Receive Webhook - Feedspace sends testimonial data via webhook when a new submission arrives Extract Data - Code node parses and normalizes the testimonial payload (handles text, video, and audio types) Validate Email - IF node checks if reviewer email exists (required for HubSpot contact) Upsert Contact - HubSpot node creates a new contact or updates existing one based on email Create Note - HTTP Request creates a detailed note associated with the contact Respond - Returns success/error response to Feedspace Setup Steps 1. Configure HubSpot Credentials Create a HubSpot Private App with the following scopes: crm.objects.contacts.write crm.objects.contacts.read Add the App Token to your n8n credentials 2. Get the Webhook URL Open the Receive Testimonial webhook node Copy the Production URL 3. Configure Feedspace Go to your Feedspace dashboard Navigate to Integrations → Webhooks Add the n8n webhook URL Select New Testimonial as the trigger event 4. Activate the Workflow Toggle the workflow to Active Test by submitting a testimonial in Feedspace
by Cheng Siong Chin
How It Works This workflow automates enterprise policy compliance monitoring using AI agents to ensure organizational adherence to regulatory and internal policies. Designed for compliance officers, legal teams, and risk managers, it solves the challenge of manually reviewing vast policy documents and execution logs for violations.The system fetches policy records on schedule, routes them to specialized AI agents (OpenAI for compliance assessment and escalation logic), validates outputs, and logs all actions for audit trails. Email notifications alert stakeholders when violations occur. By automating detection and escalation, organizations reduce compliance risks, accelerate response times, and maintain comprehensive audit documentation—critical for regulated industries like finance, healthcare, and manufacturing. Setup Steps Connect Schedule Trigger (set monitoring frequency: hourly/daily) Configure Fetch Policy Records node with your policy database/API credentials Add OpenAI API key to Compliance Agent and Escalation Logic nodes Connect Email node with SMTP credentials for alert notifications Link Final Execution Log to your audit storage system Test workflow with sample policy violations to verify routing logic Prerequisites OpenAI API account with GPT-4 access, policy database/API access Use Cases Financial services regulatory compliance (KYC/AML), healthcare HIPAA monitoring Customization Modify AI prompts for industry-specific regulations, adjust routing thresholds for violation severity Benefits Reduces compliance review time by 90%, eliminates human oversight gaps
by Gilbert Onyebuchi
Transform your inbox into an intelligent assistant! This workflow automatically reads incoming emails, generates personalized AI responses using Google Gemini, and sends professional replies—all on autopilot. How It Works: Monitors Your Inbox - IMAP trigger checks for new emails Smart Processing - Filters and validates incoming messages AI Response - Google Gemini generates contextual, professional replies with conversation memory Auto-Send - SendGrid delivers your response instantly Perfect For: B2B customer support automation Lead qualification and response FAQ handling After-hours email coverage Reducing response time from hours to seconds Quick Setup (5 minutes): Add your IMAP email credentials Get free Google Gemini API key from aistudio.google.com Configure SendGrid (or use any SMTP provider) Customize AI prompt for your business Activate and let it run! Cost-Effective: Google Gemini: Free tier available SendGrid: Free up to 100 emails/day Perfect for startups and small businesses Customizable: Easily modify the AI prompt, add business logic, integrate with your CRM, or adjust response timing. The workflow is fully documented and beginner-friendly. Need help customizing this for your specific use case? Looking for a collaborator on your n8n automation project? Let's connect on LinkedIn: Send me a connection I'm always open to collaborating on innovative automation solutions! 🤝 #email-automation #ai-chatbot #customer-support #gemini #sendgrid #b2b
by Lucas Hideki
How it works Any external system triggers a reminder via webhook with a tenant token — the workflow validates the token, fetches the tenant's channel config and message template from PostgreSQL, renders the message with event variables, and sends it immediately A schedule trigger runs every minute and queries events approaching their deadline window per tenant — idempotency via a reminders_sent table ensures the same reminder is never sent twice A built-in n8n form lets you register new tenants with their channel, message template and timing rules — no external backend needed Every send attempt is logged to the database with status, message sent and error details Set up steps Add your PostgreSQL credentials to all Postgres nodes (~2 min) Add your Telegram credentials to the Send Message node (~2 min) Create the required tables using the SQL schema provided in the workflow sticky note (~10 min) Register your first tenant at /form/multi-tenant-register Send events via POST /webhook/multi-tenant-webhook with x-tenant-token header
by Cheng Siong Chin
How It Works This workflow automates end-to-end code repository governance scanning using a multi-agent AI orchestration system. Designed for engineering leads, DevSecOps teams, and CTOs, it replaces manual code audits with a structured, AI-driven compliance and security analysis pipeline. The workflow begins by extracting repository metadata, which is passed to a Governance Orchestrator Agent coordinating four specialised sub-agents: Static Code Analysis, Architectural Compliance, CTO Report Generation, and Security Vulnerability Analysis. Outputs are consolidated into a Structured Governance Output, formatted as a final report, then routed by severity level. Critical findings trigger escalation alerts and are aggregated separately, while medium findings are handled independently. All paths converge to merge analysis results, enrich the final output, and deliver a board-ready governance report with full audit traceability. Setup Steps Configure Extract Repository Metadata with your Git provider or repository API credentials. Set severity thresholds in the Check Critical Issues Threshold node to match your governance policy. Configure Prepare Escalation Alert with your notification channel. Prerequisites OpenAI or compatible LLM API credentials Git repository access (GitHub, GitLab, or Bitbucket API) Notification channel (Slack, email, or webhook) Use Cases Automated pre-release security and compliance audits Customisation Adjust severity thresholds to match internal risk frameworks Benefits Eliminates manual code audit effort across engineering teams
by Alex Berman
Who is this for Sales development reps, growth marketers, and recruiters who need to find verified business email addresses at scale from a list of contacts -- without manual lookups or guesswork. How it works A Set node holds your list of contacts (first name, last name, and company domain). An HTTP Request node POSTs the contacts to the ScraperCity email-finder API, which returns a runId. A second Set node stores the runId for use in subsequent requests. The workflow waits 30 seconds, then polls the ScraperCity status endpoint in a loop until the job status is SUCCEEDED. Once complete, the results are downloaded via the ScraperCity download endpoint. A Code node parses the response and formats each contact row. Results are written to Google Sheets, giving you a clean, ready-to-use email list. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, create an HTTP Header Auth credential named ScraperCity API Key with header Authorization and value Bearer YOUR_KEY. Set your Google Sheets document ID and sheet name in the Google Sheets node. Update the contacts list in the Set Contact List node with your real contacts. Requirements ScraperCity account with email-finder credits Google Sheets OAuth2 credential configured in n8n How to customize the workflow Replace the manual contact list with a Google Sheets Get Rows node to process a dynamic list. Add a Slack or email notification node after the results are written to alert your team. Add a Filter node to keep only contacts where an email was successfully found.
by Milo Bravo
AI Event Feedback Analyzer with Google Forms, GPT, Slack & Docs Who is this for? Event planners, webinar hosts, conference organizers, and marketers who collect attendee feedback and want instant actionable insights without manual analysis. What problem is this workflow solving? Post-event feedback analysis is slow and manual: Sorting 100s of forms takes hours Missing sentiment patterns or testimonials No real-time team alerts or historical logs This workflow automates sentiment analysis + distribution across Slack and Google Docs. What this workflow does Trigger**: Google Forms → Sheets webhook (works with Typeform too) AI Analysis**: GPT-4o extracts: Sentiment score, key likes/improvements, golden quotes Slack Alert**: "#eventfeedback: 4.2/5 'Loved networking' → Action: More breaks" Google Docs Log**: Appends "{{Event}} Feedback Summary" with bullets + NPS trends Bonus**: 50+ responses → "Avg NPS 4.2 | Top 3 fixes ranked" Setup (3 minutes) Google Forms** → Sheets (native integration) Slack channel + OpenAI API key** (GPT-4o-mini recommended) Google Docs ID** (env var DOCS_ID) Fully configurable—no code changes needed. How to customize to your needs Forms**: Swap Google Forms for Typeform/Webhook AI**: Adjust sentiment thresholds or add custom categories Channels**: Add Teams/Email + multiple Slack rooms Metrics**: Track NPS, CSAT, or custom scores Scale**: Aggregate by event/date for multi-conference orgs ROI: 30% faster feedback loops** 15% NPS uplift** (proven 500+ runs) Zero manual analysis** Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: event management, sentiment analysis, post-event feedback, conference feedback
by Oneclick AI Squad
Generates SEO blog posts, ad copy, email sequences, and social captions using Claude AI. Setup Add your ANTHROPIC_API_KEY in the Set Config node Configure Serper API key for SEO data (optional) Set your WordPress, Mailchimp, Airtable credentials POST to the webhook URL Webhook Payload { "topic": "Best AI Tools 2025", "contentType": "blog_post", "keyword": "ai tools for marketing", "tone": "professional", "audience": "SMB marketers", "wordCount": 1500, "brand": "YourBrand", "clientEmail": "client@email.com" } Content Types blog_post ad_copy email_sequence social_captions Flow The simplified flow (9 active nodes): Receive Brief → Set Config → Fetch SERP Data → Build Claude Prompt → Call Claude AI → Parse Claude Response → Save to Google Sheets → Send Delivery Email → Send Response After importing, replace these 5 values in the Set Config node: | Field | Replace with | |---|---| | YOUR_ANTHROPIC_API_KEY | Your Anthropic key | | YOUR_SERPER_API_KEY | Your Serper.dev key (free tier works) | | YOUR_SENDGRID_API_KEY | Your SendGrid key | | YOUR_GOOGLE_SHEET_ID | Your Sheet ID from the URL | | YOUR_GSHEETS_CREDENTIAL_ID | Set up Google Sheets OAuth in n8n credentials | The workflow handles all 4 content types (blog_post, ad_copy, email_sequence, social_captions) through a single Claude call with type-specific prompts — no branching needed.
by Niclas Aunin
This n8n workflow automatically generates professional product announcements and blog articles from your Notion content planning database. Who's it for & Use Cases Product Marketers, Content Teams, Product Managers, and Founders who want to: Automate product announcement creation from their Notion product backlog. Generate SEO and AI Search/ALLMO optimized blog articles with consistent structure and brand voice Maintain an up to date product changelog for products with frequent udpates. How It Works Phase 1: Notion Trigger & Validation Workflow monitors your Notion "Content Plan" database for page updates Validates that the entry is marked as ready for writing Checks that content type is set to "Product" (filters other content types) Phase 2: AI Outline Generation GPT-5 Mini creates a structured outline based on: Project name from Notion Notes field (context/instructions) Built-in SEO and ALLMO best practices Output includes sections, subsections, and key talking points Phase 3: Full Article Generation Claude Sonnet 4.5 writes the complete product announcement using: The generated outline Project details from Notion Expert product communications system prompt Article follows structured format: headline, summary, feature sections, FAQ, CTA, and SEO metadata Phase 4: Google Docs Creation & Notion Update Creates new Google Doc with your project name as title Inserts the complete Markdown article into the document Updates Notion page with Google Docs link for instant access Marks the project as complete in Notion How to Setup Connect your Notion account and select your Content Plan database Enter API credentials in the Claude and OpenAI nodes Configure your Google Docs folder location Customize system prompts with your company description, target audience, and brand voice How to Expand Replace the Notion node with a product backlog tool of your choice. Update and fine tune the prompts. Output Structure Full Markdown article with YAML front matter Structured sections: headline, summary, feature descriptions, additional improvements, FAQ, CTA SEO metadata included (title, meta description, slug, tags) Automatically saved to Google Docs with link in Notion Requirements API Credentials: Anthropic API (Claude Sonnet 4.5) OpenAI API (GPT-5 Mini) Connected Services: Notion workspace with Content Plan database Google Docs/Drive account Notion Database Fields: Project name (title/text) Notes (text/description field) Google_Docs_Link (URL field) Status field to mark entries as ready (e.g., "Ready for Writing") Content Type field set to "Product"
by Milo Bravo
Automated Contract Signing: Tally, Airtable & DocuSign Who is this for? Business who manually prep/route DocuSign envelopes and want zero-touch contract signing from form submission. What problem is this workflow solving? Contract chaos kills velocity: Manual DocuSign prep (30min/envelope) Signer routing errors Data re-entry across tools No audit trail This workflow auto-generates + routes DocuSign from Tally forms + retrieves and updates Airtable . What this workflow does Normalizes Tally payload + lookups service provider Routes smartly: Both signers / Primary only / Secondary only Pre-fills DocuSign from Airtable data Tracks everything: Status, signers, timestamps in dashboard *3 Signing Modes: Both**: Dual signer envelopes Primary**: Client-only signing Secondary**: Provider-only Setup (10 minutes) Airtable: 3 tables (Contracts / Providers / Logs) DocuSign: OAuth2 + 3 envelope templates Tally: Form webhook → this workflow URL Config: Replace BASE_ID / TABLE_IDs / ACCOUNT_ID Test: Submit Tally form → watch DocuSign magic *Airtable Schema: Contracts: ID, Client, Provider, Status, EnvelopeID, Signers Providers: Name, DocuSignEmail, Role Logs: Timestamp, Action, Details How to customize to your needs Signing Flows: Agency → Client NDA (Primary only) Partner → Mutual MSA (Both signers) Internal → Approval (Secondary only) Scale Up: CRM Sync**: HubSpot/Salesforce status update Payments**: Stripe link post-signing Multi-language**: Template per locale Notifications**: Slack/Teams on completion Pro Features: Sequential signing order Void/correct envelopes Audit log dashboard Field validation ROI: 30min → 30sec per contract $0 (vs HelloSign $15+/mo) 100% tracked (no lost envelopes) Audit-ready (logs + timestamps) GDPR compliant (data mapping) Proven: 2k+ contracts signed, 98% completion rate. Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: DocuSign automation, contract signing automation, e-signature workflow, sales contract automation