by Rodrigue Gbadou
How it works This comprehensive recruitment automation workflow transforms your hiring process from manual screening to intelligent candidate management. The system begins by automatically collecting CVs from multiple job boards and career platforms, immediately parsing each submission using advanced AI technology to extract key information including skills, experience levels, educational background, and career progression patterns. Once parsed, the workflow employs predictive scoring algorithms that evaluate each candidate against your specific job requirements and company culture criteria. This multi-dimensional analysis considers technical skills alignment, experience relevance, cultural fit indicators, and career trajectory patterns to generate compatibility scores with remarkable accuracy. The system then seamlessly transitions qualified candidates into automated interview scheduling, coordinating availability across hiring managers, team members, and candidates while optimizing for timezone considerations and calendar conflicts. Finally, successful candidates enter a personalized onboarding workflow that adapts to their role, department, and experience level, ensuring smooth integration into your organization. Target audience and problem solved This workflow is designed for HR departments, talent acquisition teams, and growing companies struggling with time-intensive recruitment processes. It specifically addresses the challenges of manual CV screening, subjective candidate evaluation, scheduling conflicts, and inconsistent onboarding experiences. Organizations processing high volumes of applications or seeking to eliminate recruitment bias while maintaining quality standards will benefit most from this automation. Set up steps Prerequisites: Ensure you have API access to your chosen AI parsing service (OpenAI, Affinda, or equivalent), active accounts on target job boards, and administrative access to your calendar and ATS systems. Configure job board integrations: Connect your LinkedIn Recruiter, Indeed, and Glassdoor accounts using their respective APIs. Set up webhook endpoints to automatically capture new CV submissions and configure filtering criteria based on job titles, locations, and basic qualifications. Establish AI parsing service: Choose and configure your CV analysis provider (OpenAI for natural language processing, Affinda for specialized CV parsing, or alternative services). Set up API credentials and define extraction templates for skills, experience, education, and custom fields relevant to your industry. Integrate calendar systems: Connect Google Calendar, Outlook, or your preferred scheduling platform. Configure availability windows for all hiring team members, set interview duration templates, and establish buffer times between meetings. Synchronize ATS platform: Link your Applicant Tracking System (Workday, BambooHR, Greenhouse, etc.) to ensure seamless candidate data flow. Map workflow fields to your ATS schema and establish status update triggers. Connect interview tools: Integrate video conferencing platforms (Zoom, Microsoft Teams, Google Meet) for automatic meeting room creation and invitation distribution. Configure recording settings and waiting room preferences. Link HRMS for onboarding: Connect your Human Resource Management System to trigger personalized onboarding sequences based on role type, department, and seniority level. Key Features π§ Advanced CV analysis**: Leverages machine learning to automatically extract and categorize skills, experience, education, certifications, and career progression patterns with 95% accuracy π Multi-criteria scoring**: Implements customizable evaluation matrices considering technical skills, soft skills, experience relevance, cultural fit indicators, and growth potential π Intelligent scheduling**: Automatically coordinates complex interview schedules across multiple stakeholders, considering time zones, availability preferences, and interview type requirements π― Precise candidate matching**: Generates compatibility percentages based on job requirements, team dynamics, and long-term career alignment factors β‘ Accelerated recruitment cycle**: Reduces time-to-hire by up to 60% through automated screening, intelligent prioritization, and streamlined communication workflows π₯ Collaborative evaluation**: Enables structured feedback collection from multiple interviewers with standardized scoring rubrics and consensus-building tools π± Enhanced candidate experience**: Provides mobile-optimized interfaces for application tracking, interview scheduling, and communication throughout the recruitment journey π Continuous optimization**: Automatically tracks and analyzes recruitment metrics to continuously improve scoring algorithms and process efficiency Customization options The workflow offers extensive customization capabilities including adjustable scoring weights for different criteria, industry-specific skill taxonomies, custom interview formats, and role-based onboarding paths. Organizations can configure approval workflows, set up custom notification templates, and establish specific integration parameters to match their unique recruitment processes and company culture. This automation solution transforms recruitment from a time-intensive manual process into a strategic, data-driven system that improves both hiring quality and candidate experience while significantly reducing administrative overhead.
by Jay Hartley
What this template does This workflow will collect order data as it is produced, then send a summary email of all orders at the end of every day, formatted in a table. It receives new orders via webhook and stores in Airtable. At 7PM every day, it sends a summary email with the day's orders in a HTML table Setup: Instructions Video Create a new table in Airtable and give it a field time with type date, orderID with type number, and orderPrice also with type number. Create a new access token if you haven't already at https://airtable.com/create/tokens/new. Make sure to give the token the scopes data.records:read, data.records:write, schema.bases:read and access to whichever table you choose to store the orders. A pop-up window appears with the token. Use this token to make Create New Credential > Access Token for Airtable in the Store Order and Airtable Get Today's Orders nodes. Create access credentials for your Gmail as described here: https://developers.google.com/workspace/guides/create-credentials. Use the credentials from your client_secret.json in the Send to Gmail node. In the Store Order node, change Base and Table to the database and table in your Airtable account you wish to use to store orders. Make sure to use these same values in the Airtable Get Today's Orders node. Every time an order is created in your system, send a POST request to Webhook from your order software. Each request must contain a single order containing fields 'orderID' and 'orderPrice' (or, edit Set Order Fields to select which incoming fields you wish to save) Change the schedule time for sending email from Everyday at 7PM to whichever time you choose. Test: Activate the workflow. From the node Webhook, copy Production URL Send the following CURL request to the URL given to you: curl -X POST -H "Content-Type: application/json" -d '{"orderID": 12345, "orderPrice": 99.99}' YOUR_URL_HERE It should say Node executed successfully. Now check your Airtable and confirm the order was stored in the right place.
by Adam Bertram
An AI-powered chat assistant that analyzes Azure virtual machine activity and generates detailed timeline reports showing VM state changes, performance metrics, and operational events over time. How It Works The workflow starts with a chat trigger that accepts user queries about Azure VM analysis. A Google Gemini AI agent processes these requests and uses six specialized tools to gather comprehensive VM data from Azure APIs. The agent queries resource groups, retrieves VM configurations and instance views, pulls performance metrics (CPU, network, disk I/O), and collects activity log events. It then analyzes this data to create timeline reports showing what happened to VMs during specified periods, defaulting to the last 90 days unless the user specifies otherwise. Prerequisites To use this template, you'll need: n8n instance (cloud or self-hosted) Azure subscription with virtual machines Microsoft Azure Monitor OAuth2 API credentials Google Gemini API credentials Proper Azure permissions to read VM data and activity logs Setup Instructions Import the template into n8n. Configure credentials: Add Microsoft Azure Monitor OAuth2 API credentials with read permissions for VMs and activity logs Add Google Gemini API credentials Update workflow parameters: Open the "Set Common Variables" node Replace <your azure subscription id here> with your actual Azure subscription ID Configure triggers: The chat trigger will automatically generate a webhook URL for receiving chat messages No additional trigger configuration needed Test the setup to ensure it works. Security Considerations Use minimum required Azure permissions (Reader role on subscription or resource groups). Store API credentials securely in n8n credential store. The Azure Monitor API has rate limits, so avoid excessive concurrent requests. Chat sessions use session-based memory that persists during conversations but doesn't retain data between separate chat sessions. Extending the Template You can add more Azure monitoring tools like disk metrics, network security group logs, or Application Insights data. The AI agent can be enhanced with additional tools for Azure cost analysis, security recommendations, or automated remediation actions. You could also integrate with alerting systems or export reports to external storage or reporting platforms.
by explorium
HubSpot Contact Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: hubspot\_flow.json Overview This n8n workflow monitors your HubSpot instance for newly created contacts and automatically enriches them with additional contact information. When a contact is created, the workflow: Detects the new contact via HubSpot webhook trigger Retrieves recent contact details from HubSpot Matches the contact against Explorium's database using name, company, and email Enriches the contact with professional emails and phone numbers Updates the HubSpot contact record with discovered information This automation ensures your sales and marketing teams have complete contact information, improving outreach success rates and data quality. Key Features Real-time Webhook Trigger**: Instantly processes new contacts as they're created Intelligent Matching**: Uses multiple data points (name, company, email) for accurate matching Comprehensive Enrichment**: Adds both professional and work emails, plus phone numbers Batch Processing**: Efficiently handles multiple contacts to optimize API usage Smart Data Mapping**: Intelligently maps multiple emails and phone numbers Profile Enrichment**: Optional additional enrichment for deeper contact insights Error Resilience**: Continues processing other contacts if some fail to match Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) HubSpot account with: Developer API access (for webhooks) Private App or OAuth2 app created Contact object permissions (read/write) Explorium API credentials (Bearer token) - Get explorium api key Understanding of HubSpot contact properties HubSpot Requirements Required Contact Properties The workflow uses these HubSpot contact properties: firstname - Contact's first name lastname - Contact's last name company - Associated company name email - Primary email (read and updated) work_email - Work email (updated by workflow) phone - Phone number (updated by workflow) API Access Setup Create a Private App in HubSpot: Navigate to Settings β Integrations β Private Apps Create new app with Contact read/write scopes Copy the Access Token Set up Webhooks (for Developer API): Create app in HubSpot Developers portal Configure webhook for contact.creation events Note the App ID and Developer API Key Custom Properties (Optional) Consider creating custom properties for: Multiple email addresses Mobile vs. office phone numbers Data enrichment timestamps Match confidence scores Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows β Add Workflow β Import from File Paste the JSON and click Import Step 2: Configure HubSpot Developer API (Webhook) Click on the HubSpot Trigger node Under Credentials, click Create New Enter your HubSpot Developer credentials: App ID: From your HubSpot app Developer API Key: From your developer account Client Secret: From your app settings Save as "HubSpot Developer account" Step 3: Configure HubSpot App Token Click on the HubSpot Recently Created node Under Credentials, click Create New (App Token) Enter your Private App access token Save as "HubSpot App Token account" Apply the same credentials to the Update HubSpot node Step 4: Configure Explorium API Credentials Click on the Explorium Match Prospects node Under Credentials, click Create New (HTTP Header Auth) Configure the authentication: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth Connection" Apply to all Explorium nodes: Explorium Enrich Contacts Information Explorium Enrich Profiles Step 5: Configure Webhook Subscription In HubSpot Developers portal: Go to your app's webhook settings Add subscription for contact.creation events Set the target URL from the HubSpot Trigger node Activate the subscription Step 6: Activate the Workflow Save the workflow Toggle the Active switch to ON The webhook is now listening for new contacts Node Descriptions HubSpot Trigger: Webhook that fires when new contacts are created HubSpot Recently Created: Fetches details of recently created contacts Loop Over Items: Processes contacts in batches of 6 Explorium Match Prospects: Finds matching person in Explorium database Filter: Validates successful matches Extract Prospect IDs: Collects matched prospect identifiers Enrich Contacts Information: Fetches emails and phone numbers Enrich Profiles: Gets additional profile data (optional) Merge: Combines all enrichment results Split Out: Separates individual enriched records Update HubSpot: Updates contact with new information Data Mapping Logic The workflow maps Explorium data to HubSpot properties: | Explorium Data | HubSpot Property | Notes | | ------------------------------ | ------------------ | ----------------------------- | | professions_email | email | Primary professional email | | emails[].address | work_email | All email addresses joined | | phone_numbers[].phone_number | phone | All phones joined with commas | | mobile_phone | phone (fallback) | Used if no other phones found | Data Processing The workflow handles complex data scenarios: Multiple emails**: Joins all discovered emails with commas Phone numbers**: Combines all phone numbers into a single field Missing data**: Uses "null" as placeholder for empty fields Name parsing**: Cleans sample data and special characters Usage & Operation Automatic Processing Once activated: Every new contact triggers the webhook immediately Contact is enriched within seconds HubSpot record is updated automatically Process repeats for each new contact Manual Testing To test the workflow: Use the pinned test data in the HubSpot Trigger node, or Create a test contact in HubSpot Monitor the execution in n8n Verify the contact was updated in HubSpot Monitoring Performance Track workflow health: Go to Executions in n8n Filter by this workflow Monitor success rates Review any failed executions Check webhook delivery in HubSpot Troubleshooting Common Issues Webhook not triggering Verify webhook subscription is active in HubSpot Check the webhook URL is correct and accessible Ensure workflow is activated in n8n Test webhook delivery in HubSpot developers portal Contacts not matching Verify contact has firstname, lastname, and company Check for typos or abbreviations in company names Some individuals may not be in Explorium's database Email matching improves accuracy significantly Updates failing in HubSpot Check API token has contact write permissions Verify property names exist in HubSpot Ensure rate limits haven't been exceeded Check for validation rules on properties Missing enrichment data Not all prospects have all data types Phone numbers may be less available than emails Profile enrichment is optional and may not always return data Error Handling Built-in error resilience: Failed matches don't block other contacts Each batch processes independently Partial enrichment is possible All errors are logged for review Debugging Tips Check webhook logs: HubSpot shows delivery attempts Review executions: n8n logs show detailed error messages Test with pinned data: Use the sample data for isolated testing Verify API responses: Check Explorium API returns expected data Best Practices Data Quality Complete contact records: Ensure name and company are populated Standardize company names: Use official names, not abbreviations Include existing emails: Improves match accuracy Regular data hygiene: Clean up test and invalid contacts Performance Optimization Batch size: 6 is optimal for rate limits Webhook reliability: Monitor delivery success API quotas: Track usage in both platforms Execution history: Regularly clean old executions Compliance & Privacy GDPR compliance: Ensure lawful basis for enrichment Data minimization: Only enrich necessary fields Access controls: Limit who can modify enriched data Audit trail: Document enrichment for compliance Customization Options Additional Enrichment Extend with more Explorium data: Job titles and departments Social media profiles Professional experience Skills and interests Company information Enhanced Processing Add workflow logic for: Lead scoring based on enrichment Routing based on data quality Notifications for high-value matches Custom field mapping Integration Extensions Connect to other systems: Sync enriched data to CRM Trigger marketing automation Update data warehouse Send notifications to Slack API Considerations HubSpot Limits API calls**: Monitor daily limits Webhook payload**: Max 200 contacts per trigger Rate limits**: 100 requests per 10 seconds Property limits**: Max 1000 custom properties Explorium Limits Match API**: Batched for efficiency Enrichment calls**: Two parallel enrichments Rate limits**: Based on your plan Data freshness**: Real-time matching Architecture Considerations This workflow integrates with: HubSpot workflows and automation Marketing campaigns and sequences Sales engagement tools Reporting and analytics Other enrichment services Security Best Practices Webhook validation**: Verify requests are from HubSpot Token security**: Rotate API tokens regularly Access control**: Limit workflow modifications Data encryption**: All API calls use HTTPS Audit logging**: Track all enrichments Advanced Configuration Custom Field Mapping Modify the Update HubSpot node to map to custom properties: // Example custom mapping { "custom_mobile": "{{ $json.data.mobile_phone }}", "custom_linkedin": "{{ $json.data.linkedin_url }}", "enrichment_date": "{{ $now.toISO() }}" } Conditional Processing Add logic to process only certain contacts: Filter by contact source Check for specific properties Validate email domains Exclude test contacts Support Resources For assistance: n8n issues**: Check n8n documentation and forums HubSpot API**: Reference HubSpot developers documentation Explorium API**: Contact Explorium support Webhook issues**: Use HubSpot webhook testing tools
by Pavel Zamorev
This n8n template automates the transformation of raw meeting notes into structured tasks and documents using GPT (or another model) , syncing them to Notion and TickTick via a Telegram bot. Use Cases Automate note-taking and formatting for daily standups, brainstorming sessions, or client calls. Reduce cognitive load by eliminating manual tracking of ideas and tedious formatting. Convert discussions into actionable tasks instantly with TickTick and structured notes in Notion. How It Works Capture Notes: Send raw meeting notes to a Telegram bot. AI Processing: The workflow sends the text to AI, which: Removes duplicates and extracts key points. Formats content into structured Markdown notes for Notion. Identifies tasks with deadlines (e.g., "- Prepare presentation (Responsible: John, Deadline: Friday)"). Task Parsing: Extracts task titles, removing metadata like "Responsible" and "Deadline." Review & Edit: The bot returns formatted notes and tasks for review in Telegram. Sync & Publish: Notes are published to a Notion database. Tasks are exported to TickTick via API. Confirmation: A Telegram reaction (e.g., π emoji) confirms successful processing. Setup Instructions Set Up Telegram Bot: Create a Telegram bot via BotFather and obtain an API token. Add the token to the "Telegram Trigger" and "Send-Edited-Notes" nodes under credentials (telegramApi). Configure OpenAI: Obtain an OpenAI API key and add it to the "Edit-Notes" node (openAiApi credentials). Ensure the model is set to gpt-4.1-mini in the node parameters. Set Up Notion: Create a Notion database for notes (e.g., "Meetings"). Add the database ID to the "Create a Database Page" node (databaseId). Configure Notion API credentials (notionApi) in the node. Set Up TickTick: Obtain a TickTick API key and add it to the "Create a Task" node (tickTickOAuth2Api credentials). Specify your TickTick project ID in the node (projectId). Deploy Workflow: Ensure your n8n instance is self-hosted to support community nodes (TickTick, Notion). Activate the workflow in n8n. Test: Send a test message to the Telegram bot (e.g., "Discussed project timeline. Tasks: - Prepare slides (Responsible: Alice, Deadline: Friday)"). Verify that notes appear in Notion, tasks in TickTick, and a π reaction in Telegram. Configuration Examples Telegram Trigger: { "parameters": { "updates": ["message"], "additionalFields": {} }, "credentials": { "telegramApi": { "id": "your-telegram-api-id", "name": "meeting notes" } } } OpenAI Prompt (in "Edit-Notes" node): Analyze the quick meeting notes from {{ $json.message.text }} Generate meeting notes and a task list in the following format:\nMeeting Notes:\n- [Note 1]\n- [Note 2]\n\nTasks:\n- [Task 1] \n- [Task 2] Notion Database Page { "parameters": { "resource": "databasePage", "databaseId": "your-notion-database-id", "title": "MN {{ $now }}", "blockUi": { "blockValues": [ { "textContent": "{{ $json.message.text }}" } ] } } } Requirements Requires an OpenAI API key (or another model). APIs: Pre-configured Notion and TickTick API credentials are required. The template includes setup guides. Setup: Uses community nodes, requiring a self-hosted n8n instance. Customizing This Workflow Replace the Telegram bot with a webhook or form for alternative inputs (e.g., mobile apps). Modify the OpenAI prompt in the "Edit-Notes" node to customize note and task formats. Add filters in the "Split Notes and Tasks" node to prioritize tasks (e.g., ++#urgent++). Integrate Google Calendar via an additional HTTP Request node to auto-set deadlines based on text (e.g., "by Friday").
by Robert Breen
This n8n workflow reads emails from your Outlook inbox, drafts AI-powered replies using OpenAI, and routes them through the gotoHuman node for human approval before replying automatically. β Key Features Reads Outlook emails** from today only (excluding those from your own address). AI-generated replies** crafted using OpenAI based on the subject and body of the email. Community node integration**: Uses the gotoHuman node for human review and approval of replies before sending. Safe sending**: Only approved responses are automatically sent back via Outlook. Expandable**: Can be easily modified to: Send drafts instead of full replies Include additional email filters Trigger at intervals or via webhook π§ Nodes Used Microsoft Outlook β Fetch and reply to emails OpenAI β Generates smart reply text gotoHuman β Human-in-the-loop approval system Loop Over Items, IF, Code, and Set nodes for processing logic Manual Trigger β For testing π§ Setup Instructions 1. Connect APIs Outlook OAuth2**: Go to Azure Portal Register an app Add Mail.Read, Mail.Send scopes Set redirect URI: https://api.n8n.cloud/oauth2-credential/callback Paste credentials in n8n credential manager OpenAI API**: Create account at OpenAI Create an API Key Add it to n8n credentials gotoHuman API**: Go to https://gotoHuman.ai and sign in Create a review template (e.g., βEmail Responsesβ) Copy the Template ID and API key into n8n credentials πͺ Workflow Steps Overview 1. Trigger Use the Manual Trigger to test or schedule execution with a cron node. 2. Filter Emails from Today A Code node outputs today's date in the proper yyyy-mm-dd format. const today = new Date(); today.setHours(0, 0, 0, 0); return [{ json: { searchQuery: received:${today.toISOString().split('T')[0]} } }]; 3. Search and Filter Outlook Messages Uses the Outlook node with a search query like: received:2025-08-06 -from:rbreen@ynteractive.com (Update to your email) 4. Generate AI Response Text prompt to OpenAI: subject: {{ $json.subject }} body: {{ $json.body.content }} System prompt: > You are a personal assistant helping respond to emails. I am an AI automation expert specializing in helping small and medium-size businesses automate processes. Create a short response to the email. Sign the email as Robert Breen. 5. Review with gotoHuman Submit AI output for human approval using the gotoHuman node. The output schema should match the Review Template fields (e.g., "email", "OriginalEmail"). 6. IF Node Decision If status is approved, send reply If not, return to loop for revision or skip βοΈ Customization Ideas βοΈ Send only drafts by skipping the "reply" step and storing results. π Schedule the workflow with a Cron trigger for automation. π Add label filters or subject keywords for advanced targeting. π External Links gotoHuman Community Node OpenAI Microsoft Outlook API Setup π¬ Need More Help? If you'd like help customizing this or building similar automations, reach out: Robert Breen AI & Automation Consultant π https://ynteractive.com π§ robert.j.breen@gmail.com π LinkedIn
by WeblineIndia
Smart Document Parser for Invoices, Logs or Sensor Reports (PDF/Image to Google Sheets) This n8n workflow automatically parses documents such as invoices, sensor logs or structured PDFs/images (including scanned docs or CSVs), extracts key fields like totals, dates and customer/vendor info using OCR and AI, and writes the structured output into Google Sheets. Whoβs it for Finance or Ops teams automating invoice processing. SaaS platforms parsing uploaded reports or documents. Anyone needing a no-code backend for PDF/image/CSV document parsing. AI-powered data capture pipelines. How it works Webhook Trigger receives file uploads (/uploadDoc) Switch Node checks the file type: If image β Use Tesseract OCR If PDF β Use PDF parser If CSV β Extract as-is Extracted text is passed to: Google Gemini or Gemini Flash AI model Prompt extracts fields like invoice_id, total, customer_name, etc. JSON string is parsed and cleaned Data is appended to Google Sheets using appendOrUpdate How to set up Create a Google Sheet with columns like: invoice_id, invoice_date, due_date, customer_name, vendor_name, subtotal, tax_total, total, currency Connect: Google Sheets OAuth Google Gemini (PaLM API key) for LLM parsing Deploy the webhook endpoint: /uploadDoc Upload sample files (PDFs, images, CSVs) to test Review and map sheet columns in the Invoice Data node Requirements | Tool | Purpose | | ------------- | --------------------------------- | | n8n | Automation framework | | Google Sheets | To store structured output | | Tesseract OCR | For scanned image text extraction | | Google Gemini | For natural language parsing | How to customize Add extraction for line items using structured prompts. Change prompt to extract sensor readings, log types, or custom keys. Add support for other file types (e.g., XLSX, DOCX). Add Slack/Email notifications on success/failure. Swap Gemini with OpenAI or Hugging Face if preferred. Addβons Save uploaded files to Google Drive or S3 Add auth for secure uploads Use charting/dashboard nodes to visualize extracted data Integrate with billing/accounting software Use Case Examples | Scenario | What Happens | | ----------------------- | ------------------------------------------------------- | | Invoice Upload (PDF) | Extracts totals, customer, tax data into a Google Sheet | | Scanned Receipt (Image) | OCR + LLM extracts structured data | | Log File (CSV) | Parses and logs entries into Sheets | Common troubleshooting | Issue | Possible Cause | Solution | | --------------------------------- | ----------------------- | ------------------------------------------- | | Webhook not triggered | URL or method mismatch | Use correct POST URL /uploadDoc | | Text is blank | OCR failed | Check image quality or Tesseract config | | Gemini model not returning JSON | Prompt formatting issue | Ensure prompt ends with valid JSON schema | | Sheet not updated | Invalid Sheet ID or tab | Double-check sheet credentials and tab name | Need Help? Need help fine-tuning the Gemini prompt for better field accuracy? Want to extract full tables, multi-page invoices or convert PDFs to JSON lines? Our automation team at WeblineIndia can help you extend this into a full-blown document automation pipeline.
by Niranjan G
Automated GitHub Scanner for Exposed AWS IAM Keys Overview This n8n workflow automatically scans GitHub for exposed AWS IAM access keys associated with your AWS account, helping security teams quickly identify and respond to potential security breaches. When compromised keys are found, the workflow generates detailed security reports and sends Slack notifications with actionable remediation steps. π Key Features Automated AWS IAM Key Scanning**: Regularly checks for exposed AWS access keys on GitHub Real-time Security Alerts**: Sends immediate Slack notifications when compromised keys are detected Comprehensive Security Reports**: Generates detailed reports with exposure information and risk assessment Actionable Remediation Steps**: Provides clear instructions for securing compromised credentials Continuous Monitoring**: Maintains ongoing surveillance of your AWS environment π Workflow Steps List AWS Users: Retrieves all users from your AWS account Split Users for Processing: Processes each user individually Get User Access Keys: Retrieves access keys for each user Filter Active Keys Only: Focuses only on currently active access keys Search GitHub for Exposed Keys: Scans GitHub repositories for exposed access keys Aggregate Search Results: Consolidates and deduplicates search findings Check For Compromised Keys: Determines if any keys have been exposed Generate Security Report: Creates detailed security reports for compromised keys Extract AWS Usernames: Extracts usernames from AWS response for notification Format Slack Alert: Prepares comprehensive Slack notifications Send Slack Notification: Delivers alerts with actionable information Continue Scanning: Maintains continuous monitoring cycle π οΈ Setup Requirements Prerequisites Active n8n instance AWS account with IAM permissions GitHub account/token for searching repositories Slack workspace for notifications Required Credentials AWS Credentials: IAM user with permissions to list users and access keys Access Key ID and Secret Access Key GitHub Credentials: Personal Access Token with search permissions Slack Credentials: Webhook URL for your notification channel βοΈ Configuration AWS Configuration: Configure the "List AWS Users" node with your AWS credentials Ensure proper IAM permissions for listing users and access keys GitHub Configuration: Set up the "Search GitHub for Exposed Keys" node with your GitHub token Adjust search parameters if needed Slack Configuration: Configure the Slack node with your webhook URL Customize notification format if desired π Usage Running the Workflow Manual Execution: Click "Execute Workflow" to run an immediate scan Scheduled Execution: Set up a schedule to run periodic scans (recommended daily or weekly) Repository Compatibility This workflow is compatible with both public and private GitHub repositories to which you have access. It will scan all repositories you have permission to view based on your GitHub credentials. Handling Alerts When a compromised key is detected: Review the Slack notification for details about the exposure Follow the recommended remediation steps: Deactivate the compromised key immediately Create a new key if needed Investigate the exposure source Update any services using the compromised key β οΈ Disclaimer This workflow template is provided for reference purposes only to demonstrate how to automate AWS IAM key exposure scanning. Please note: The scanning process may produce false positives as it only matches potential AWS access key patterns Always verify any reported exposures manually before taking action Disabling or deleting access keys without proper verification could have significant negative impacts on your environment Understand which systems and applications rely on identified access keys before deactivating them This template should be customized to fit your specific environment and security policies IMPORTANT: Use this workflow with caution and only after thoroughly understanding your AWS environment. The authors of this template are not responsible for any disruptions or damages resulting from its use. π Security Considerations This workflow requires access to sensitive AWS credentials Store all credentials securely within n8n Review and rotate access keys regularly π Customization Options Adjust GitHub search parameters for more targeted scanning Customize Slack notification format and content Modify security report generation for your specific needs Integrate with additional notification channels (email, MS Teams, etc.) Optional: Enabling Interactive Slack Buttons The Slack Block Kit notification format supports interactive buttons that can be implemented if you want to perform actions directly from Slack: Disable Key: This button can be configured to automatically disable the compromised AWS IAM access key View Details: This button can be set up to show additional information about the exposure Acknowledge: This button can be used to mark the alert as acknowledged To make these buttons functional: Set up a Slack Socket Mode App: Create a Slack app in the Slack API Console Enable Socket Mode and Interactive Components Subscribe to the block_actions event to capture button clicks Create an n8n Webhook Endpoint: Add a new webhook node to receive Slack button click events Create separate workflows for each button action Implement AWS Key Disabling: For the "Disable Key" button, create a workflow that uses the n8n HTTP Request node to call the AWS IAM UpdateAccessKey API Example HTTP request that can be implemented in n8n: Method: POST URL: https://iam.amazonaws.com/ Query Parameters: Action: UpdateAccessKey AccessKeyId: AKIAIOSFODNN7EXAMPLE Status: Inactive UserName: {{$json.username}} Version: 2010-05-08 Update the Slack Message Format: Modify the Format Slack Alert node to include your webhook URL in the button action values Add callback_id and action_id values to identify which button was clicked This implementation allows for immediate response to security incidents directly from the Slack interface, reducing response time and improving security posture.
by Danger
How it Works This meta-workflow is designed to intelligently scan all your active workflows in n8n, identify those that contain Webhook nodes, and automatically generate a Swagger (OpenAPI) specification based on them. The output Swagger document reflects all accessible endpoints from your Webhook nodes, making it easier to: Visualize your API structure Share your endpoints Integrate with tools like Postman or Swagger UI Enhanced Parameter Support If you want the Swagger to reflect request parameters (e.g., query or body fields), you can annotate your Webhook nodes using the Note section. When configured properly, these annotations enrich your Swagger documentation with parameter names, types, and descriptions. Setup Steps Add the WebhookDocs to n8n Import the WebhookDocs JSON file into your n8n instance. Activate the WebhookDocs (you can also use the test-endpoint) Annotate Webhook Nodes (Optional but Recommended) To enable parameter documentation, open the Note section of each Webhook node and add annotations in the following format: //@body field_name string description //@query field_name string description Open the page https://n8n.youristance.com/webhook/swagger
by VKAPS IT
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. π― How it works This workflow captures new lead information from a web form, enriches it with Apollo.io data, qualifies the lead using AI, andβif the lead is strongβautomatically sends a personalized outreach email via Gmail and logs the result in Google Sheets. π οΈ Key Features π© Lead form capture with validation π Enrichment via Apollo API π€ Lead scoring using AI (LangChain + Groq) π§ Dynamic email generation & sending via Gmail π Logging leads with job title & org into Google Sheets β Conditional email sending (score β₯ 6 only) π§ͺ Set up steps Estimated time: 15β20 minutes Add your Apollo API Key to the HTTP Header credential (never hardcode!) Connect your Gmail account for sending emails Connect your Google Sheets account and set up the correct spreadsheet & sheet name Enable LangChain/Groq credentials for lead scoring and AI-generated emails Update the form endpoint to your live webhook if needed π Sticky Notes Add the following mandatory sticky notes inside your workflow: FormTrigger Node: "Collects lead info via form. Ensure your form is connected to this endpoint." HTTP Request Node: "Enrich lead using Apollo.io API. Add your API key via header-based authentication." AI Agent (Lead Score): "Scores lead from 1-10 based on job title and industry match. Only leads with score β₯ 6 proceed." AI Agent (Email Composer): "Generates a concise, polite email using leadβs job title & company. Modify tone if needed." Google Sheets Append: "Logs enriched lead with job title, org, and LinkedIn URL. Customize sheet structure if needed." Gmail Node: "Sends personalized outreach email if lead passes score threshold. Uses AI-generated content." πΈ Free or Paid? Free β No paid API services are required (Apollo has a free tier).
by explorium
Explorium Prospects Search Chatbot Template Download the following json file and import it to a new n8n workflow: mcp\_to\_prospects\_to\_csv.json Overview This n8n workflow creates a chatbot that understands natural language requests for finding business prospects and automatically: Interprets your query using AI (Claude Sonnet 3.7) Converts it to proper Explorium API filters Validates the API request structure Fetches prospect data from Explorium Exports results as a downloadable CSV file Perfect for sales teams, recruiters, and business development professionals who need to quickly find and export targeted prospect lists without learning complex API syntax. Key Features Natural Language Interface**: Simply describe who you're looking for in plain English Smart Query Translation**: AI converts your request to valid API parameters Built-in Validation**: Ensures API calls meet Explorium's requirements Error Recovery**: Automatically retries with corrections if validation fails Pagination Support**: Handles large result sets automatically CSV Export**: Clean, formatted output ready for CRM import Conversation Memory**: Maintains context for follow-up queries Example Queries The chatbot understands queries like: "Find marketing directors at SaaS companies in New York with 50-200 employees" "Get me CTOs from fintech startups in California" "Show me sales managers at healthcare companies with revenue over $10M" "Find engineers at Microsoft with 3-5 years experience" "Get customer service leads from e-commerce companies in Europe" Prerequisites Before setting up this workflow, ensure you have: n8n instance with chat interface enabled Anthropic API key for Claude Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of n8n chat workflows Supported Filters The chatbot can search using these criteria: Company Filters Size**: 1-10, 11-50, 51-200, 201-500, 501-1000, 1001-5000, 5001-10000, 10001+ employees Revenue**: Ranges from $0-500K up to $10T+ Age**: 0-3, 3-6, 6-10, 10-20, 20+ years Location**: Countries, regions, cities Industry**: Google categories, NAICS codes, LinkedIn categories Name**: Specific company names Prospect Filters Job Level**: CXO, VP, Director, Manager, Senior, Entry, etc. Department**: Sales, Marketing, Engineering, Finance, HR, etc. Experience**: Total months and current role duration Location**: Country and region codes Contact Info**: Filter by email/phone availability Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Workflows β Add Workflow β Import from File Paste the JSON and click Import Step 2: Configure Anthropic Credentials Click on the Anthropic Chat Model1 node Under Credentials, click Create New Add your Anthropic API key Name: "Anthropic API" Save credentials Step 3: Configure Explorium Credentials You'll need to set up Explorium credentials in two places: For MCP Client: Click on the MCP Client node Under Credentials, create new Header Auth Add your authentication header (usually Authorization: Bearer YOUR_TOKEN) Save credentials For API Calls: Click on the Prospects API Call node Use the same Header Auth credentials created above Verify the API endpoint is correct Step 4: Activate the Workflow Save the workflow Click the Active toggle to enable it The chat interface will now be available Step 5: Access the Chat Interface Click on the When chat message received node Copy the webhook URL Access this URL in your browser to start chatting How It Works Workflow Architecture Chat Trigger: Receives natural language queries from users Memory Buffer: Maintains conversation context AI Agent: Interprets queries and generates API parameters Validation: Checks API structure against Explorium requirements API Call: Fetches prospect data with pagination Data Processing: Formats results for CSV export File Conversion: Creates downloadable CSV file Processing Flow User Query β AI Interpretation β Validation β API Call β CSV Export β β βββββ Error Correction Loop ββββββββ Validation Rules The workflow validates: Filter keys are allowed by Explorium API Values match expected formats (e.g., valid country codes) Range filters have proper gte/lte values No duplicate values in arrays Required structure is maintained Usage Guide Basic Conversation Flow Start with your query: "Find me VPs of Sales at software companies in the US" Bot processes and responds: Generates API filters Validates the structure Fetches data Returns CSV download link Refine if needed: "Can you also include directors and filter for companies with 100+ employees?" Query Tips Be specific**: Include job titles, departments, company details Use standard terms**: "CTO" instead of "Chief Technology Officer" Specify locations**: Use country names or standard codes Include size/revenue**: Helps narrow results effectively Advanced Queries Combine multiple criteria: "Find engineering managers and senior engineers at B2B SaaS companies in New York and California with 50-500 employees and revenue over $5M who have been in their role for at least 1 year" Output Format The CSV file includes: Prospect ID Name (first, last, full) Location (country, region, city) LinkedIn profile Experience summary Skills and interests Company details Job information Business ID Troubleshooting Common Issues "Validation failed" errors Check that your query uses supported filter values Ensure location names are spelled correctly Verify company sizes/revenues match allowed ranges No results returned Broaden your search criteria Check if the company exists in Explorium's database Verify filter combinations aren't too restrictive Chat not responding Ensure workflow is activated Check all credentials are properly configured Verify webhook URL is accessible Large result sets timing out Try adding more specific filters Limit results by location or company size Use the size parameter (max 10,000) Error Messages The bot provides clear feedback: Invalid filters**: Shows which filters aren't supported Value errors**: Lists correct options for each field API failures**: Explains connection or authentication issues Performance Optimization Best Practices Start broad, then narrow: Begin with basic criteria and add filters Use business IDs: When targeting specific companies Limit by contact info: Add has_email: true for actionable leads Batch by location: Process regions separately for large searches API Limits Maximum 10,000 results per search Pagination handles up to 100 records per page Rate limits apply based on your Explorium subscription Customization Options Modify AI Behavior Edit the AI Agent system message to: Change response format Add custom filters Adjust interpretation logic Include additional instructions Extend Functionality Add nodes to: Send results via email Import directly to CRM Schedule recurring searches Create custom reports Integration Ideas Connect to Slack for team queries Add to CRM workflows Create lead scoring systems Build automated outreach campaigns Security Considerations API credentials are stored securely in n8n Chat sessions are isolated No prospect data is stored permanently CSV files are generated on-demand Support Resources For issues with: n8n platform**: Check n8n documentation Explorium API**: Contact Explorium support Anthropic/Claude**: Refer to Anthropic docs Workflow logic**: Review node configurations
by Srinivasan KB
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What is DIGIPIN? DIGIPIN (Digital Pincode) is a 10-character alphanumeric code introduced by India Post. It maps any 3x3 meter square in India to a unique digital address. This helps precisely locate homes, shops, or landmarks, especially in areas where physical addresses are inconsistent or missing. What this workflow does This workflow creates a fully offline DIGIPIN microservice using only JavaScript - no external APIs are used. You get two HTTP endpoints: GET /generate-digipin?lat={latitude}&lon={longitude} β returns a DIGIPIN GET /decode-digipin?digipin={code} β returns the latitude and longitude You can plug this into any system to: Convert GPS coordinates to a DIGIPIN Convert a DIGIPIN back to coordinates How it works An HTTP Webhook node receives the request A JS Function node either encodes or decodes based on input The result is returned as a JSON response All the logic is handled inside the workflow - no API keys, no external calls. Why use this Fast and lightweight Easily extendable: you can connect this to forms, CRMs, apps, or spreadsheets Ideal for field agents, address validation, logistics, or rural operations