by ist00dent
This n8n template allows you to perform real-time currency conversions by simply sending a webhook request. By integrating with the ExchangeRate.host API, you can get up-to-date exchange rates for over 170 world currencies, making it an incredibly useful tool for financial tracking, e-commerce, international business, and personal budgeting. π§ How it works Receive Conversion Request Webhook: This node acts as the entry point for the workflow, listening for incoming POST requests. It's configured to expect a JSON body containing: from: The 3-letter ISO 4217 currency code for the source currency (e.g., USD, PHP). to: The 3-letter ISO 4217 currency code for the target currency (e.g., EUR, JPY). amount: The numeric value you want to convert. Important: The ExchangeRate.host API access_key is handled securely by n8n's credential system and should not be included in the webhook body or headers. Convert Currency: This node makes an HTTP GET request to the ExchangeRate.host API (api.exchangerate.host). It dynamically constructs the URL using the from, to, and amount from the webhook body. Your API access key is securely retrieved from n8n's pre-configured credentials (HTTP Query Auth type) and automatically added as a query parameter (access_key). The API then performs the conversion and returns a JSON object with the conversion details. Respond with Converted Amount: This node sends the full currency conversion result received from ExchangeRate.host back to the service that initiated the webhook. π€ Who is it for? This workflow is ideal for: E-commerce Platforms: Display prices in local currencies on the fly for international customers. Convert incoming international payments to your local currency for accounting. Calculate shipping costs in different currencies. Financial Tracking & Budgeting Apps: Update personal or business budgets with converted values. Track expenses incurred in foreign currencies. Automate portfolio value conversion for multi-currency investments. International Business & Freelancers: Generate invoices in a client's local currency based on your preferred currency. Quickly estimate project costs or earnings in different currencies. Automate reconciliation of international transactions. Travel Planning: Convert travel expenses from one currency to another while abroad. Build simple tools to estimate costs for trips in different countries. Data Analysis & Reporting: Standardize financial data from various sources into a single currency for unified reporting. Build dashboards that display converted financial metrics. Custom Integrations: Connect to CRMs, accounting software, or internal tools to automate currency-related tasks. Build chatbots that can answer currency conversion queries. π Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "from": "USD", "to": "PHP", "amount": 100 } The workflow will return a JSON response similar to this (results will vary based on currencies and amount): { "date": "2025-06-03", "historical": false, "info": { "rate": 58.749501, "timestamp": 1717398188 }, "query": { "amount": 100, "from": "USD", "to": "PHP" }, "result": 5874.9501, "success": true } βοΈ Setup Instructions Get an ExchangeRate.host Access Key: Go to https://exchangerate.host/ and sign up for a free API key. Create an n8n Credential for ExchangeRate.host: In your n8n instance, go to Credentials. Click "New Credential" and search for "HTTP Query Auth". Set the Name (e.g., ExchangeRate.host API Key). Set API Key to your ExchangeRate.host access key. Set Parameter Name to access_key. Set Parameter Position to Query. Save the credential. Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure ExchangeRate.host API Node: Double-click the Convert Currency node. Under "Authentication", select "Generic Credential Type". Choose "HTTP Query Auth" as the Generic Auth Type. Select the credential you created (e.g., "ExchangeRate.host API Key") from the dropdown. Configure Webhook Path: Double-click the Receive Conversion Request Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /convert-currency). Activate Workflow: Save and activate the workflow. π Tips This workflow is a powerful starting point. Here's how you can make it even more robust and integrated: Robust Error Handling: Add an IF node after Convert Currency to check {{ $json.success }}. If false, branch to an Error Trigger node or send an alert (e.g., Slack, Email) with {{ $json.error.info }} to notify you of API issues or invalid inputs. Include a Try/Catch block to gracefully handle network issues or malformed responses. Input Validation & Defaults: Add a Function node after the webhook to validate if from, to, and amount are present and in the correct format. If not, return a clear error message to the user. Set default from or to currencies if they are not provided in the webhook, making the API more flexible. Logging & Auditing: After a successful conversion, use a Google Sheets, Airtable, or database node (e.g., PostgreSQL, MongoDB) to log every conversion request, including the input currencies, amount, converted result, date, and possibly the calling IP (from the webhook headers). This is crucial for financial auditing and analysis. Rate Limits & Caching: If you anticipate many requests, be mindful of ExchangeRate.host's API rate limits. You can introduce a Cache node to store recent conversion results for a short period, reducing redundant API calls for common conversions. Alternatively, add a Delay node to space out requests if you're hitting limits. Format & Rounding: Use a Function node or Set node to format the result to a specific number of decimal places (e.g., {{ $json.result.toFixed(2) }}). Add currency symbols or full currency names to the output for better readability. Alerting on Significant Changes: Chain this workflow with a Cron or Schedule node to periodically fetch exchange rates for a pair you care about (e.g., USD to EUR). Use an IF node to compare the current rate with a previously stored rate. If the change exceeds a certain percentage, send an alert via Slack, Email, or Telegram to notify you of significant market shifts. Integration with Payment Gateways: For e-commerce, combine this with nodes for payment gateways (e.g., Stripe, PayPal) to automatically convert customer payments received in foreign currencies to your base currency before recording. Multi-currency Pricing for Products: Use this workflow in conjunction with your product database. When a user selects a different country/currency, trigger this webhook to dynamically convert product prices and display them instantly.
by Ranjan Dailata
Who this is for? The Automate Etsy Data Mining with Bright Data Scrape & Google Gemini workflow is designed for eCommerce analysts, product researchers, and AI developers seeking to extract actionable insights from Etsy listings at scale. It is ideal for: eCommerce Entrepreneurs** - Researching product demand and competition. Market Analysts** - Tracking pricing, reviews, and trends across Etsy categories. Product Managers** - Identifying niche opportunities and design inspirations. Data Scientists & AI Engineers** - Automating product intelligence pipelines. Growth Hackers** - Leveraging Etsy insights to refine product-market fit. What problem is this workflow solving? Manually browsing Etsy to analyze product listings, pricing, reviews, and seller activity is slow, inconsistent, and unscalable. Scraping Etsy requires unlocking JavaScript-heavy content and structuring noisy data for analysis. This workflow solves: Automated and scalable scraping of Etsy product listings using Bright Dataβs infrastructure. A fully paginated data structured Estry production data extraction via the Google Gemini LLM. Enables faster decision-making for product research and competitive analysis via the fully automated paginated data extraction. What this workflow does Receives input: Sets the Esty URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. Cleans and preprocesses the scraped content for readability. Sends the content to Google Gemini for: Enriched results including: Data persistence over the disk. Sends the response to a target system via Webhook notification. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set Esty Search Query for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Input Sources** : Replace the static URL with dynamic input from Google Sheets, Webhook, or Airtable to research multiple niches. Prompt Customization** : Adjust Gemini prompts to extract specific insights for example: List key features of the product Summarization of the review themes Data Output Options** : Update the Webhook notification to save data to: Google Sheets Notion or Airtable SQL/NoSQL Slack/Email
by mariskarthick
QuantumDefender AI is a next-generation intelligent cybersecurity assistant designed to harness the symbolic strength of quantum computingβs promise alongside cutting-edge AI capabilities. This sophisticated agent empowers SOC analysts, red teamers, and security researchers with rapid threat investigation, operational automation, and intelligent command executionβall driven by GPT-4 and integrated tools, accessible through Telegram or on any medium. π Key Features: Expert-Level Cybersecurity Research & Analysis: Leverages powerful AI models to deliver clean, detailed, domain-specific insights across detection, remediation, and offensive security. Command & Control: Executes Linux shell commands, autonomous scripts, and system operations securely in isolated environments. Real-Time Web Intelligence: Utilizes integrated Langsearch API to provide timely internet research with contextual relevance. Calendar & Scheduling Automation: Manage Google Calendar events or any similar application(create, update, delete, retrieve) dynamically from chat. Multi-Tool Orchestration: Combines calculator functions, internet searches, command execution, and messaging for comprehensive operational support. Telegram-native Chatbot: Delivers an adaptive, memory-informed, and interactive conversational experience with immediate typing indicators and high responsiveness. Conversation & Session Management: Maintains context-aware, session-based memory to enable smooth, multi-turn dialogues with individual users. Sends βtypingβ¦β indicators during processing to ensure an interactive, user-friendly chat experience. Operates exclusively within Telegram, delivering rich, timely responses and leveraging all Telegram bot capabilities. Execution Intelligence & Safety: Fully autonomous in deciding which tools to invoke, how frequently, and in what sequence to fulfill user requests comprehensively and responsibly. Operates within a secure temporary folder environment to contain all command executions safely and avoid persistent or harmful side effects. Enforces strict safety protocols to avoid running malicious or destructive commands, maintaining ethical standards and compliance. Use Cases: Cybersecurity researchers and operators seeking an intelligent assistant to accelerate investigations and automate routine tasks. Red team professionals requiring on-the-fly command execution and information gathering integrated with tactical chat interactions. SOC teams aiming to augment their alert triage and incident handling workflows with AI-powered analysis and action. Anyone looking for a robust multi-tool AI chatbot integrated with real-world operational capabilities. Setup Requirements: OpenAI API key for GPT-4.1-nano language processing. Telegram Bot API credentials with proper webhook setup to receive and respond to messages. Google OAuth credentials for Calendar integration if calendar features are used. SSH access credentials for executing commands on remote hosts, if remote execution is enabled. Internet connectivity for the Langsearch web search API. Customization & Extensibility: The workflow is built modularly with n8nβs flexible node system. Users can extend it by adding more tools, integrating other services (ticketing, threat intel, scanning tools), or modifying interaction logic to suit specialized operational needs and environments. Created by Mariskarthick M Senior Security Analyst | Detection Engineer | Threat Hunter | Open-Source Enthusiast
by Amit Mehta
How it Works This workflow automates the complete newsletter management process from content creation to client delivery, using Google Sheets, AI content generation, Google Drive, and Gmail. Whether you're a content creator, marketing agency, or small business owner, this workflow helps you automate newsletter creation and manage client communications with built-in approval workflows β all triggered from a simple spreadsheet. π― Use Case Ideal for: Marketing Teams** streamlining newsletter distribution Agencies** managing multiple client newsletters Content Creators** automating regular communications Small Businesses** maintaining customer engagement Setup Instructions 1. Upload the Spreadsheet File name: Newsletter_Management Sheet structure: | ID | Topic | Client Name | Client Email | Status | Created Date | Send Date | Add newsletter topics and set their Status as Pending 2. Configure Google Sheets Nodes Connect your Google account to: Get topic from newsletter sheet Pick records to send email to client Get Client email address Update Status as Generated Update status as Sent 3. Add API Credentials OpenAI API Key** β for AI content generation Google Drive Access** β for document storage Gmail Account** β for sending newsletters and notifications 4. Activate the Workflow Once live, the workflow will: Manual Path: Generate newsletter content from pending topics Scheduled Path: Send approved newsletters to clients automatically Track status updates throughout the entire process Store generated content in Google Drive Send admin notifications and client emails π Workflow Logic Main Workflow (Content Generation) Trigger: Manual activation for newsletter creation Retrieve: Pending topics from Google Sheets Validate: Status confirmation (Pending only) Generate: AI-powered HTML newsletter content Store: Upload to Google Drive Notify: Send completion email to admin Update: Mark status as "Generated" Scheduled Workflow (Client Distribution) Trigger: Schedule-based activation Retrieve: Approved newsletters from Google Sheets Validate: Status confirmation (Approved only) Lookup: Client email addresses Loop: Process multiple recipients Send: Personalized newsletters via Gmail Update: Mark status as "Sent" π§© Node Descriptions | Node Name | Description | |-----------|-------------| | When clicking 'Test workflow' | Manual trigger to start newsletter generation | | Get topic from newsletter sheet | Retrieves pending newsletter topics from Google Sheets | | Validate Status as Pending | Checks whether status is 'Pending' for processing | | Create HTML for Newsletter | AI-powered content generation using OpenAI | | Prepare Data to create word doc | Formats generated content for document creation | | Upload doc to google drive | Stores completed newsletters in Google Drive | | Send an email to admin | Notifies administrators of completion | | Update Status as Generated | Marks processed items as 'Generated' | | Schedule Trigger | Automated trigger for client email distribution | | Pick records to send email to client | Retrieves approved newsletters for sending | | Validate Status as Approved | Ensures only approved content is processed | | Get Client email address | Fetches client contact information | | Loop Over Items | Processes multiple newsletter recipients | | Send email to client | Delivers personalized newsletters via Gmail | | Update status as Sent | Marks newsletters as successfully delivered | π οΈ Customization Tips Modify AI prompts for different content styles and tones Add Slack notifications instead of or alongside Gmail Export to different formats (PDF, Word, etc.) Schedule multiple sending times for different client segments Add approval workflows with webhook triggers Integrate with CRM systems for client management π Suggested Sticky Notes for Workflow | Node/Section | Sticky Note Content | |--------------|---------------------| | Manual Trigger | "Click to start newsletter generation process" | | AI Content Generation | "Customize prompts here for different newsletter styles" | | Google Drive Upload | "Organized storage - change folder structure as needed" | | Gmail Admin Notification | "Update admin email addresses and notification templates" | | Schedule Trigger | "Set optimal sending times for your audience" | | Client Email Loop | "Handles bulk sending - monitors for delivery errors" | | Status Updates | "Maintains audit trail - prevents duplicate processing" | π Required Files | File Name | Purpose | |-----------|---------| | Newsletter_Management.xlsx | Google Sheet to manage topics, clients, and status tracking | | Client_Database.xlsx | Client contact information and preferences | | Newsletter_Workflow.json | Main n8n workflow export for this automation | π§ͺ Testing Tips Add one test topic with status = Pending and run manual trigger Verify AI content generation produces quality HTML Check Google Drive upload and folder organization Test admin email delivery and formatting Add test client with valid email for scheduled workflow Monitor workflow logs for API responses and errors Confirm status updates occur at each step π· Suggested Tags & Categories #Newsletter #EmailMarketing #ContentGeneration #ClientCommunication #Automation #GoogleWorkspace #AIContent #MarketingAutomation #WorkflowManagement #BusinessProcess π§ Prerequisites Google Workspace account (Sheets, Drive, Gmail) OpenAI API account with GPT-4 access n8n instance (Cloud or self-hosted) Basic understanding of Google Sheets and email marketing π Expected Performance Setup Time**: 30-45 minutes Monthly Executions**: 100-500 (varies by newsletter frequency) Processing Time**: 2-5 minutes per newsletter Scalability**: Handles 100+ clients efficiently π¨ Important Notes Ensure proper Google API permissions are configured Monitor OpenAI API usage and rate limits Set up error handling for failed email deliveries Regularly backup your Google Sheets data Test thoroughly before production deployment π‘ Advanced Features Approval Workflows**: Add manual approval steps between generation and sending A/B Testing**: Create multiple versions and track performance Analytics Integration**: Connect with Google Analytics for tracking Multi-language Support**: Generate content in different languages Dynamic Personalization**: Use client data for personalized content
by ist00dent
This n8n workflow provides a simple yet powerful utility to convert Unix timestamps (seconds since epoch) into the universally recognized ISO 8601 date and time format. This is crucial for harmonizing date data across different systems, databases, and applications. π§ How it works Receive Timestamp Webhook: This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing a single property: timestamp, which should be a Unix timestamp in seconds (e.g., 1678886400). Convert to ISO 8601: This node takes the timestamp received from the webhook. Since JavaScript's Date object typically uses milliseconds, it multiplies the Unix timestamp by 1000. It then uses new Date(...).toISOString() to convert this into an ISO 8601 formatted string (e.g., 2023-03-15T00:00:00.000Z) and assigns it to a new property called convertedTime. Respond with Converted Time: This node sends the convertedTime property back as the response to the original webhook caller. π€ Who is it for? This workflow is extremely useful for: Developers & Integrators: When working with APIs or databases that return dates as Unix timestamps, and you need to display them in a human-readable or standardized format in your applications or dashboards. Data Analysts & Scientists: For cleaning and transforming raw timestamp data from logs, event streams, or legacy systems into a consistent format for analysis. System Administrators: For debugging logs where timestamps are often in Unix format. Anyone Managing Data Imports/Exports: Ensuring date compatibility when moving data between different platforms. Automators: As a building block in larger workflows where incoming data has Unix timestamps that need to be normalized before further processing (e.g., adding to a spreadsheet, sending in an email, or performing date calculations). π Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "timestamp": 1678886400 } The workflow will return a JSON response similar to this: { "convertedTime": "2023-03-15T00:00:00.000Z" } βοΈ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Timestamp Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /convert-timestamp or /unix-to-iso). Activate Workflow: Save and activate the workflow. π Tips This simple conversion workflow can be drastically enhanced and leveraged in many ways: Dynamic Output Formats: Upgrade: Modify the Convert to ISO 8601 node (or add a Function node after it) to accept an optional format parameter in the webhook. Leverage: Allow users to request formats like MM/DD/YYYY HH:mm:ss, YYYY-MM-DD, DD-MM-YYYY, or just the time, making the output directly usable in various contexts without further processing. Example using a Function node: const date = new Date($json.timestamp * 1000); const format = $json.format || 'iso'; // Default to ISO let output; switch (format.toLowerCase()) { case 'iso': output = date.toISOString(); break; case 'locale': // e.g., "3/15/2023, 12:00:00 AM UTC" output = date.toLocaleString('en-US', { timeZone: 'UTC' }); break; case 'dateonly': // e.g., "2023-03-15" output = date.toISOString().split('T')[0]; break; case 'timeonly': // e.g., "00:00:00 UTC" output = date.toLocaleTimeString('en-US', { timeZone: 'UTC', hour12: false }); break; default: output = date.toISOString(); // Fallback } return [{ json: { convertedTime: output } }]; Timezone Conversion: Upgrade: Combine this with the Time Zone Converter workflow (or integrate moment-timezone.js if using a Code node and have a self-hosted instance). Accept an optional targetTimeZone parameter in the webhook. Leverage: Convert the Unix timestamp directly into a human-readable date and time in a specific target timezone, which is incredibly valuable for global scheduling or reporting. Error Handling and Input Validation: Upgrade: Add an IF node after the Receive Timestamp Webhook. Check if isNaN($json.body.timestamp) or if typeof $json.body.timestamp !== 'number'. Leverage: If the input timestamp is invalid, branch to a Respond to Webhook node that returns a clear error message (e.g., "Invalid timestamp provided. Please provide a numeric Unix timestamp in seconds."). This makes your API more robust. Reverse Conversion (ISO to Unix): Upgrade: Create a separate workflow, or add another branch to this one, to convert an ISO 8601 string back to a Unix timestamp. This provides a complete conversion utility. Example Set node value: ={{ new Date($json.body.isoString).getTime() / 1000 }} Integration with Data Pipelines: Upgrade: Use this workflow as a microservice in larger ETL (Extract, Transform, Load) pipelines. Leverage: If you're pulling data from a source that provides Unix timestamps (e.g., a logging system, IoT device, certain databases), send that data through this workflow to normalize the dates before loading them into your analytics database, CRM, or data warehouse. Automated Reporting: Upgrade: If you have a system that generates reports with Unix timestamps, trigger this webhook for each timestamp. Leverage: Produce reports with human-readable dates for better readability and decision-making for non-technical stakeholders. This workflow is a cornerstone for any automation involving diverse date and time data. By implementing the suggested upgrades, you can transform it from a basic converter into a highly flexible and reliable date-time processing hub.
by Angel Menendez
Who is this for? This workflow is perfect for HR teams, recruiters, and hiring platforms that need to automate the extraction of key candidate detailsβlike name, email, skills, and educationβfrom resume files submitted in various formats. What problem does this solve? Manually reviewing and extracting structured data from resumes is time-consuming and error-prone. This automation eliminates that bottleneck, standardizing candidate data for seamless integration into CRMs, applicant tracking systems, or Google Sheets. What this workflow does This n8n template listens for uploaded resume files, detects their format (PDF, DOC, TXT, CSV, etc.), and automatically extracts the raw text using n8nβs built-in file extraction tools. The extracted text is then parsed using an OpenAI-powered agent that returns structured fields such as: Full Name Email Address Skill Keywords Education Details Optionally, you can push the structured output to Google Sheets (node included, currently disabled). Setup Clone this workflow into your n8n instance. Enable the When chat message received trigger if using n8n chat. Provide your OpenAI credentials and enable the LangChain Agent node. (Optional) Connect Google Sheets by authenticating with your Google account and filling in your target document and sheet. Watch the setup and demo video here: π₯ https://youtu.be/2SUPiNmLWdA How to customize Modify the OpenAI system message to extract different fields (e.g., phone number, LinkedIn). Replace the Google Sheets node with a webhook to push results to your ATS. Add filters to limit accepted file types or max file size. > β οΈ This template is designed to be secure. It uses credentials stored in the n8n credential managerβno hardcoded secrets required.
by Don Jayamaha Jr
A short-term technical analysis agent for 15-minute candles on Binance Spot Market pairs. Calculates and interprets key trading indicators (RSI, MACD, BBANDS, ADX, SMA/EMA) and returns structured summaries, optimized for Telegram or downstream AI trading agents. This tool is designed to be triggered by another workflow (such as the Binance SM Financial Analyst Tool or Binance Quant AI Agent) and is not intended for standalone use. π§ Key Features β±οΈ Uses 15-minute kline data (last 100 candles) π Calculates: RSI, MACD, Bollinger Bands, SMA/EMA, ADX π§ Interprets numeric data using GPT-4.1-mini π€ Outputs concise, formatted analysis like: β’ RSI: 72 β Overbought β’ MACD: Cross Up β’ BB: Expanding β’ ADX: 34 β Strong Trend π§ AI Agent Purpose > You are a short-term analysis tool for spotting volatility, early breakouts, and scalping setups. Used by higher agents to determine: Entry/exit precision Momentum shifts Scalping opportunities βοΈ How it Works Triggered externally by another workflow Accepts input: { "message": "BTCUSDT", "sessionId": "123456789" } Sends POST request to backend endpoint: https://treasurium.app.n8n.cloud/webhook/15m-indicators Fetches last 100 candles and calculates indicators Passes data to GPT for interpretation Returns summary with indicator tags for human readability π Dependencies This tool is triggered by: β Binance SM Financial Analyst Tool β Binance Spot Market Quant AI Agent π Setup Instructions Import into your n8n instance Make sure /15m-indicators webhook is active and calculates indicators correctly Connect your OpenAI GPT-4.1-mini credentials Trigger from upstream agent with Binance symbol and session ID Ensure all external calls (to Binance + webhook) are working π§ͺ Example Use Cases | Use Case | Result | | ------------------------------------- | --------------------------------------- | | Short-term trade decision for ETHUSDT | Receives 15m signal indicators summary | | Input from Financial Analyst Tool | Returns real-time volatility snapshot | | Telegram bot asks for βDOGE updateβ | Returns momentum indicators in 15m view | π₯ Watch Tutorial: π§Ύ Licensing & Attribution Β© 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding or resale permitted. π For support: Don Jayamaha β LinkedIn
by explorium
Explorium Prospects Search Chatbot Template Download the following json file and import it to a new n8n workflow: mcp\_to\_prospects\_to\_csv.json Overview This n8n workflow creates a chatbot that understands natural language requests for finding business prospects and automatically: Interprets your query using AI (Claude Sonnet 3.7) Converts it to proper Explorium API filters Validates the API request structure Fetches prospect data from Explorium Exports results as a downloadable CSV file Perfect for sales teams, recruiters, and business development professionals who need to quickly find and export targeted prospect lists without learning complex API syntax. Key Features Natural Language Interface**: Simply describe who you're looking for in plain English Smart Query Translation**: AI converts your request to valid API parameters Built-in Validation**: Ensures API calls meet Explorium's requirements Error Recovery**: Automatically retries with corrections if validation fails Pagination Support**: Handles large result sets automatically CSV Export**: Clean, formatted output ready for CRM import Conversation Memory**: Maintains context for follow-up queries Example Queries The chatbot understands queries like: "Find marketing directors at SaaS companies in New York with 50-200 employees" "Get me CTOs from fintech startups in California" "Show me sales managers at healthcare companies with revenue over $10M" "Find engineers at Microsoft with 3-5 years experience" "Get customer service leads from e-commerce companies in Europe" Prerequisites Before setting up this workflow, ensure you have: n8n instance with chat interface enabled Anthropic API key for Claude Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of n8n chat workflows Supported Filters The chatbot can search using these criteria: Company Filters Size**: 1-10, 11-50, 51-200, 201-500, 501-1000, 1001-5000, 5001-10000, 10001+ employees Revenue**: Ranges from $0-500K up to $10T+ Age**: 0-3, 3-6, 6-10, 10-20, 20+ years Location**: Countries, regions, cities Industry**: Google categories, NAICS codes, LinkedIn categories Name**: Specific company names Prospect Filters Job Level**: CXO, VP, Director, Manager, Senior, Entry, etc. Department**: Sales, Marketing, Engineering, Finance, HR, etc. Experience**: Total months and current role duration Location**: Country and region codes Contact Info**: Filter by email/phone availability Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Workflows β Add Workflow β Import from File Paste the JSON and click Import Step 2: Configure Anthropic Credentials Click on the Anthropic Chat Model1 node Under Credentials, click Create New Add your Anthropic API key Name: "Anthropic API" Save credentials Step 3: Configure Explorium Credentials You'll need to set up Explorium credentials in two places: For MCP Client: Click on the MCP Client node Under Credentials, create new Header Auth Add your authentication header (usually Authorization: Bearer YOUR_TOKEN) Save credentials For API Calls: Click on the Prospects API Call node Use the same Header Auth credentials created above Verify the API endpoint is correct Step 4: Activate the Workflow Save the workflow Click the Active toggle to enable it The chat interface will now be available Step 5: Access the Chat Interface Click on the When chat message received node Copy the webhook URL Access this URL in your browser to start chatting How It Works Workflow Architecture Chat Trigger: Receives natural language queries from users Memory Buffer: Maintains conversation context AI Agent: Interprets queries and generates API parameters Validation: Checks API structure against Explorium requirements API Call: Fetches prospect data with pagination Data Processing: Formats results for CSV export File Conversion: Creates downloadable CSV file Processing Flow User Query β AI Interpretation β Validation β API Call β CSV Export β β βββββ Error Correction Loop ββββββββ Validation Rules The workflow validates: Filter keys are allowed by Explorium API Values match expected formats (e.g., valid country codes) Range filters have proper gte/lte values No duplicate values in arrays Required structure is maintained Usage Guide Basic Conversation Flow Start with your query: "Find me VPs of Sales at software companies in the US" Bot processes and responds: Generates API filters Validates the structure Fetches data Returns CSV download link Refine if needed: "Can you also include directors and filter for companies with 100+ employees?" Query Tips Be specific**: Include job titles, departments, company details Use standard terms**: "CTO" instead of "Chief Technology Officer" Specify locations**: Use country names or standard codes Include size/revenue**: Helps narrow results effectively Advanced Queries Combine multiple criteria: "Find engineering managers and senior engineers at B2B SaaS companies in New York and California with 50-500 employees and revenue over $5M who have been in their role for at least 1 year" Output Format The CSV file includes: Prospect ID Name (first, last, full) Location (country, region, city) LinkedIn profile Experience summary Skills and interests Company details Job information Business ID Troubleshooting Common Issues "Validation failed" errors Check that your query uses supported filter values Ensure location names are spelled correctly Verify company sizes/revenues match allowed ranges No results returned Broaden your search criteria Check if the company exists in Explorium's database Verify filter combinations aren't too restrictive Chat not responding Ensure workflow is activated Check all credentials are properly configured Verify webhook URL is accessible Large result sets timing out Try adding more specific filters Limit results by location or company size Use the size parameter (max 10,000) Error Messages The bot provides clear feedback: Invalid filters**: Shows which filters aren't supported Value errors**: Lists correct options for each field API failures**: Explains connection or authentication issues Performance Optimization Best Practices Start broad, then narrow: Begin with basic criteria and add filters Use business IDs: When targeting specific companies Limit by contact info: Add has_email: true for actionable leads Batch by location: Process regions separately for large searches API Limits Maximum 10,000 results per search Pagination handles up to 100 records per page Rate limits apply based on your Explorium subscription Customization Options Modify AI Behavior Edit the AI Agent system message to: Change response format Add custom filters Adjust interpretation logic Include additional instructions Extend Functionality Add nodes to: Send results via email Import directly to CRM Schedule recurring searches Create custom reports Integration Ideas Connect to Slack for team queries Add to CRM workflows Create lead scoring systems Build automated outreach campaigns Security Considerations API credentials are stored securely in n8n Chat sessions are isolated No prospect data is stored permanently CSV files are generated on-demand Support Resources For issues with: n8n platform**: Check n8n documentation Explorium API**: Contact Explorium support Anthropic/Claude**: Refer to Anthropic docs Workflow logic**: Review node configurations
by InfoGrab
This is a response chatbot in public channels through slash commands. I explain more in detail through the YouTube video, but it's only available in Korean. How it works? When you request the created slash command in Slack, the request comes to the webhook. Then, the Switch Node branches appropriately according to each slash command request. Here, a slash command called /ask is connected to the chatbot, and the chatbot generates answers to the questions asked. The final node responds to the channel. Set up steps Create a Slack app. Add chat:write permission in Slack OAuth&Permissions>Scopes. Create a Command in Slack Slash Commands menu and enter the n8n Webhook node's URL. Complete creating the Slash Commands. Enter the created command in the Switch node. μ¬λμ 컀맨λλ₯Ό ν΅ν κ³΅κ° μ±λμμμ μλ΅ μ±λ΄ μ λλ€. μ νλΈ μμμ λ μμΈνκ² μ€λͺ λ립λλ€. μ€λͺ μ¬λμ μμ±ν μ¬λμ 컀맨λλ₯Ό μ¬λμμ μμ²νλ©΄ μΉν μ μμ²μ΄ λ€μ΄μ΅λλ€. μ΄ν Switch Nodeμμ κ° μ¬λμ 컀맨λμ μμ²μ λ°λΌ μλ§κ² λΆκΈ°ν©λλ€. μ¬κΈ°μμλ /askβλΌλ μ¬λμ 컀맨λκ° μ±λ΄μΌλ‘ μ°κ²°λμ΄ μκ³ , μ±λ΄μμ μ§λ¬Έν λ΄μ©μ λ΅λ³μ μμ±ν©λλ€. λ§μ§λ§ λ Έλμμ μ±λλ‘ μλ΅μ ν©λλ€. μ€μ λ°©λ² Slack μ±μ λ§λμΈμ. Slack OAuth&Permissions>Scopes μμ chat:write κΆνμ μΆκ°νμΈμ. Slack Slash Commands λ©λ΄μμ Commandλ₯Ό μμ±νκ³ , n8n Webhook λ Έλμ urlμ μ λ ₯νμΈμ. Slash Slash Commands μμ±μ μλ£νμΈμ. Switch λ Έλμ μμ±ν 컀맨λλ₯Ό μ λ ₯νμΈμ.
by explorium
HubSpot Contact Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: hubspot\_flow.json Overview This n8n workflow monitors your HubSpot instance for newly created contacts and automatically enriches them with additional contact information. When a contact is created, the workflow: Detects the new contact via HubSpot webhook trigger Retrieves recent contact details from HubSpot Matches the contact against Explorium's database using name, company, and email Enriches the contact with professional emails and phone numbers Updates the HubSpot contact record with discovered information This automation ensures your sales and marketing teams have complete contact information, improving outreach success rates and data quality. Key Features Real-time Webhook Trigger**: Instantly processes new contacts as they're created Intelligent Matching**: Uses multiple data points (name, company, email) for accurate matching Comprehensive Enrichment**: Adds both professional and work emails, plus phone numbers Batch Processing**: Efficiently handles multiple contacts to optimize API usage Smart Data Mapping**: Intelligently maps multiple emails and phone numbers Profile Enrichment**: Optional additional enrichment for deeper contact insights Error Resilience**: Continues processing other contacts if some fail to match Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) HubSpot account with: Developer API access (for webhooks) Private App or OAuth2 app created Contact object permissions (read/write) Explorium API credentials (Bearer token) - Get explorium api key Understanding of HubSpot contact properties HubSpot Requirements Required Contact Properties The workflow uses these HubSpot contact properties: firstname - Contact's first name lastname - Contact's last name company - Associated company name email - Primary email (read and updated) work_email - Work email (updated by workflow) phone - Phone number (updated by workflow) API Access Setup Create a Private App in HubSpot: Navigate to Settings β Integrations β Private Apps Create new app with Contact read/write scopes Copy the Access Token Set up Webhooks (for Developer API): Create app in HubSpot Developers portal Configure webhook for contact.creation events Note the App ID and Developer API Key Custom Properties (Optional) Consider creating custom properties for: Multiple email addresses Mobile vs. office phone numbers Data enrichment timestamps Match confidence scores Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows β Add Workflow β Import from File Paste the JSON and click Import Step 2: Configure HubSpot Developer API (Webhook) Click on the HubSpot Trigger node Under Credentials, click Create New Enter your HubSpot Developer credentials: App ID: From your HubSpot app Developer API Key: From your developer account Client Secret: From your app settings Save as "HubSpot Developer account" Step 3: Configure HubSpot App Token Click on the HubSpot Recently Created node Under Credentials, click Create New (App Token) Enter your Private App access token Save as "HubSpot App Token account" Apply the same credentials to the Update HubSpot node Step 4: Configure Explorium API Credentials Click on the Explorium Match Prospects node Under Credentials, click Create New (HTTP Header Auth) Configure the authentication: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth Connection" Apply to all Explorium nodes: Explorium Enrich Contacts Information Explorium Enrich Profiles Step 5: Configure Webhook Subscription In HubSpot Developers portal: Go to your app's webhook settings Add subscription for contact.creation events Set the target URL from the HubSpot Trigger node Activate the subscription Step 6: Activate the Workflow Save the workflow Toggle the Active switch to ON The webhook is now listening for new contacts Node Descriptions HubSpot Trigger: Webhook that fires when new contacts are created HubSpot Recently Created: Fetches details of recently created contacts Loop Over Items: Processes contacts in batches of 6 Explorium Match Prospects: Finds matching person in Explorium database Filter: Validates successful matches Extract Prospect IDs: Collects matched prospect identifiers Enrich Contacts Information: Fetches emails and phone numbers Enrich Profiles: Gets additional profile data (optional) Merge: Combines all enrichment results Split Out: Separates individual enriched records Update HubSpot: Updates contact with new information Data Mapping Logic The workflow maps Explorium data to HubSpot properties: | Explorium Data | HubSpot Property | Notes | | ------------------------------ | ------------------ | ----------------------------- | | professions_email | email | Primary professional email | | emails[].address | work_email | All email addresses joined | | phone_numbers[].phone_number | phone | All phones joined with commas | | mobile_phone | phone (fallback) | Used if no other phones found | Data Processing The workflow handles complex data scenarios: Multiple emails**: Joins all discovered emails with commas Phone numbers**: Combines all phone numbers into a single field Missing data**: Uses "null" as placeholder for empty fields Name parsing**: Cleans sample data and special characters Usage & Operation Automatic Processing Once activated: Every new contact triggers the webhook immediately Contact is enriched within seconds HubSpot record is updated automatically Process repeats for each new contact Manual Testing To test the workflow: Use the pinned test data in the HubSpot Trigger node, or Create a test contact in HubSpot Monitor the execution in n8n Verify the contact was updated in HubSpot Monitoring Performance Track workflow health: Go to Executions in n8n Filter by this workflow Monitor success rates Review any failed executions Check webhook delivery in HubSpot Troubleshooting Common Issues Webhook not triggering Verify webhook subscription is active in HubSpot Check the webhook URL is correct and accessible Ensure workflow is activated in n8n Test webhook delivery in HubSpot developers portal Contacts not matching Verify contact has firstname, lastname, and company Check for typos or abbreviations in company names Some individuals may not be in Explorium's database Email matching improves accuracy significantly Updates failing in HubSpot Check API token has contact write permissions Verify property names exist in HubSpot Ensure rate limits haven't been exceeded Check for validation rules on properties Missing enrichment data Not all prospects have all data types Phone numbers may be less available than emails Profile enrichment is optional and may not always return data Error Handling Built-in error resilience: Failed matches don't block other contacts Each batch processes independently Partial enrichment is possible All errors are logged for review Debugging Tips Check webhook logs: HubSpot shows delivery attempts Review executions: n8n logs show detailed error messages Test with pinned data: Use the sample data for isolated testing Verify API responses: Check Explorium API returns expected data Best Practices Data Quality Complete contact records: Ensure name and company are populated Standardize company names: Use official names, not abbreviations Include existing emails: Improves match accuracy Regular data hygiene: Clean up test and invalid contacts Performance Optimization Batch size: 6 is optimal for rate limits Webhook reliability: Monitor delivery success API quotas: Track usage in both platforms Execution history: Regularly clean old executions Compliance & Privacy GDPR compliance: Ensure lawful basis for enrichment Data minimization: Only enrich necessary fields Access controls: Limit who can modify enriched data Audit trail: Document enrichment for compliance Customization Options Additional Enrichment Extend with more Explorium data: Job titles and departments Social media profiles Professional experience Skills and interests Company information Enhanced Processing Add workflow logic for: Lead scoring based on enrichment Routing based on data quality Notifications for high-value matches Custom field mapping Integration Extensions Connect to other systems: Sync enriched data to CRM Trigger marketing automation Update data warehouse Send notifications to Slack API Considerations HubSpot Limits API calls**: Monitor daily limits Webhook payload**: Max 200 contacts per trigger Rate limits**: 100 requests per 10 seconds Property limits**: Max 1000 custom properties Explorium Limits Match API**: Batched for efficiency Enrichment calls**: Two parallel enrichments Rate limits**: Based on your plan Data freshness**: Real-time matching Architecture Considerations This workflow integrates with: HubSpot workflows and automation Marketing campaigns and sequences Sales engagement tools Reporting and analytics Other enrichment services Security Best Practices Webhook validation**: Verify requests are from HubSpot Token security**: Rotate API tokens regularly Access control**: Limit workflow modifications Data encryption**: All API calls use HTTPS Audit logging**: Track all enrichments Advanced Configuration Custom Field Mapping Modify the Update HubSpot node to map to custom properties: // Example custom mapping { "custom_mobile": "{{ $json.data.mobile_phone }}", "custom_linkedin": "{{ $json.data.linkedin_url }}", "enrichment_date": "{{ $now.toISO() }}" } Conditional Processing Add logic to process only certain contacts: Filter by contact source Check for specific properties Validate email domains Exclude test contacts Support Resources For assistance: n8n issues**: Check n8n documentation and forums HubSpot API**: Reference HubSpot developers documentation Explorium API**: Contact Explorium support Webhook issues**: Use HubSpot webhook testing tools
by Agentick AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. **This n8n template automates candidate outreach, call transcription, and structured feedback capture for HR teams and recruiters. It triggers on a new candidate row added in a Google Sheet, initiates a call using Vapi.ai, processes the transcript using Google Gemini, extracts key information like CTC, experience, and notice period, and then updates the same Google Sheet with parsed insights. This is ideal for recruiters or HR teams conducting high-volume candidate outreach and wanting to scale initial data collection using automated voice bots and AI transcription analysis.** How it works Trigger: Listens for new rows added to a Google Sheet (e.g., a new candidate lead). Call Initiation: Uses Vapi.ai to make a phone call to the candidate using an assistant bot. Transcript Retrieval: After the call, fetches the conversation transcript from the Vapi API. AI Transcript Analysis: Google Gemini parses the transcript and extracts structured fields like: Work experience Current & expected CTC Notice period & negotiability Work preferences and location Data Mapping: Extracted insights are mapped to structured JSON fields. Google Sheet Update: The same row in the source Sheet is updated with the collected information. Use Cases Pre-screening calls for job applicants Collecting missing candidate information asynchronously Replacing manual HR data entry with AI-powered automation Smart CRM updates from voice interactions Requirements Before you run this workflow, ensure the following: β Google account with access to Google Sheets API β Vapi.ai account with: Assistant ID Phone number ID Active API key β Google Gemini API (via PaLM) enabled β n8n version 1.40.0 or later with relevant credentials configured How to use Import the workflow into n8n. Set up your credentials for: Google Sheets Trigger Google Sheets Vapi.ai (add Bearer token) Google Gemini Replace the placeholder values in: Assistant ID Phone number ID Google Sheet ID and tab Start the workflow and add a row to the Google Sheet. Wait for the automated call and let the AI extract and populate the data. Customising this workflow Replace Google Gemini with OpenAI or Claude if preferred. Add sentiment analysis on the transcript using an LLM. Modify the Sheet column structure to add additional fields. Add a filter node to skip candidates with incomplete phone numbers. Use a Webhook trigger instead of Google Sheets to integrate with job portals or ATS.
by CustomJS
n8n Workflow: Invoice PDF Generator This n8n workflow captures invoice data and generates a PDF invoice, ready to be sent or saved. It uses a webhook to trigger the process, preprocesses the invoice data, and converts it to a PDF using HTML and custom styling. @custom-js/n8n-nodes-pdf-toolkit Features: Webhook Trigger**: Receives incoming data, including invoice details. Preprocessing**: Transforms the invoice data into HTML format. HTML to PDF Conversion**: Converts the preprocessed HTML into a styled PDF document. Response**: Sends the generated PDF back to the webhook response. Notice Community nodes can only be installed on self-hosted instances of n8n. Requirements Self-hosted** n8n instance A CustomJS API key for website screenshots. Invoice data** for PDF generation Workflow Steps: Webhook Trigger: Accepts incoming data (e.g., invoice number, recipient details, itemized list). This data is passed to the next node for processing. Set Data Node: Configures initial values for the invoice, including the recipient, sender, invoice number, and the items on the invoice. The invoice details include information like description, unit price, and quantity. Preprocess Node: Processes the raw data to format it correctly for HTML. This includes splitting addresses and converting the items into an HTML table format. HTML to PDF Conversion: Converts the generated HTML into a PDF document. The HTML includes a header, a detailed invoice table, and a footer with contact information. Respond to Webhook: Returns the generated PDF as a response to the initial webhook request. Setup Guide: 1. Configure CustomJS API Sign up at CustomJS. Retrieve your API key from the profile page. Add your API key as n8n credentials. 2. Design Workflow Create a Webhook: Set up a webhook to trigger the workflow when invoice data is received. Prepare Data: Ensure the incoming request contains fields like "Invoice No", "Bill To", "From", and "Details" (list of items with price and quantity). Customize the HTML: The HTML template for the invoice includes custom styling to give the invoice a professional look. Convert to PDF: The HTML to PDF node is configured with the data generated from the preprocessing step to convert the invoice HTML to a PDF format. Example Invoice Data: { "Invoice No": "1", "Bill To": "John Doe\n1234 Elm St, Apt 567\nCity, Country, 12345", "From": "ABC Corporation\n789 Business Ave\nCity, Country, 67890", "Details": [ { "description": "Web Hosting", "price": 150, "qty": 2 }, { "description": "Domain", "price": 15, "qty": 5 } ], "Email": "support@mycompany.com" } Result PDF File