by Stathis Askaridis
Integrate Xero with FileMaker using Webhooks Workflow Description This n8n workflow automates the integration between Xero and FileMaker, allowing for seamless data transfer between the two platforms. By listening for webhooks from Xero (e.g., new invoices, payments, or contacts), this workflow ensures that data is automatically sent and recorded in a FileMaker database. Who is This For? This workflow template is ideal for: Accountants** who need a streamlined process to sync financial data between Xero and FileMaker. Business Owners** looking to automate data entry and improve accuracy across their systems. Developers** building solutions for clients that require integration between accounting software and databases. Operations Teams** focused on minimizing manual work and improving efficiency. Key Steps Xero Webhook Trigger: The workflow starts by capturing events from Xero via a webhook. Data Processing: Transforms and maps the incoming data to match FileMaker’s required format. FileMaker Node: Utilizes the FileMaker node to create or update records directly in the FileMaker database. Logging & Error Handling: Tracks successful entries and manages any errors with automated alerts. Setup Instructions Set Up the Xero Webhook: Create a webhook in Xero and point it to your n8n webhook node URL. Configure the types of events to trigger the workflow (e.g., new invoices or payments). Xero will then send some test calls to test you are doing proper hash control. Connect the FileMaker Node: Set up your FileMaker node with the appropriate credentials and database configuration. Map the fields between the incoming Xero data and your FileMaker database structure. Customize Data Processing: Adjust data transformations as needed to ensure compatibility with your FileMaker schema. Test and Deploy: Run the workflow with sample data to ensure everything is functioning correctly. Monitor the execution log to verify data transfer and make any adjustments as needed. Error Handling Configuration: Configure error-handling nodes or alerts to notify you of any issues during data processing. Benefits This setup facilitates real-time data synchronization between Xero and FileMaker, reducing the need for manual data entry and improving overall operational efficiency.
by TreyDong
How it works • Automatically detects when new pages are created in your Notion workspace • Uses AI to generate contextually relevant icons based on page titles for perfect visual representation • Fetches random high-quality cover images from Unsplash to add visual appeal to each page • Seamlessly integrates with your existing Notion workflow without manual intervention Set up steps • Connect your Notion workspace using API credentials - takes about 5 minutes to configure • Set up AI service integration for intelligent icon generation based on page titles • Configure Unsplash API access for random cover image fetching • Configure webhook triggers to monitor new page creation events • Test the workflow with a sample page to ensure proper functionality • Keep detailed setup instructions and troubleshooting tips in the workflow notes for future reference This template helps streamline your Notion workspace by automatically beautifying new pages with AI-generated icons and stunning Unsplash covers, saving you time while maintaining a visually appealing and professional appearance across your knowledge base.
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for? The Search Engine Intelligence Extractor is a powerful n8n automation that leverages Bright Data’s MCP based AI Agents to simulate human-like searches across Google, Bing, and Yandex, and then distills clean, structured insights using Google Gemini. This workflow is tailored for: SEO analysts researching competitors or market trends Market researchers needing real-time search visibility Journalists & content writers gathering contextual insights AI developers creating intelligent assistants Digital marketers tracking brand mentions or news What problem is this workflow solving? Traditional scraping of search engines is often blocked, cluttered, or filled with irrelevant information. Manually analyzing and cleaning this data for insight is time-consuming. This workflow solves the problem by: Simulating real user search behavior via Bright Data MCP based AI Agent Performing multi-platform search (Google, Bing, Yandex) in one unified flow Extracting clean, human-readable results (stripping ads, navigation, etc.) Structuring the content using Google Gemini LLM Automating delivery via Webhook or saving to disk What this workflow does Input Fields Node: Accepts the search query Accepts action for example - Perform a google search. Replace the action with bing, yandex etc. for other search providers Accepts Webhook notification URL Bright Data MCP Agent Execution: Triggers Bright Data’s intelligent search agent Handles search navigation, result loading, pagination Human Readable Data Extractor: Cleanses HTML, removes ads, footers, irrelevant links Produces a readable narrative of results Final Output Handling: Saves the processed response to disk Sends the structured data to a Webhook for real-time use Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Add Scheduled Execution Add a Cron trigger to run this workflow on a set schedule (e.g., daily/weekly keyword tracking). Push Results to Custom Destinations Connect output to: Google Sheets (for analytics or dashboards) PostgreSQL or MySQL DBs (for structured storage) Notion or Airtable (for content pipelines) Slack or Email (for alerting teams) Customize Webhook Notifications Update the Webhook URL in the notification node to push processed results to external APIs, CRMs, or real-time dashboards.
by Vijayeta Sinel
Automate document translation and ensure translation accuracy using Straker Verify, Google Drive and Slack. **How it works? ** A workflow step is set up to "watch" a Google Drive folder. When your team members place new files in this folder, they are downloaded. Straker Verify then translates them and provides a quality score. Once Straker Verify has completed this, the job info is fetched, the translation is saved to an output folder and you are notified via Slack. What problem does this solve? When using AI to translate documents, you have no idea about the quality and accuracy of the output. This template answers the question “How good is my translation?” so you have a high level of confidence before you publish. Who is this for? This workflow template is designed for businesses needing translation and localization of documents such as text docs, presentations, web pages, transcripts, video subtitles and others. Use it to build workflows that localize your content at scale while maintaining translation quality, accuracy and compliance. Set up instructions Straker Verify Integration with n8n **Connect to Straker Verify: Obtain Your API Key Sign Up/Log In:** Visit https://verify.straker.ai/ to create an account or log in. Navigate to API Keys: Go to `Verify → Settings → API Keys. Copy Your Key: Find and copy your API key. Add Key to n8n: In n8n, go to Settings → Credentials → Straker Verify. Set Up Credentials for the Straker Verify Node Open n8n and go to "Credentials". From the left sidebar, click on "Credentials". Search for "Straker Verify" and select "Straker Verify API" from the dropdown. Paste the copied access token from the Verify app and save it. Assign Credentials to the Node 1.Go to your workflow and open the Straker Verify node. 2.Select the "Straker Verify API" credentials you just created from the dropdown (Credentials to connect with), and save. ✅ You are now ready to use the workflow. Workflow Steps Step 1: Initiate Workflow – Upload Files to Google Drive 1.Upload one or more files to the designated Google Drive folder. This triggers the "New File in Google Drive" node in n8n. Step 2: Verify Token Balance The workflow checks your token balance via: Get Current User Balance User Has Enough Tokens Not enough tokens? You will receive a Slack message: Not enough tokens Please top up and re-upload your files. Step 3: Select Workflow The system fetches available workflows via: Fetch All Users Workflows No matching workflow found? You'll be notified via Slack: No workflow found Step 4: Create Project in Straker Verify The following steps are handled automatically: Fetch language/project options Download files from Google Drive Create a new project using: Create Straker Project ✅ You'll receive a Slack confirmation: > "Project created – ID: xxxxxx" Step 5: Process Completion & File Return Once files are processed by Straker: The Incoming Translation Result webhook is triggered. The workflow: Downloads processed files: Get File Content from Strakerh - Uploads them to Google Drive: Upload File to Google Drivee Step 6: Confirmation - Workflow Complete You'll receive a final Slack message: > "Workflow complete – Files are now available in Google Drive"
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for? This workflow template enables intelligent data extraction from ProductHunt using Bright Data’s Model Context Protocol (MCP) and processes search results with Google Gemini. This workflow is designed for individuals and teams who need automated, intelligent discovery and analysis of new tech products. It's especially valuable for: Startup Analysts & VC Researchers Growth Hackers & Marketers Recruiters & Tech Scouts Product Managers & Innovation Teams AI & Automation Enthusiasts What problem is this workflow solving? Traditional product discovery on ProductHunt is constrained by limited descriptions and requires repeated manual validation through web searches. Manually extracting and enriching this data is slow, repetitive, and error-prone. This workflow solves the problem by: Extracting real-time ProductHunt data using Bright Data’s MCP infrastructure to mimic real-user behavior and avoid blocks. Performing contextual searches on Google for a specific product on ProductHunt to gather use cases, reviews, and related information. Structuring results using Google Gemini LLM to provide human-readable insights and reduce noise. Delivering results seamlessly by saving output to disk, updating Google Sheets, and sending Webhook alerts. What this workflow does Input Field Node Define the ProductHunt category with the search term(s) you want to target. This is used to drive extraction and search operations. Agent Operation Node The agent performs two major tasks: Extract from ProductHunt Retrieves trending products from ProductHunt using Bright Data MCP Contextual Google Search for the product the agent searches Google for deeper context, including: Reviews Competitor mentions Real-world usage examples LLM Node (Google Gemini) Analyzes and summarizes extracted web content Removes noise (ads, menus, etc.) Structures content into bullet points, insights, or JSON objects Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs This workflow is flexible and modular, allowing you to adapt it for various research, product discovery, or trend analysis use cases. Below are the key customization points and how to modify them. Define Your Target Products or Topics: Change the input parameter to a specific ProductHunt category, tag, or keyword (e.g., "AI tools", "SaaS", "DevOps") Change Output Destinations : Save to Disk**: Change the file format (.json, .csv, .md) or directory path Google Sheet**: Modify sheet name, structure (columns like Product, Summary, Link) Webhook Notification**: Point to a Slack/Discord/CRM/Webhook URL with payload mapping
by explorium
Explorium Event-Triggered Outreach This n8n and agent-based workflow automates outbound prospecting by monitoring Explorium event data (e.g. product launches, new office opening, new investment and more), researching companies, identifying key contacts, and generating tailored sales emails leveraging the Explorium MCP server. Template Workflow Overview Node 1: Webhook Trigger Purpose: Listens for real-time product launch events pushed from Explorium's webhook system. How it works: Explorium sends HTTP POST requests containing event data The webhook payload includes company name, business ID, domain, product name, and event type Pay attention: Product launch is just one example. You can easily enroll to many more meaningful events. to learn about events and how to enroll to events, visit the events documentation. Node 2: Company Research Agent Agent Type: Tools Agent Purpose: Enrich company data after an event occurs. How it works: Uses Explorium MCP via the MCP Client tool to gather additional company data Uses Anthropic Claude (Chat Model) to process and interpret company information for downstream personalization Node 3: Employee Data Retrieval Purpose: Retrieve prospect-level data for targeting. How it works: Uses HTTP Request node to call Explorium's fetch_prospects endpoint Filters prospects by: Company business_id Departments: Product, R&D, etc... Seniority levels: owner, cxo, vp, director, senior, manager, partner, etc... Pay Attention: Follow our fetch prospect documentation for the full list of filter and best practice. Limits results to top 5 relevant employees Code nodes handle: Filtering logic Cleaning API response Formatting data for downstream agents Node 4: Conditional Branch - Prospect Data Check If Node: Checks whether prospect data was successfully retrieved Logic: If prospects found → personalized emails per person If no prospects → fallback to company-level general email Node 5A: Email Writer #1 (No Prospect Data) Agent Type: Tools Agent Purpose: Write generic outbound email using only company-level research and event info. Powered by: Anthropic Chat Model Node 5B: Loop Over Prospects → Email Writer #2 (Personalized) Agent Type: Tools Agent Purpose: Write highly personalized email for each identified employee. How it works: Loops through each individual prospect Passes company research + employee data to LLM agent Generates customized emails referencing: Prospect's title & department Product launch Role-relevant Explorium value proposition Node 6: Slack Notifications Purpose: Posts completed emails to internal Slack channel for review or testing before final deployment. Future State: Can be swapped with an email sequencing platform in production. Setup Requirements Explorium API Access MCP Client credentials for company enrichment and prospect fetching Registered webhook for event listening Get explorium api key n8n Configuration Secure environment variables for API keys & webhook secret Code nodes configured for JSON transformation, filtering & signature validation Customization Options Personalization Logic Update LLM prompt instructions to reflect ICP priorities Modify email templates based on role, department, or tenure logic Adjust fallback behavior when prospect data is unavailable API Request Tuning Adjust page_size for number of prospects retrieved Fine-tune seniority and department filters to match evolving targeting Future Expansion Swap Slack notifications for outbound email automation Integrate call task assignment directly into CRM Introduce engagement scoring feedback loop (opens, clicks, replies) Troubleshooting Tips Validate webhook signature matching to prevent unauthorized requests Ensure correct business_id is passed to prospect fetching endpoint Confirm business enrichment returns sufficient data for company researcher agents Review agent LLM responses for correct output structure and parsing consistency
by ist00dent
This n8n template allows you to perform real-time currency conversions by simply sending a webhook request. By integrating with the ExchangeRate.host API, you can get up-to-date exchange rates for over 170 world currencies, making it an incredibly useful tool for financial tracking, e-commerce, international business, and personal budgeting. 🔧 How it works Receive Conversion Request Webhook: This node acts as the entry point for the workflow, listening for incoming POST requests. It's configured to expect a JSON body containing: from: The 3-letter ISO 4217 currency code for the source currency (e.g., USD, PHP). to: The 3-letter ISO 4217 currency code for the target currency (e.g., EUR, JPY). amount: The numeric value you want to convert. Important: The ExchangeRate.host API access_key is handled securely by n8n's credential system and should not be included in the webhook body or headers. Convert Currency: This node makes an HTTP GET request to the ExchangeRate.host API (api.exchangerate.host). It dynamically constructs the URL using the from, to, and amount from the webhook body. Your API access key is securely retrieved from n8n's pre-configured credentials (HTTP Query Auth type) and automatically added as a query parameter (access_key). The API then performs the conversion and returns a JSON object with the conversion details. Respond with Converted Amount: This node sends the full currency conversion result received from ExchangeRate.host back to the service that initiated the webhook. 👤 Who is it for? This workflow is ideal for: E-commerce Platforms: Display prices in local currencies on the fly for international customers. Convert incoming international payments to your local currency for accounting. Calculate shipping costs in different currencies. Financial Tracking & Budgeting Apps: Update personal or business budgets with converted values. Track expenses incurred in foreign currencies. Automate portfolio value conversion for multi-currency investments. International Business & Freelancers: Generate invoices in a client's local currency based on your preferred currency. Quickly estimate project costs or earnings in different currencies. Automate reconciliation of international transactions. Travel Planning: Convert travel expenses from one currency to another while abroad. Build simple tools to estimate costs for trips in different countries. Data Analysis & Reporting: Standardize financial data from various sources into a single currency for unified reporting. Build dashboards that display converted financial metrics. Custom Integrations: Connect to CRMs, accounting software, or internal tools to automate currency-related tasks. Build chatbots that can answer currency conversion queries. 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "from": "USD", "to": "PHP", "amount": 100 } The workflow will return a JSON response similar to this (results will vary based on currencies and amount): { "date": "2025-06-03", "historical": false, "info": { "rate": 58.749501, "timestamp": 1717398188 }, "query": { "amount": 100, "from": "USD", "to": "PHP" }, "result": 5874.9501, "success": true } ⚙️ Setup Instructions Get an ExchangeRate.host Access Key: Go to https://exchangerate.host/ and sign up for a free API key. Create an n8n Credential for ExchangeRate.host: In your n8n instance, go to Credentials. Click "New Credential" and search for "HTTP Query Auth". Set the Name (e.g., ExchangeRate.host API Key). Set API Key to your ExchangeRate.host access key. Set Parameter Name to access_key. Set Parameter Position to Query. Save the credential. Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure ExchangeRate.host API Node: Double-click the Convert Currency node. Under "Authentication", select "Generic Credential Type". Choose "HTTP Query Auth" as the Generic Auth Type. Select the credential you created (e.g., "ExchangeRate.host API Key") from the dropdown. Configure Webhook Path: Double-click the Receive Conversion Request Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /convert-currency). Activate Workflow: Save and activate the workflow. 📝 Tips This workflow is a powerful starting point. Here's how you can make it even more robust and integrated: Robust Error Handling: Add an IF node after Convert Currency to check {{ $json.success }}. If false, branch to an Error Trigger node or send an alert (e.g., Slack, Email) with {{ $json.error.info }} to notify you of API issues or invalid inputs. Include a Try/Catch block to gracefully handle network issues or malformed responses. Input Validation & Defaults: Add a Function node after the webhook to validate if from, to, and amount are present and in the correct format. If not, return a clear error message to the user. Set default from or to currencies if they are not provided in the webhook, making the API more flexible. Logging & Auditing: After a successful conversion, use a Google Sheets, Airtable, or database node (e.g., PostgreSQL, MongoDB) to log every conversion request, including the input currencies, amount, converted result, date, and possibly the calling IP (from the webhook headers). This is crucial for financial auditing and analysis. Rate Limits & Caching: If you anticipate many requests, be mindful of ExchangeRate.host's API rate limits. You can introduce a Cache node to store recent conversion results for a short period, reducing redundant API calls for common conversions. Alternatively, add a Delay node to space out requests if you're hitting limits. Format & Rounding: Use a Function node or Set node to format the result to a specific number of decimal places (e.g., {{ $json.result.toFixed(2) }}). Add currency symbols or full currency names to the output for better readability. Alerting on Significant Changes: Chain this workflow with a Cron or Schedule node to periodically fetch exchange rates for a pair you care about (e.g., USD to EUR). Use an IF node to compare the current rate with a previously stored rate. If the change exceeds a certain percentage, send an alert via Slack, Email, or Telegram to notify you of significant market shifts. Integration with Payment Gateways: For e-commerce, combine this with nodes for payment gateways (e.g., Stripe, PayPal) to automatically convert customer payments received in foreign currencies to your base currency before recording. Multi-currency Pricing for Products: Use this workflow in conjunction with your product database. When a user selects a different country/currency, trigger this webhook to dynamically convert product prices and display them instantly.
by Vincent Belmehel
Purpose This workflow automatically creates a subscriber in a given Beehiiv publication when a new opt-in is registered in a given Systeme.io sales funnel. Good to know: the integration with Systeme.io is done at the sales funnel level, not at the account level. If you have several sales funnels, you can use the same workflow several times. Quick Setup Configure your sales funnel in Systeme.io to create and trigger a webhook after an opt-in Open the “On New Systeme.io Optin” node to find the webhook URL needed to configure your sales funnel on Systeme.io Configure the “Configure Workflow” node Add your Beehiiv publication ID If you know the subscriber's first and last name and want to send it to Beehiiv, configure the custom field names for first and last name Add one or more email addresses to which to send alert notifications in the event of a problem (separated by commas). If you have not already done so : Connect your Beehiiv account in the “Create New Beehiiv Subscriber” node Connect your Gmail account in the “Send Email Alert (Beehiiv API error)” node How It Works As soon as a new opt-in is registered on your sales funnel, Systeme.io triggers the workflow (via a webhook) Only requests actually coming from Systeme.io are considered (whitelisting of their IP addresses for security reasons) A new subscriber is added to your Beehiiv publication (via an API call) If available in Systeme.io, UTM tags (utm_source, utm_medium and utm_campaign) are transferred to Beehiiv to correctly track where your subscribers are coming from If an error occurs during the Beehiiv API call, an alert notification is sent to you (via email) Requirements A Systeme.io account A Beehiiv account with an active publication A Gmail account Benefits Automate & scale your email marketing efforts seamlessly No more manual tasks to keep your subscriber list always up-to-date Focus on creating a newsletter that stands out, not on the technical side Check Out My Other Templates 👉 https://n8n.io/creators/belmehel/
by Louis Chan
How it works Transform medical documents into structured data using Google Gemini AI with enterprise-grade accuracy. Classifies document types (receipts, prescriptions, lab reports, clinical notes) Extracts text with 95%+ accuracy using advanced OCR Structures data according to medical taxonomy standards Supports multiple languages (English, Chinese, auto-detect) Tracks processing costs and quality metrics automatically Set up steps Prerequisites Google Gemini API key (get from Google AI Studio) Quick setup Import this workflow template Configure Google Gemini API credentials in n8n Test with a sample medical document URL Deploy your webhook endpoint Usage Send POST request to your webhook: { "image_url": "https://example.com/medical-receipt.jpg", "expected_type": "financial", "language_hint": "auto" } Get structured response: json{ "success": true, "result": { "documentType": "financial", "metadata": { "providerName": "Dr. Smith Clinic", "createdDate": "2025-01-06", "currency": "USD" }, "content": { "amount": 150.00, "services": [...] }, "quality_metrics": { "overall_confidence": 0.95 } } } Use cases Healthcare Organizations Medical billing automation - Process receipts and invoices automatically Insurance claim processing - Extract data from claim documents Clinical documentation - Digitize patient records and notes Data standardization - Consistent structured output format System Integrators EMR integration - Connect with existing healthcare systems Workflow automation - Reduce manual data entry by 90% Multi-language support - Handle international medical documents Quality assurance - Built-in confidence scoring and validation Supported Document Types Financial: Medical receipts, bills, insurance claims, invoices Clinical: Medical charts, progress notes, consultation reports Prescription: Prescriptions, medication lists, pharmacy records Administrative: Referrals, authorizations, patient registration Diagnostic: Lab reports, test results, screening reports Legal: Medical certificates, documentation forms
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of building a call analyzer. .png) Who is this for? This workflow is ideal for sales teams, customer support managers, and online education services that conduct follow-up calls with clients. It’s designed for those who want to leverage AI to gain deeper insights into client needs and upsell opportunities from recorded calls. What problem does this workflow solve? Many follow-up sales calls lack structured analysis, making it challenging to identify client needs, gauge interest levels, or uncover upsell opportunities. This workflow enables automated call transcription and AI-driven analysis to generate actionable insights, helping teams improve sales performance, refine client communication, and streamline upselling strategies. What this workflow does This workflow transcribes and analyzes sales calls using AssemblyAI, OpenAI, and Supabase to store structured data. The workflow processes recorded calls as follows: Transcribe Call with AssemblyAI: Converts audio into text with speaker labels for clarity. Analyze Transcription with OpenAI: Using a predefined JSON schema, OpenAI analyzes the transcription to extract metrics like client intent, interest score, upsell opportunities, and more. Store and Access Results in Supabase: Stores both transcription and analysis data in a Supabase database for further use and display in interfaces. Setup Preparation Create Accounts: Set up accounts for N8N, Supabase, AssemblyAI, and OpenAI. Get Call Link: Upload audio files to public Supabase storage or Dropbox to generate a direct link for transcription. Prepare Artifacts for OpenAI: Define Metrics: Identify business metrics you want to track from call analysis, such as client needs, interest score, and upsell potential. Generate JSON Schema: Use GPT to design a JSON schema for structuring OpenAI’s responses, enabling efficient storage, analysis, and display. Create Analysis Prompt: Write a detailed prompt for GPT to analyze calls based on your metrics and JSON schema. Scenario 1: Transcribe Call with AssemblyAI Set Up Request: Header Authentication: Set Authorization with AssemblyAI API key. URL: POST to https://api.assemblyai.com/v2/transcript/. Parameters: audio_url: Direct URL of the audio file. webhook_url: URL for an N8N webhook to receive the transcription result. Additional Settings: speaker_labels (true/false): Enables speaker diarization. speakers_expected: Specify expected number of speakers. language_code: Set language (default: en_us). Scenario 2: Process Transcription with OpenAI Webhook Configuration: Set up a POST webhook to receive AssemblyAI’s transcription data. Get Transcription: Header Authentication: Set Authorization with AssemblyAI API key. URL: GET https://api.assemblyai.com/v2/transcript/<transcript_id>. Send to OpenAI: URL: POST to https://api.openai.com/v1/chat/completions. Header Authentication: Set Authorization with OpenAI API key. Body Parameters: Model: Use gpt-4o-2024-08-06 for JSON Schema support, or gpt-4o-mini for a less costly option. Messages: system: Contains the main analysis prompt. user: Combined speakers’ utterances to analyze in text format. Response Format: type: json_schema. json_schema: JSON schema for structured responses. Save Results in Supabase: Operation: Create a new record. Table Name: demo_calls. Fields: Input: Transcription text, audio URL, and transcription ID. Output: Parsed JSON response from OpenAI’s analysis.
by Rodrigue Gbadou
How it works This comprehensive recruitment automation workflow transforms your hiring process from manual screening to intelligent candidate management. The system begins by automatically collecting CVs from multiple job boards and career platforms, immediately parsing each submission using advanced AI technology to extract key information including skills, experience levels, educational background, and career progression patterns. Once parsed, the workflow employs predictive scoring algorithms that evaluate each candidate against your specific job requirements and company culture criteria. This multi-dimensional analysis considers technical skills alignment, experience relevance, cultural fit indicators, and career trajectory patterns to generate compatibility scores with remarkable accuracy. The system then seamlessly transitions qualified candidates into automated interview scheduling, coordinating availability across hiring managers, team members, and candidates while optimizing for timezone considerations and calendar conflicts. Finally, successful candidates enter a personalized onboarding workflow that adapts to their role, department, and experience level, ensuring smooth integration into your organization. Target audience and problem solved This workflow is designed for HR departments, talent acquisition teams, and growing companies struggling with time-intensive recruitment processes. It specifically addresses the challenges of manual CV screening, subjective candidate evaluation, scheduling conflicts, and inconsistent onboarding experiences. Organizations processing high volumes of applications or seeking to eliminate recruitment bias while maintaining quality standards will benefit most from this automation. Set up steps Prerequisites: Ensure you have API access to your chosen AI parsing service (OpenAI, Affinda, or equivalent), active accounts on target job boards, and administrative access to your calendar and ATS systems. Configure job board integrations: Connect your LinkedIn Recruiter, Indeed, and Glassdoor accounts using their respective APIs. Set up webhook endpoints to automatically capture new CV submissions and configure filtering criteria based on job titles, locations, and basic qualifications. Establish AI parsing service: Choose and configure your CV analysis provider (OpenAI for natural language processing, Affinda for specialized CV parsing, or alternative services). Set up API credentials and define extraction templates for skills, experience, education, and custom fields relevant to your industry. Integrate calendar systems: Connect Google Calendar, Outlook, or your preferred scheduling platform. Configure availability windows for all hiring team members, set interview duration templates, and establish buffer times between meetings. Synchronize ATS platform: Link your Applicant Tracking System (Workday, BambooHR, Greenhouse, etc.) to ensure seamless candidate data flow. Map workflow fields to your ATS schema and establish status update triggers. Connect interview tools: Integrate video conferencing platforms (Zoom, Microsoft Teams, Google Meet) for automatic meeting room creation and invitation distribution. Configure recording settings and waiting room preferences. Link HRMS for onboarding: Connect your Human Resource Management System to trigger personalized onboarding sequences based on role type, department, and seniority level. Key Features 🧠 Advanced CV analysis**: Leverages machine learning to automatically extract and categorize skills, experience, education, certifications, and career progression patterns with 95% accuracy 📊 Multi-criteria scoring**: Implements customizable evaluation matrices considering technical skills, soft skills, experience relevance, cultural fit indicators, and growth potential 📅 Intelligent scheduling**: Automatically coordinates complex interview schedules across multiple stakeholders, considering time zones, availability preferences, and interview type requirements 🎯 Precise candidate matching**: Generates compatibility percentages based on job requirements, team dynamics, and long-term career alignment factors ⚡ Accelerated recruitment cycle**: Reduces time-to-hire by up to 60% through automated screening, intelligent prioritization, and streamlined communication workflows 👥 Collaborative evaluation**: Enables structured feedback collection from multiple interviewers with standardized scoring rubrics and consensus-building tools 📱 Enhanced candidate experience**: Provides mobile-optimized interfaces for application tracking, interview scheduling, and communication throughout the recruitment journey 🔄 Continuous optimization**: Automatically tracks and analyzes recruitment metrics to continuously improve scoring algorithms and process efficiency Customization options The workflow offers extensive customization capabilities including adjustable scoring weights for different criteria, industry-specific skill taxonomies, custom interview formats, and role-based onboarding paths. Organizations can configure approval workflows, set up custom notification templates, and establish specific integration parameters to match their unique recruitment processes and company culture. This automation solution transforms recruitment from a time-intensive manual process into a strategic, data-driven system that improves both hiring quality and candidate experience while significantly reducing administrative overhead.
by Srinivasan KB
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What is DIGIPIN? DIGIPIN (Digital Pincode) is a 10-character alphanumeric code introduced by India Post. It maps any 3x3 meter square in India to a unique digital address. This helps precisely locate homes, shops, or landmarks, especially in areas where physical addresses are inconsistent or missing. What this workflow does This workflow creates a fully offline DIGIPIN microservice using only JavaScript - no external APIs are used. You get two HTTP endpoints: GET /generate-digipin?lat={latitude}&lon={longitude} → returns a DIGIPIN GET /decode-digipin?digipin={code} → returns the latitude and longitude You can plug this into any system to: Convert GPS coordinates to a DIGIPIN Convert a DIGIPIN back to coordinates How it works An HTTP Webhook node receives the request A JS Function node either encodes or decodes based on input The result is returned as a JSON response All the logic is handled inside the workflow - no API keys, no external calls. Why use this Fast and lightweight Easily extendable: you can connect this to forms, CRMs, apps, or spreadsheets Ideal for field agents, address validation, logistics, or rural operations