by Brian Coyle
Description Candidate Engagement | Resume Screening | AI Voice Interviews | Applicant Insights This intelligent n8n workflow automates the process of extracting and scoring resumes received through a company career page, populating a Notion database with AI insights where the recruiter or hiring manager can automatically invite the applicant to an instant interview with an Elevenlabs AI voice agent. After the agent conducts the behavior-based interview, the workflow scores the overall interview against customizable evaluation criteria and updates the Notion database with AI insights about the applicant. AI Powered Resume Screening & Voice AI that interviews like a Recruiter! AI Insights in Notion dashboard Who is this for? HR teams, recruiters, and talent acquisition professionals This workflow is ideal for HR teams, recruiters, and talent acquisition professionals looking for a foundational, extensible framework to automate early stage recruiting. Whether you're exploring AI for the first time or scaling automation across your hiring process, this template provides a base for screening, interviewing, and tracking candidates—powered entirely by n8n, Elevenlabs, Notion, and LLM integrations. Be sure to consult State and Country regulations with respect to AI Compliance, AI Bias Audits, AI Risk Assessment, and disclosure requirements. What problem is this workflow solving? Manually screening resumes and conducting initial interviews slows down hiring. This template automates: Resume assessment against job description. Scheduling first and second round interviews. First-round AI-led behavioral interviews with AI scoring assessment. Centralized tracking of AI assessments in Notion. What this does This customizable tool, configured to manage 3 requisitions in parallel, automates the application process, resume screen, and first round behavioral interviews. Pre-screen Applicants with AI Immediately screens and scores applicant’s resume against the job description. The AI Agent generates a score and an AI assessment, adding both to the applicant's profile in Notion. Notion automatically notifies hiring manager when a resume receives a score of 8 or higher. Voice AI that Interviews like a Recruiter AI Voice agent adapts probing questions based on applicant’s response and intelligently dives deeper into skill and experience to assess answers against a scoring rubric for each question. AI Applicant Insights in Notion Get detailed post-interview AI analysis, including interview recordings and question-by-question scoring breakdowns to help identify who you should advance to the next stage in the process. AI insight provided in Notion ATS dashboard with drag and drop to advance top candidates to the next interview stage. How it works Link to Notion Template Notion Career Page: Notion Career Page published to web, can be integrated with your preferred job board posting system. Notion Job Posting: Gateway for applicants to apply to active requisitions with ‘Click to Apply’ button. Application Form: N8N webform embedded into Notion job posting captures applicant information and routes for AI processing. AI Agent evaluates resume against job description AI Agent evaluates resume against the job description, stored in Notion, and scores the applicant on a scale of 1 to 10, providing rationale for score. Creates ATS record in Notion with assessment and score Workflow creates the applicant record in the Notion ATS where Recruiters and Hiring Managers see applicants in a filtered view, sorted by AI generated resume score. Users can automatically advance applicants to the next step in process (AI Conversation interview) with drag and drop functionality. Invites applicant to an Instant AI Interview Dragging the applicant to AI Interview step in the Notion ATS dashboard triggers Notion automation that sends the applicant an email with a link to the Elevenlabs Conversation AI Agent. The AI Conversation Agent is provided with instructions on how to conduct the behavior-based interview, including probing questions, for the specific role. AI Conversation Agent Behavior Based Interview The email link resolves to an ElevenLabs AI Conversation agent that has been instructed to interview applicants using pre-defined interview questions, scoring rubric, job description, and company profile. The Elevenlabs agent assesses the applicant on a scale of 1 to 5 for each interview question and provides an overall assessment of the interview based on established evaluation criteria. Click to hear AI Voice Agent in action Example: Role: IT Support Analyst Mark: Elevenlabs AI Agent instructed to interview applicants for specific role Gemini: Google AI coached to answer questions as an IT Support Analyst being interviewed Updates Notion record with Interview Assessment and Score All results—including the conversation transcript, interview scores, and rationale for assessment are automatically added back to the applicant’s profile in Notion where the Hiring Manager can validate the AI assessment by skimming through the embedded audio file. AI Interview Overall Score: 1 to 5 based on response to all questions and probes. AI Agent confirms that it was able to evaluate the interview using the assigned rubric. AI Interview Criteria Score: Success/Failure based on response to individual interview questions. Invites applicant to second interview with Hiring Manager Dragging the applicant to the ‘Hiring Manager Interview’ step in the Notion ATS dashboard triggers a Notion automation that sends an email with a link to the Hiring Manager’s calendar scheduling solution. Configuration and Set Up Accounts & API Keys You wil need accounts and credentials for: n8n (hosted or self-hosted) Elevenlabs (for AI Conversation Agent) Gemini (for LLM model access) Google Drive (to back up applicant data) Calendly (to automate interview scheduling) Gmail (to automate interview scheduling) Data / Documents to implement Job Descriptions for each role Interview questions for each role Evaluation criteria for each interview question Notion Set Up Customize your Notion Career Page Link to Free Notion Template that enables workflow: Update Notion job description database -update job description(s) for each role -add interview questions to the job description database page in Notion -add evaluation criteria to the job description database page in Notion -edit each ‘Click to Apply’ button in the job description template so it resolves to the corresponding N8N 'Application Form' webform production URL (detail provided below) Notion Applicant Tracker In the Applicant Tracker database, update position titles, tab headings, in the custom database view (Notion) so it reflects the title of the position you are posting. Edit the filter for each tab so it matches the position title. Notion Email Automation Update Notion automation templates used to invite applicants to the AI Interview and Hiring Manager interview. Note: Trigger email automation by dragging applicant profile to the next Applicant Comm Status in the Applicant Tracker. AI Interview invite template: revise position title to reflect the title of the role you are posting; include the link to your Conversation AI Agent for that role in the email body. Note: each unique role will use an Elevenlabs AI conversation agent designed for that role. Hiring Manager Interview invite template: revise position title to reflect the title of the role you are posting; include the link to your Calendly page or similar solution provider to automate interview scheduling. N8N Configuration Workflow 1 Application Forms (3 Nodes - one for each job) Update the N8N form title and description to match the job description you configured in Notion. Confirm Job Code in Applicant Form node matches Job Code in Notion for that position. Edit the Form Response to customize the message you want displayed after applicant clicks submit. Upload CV - Google Drive Authenticate your Google Drive account and select the folder that will be used to store resumes Get Job Description - Notion Authenticate your Notion account and select your Career Page from the list of databases that contain your job descriptions. Applicant Data Backup - Google Sheet Create a Google Sheet where you will track applicant data for AI Compliance reporting requirements. Open the node in n8n and use the field names in the node as Google Sheet column headings. Workflow 2 Elevenlabs Web Hook (Node 1) Edit the Web Hook POST node and copy your production URL that is displayed in the Node. This URL is entered into the Elevenlabs AI Conversation Agent post-call webhook described below. AI Agent Authenticate your LLM model (Gemini in this example) and add your Notion database as a tool to pull the evaluation_criteria hosted in Notion for the specific role. Extract Audio Create an Elevenlabs API key for your conversation agent and enter that key as a json header for the Extract Audio node Upload Audio to Drive - Google Drive Authenticate your Google Drive account and select the folder that will be used to store the audio file. Elevenlabs Configuration Create an Elevenlabs account Create Conversation AI Agent Add First Message and Systems Prompt: Design your ‘First Message’ and ‘Systems Prompt’ that guides the AI agent conducting the interview. Tool Tip: provide instruction that limits the number of probes per interview question. Knowledge Base: Upload your role specific interview questions and job description, using the same text that is stored in your Notion Career page for the role. You can also add a document about your company and instruct the Elevenlabs agent to answer questions about culture, strategy, and company growth. Analysis: Evaluation Criteria: Add your evaluation criteria, less than 2000 characters, for each interview question / competency. Analysis: Data Collection: Add the following elements, using the exact character string represented below. phone_number_AI_screen "capture applicant's phone number provided at the start of the conversation and share this as a string, integers only." full_name "capture applicant's full name provided at the start of the conversation." Advanced: Max Duration Set the max duration for interview in seconds. The AI Agent will timeout at the max duration. Conversation AI Widget: Customize your AI Conversation Agent landing page, including the position tile and company name. AI Conversation Agent URL: Copy the AI Conversation Agent URL and add it to your Notion email template triggered by the AI Interview email automation. Use a custom AI Agent URL for each distinct job description. Enable your Elevenlabs Post-Call Webhook for your Conversation Agent: Log into your Elevenlabs account and go to Conversational AI Settings and click on Post-Call Web Hook. This is where you enter the production URL from the N8N Web Hook node (Workflow 2). This sends the AI Voice Agent output to your n8n workflow which feeds back to your Notion dashboard.
by Peter Zendzian
This n8n template demonstrates how to automate comprehensive web research using multiple AI models to find, analyze, and extract insights from authoritative sources. Use cases are many: Try automating competitive analysis research, finding latest regulatory guidance from official sources, gathering authoritative content for reports, or conducting market research on industry developments! Good to know Each research query typically costs $0.08-$0.34 depending on the number of sources found and processed. The workflow includes smart filtering to minimize unnecessary API calls. The workflow requires multiple AI services and may need additional setup time compared to simpler templates. Qdrant storage is optional and can be removed without affecting performance. How it works Your research question gets transformed into optimized Google search queries that target authoritative sources while filtering out low-quality sites. Apify's RAG Web Browser scrapes the content and converts pages to clean markdown format. Claude Sonnet 4 evaluates each article for relevance and quality before full processing. Articles that pass the filter get analyzed in parallel - one pipeline creates focused summaries while another extracts specific claims and evidence. GPT-4.1 Mini ranks all findings and presents the top 3 most valuable insights and summaries. All processed content gets stored in your Qdrant vector database to prevent duplicate processing and enable future reference. How to use The manual trigger node is used as an example but feel free to replace this with other triggers such as webhook, form submissions, or scheduled research. You can modify the configuration variables in the Set Node to customize Qdrant URLs, collection names, and quality thresholds for your specific needs. Requirements OpenAI API account for GPT-4.1 Mini (query optimization, summarization, ranking) Anthropic API account for Claude Sonnet 4 (content filtering) Apify account for web scraping capabilities Qdrant vector database instance (local or cloud) Ollama with nomic-embed-text model for embeddings Customizing this workflow Web research automation can be adapted for many specialized use cases. Try focusing on specific domains like legal research (targeting .gov and .edu sites), medical research (PubMed and health authorities), or financial analysis (SEC filings and analyst reports).
by Ventsislav Minev
Google Drive Duplicate File Manager 🧹📁 Purpose: Automate the process of finding and managing duplicate files in your Google Drive. Who's it for? Individuals and teams aiming to streamline their Google Drive. Anyone tired of manual duplicate file cleanup. What it Solves: Saves storage space 💾. Reduces file confusion 😕➡️🙂. Automates tedious cleanup tasks 🤖. How it works: Trigger: Monitors a Google Drive folder for new files. Configuration: Sets rules for keeping and handling duplicates. Find Duplicates: Identifies duplicate files based on their content (MD5Checksum). Action: Either moves duplicates to trash or renames them. Setup Guide: Google Drive Trigger ⏰: Set up the trigger to watch a specific folder or your entire drive (use caution with the root folder! ⚠️). Configure the polling interval (default: every 15 minutes). Config Node ⚙️: keep: Choose whether to keep the "first" or "last" uploaded file (default: "last"). action: Select "trash" to delete duplicates or "flag" to rename them with "DUPLICATE-" (default: "flag"). owner & folder: Taken from the trigger. Only change if needed. Key Considerations: Google Drive API limits:** Be mindful of API usage. Folder Scope:* The workflow handles one folder depth by default. (WARNING: If configured to work with the Root folder / all files in all sub-directories are processed so *USE THIS OPTION WITH CAUTION** since the workflow might trash/rename important files) Google Apps:** Google docs are ignored since they are not actual binary-files and their content can't be compared. Enjoy your clean Google Drive! ✨
by Jimleuk
If you have a shared or personal drive location with a high frequency of files created by humans, it can become difficult to organise. This may not matter... until you need to search for something! This n8n workflow works with the local filesystem to target the messy folder and categorise as well as organise its files into sub directories automatically. Disclaimer Unfortunately due to the intended use-case, this workflow will not work on n8n Cloud and a self-hosted version of n8n is required. How it works Uses the local file trigger to activate once a new file is introduced to the directory The new file's filename and filetype are analysed using AI to determine the best location to move this file. The AI assess the current subdirectories as to not create duplicates. If a relevant subdirectory is not found, a new subdirectory is suggested. Finally, an Execute Command node uses the AI's suggestions to move the new file into the correct location. Requirements Self-hosted version of n8n. The nodes used in this workflow only work in the self-hosted version. If you are using docker, you must create a bind mount to a host directory. Mistral.ai account for LLM model Customise this workflow If the frequency of files created is high enough, you may not want the trigger to active on every new file created event. Switch to a timer to avoid concurrency issues. Want to go fully local? A version of this workflow is available which uses Ollama instead. You can download this template here: https://drive.google.com/file/d/1iqJ_zCGussXpfaUBYGrN5opziEFAEQMu/view?usp=sharing
by n8n Team
This workflow pushes Stripe charges to HubSpot contacts. It uses the Stripe API to get all charges and the HubSpot API to update the contacts. The workflow will create a new HubSpot property to store the total amount charged. If the property already exists, it will update the property. Prerequisites Stripe credentials. HubSpot credentials. How it works On a schedule, check if the property exists in HubSpot. If it doesn't exist, create it. The default schedule is once a day at midnight. Once property is acertained, the first Stripe node gets all charges. Once the charges are returned, the second Stripe node gets extra customer information. Once the customer information is returned, Merge data node will merge the customer information with the charges so that the next node Aggregate totals can calculate the total amount charged per contact. Once we have the total amount charged per contact, the Create or update customer node will create a new HubSpot contact if it doesn't exist or update the contact if it does exist with the total amount charged.
by Billy Christi
Who is this for? This workflow is ideal for: HR professionals* and *recruiters** who want to automate and enhance the hiring process Organizations** seeking AI-driven, consistent, and data-backed candidate evaluations Hiring managers** using Airtable as their recruitment database What problem is this workflow solving? Screening candidates manually is time-consuming, inconsistent, and difficult to scale. This workflow solves that by: Automating resume intake and AI evaluation** Matching candidates to job postings dynamically** Generating standardized suitability reports** Notifying HR only when candidates meet the criteria** Storing all applications in a structured Airtable database** What this workflow does This workflow builds an end-to-end AI-powered hiring pipeline using Airtable, OpenAI, and Google Drive. Here's how it works: Accept candidate applications via a public web form, including resume upload (PDF only) Extract text from uploaded resumes for processing Store resumes in Google Drive and generate shareable links Match the application to a job posting stored in Airtable Use AI to evaluate candidates (via OpenAI GPT-4) against job descriptions and requirements Generate suitability results including: Match percentage Screening status: Suitable, Not Suitable, Under Review Detailed notes Combine AI output and files into one data object Create a new candidate record in Airtable with all application data Automatically notify HR via Gmail if a candidate is marked “Suitable” Setup View & Copy the Airtable base here: 👉 Candidate Screening – Airtable Base Template Set up Google Drive folder Connect your OpenAI API key for the AI agent model Connect your Gmail account for email notifications Deploy the public-facing form to start receiving applications Test the workflow using a sample job and resume How to customize this workflow to your needs Expand file support**: Allow DOC or DOCX uploads by adding format conversion nodes Add multi-recipient email alerts**: Extend Gmail node for multiple HR recipients Handle “Under Review” differently**: Add additional logic to notify or flag these candidates Send rejection emails automatically**: Extend the IF branch for “Not Suitable” candidates Schedule interviews**: Integrate with Google Calendar or Calendly APIs Add Slack notifications**: Send alerts to team channels for real-time updates
by Audun
Who is this for? Security professionals Developers Individuals interested in data breach awareness Use Case Automated monitoring for new breaches Proactive identity protection Demonstration of simple cache mechanism What this workflow does Checks the Have I Been Pwned API every 15 minutes for the latest breaches. Compares new breach data against previously notified breaches. Demonstrates a simple cache mechanism to track previously seen breaches. How the Cache Functionality Works Read from Cache**: Retrieves the last known breach from cache.json to avoid redundant alerts for the same breach. Compare Against Current Breach**: The workflow checks if the latest fetched breach differs from the cached one. Update the Cache**: If a new breach is detected, it updates cache.json with the latest breach data. Setup instructions The endpoint used in this workflow does not require an API key. Add your desired alert mechanism in the red box attached to the New breach node. How to customize this workflow to your needs Modify Notification Settings**: Tailor where alerts are sent (email, Slack, etc.). Add the desired node after the New breach node. This node contains all the data from the breach so it is eaisily available. You can choose from a variety of n8n nodes to send alerts when a new breach is detected. Below are a few common options you might consider adding after the New breach node: Email Node What it does: Sends an email notification to one or more recipients. Use case: Great for simple alerts to your inbox or a team distribution list. Customization: You can include breach details in the subject or body of the email, using data from the New breach node. Slack Node What it does: Sends a message to a Slack channel or user. Use case: Perfect for real-time alerts to your team in Slack. Customization: You can post breach details directly in a channel or DM. You can also format the message (bold, code blocks, etc.). Microsoft Teams Node What it does: Sends a message to a Teams channel. Use case: For organizations that use Microsoft Teams for communication. Customization: Similar to Slack, you can customize the message content and include all relevant breach information. Discord Node What it does: Sends an alert message to a Discord channel. Use case: Useful for teams or communities that coordinate via Discord. Customization: Add formatted messages with breach details for easy viewing. Telegram Node What it does: Sends messages to a Telegram chat or group. Use case: Good for mobile notifications and fast alerts. Customization: You can include breach summaries or detailed information, and even use bots to automate this. Webhook Node (as a sender) What it does: Sends breach data to another service via a webhook. Use case: If you have an external system or app that handles alerts, you can push the data directly to it. Customization: Send JSON payloads with detailed breach information to trigger actions in other systems. SMS Nodes (like Twilio) What it does: Sends an SMS notification to one or more phone numbers. Use case: For urgent alerts that need to be seen immediately. Customization: Keep messages concise, including key breach details like the time, type of breach, and affected system. Adjust Check Frequency**: Change the interval in the Schedule Trigger node (e.g., hourly or daily).
by n8n Team
This workflow automatically sends Zendesk tickets to Pipedrive contacts and makes them task assignees. The automation is triggered every 5 minutes, with Zendesk checking and collecting new tickets which are then individually assigned to a Pipedrive contact. Prerequisites Pipedrive account and Pipedrive credentials Zendesk account and Zendesk credentials Note: The Pipedrive and the Zendesk accounts need to be created by the same person / with the same email. How it works Cron node triggers the workflow every 5 minutes. Zendesk node collects all the tickets received after the last execution timestamp. Set node passes only the requester`s email and ID further to the Merge node. Merge by key node merges both inputs together, the tickets and their contact emails. Pipedrive node then searches for the requester. HTTP Request node gets owner information of Pipedrive contact. Set nodes keep only the requester owner's email and the agent`s email and id. Merge by key node merges the information and adds the contact owner to ticket data. Zendesk node changes the assignee to the Pipedrive contact owner or adds a note if the requester is not found. The Function Item node sets the new last execution timestamp.
by Oskar
This workflow uses AI to analyze the content of every new message in Gmail and then assigns specific labels, according to the context of the email. Default configuration of the workflow includes 3 labels: „Partnership” - email about sponsored content or cooperation, „Inquiry” - email about products, services, „Notification” - email that doesn't require response. You can add or edit labels and descriptions according to your use case. 🎬 See this workflow in action in my YouTube video about automating Gmail. How it works? Gmail trigger performs polling every minute for new messages (you can change the trigger interval according to your needs). The email content is then downloaded and forwarded to an AI chain. 💡 The prompt in the AI chain node includes instructions for applying labels according to the email content - change label names and instructions to fit your use case. Next, the workflow retrieves all labels from the Gmail account and compares them with the label names returned from the AI chain. Label IDs are aggregated and applied to processed email messages. ⚠️ Label names in the Gmail account and workflow (prompt, JSON schema) must be the same. Set up steps Set credentials for Gmail and OpenAI. Add labels to your Gmail account (e.g. „Partnership”, „Inquiry” and „Notification”). Change prompt in AI chain node (update list of label names and instructions). Change list of available labels in JSON schema in parser node. Optionally: change polling interval in Gmail trigger (by default interval is 1 minute). If you like this workflow, please subscribe to my YouTube channel and/or my newsletter.
by Mario
Purpose This workflow creates a versioned backup of an entire Clockify workspace split up into monthly reports. How it works This backup routine runs daily by default The Clockify reports API endpoint is used to get all data from the workspace based on time entries A report file is being retrieved for every month starting with the current one, going back 3 month in total by default If changes happened during a day to any report, it is being updated in Github Prerequisites Create a private Github repository Create credentials for both Clockify and Github (make sure to give permissions for read and write operations) Setup Clone the workflow and select the belonging credentials Follow the instructions given in the yellow sticky notes Activate the workflow
by Mario
Purpose This ensures that executions of scheduled workflows do not overlap when they take longer than expected. How it works This is a separate workflow which monitors the execution of the main workflow Stores a flag in Redis (key dynamically named after workflow ID) which indicates if the main workflow is running or idle Only calls the main workflow if the last execution has finished Setup Update the credentials suitable for your Redis instance Replace the Schedule Trigger of your main workflow by an Execute Workflow Trigger Copy the workflow ID from the URL Paste the workflow ID in the Execute Workflow Node of this workflow Configure the Schedule Trigger Node
by David Ashby
🛠️ E-goi Tool MCP Server Complete MCP server exposing all E-goi Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every E-goi Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n E-goi Tool tool with full error handling 📋 Available Operations (4 total) Every possible E-goi Tool operation is included: 📇 Contact (4 operations) • Create a member • Get a member • Get many members • Update a member 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native E-goi Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every E-goi Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.