by Avkash Kakdiya
How it works This workflow enriches and personalizes your lead profiles by integrating HubSpot contact data, scraping social media information, and using AI to generate tailored outreach emails. It streamlines the process from contact capture to sending a personalized email โ all automatically. The system fetches new or updated HubSpot contacts, verifies and enriches their Twitter/LinkedIn data via Phantombuster, merges the profile and engagement insights, and finally generates a customized email ready for outreach. Step-by-step 1. Trigger & Input HubSpot Contact Webhook: Fires when a contact is created or updated in HubSpot. Fetch Contact: Pulls the full contact details (email, name, company, and social profiles). Update Google Sheet: Logs Twitter/LinkedIn usernames and marks their tracking status. 2. Validation Validate Twitter/LinkedIn Exists: Checks if the contact has a valid social profile before proceeding to scraping. 3. Social Media Scraping (via Phantombuster) Launch Profile Scraper & ๐ฏ Launch Tweet Scraper: Triggers Phantombuster agents to fetch profile details and recent tweets. Wait Nodes: Ensures scraping completes (30โ60 seconds). Fetch Profile/Tweet Results: Retrieves output files from Phantombuster. Extract URL: Parses the job output to extract the downloadable .json or .csv data file link. 4. Data Download & Parsing Download Profile/Tweet Data: Downloads scraped JSON files. Parse JSON: Converts the raw file into structured data for processing. 5. Data Structuring & Merging Format Profile Fields: Maps stats like bio, followers, verified status, likes, etc. Format Tweet Fields: Captures tweet data and associates it with the leadโs email. Merge Data Streams: Combines tweet and profile datasets. Combine All Data: Produces a single, clean object containing all relevant lead details. 6. AI Email Generation & Delivery Generate Personalized Email: Feeds the merged data into OpenAI GPT (via LangChain) to craft a custom HTML email using your brand details. Parse Email Content: Cleans AI output into structured subject and body fields. Sends Email: Automatically delivers the personalized email to the lead via Gmail. Benefits Automated Lead Enrichment โ Combines CRM and real-time social media data with zero manual research. Personalized Outreach at Scale โ AI crafts unique, relevant emails for each contact. Improved Engagement Rates โ Targeted messages based on actual social activity and profile details. Seamless Integration โ Works directly with HubSpot, Google Sheets, Gmail, and Phantombuster. Time & Effort Savings โ Replaces hours of manual lookup and email drafting with an end-to-end automated flow.
by Pawan
This template sets up a scheduled automation that scrapes the latest news from The Hindu website, uses a Google Gemini AI Agent to filter and analyze the content for relevance to the Competitive Exams like UPSC Civil Services Examination (CSE) syllabus, and compiles a structured daily digest directly into a Google Sheet. It saves hours of manual reading and note-taking by providing concise summaries, subject categorization, and explicit UPSC importance notes. Whoโs it for This workflow is essential for: UPSC/CSE Aspirants who require a curated, focused, and systematic daily current affairs digest. Coaching Institutes aiming to instantly generate structured, high-quality study material for their students. Educators and Content Creators focused on Governance, Economy, International Relations, and Science & Technology. How it works / What it does This workflow runs automatically every morning (scheduled for 7 AM by default) to generate a ready-to-study current affairs document. Scraping: The Schedule Trigger fires an HTTP Request to fetch the latest news links from The Hindu's front page. Data Curation: The HTML and Code in JavaScript nodes work together to extract and pair every article URL with its title. Content Retrieval: For each identified link, a second HTTP Request node fetches the entire article body. AI Analysis and Filtering: The AI Agent uses a detailed prompt and the Google Gemini Chat Model to perform two critical tasks: Filter: It filters out all irrelevant articles (e.g., sports results, local crime) to keep only the 5-6 most important UPSC-relevant pieces (Polity, Economy, IR, etc.). Analyze: For the selected articles, it generates a Brief Summary, identifies the Main Subject, and clearly articulates Why it is Important for the UPSC Exam. Storage: The AI Agent calls the integrated Google Sheets Tool to automatically append the structured, analyzed data into your designated Google Sheet, creating your daily ready-made notes. Requirements To deploy this workflow, you need: n8n Account: (Cloud or self-hosted). Google Gemini API Key: For connecting the Google Gemini Chat Model and powering the AI Agent. Google Sheets Credentials: For reading/writing the final compiled digest. Target Google Sheet: A spreadsheet with the following columns: Date, URL, Subject, Brief Summary, and What is Important. How to set up Credentials Setup:** Connect your Google Gemini and Google Sheets accounts via the n8n Credentials Manager. Google Sheet Linking:* In the *Append row in sheet and Append row in sheet in Google Sheets1 nodes, replace the **placeholder IDs and GIDs with the actual ID and sheet name of your dedicated UPSC notes spreadsheet. Scheduling:* Adjust the time in the *Schedule Trigger: Daily at 7 AM node** if you want the daily analysis to run at a different hour. AI Customization (Optional):* You can refine the System Message in the *AI Agent: Filter & Analyze UPSC News node** to focus the analysis on specific exam phases (e.g., Prelims only) or adjust the priority of subjects.
by Emir Belkahia
This workflow helps Customer Success Managers and customer success professionals quickly gather intelligence on clients or prospects by analyzing their recent LinkedIn activity via a simple Slack command. Who's it for CSMs, Account Managers, and Sales professionals who need fast, structured insights about a person's LinkedIn presence before a call, meeting, or outreach. What it does (and doesn't do) โ It DOES: Fetch recent LinkedIn posts from any profile Analyze posting frequency and cadence patterns Identify top themes and focus areas Extract recent highlights with context Generate a clean HTML report sent via email โ It DOESN'T: Access private/non-public LinkedIn content Provide real-time updates (it's a snapshot) Replace actual researches when needed Think of it as: Your personal LinkedIn research assistant that turns a name into actionable intelligence in under a minute. How it works Slack command - Type /check-linkedin [Full Name] in Slack Name validation - AI verifies you provided a full name (not just "John") Profile discovery - Finds the correct LinkedIn profile via Apify Content scraping - Pulls their recent posts (last 20) AI analysis - GPT-4.1 analyzes posting patterns, topics, and highlights Report generation - Creates a formatted HTML email report Email delivery - Sends the intelligence brief to your inbox Set up steps Setup time: ~15 minutes Create or use your existing Slack app and add a Slash Command (it can be done here https://api.slack.com/apps) Configure the webhook URL in your Slack app Connect credentials: Slack OAuth Apify API OpenAI API Gmail OAuth Update the email recipient in "Send report via Email" node Test with a known LinkedIn profile Requirements Slack workspace (with app installation permissions) Apify account with credits OpenAI API key (GPT-4.1 access) Gmail account Apify actors: LinkedIn Profile Finder LinkedIn Post Scraper Cost estimation ~$0.05-0.09 per profile check. You could research 11-20 people for $1. โ ๏ธ Cost Disclaimer: The costs displayed above are indicative only and may vary significantly depending on which n8n actors you select. Some actors incur monthly chargesโfor example, one of the two actors used in this workflow costs $35/month. So, I recommend using this actor only when there's a clear business need for it. For cost optimization, consider switching to alternative actors that can deliver similar / simpler functionality at a lower cost. If you plan to use this workflow extensively, I strongly suggest performing a budget assessment and evaluating other actor options to maximize cost efficiency. The workflow uses GPT-4.1-mini for lightweight classification and GPT-4.1 for the heavy analysis to balance quality and cost. Known Limitations Common names have limited accuracy: Very common names (e.g., "John Smith") often fail to identify the correct person accurately. An improved version could support company name in the slash command as an additional input to help narrow down results and improve first-try matching accuracy. ๐ก Pro tips Check before important meetings: Run this 15-30 minutes before a call. The email report gives you conversation starters and context about what they care about. Batch your research: If you have multiple clients or prospects, queue them up. Just remember each lookup costs ~$0.05-0.09. Watch your Apify credits: The LinkedIn scrapers are the main cost driver. Monitor your Apify usage if you're doing high volume. Don't spam the same profile: LinkedIn may rate-limit. Space out repeat checks on the same person by at least a few hours. Review the "Quick Scan" section first: The email report starts with key stats and top focus areas. Perfect for a 30-second pre-call prep. What to do after the workflow runs Check your email - Report arrives in 30-90 seconds Review the report - Latest post date, cadence, and top themes Read Recent Activity Summary - High-level overview of their content Dive into Detailed Analysis - Two main topics with keywords and rationale Use it strategically: Reference their recent posts in your outreach Ask about topics they're clearly passionate about Tailor your pitch to their demonstrated interests Avoid generic "saw you on LinkedIn" messages Questions or Feedback? ๐ง emir.belkahia@gmail.com ๐ผ linkedin.com/in/emirbelkahia
by TOMOMITSU ASANO
Intelligent Invoice Processing with AI Classification and XML Export Summary Automated invoice processing pipeline that extracts data from PDF invoices, uses AI Agent for intelligent expense categorization, generates XML for accounting systems, and routes high-value invoices for approval. Detailed Description A comprehensive accounts payable automation workflow that monitors for new PDF invoices, extracts text content, uses AI to classify expenses and detect anomalies, converts to XML format for accounting system integration, and implements approval workflows for high-value or unusual invoices. Key Features PDF Text Extraction**: Extract from File node parses invoice PDFs automatically AI-Powered Classification**: AI Agent categorizes expenses, suggests GL codes, detects anomalies XML Export**: Convert structured data to accounting-compatible XML format Approval Workflow**: Route invoices over $5,000 or low confidence for human review Multi-Trigger Support**: Google Drive monitoring or manual webhook upload Comprehensive Logging**: Archive all processed invoices to Google Sheets Use Cases Accounts payable automation Expense report processing Vendor invoice management Financial document digitization Audit trail generation Required Credentials Google Drive OAuth (for PDF source folder) OpenAI API key Slack Bot Token Gmail OAuth Google Sheets OAuth Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses Extract from File node for PDF text extraction (rarely used) Uses XML node for JSON to XML conversion (very rare) Uses AI Agent node for intelligent classification Uses Google Drive Trigger for file monitoring Implements approval workflow with conditional routing Webhook response** mode for API integration Workflow Architecture [Google Drive Trigger] [Manual Webhook] | | +----------+-----------+ | v [Filter PDF Files] | v [Download Invoice PDF] | v [Extract PDF Text] | v [Parse Invoice Data] (Code) | v [AI Invoice Classifier] <-- [OpenAI Chat Model] | v [Parse AI Classification] | v [Convert to XML] | v [Format XML Output] | v [Needs Approval?] (If) / \ Yes (>$5000) No (Auto) | | [Email Approval] [Slack Notify] | | +------+-------+ | v [Archive to Google Sheets] | v [Respond to Webhook] Configuration Guide Google Drive: Set folder ID to monitor in Drive Trigger node Approval Threshold: Default $5,000, adjust in "Needs Approval?" node Email Recipients: Configure finance-approvers@example.com Slack Channel: Set #finance-notifications for updates GL Codes: AI suggests codes; customize in AI prompt if needed Google Sheets: Configure document for invoice archive
by Yusuke
๐ง Overview Discover and analyze the most valuable community-built n8n workflows on GitHub. This automation searches public repositories, analyzes JSON workflows using AI, and saves a ranked report to Google Sheets โ including summaries, use cases, difficulty, stars, node count, and repository links. โ๏ธ How It Works Search GitHub Code API โ queries for extension:json n8n and splits results Fetch & Parse โ downloads each candidate fileโs raw JSON and safely parses it Extract Metadata โ detects AI-powered flows and collects key node information AI Analysis โ evaluates the top N workflows (description, use case, difficulty) Merge Insights โ combines AI analysis with GitHub data Save to Google Sheets โ appends or updates by workflow name ๐งฉ Setup Instructions (5โ10 min) Open Config node and set: search_query โ e.g., "openai" extension:json n8n max_results โ number of results to fetch (1โ100) ai_analysis_top โ number of workflows analyzed with AI SPREADSHEET_ID, SHEET_NAME โ Google Sheets target Add GitHub PAT via HTTP Header Credential: Authorization: Bearer <YOUR_TOKEN> Connect OpenAI Credential to OpenAI Chat Model Connect Google Sheets (OAuth2) to Save to Google Sheets (Optional) Enable Schedule Trigger to run weekly for automatic updates > ๐ก Tip: If you need to show literal brackets, use backticks like `<example>` (no HTML entities needed). ๐ Use Cases 1) Trend Tracking for AI Automations Goal:** Identify the fastest-growing AI-powered n8n workflows on GitHub. Output:** Sorted list by stars and AI detection, updated weekly. 2) Internal Workflow Benchmarking Goal:** Compare your organizationโs workflows against top public examples. Output:** Difficulty, node count, and AI usage metrics in Google Sheets. 3) Market Research for Automation Agencies Goal:** Discover trending integrations and tool combinations (e.g., OpenAI + Slack). Output:** Data-driven insights for client projects and content planning. ๐งช Notes & Best Practices ๐ No hardcoded secrets โ use n8n Credentials ๐งฑ Works with self-hosted or cloud n8n ๐งช Start small (max_results = 10) before scaling ๐งญ Use โAI Poweredโ + โStarsโ columns in Sheets to identify top templates ๐งฉ Uses only Markdown sticky notes โ no HTML formatting required ๐ Resources GitHub (template JSON):** github-workflow-finder-ai.json๏ฟผ
by explorium
Research Agent - Automated Sales Meeting Intelligence This n8n workflow automatically prepares comprehensive sales research briefs every morning for your upcoming meetings by analyzing both the companies you're meeting with and the individual attendees. The workflow connects to your calendar, identifies external meetings, enriches companies and contacts with deep intelligence from Explorium, and delivers personalized research reportsโgiving your sales team everything they need for informed, confident conversations. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Google Calendar (or Outlook) Type:** OAuth2 Used for:** Reading daily meeting schedules and identifying external attendees Alternative: Microsoft Outlook Calendar Get credentials at Google Cloud Console Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business/prospect matching, firmographic enrichment, professional profiles, LinkedIn posts, website changes, competitive intelligence Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research for AI agents Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered company and attendee research analysis Get your API key at Anthropic Console Slack (or preferred output) Type:** OAuth2 Used for:** Delivering research briefs Alternative options: Google Docs, Email, Microsoft Teams, CRM updates Go to Settings โ Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: Schedule Trigger Automatically runs the workflow on a recurring schedule. Type:** Schedule Trigger Default:** Every morning before business hours Customizable:** Set to any interval (hourly, daily, weekly) or specific times Alternative Trigger Options: Manual Trigger:** On-demand execution Webhook:** Triggered by calendar events or CRM updates Node 2: Get many events Retrieves meetings from your connected calendar. Calendar Source:** Google Calendar (or Outlook) Authentication:** OAuth2 Time Range:** Current day + 18 hours (configurable via timeMax) Returns:** All calendar events with attendee information, meeting titles, times, and descriptions Node 3: Filter for External Meetings Identifies meetings with external participants and filters out internal-only meetings. Filtering Logic: Extracts attendee email domains Excludes your company domain (e.g., 'explorium.ai') Excludes calendar system addresses (e.g., 'resource.calendar.google.com') Only passes events with at least one external attendee Important Setup Note: Replace 'explorium.ai' in the code node with your company domain to properly filter internal meetings. Output: Events with external participants only external_attendees: Array of external contact emails company_domains: Unique list of external company domains per meeting external_attendee_count: Number of external participants Company Research Pipeline Node 4: Loop Over Items Iterates through each meeting with external attendees for company research. Node 5: Extract External Company Domains Creates a deduplicated list of all external company domains from the current meeting. Node 6: Explorium API: Match Business Matches company domains to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of matches with confidence scores Company name and basic info Node 7: If Validates that a business match was found before proceeding to enrichment. Condition:** business_id is not empty If True:** Proceed to parallel enrichment nodes If False:** Skip to next company in loop Nodes 8-9: Parallel Company Enrichment Node 8: Explorium API: Business Enrich Endpoints:** /v1/businesses/firmographics/enrich, /v1/businesses/technographics/enrich Enrichment Types:** firmographics, technographics Returns:** Company name, description, website, industry, employees, revenue, headquarters location, ticker symbol, LinkedIn profile, logo, full tech stack, nested tech stack by category, BI & analytics tools, sales tools, marketing tools Node 9: Explorium API: Fetch Business Events Endpoint:** /v1/businesses/events/fetch Event Types:** New funding rounds, new investments, mergers & acquisitions, new products, new partnerships Date Range:** September 1, 2025 - November 4, 2025 Returns:** Recent business milestones and financial events Node 10: Merge Combines enrichment responses and events data into a single data object. Node 11: Cleans Merge Data Output Transforms merged enrichment data into a structured format for AI analysis. Node 12: Company Research Agent AI agent (Claude Sonnet 4) that analyzes company data to generate actionable sales intelligence. Input: Structured company profile with all enrichment data Analysis Focus: Company overview and business context Recent website changes and strategic shifts Tech stack and product focus areas Potential pain points and challenges How Explorium's capabilities align with their needs Timely conversation starters based on recent activity Connected to Explorium MCP: Can pull additional real-time intelligence if needed to create more detailed analysis Node 13: Create Company Research Output Formats the AI analysis into a readable, shareable research brief. Attendee Research Pipeline Node 14: Create List of All External Attendees Compiles all unique external attendee emails across all meetings. Node 15: Loop Over Items2 Iterates through each external attendee for individual enrichment. Node 16: Extract External Company Domains1 Extracts the company domain from each attendee's email. Node 17: Explorium API: Match Business1 Matches the attendee's company domain to get business_id for prospect matching. Method:** POST Endpoint:** /v1/businesses/match Purpose:** Link attendee to their company Node 18: Explorium API: Match Prospect Matches attendee email to Explorium's professional profile database. Method:** POST Endpoint:** /v1/prospects/match Authentication:** Header Auth (Bearer token) Returns: prospect_id: Unique professional profile identifier Node 19: If1 Validates that a prospect match was found. Condition:** prospect_id is not empty If True:** Proceed to prospect enrichment If False:** Skip to next attendee Node 20: Explorium API: Prospect Enrich Enriches matched prospect using multiple Explorium endpoints. Enrichment Types:** contacts, profiles, linkedin_posts Endpoints:** /v1/prospects/contacts/enrich, /v1/prospects/profiles/enrich, /v1/prospects/linkedin_posts/enrich Returns: Contacts:** Professional email, email status, all emails, mobile phone, all phone numbers Profiles:** Full professional history, current role, skills, education, company information, experience timeline, job titles and seniority LinkedIn Posts:** Recent LinkedIn activity, post content, engagement metrics, professional interests and thought leadership Node 21: Cleans Enrichment Outputs Structures prospect data for AI analysis. Node 22: Attendee Research Agent AI agent (Claude Sonnet 4) that analyzes prospect data to generate personalized conversation intelligence. Input: Structured professional profile with activity data Analysis Focus: Career background and progression Current role and responsibilities Recent LinkedIn activity themes and interests Potential pain points in their role Relevant Explorium capabilities for their needs Personal connection points (education, interests, previous companies) Opening conversation starters Connected to Explorium MCP: Can gather additional company or market context if needed Node 23: Create Attendee Research Output Formats attendee analysis into a readable brief with clear sections. Node 24: Merge2 Combines company research output with attendee information for final assembly. Node 25: Loop Over Items1 Manages the final loop that combines company and attendee research for output. Node 26: Send a message (Slack) Delivers combined research briefs to specified Slack channel or user. Alternative Output Options: Google Docs:** Create formatted document per meeting Email:** Send to meeting organizer or sales rep Microsoft Teams:** Post to channels or DMs CRM:** Update opportunity/account records with research PDF:** Generate downloadable research reports Workflow Flow Summary Schedule: Workflow runs automatically every morning Fetch Calendar: Pull today's meetings from Google Calendar/Outlook Filter: Identify meetings with external attendees only Extract Companies: Get unique company domains from external attendees Extract Attendees: Compile list of all external contacts Company Research Path: Match Companies: Identify businesses in Explorium database Enrich (Parallel): Pull firmographics, website changes, competitive landscape, events, and challenges Merge & Clean: Combine and structure company data AI Analysis: Generate company research brief with insights and talking points Format: Create readable company research output Attendee Research Path: Match Prospects: Link attendees to professional profiles Enrich (Parallel): Pull profiles, job changes, and LinkedIn activity Merge & Clean: Combine and structure prospect data AI Analysis: Generate attendee research with background and approach Format: Create readable attendee research output Delivery: Combine: Merge company and attendee research for each meeting Send: Deliver complete research briefs to Slack/preferred platform This workflow eliminates manual pre-meeting research by automatically preparing comprehensive intelligence on both companies and individualsโgiving sales teams the context and confidence they need for every conversation. Customization Options Calendar Integration Works with multiple calendar platforms: Google Calendar:** Full OAuth2 integration Microsoft Outlook:** Calendar API support CalDAV:** Generic calendar protocol support Trigger Flexibility Adjust when research runs: Morning Routine:** Default daily at 7 AM On-Demand:** Manual trigger for specific meetings Continuous:** Hourly checks for new meetings Enrichment Depth Add or remove enrichment endpoints: Company:** Technographics, funding history, news mentions, hiring signals Prospects:** Contact information, social profiles, company changes Customizable:** Select only needed data to optimize speed and costs Research Scope Configure what gets researched: All External Meetings:** Default behavior Filtered by Keywords:** Only meetings with specific titles By Attendee Count:** Only meetings with X+ external attendees By Calendar:** Specific calendars only Output Destinations Deliver research to your preferred platform: Messaging:** Slack, Microsoft Teams, Discord Documents:** Google Docs, Notion, Confluence Email:** Gmail, Outlook, custom SMTP CRM:** Salesforce, HubSpot (update account notes) Project Management:** Asana, Monday.com, ClickUp AI Model Options Swap AI providers based on needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Domain Configuration: Replace 'explorium.ai' in the Filter for External Meetings code node with your company domain Calendar Connection: Ensure OAuth2 credentials have calendar read permissions Explorium Credentials: Both API key and MCP credentials must be configured Output Timing: Schedule trigger should run with enough lead time before first meetings Rate Limits: Adjust loop batch sizes if hitting API rate limits during enrichment Slack Configuration: Select destination channel or user for research delivery Data Privacy: Research is based on publicly available professional information and company data This workflow acts as your automated sales researcher, preparing detailed intelligence reports every morning so your team walks into every meeting informed, prepared, and ready to have meaningful conversations that drive business forward.
by Rahul Joshi
Description Process new resumes from Google Drive, extract structured candidate data with AI, save to Google Sheets, and auto-create a ClickUp hiring task. Gain a centralized, searchable candidate database and instant task kickoffโno manual data entry. ๐ What This Template Does Watches a Google Drive folder for new resume PDFs and triggers the workflow. ๐ Downloads the file and converts the PDF to clean, readable text. ๐ Analyzes resume text with an AI Resume Analyzer to extract structured candidate info (name, email, phone, experience, skills, education). ๐ค Cleans and validates the AI JSON output for reliability. ๐งน Appends or updates a candidate row in Google Sheets and creates a ClickUp hiring task. โ Key Benefits Save hours with end-to-end, hands-off resume processing. โฑ๏ธ Never miss a candidateโevery upload triggers automatically. ๐ Keep a single source of truth in Sheets, always up-to-date. ๐ Kickstart hiring instantly with auto-created ClickUp tasks. ๐ Works with varied resume formats using AI extraction. ๐ง Features Google Drive โWatch for New Resumesโ trigger (every minute). โฒ PDF-to-text extraction optimized for text-based PDFs. ๐ AI-powered resume parsing into standardized JSON fields. ๐งฉ JSON cleanup and validation for safe storage. ๐งฐ Google Sheets append-or-update for a central candidate database. ๐ ClickUp task creation with candidate-specific titles and assignment. ๐ฏ Requirements n8n instance (cloud or self-hosted); recommended n8n version 1.106.3 or higher. ๐ง Google Drive access to a dedicated resumes folder (PDF resumes recommended). ๐ Google Sheets credential with edit access to the candidate database sheet. ๐ ClickUp workspace/project access to create tasks for hiring. ๐ AI service credentials for the Resume Analyzer step (add in n8n Credentials). ๐ค Target Audience HR and Talent Acquisition teams needing faster screening. ๐ฅ Recruiters and staffing agencies handling high volumes. ๐ข Startups and ops teams standardizing candidate intake. ๐ No-code/low-code builders automating hiring workflows. ๐งฉ Step-by-Step Setup Instructions Connect Google Drive, Google Sheets, ClickUp, and your AI service in n8n Credentials. ๐ Set the Google Drive โwatchedโ folder (e.g., Resume_store). ๐ Import the workflow, assign credentials to all nodes, and map your Sheets columns. ๐๏ธ Adjust the ClickUp task details (title pattern, assignee, list). ๐ Run once with a sample PDF to test, then enable scheduling (every 1 minute). โถ๏ธ Optionally rename the email/task nodes for clarity (e.g., โCreate Hiring Task in ClickUpโ). โ๏ธ
by usamaahmed
๐ HR Resume Screening Workflow โ Smart Hiring on Autopilot ๐ค ๐ฏ Overview: "This workflow builds an AI-powered resume screening system inside n8n. It begins with Gmail and Form triggers that capture incoming resumes, then uploads each file to Google Drive for storage. The resume is downloaded and converted into plain text, where two branches run in parallel: one extracts structured contact details, and the other uses an AI agent to summarize education, job history, and skills while assigning a suitability score. A cleanup step normalizes the data before merging both outputs, and the final candidate record is saved into Google Sheets and Airtable, giving recruiters a centralized dashboard to identify top talent quickly and consistently.โ ๐ Prerequisites: To run this workflow successfully, youโll need: Gmail OAuth** โ to read incoming resumes. Google Drive OAuth** โ to upload and download resume files. Google Sheets OAuth** โ to save structured candidate records. Airtable Personal Access Token** โ for dashboards and record-keeping. OpenAI / OpenRouter API Key** โ to run the AI summarizer and evaluator. โ๏ธ Setup Instructions: Import the Workflow Clone or import the workflow into your n8n instance. Add Credentials Go to n8n โ Credentials and connect Gmail, Google Drive, Google Sheets, Airtable, and OpenRouter/OpenAI. Configure Key Nodes Gmail Trigger โ Update filters.q with the job title you are hiring for (e.g., "Senior Software Engineer"). Google Drive Upload โ Set the folderId where resumes will be stored. Google Sheets Node โ Link to your HR spreadsheet (e.g., โCandidates 2025โ). Airtable Node โ Select the correct base & table schema for candidate records. Test the Workflow Send a test resume (via email or form). Check Google Sheets & Airtable for structured candidate data. Go Live Enable the workflow. It will now run continuously and process new resumes as they arrive. ๐ End-to-End Workflow Walkthrough: ๐ข Section 1 โ Entry & Intake Nodes: ๐ง Gmail Trigger โ Polls inbox every minute, captures job application emails, and downloads resume attachments (CV0, CV1, โฆ). ๐ Form Trigger โ Alternate entry for resumes submitted via a careers page or job portal. โ Quick Understanding: Think of this section as the front desk of recruitment - resumes received either by email or online form, and the system immediately grabs them for processing. ๐ Section 2 โ File Management Nodes: โ๏ธ Upload File (Google Drive) โ Saves the incoming resume into a structured Google Drive folder, naming it after the applicant. โฌ๏ธ Download File (Google Drive) โ Retrieves the stored resume file for further processing. ๐ Extract from File โ Converts the resume (PDF/DOC) into plain text so the AI and extractors can work with it. โ Quick Understanding: This is your digital filing room. Every resume is safely stored, then converted into a readable format for the hiring system. ๐ค Section 3 โ AI Processing (Parallel Analysis) Nodes: ๐งพ Information Extractor โ Pulls structured contact information (candidate name, candidate email and candidate phone number) using regex validation and schema rules. ๐ค AI Agent (LangChain + OpenRouter) โ Reads the full CV and outputs: ๐ Educational Qualifications ๐ผ Job History ๐ Skills Set ๐ Candidate Evaluation Score (1โ10) ๐ Justification for the score โ Quick Understanding: Imagine having two assistants working in parallel, one quickly extracts basic contact info, while the other deeply reviews the CV and gives an evaluation. ๐ ๏ธ Section 4 โ Data Cleanup & Merging Nodes: โ๏ธ Edit Fields โ Standardizes the AI Agentโs output into a consistent field (output). ๐ Code (JS Parsing & Cleanup) โ Converts the AIโs free-text summary into clean JSON fields (education, jobHistory, skills, score, justification). ๐ Merge โ Combines the structured contact info with the AIโs evaluation into a single candidate record. โ Quick Understanding: This is like the data cleaning and reporting team, making sure all details are neat, structured, and merged into one complete candidate profile. ๐ Section 5 โ Persistence & Dashboards Nodes: ๐ Google Sheets (Append Row) โ Saves candidate details into a Google Sheet for quick team access. ๐ Airtable (Create Record) โ Stores the same structured data into Airtable, enabling dashboards, analytics, and ATS-like tracking. โ Quick Understanding: Think of this as your HR dashboard and database. Every candidate record is logged in both Google Sheets and Airtable, ready for filtering, reporting, or further action. ๐ Workflow Overview Table: | Section | Key Roles / Nodes | Model / Service | Purpose | Benefit | | --- | --- | --- | --- | --- | | ๐ฅ Entry & Intake | Gmail Trigger, Form Trigger | Gmail API / Webhook | Capture resumes from email or forms | Resumes collected instantly from multiple sources | | ๐ File Management | Google Drive Upload, Google Drive Download, Extract from File | Google Drive + n8n Extract | Store resumes & convert to plain text | Centralized storage + text extraction for processing | | ๐ค AI Processing | Information Extractor, AI Agent (LangChain + OpenRouter) | Regex + OpenRouter AI {gpt-oss-20b (free)} | Extract contact info + AI CV analysis | Candidate details + score + justification generated automatically | | ๐ Data Cleanup & Merge | Edit Fields, Code (JS Parsing & Cleanup), Merge | n8n native + Regex Parsing | Standardize and merge outputs | Clean, structured JSON record with all candidate info | | ๐ Persistence Layer | Google Sheets Append Row, Airtable Create Record | Google Sheets + Airtable APIs | Store structured candidate data | HR dashboards & ATS-ready records for easy review and analytics | | ๐ Execution Flow | All connected | Gmail + Drive + Sheets + Airtable + AI | End-to-end automation | Automated resume โ structured record โ recruiter dashboards | ๐ Workflow Output Overview: Each candidateโs data is standardized into the following fields: Candidate Name Candidate Email Contact Number Educational Qualifications Job History Skills Set AI Score (1โ10) Justification ๐ Example (Google Sheet row): ๐ Benefits of This Workflow at a Glance: โฑ๏ธ Lightning-Fast Screening** โ Processes hundreds of resumes in minutes instead of hours. ๐ค AI-Powered Evaluation** โ Automatically summarizes candidate education, work history, skills, and gives a suitability score (1โ10) with justification. ๐ Centralized Storage** โ Every resume is securely saved in Google Drive for easy access and record-keeping. ๐ Data-Ready Outputs** โ Structured candidate profiles go straight into Google Sheets and Airtable, ready for dashboards and analytics. โ Consistency & Fairness** โ Standardized AI scoring ensures every candidate is evaluated on the same criteria, reducing human bias. ๐ ๏ธ Flexible Intake** โ Works with both Gmail (email applications) and Form submissions (job portals or career pages). ๐ Recruiter Productivity Boost** โ Frees HR teams from manual extraction and data entry, allowing them to focus on interviewing and hiring the best talent. ๐ Practical HR Use Case: โScreen resumes for a Senior Software Engineer role and shortlist top candidates.โ Gmail Trigger โ Captures incoming job applications with CVs attached. Google Drive โ Stores resumes for record-keeping. Extract from File โ Converts CVs into plain text. Information Extractor โ Pulls candidate name, email, and phone number. AI Agent โ Summarizes education, job history, skills, and assigns a suitability score (1โ10). Code & Merge โ Cleans and combines outputs into a structured candidate profile. Google Sheets โ Logs candidate data for quick HR review. Airtable โ Builds dashboards to filter and identify top-scoring candidates. โ Result: HR instantly sees structured candidate records, filters by score, and focuses interviews on the best talent.
by Yaron Been
This workflow provides automated access to the Settyan Flash V2.0.0 Beta.0 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Settyan Flash V2.0.0 Beta.0 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Settyan/flash-v2.0.0-beta.0 AI model Settyan Flash V2.0.0 Beta.0**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Yaron Been
This workflow provides automated access to the 0Xdino Cyberrealistic Pony V125 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the 0Xdino Cyberrealistic Pony V125 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the 0Xdino/cyberrealistic-pony-v125 AI model 0Xdino Cyberrealistic Pony V125**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Yaron Been
This workflow provides automated access to the Settyan Flash V2.0.0 Beta.10 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Settyan Flash V2.0.0 Beta.10 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Settyan/flash-v2.0.0-beta.10 AI model Settyan Flash V2.0.0 Beta.10**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Yaron Been
This workflow provides automated access to the Settyan Flash V2.0.2 Beta.10 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Settyan Flash V2.0.2 Beta.10 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Settyan/flash-v2.0.2-beta.10 AI model Settyan Flash V2.0.2 Beta.10**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation