by Kevin Meneses
What this workflow does This workflow automatically generates a daily stock market email digest, combining price movements and recent financial news into a clean, actionable report. Instead of manually checking charts and news every morning, this workflow does it for you. It combines: Market data from EODHD APIs Financial news with sentiment analysis Smart processing using JavaScript (no raw data overload) ✉️ Automated email delivery via Gmail Only relevant insights reach your inbox. How it works (overview) The workflow runs on a schedule (daily at 7 AM) or manually It reads your stock watchlist from Google Sheets It fetches: Historical price data (last 14 days) Latest news (last 7 days) JavaScript processes the data: Calculates daily % change Filters recent news Assigns sentiment (positive / neutral / negative) A clean HTML email is generated The digest is sent automatically via Gmail 👉 The result: a complete market snapshot in seconds How to configure it EODHD APIs (Market Data) Add your API key in the ⚙️ Config node Used for: Price data Financial news Sentiment signals Google Sheets Add your credentials Create a column named ticker Add one stock per row (e.g. AAPL, TSLA, MSFT) Gmail Add your Gmail credentials Set your recipient email Schedule Default: runs daily at 7 AM You can adjust it easily in the Schedule node Why this workflow is powerful Most people: Check multiple sites every morning Waste time jumping between charts and news Miss important market moves This workflow: ✔ Automates everything ✔ Centralizes data ✔ Adds context with sentiment analysis ✔ Saves hours every week 📊 Data powered by EODHD APIs This workflow uses EODHD APIs to retrieve: Historical stock prices Financial news Sentiment indicators 👉 If you want to build your own financial automations, dashboards, or trading workflows: Get started here (10% discount): https://eodhd.com/pricing-special-10?via=kmg&ref1=Meneses Final result You get a daily email like this: Top movers (price changes) Curated news per stock Sentiment insights Clean, readable format All generated automatically.
by Rully Saputra
Sign up for Decodo — get better pricing here Overview This workflow automatically collects the latest AI research papers using Decodo, extracts and summarizes PDFs with AI, stores insights in Google Sheets, and notifies users via Telegram. It turns complex academic research into structured, readable knowledge with zero manual effort. Who’s this for This template is ideal for: AI researchers and ML engineers tracking new papers Founders and product teams monitoring AI trends Content writers and analysts creating research-based content Educators, students, and newsletter creators Anyone who wants automated, summarized research without reading full papers. How it works / What it does A schedule trigger starts the workflow automatically Decodo fetches the latest AI research listings from arXiv reliably and at scale Article titles and PDF links are extracted and structured Each paper PDF is downloaded and converted to text An AI summarization chain generates concise, human-readable summaries Results are saved to Google Sheets as a research database A Telegram message notifies users when new summaries are available How to set up Add your Decodo API credentials (required) Connect your OpenAI / ChatGPT-compatible model for summarization Connect Google Sheets and choose your target spreadsheet Add your Telegram bot credentials and chat ID Adjust the schedule trigger if needed, then activate the workflow Requirements n8n (self-hosted required due to community node usage) Decodo community node** (web extraction) OpenAI or compatible AI model credentials Google Sheets account Telegram bot access ⚠️ Disclaimer: This workflow uses a community node and is supported on self-hosted n8n only. How to customize the workflow Change the arXiv category to track different research domains Modify the AI prompt to adjust summary length or tone Replace Google Sheets with another database or knowledge base Disable Telegram notifications if not needed Extend the workflow for SEO blogs, newsletters, or RAG pipelines
by Yaron Been
Overview Watch target companies for C-level and VP hiring signals, then send AI-personalized outreach emails when leadership roles are posted. This workflow reads a list of target company domains from Google Sheets, checks each one for leadership-level job openings via the PredictLeads Job Openings API, enriches matching companies with additional company data, and uses OpenAI to generate a personalized outreach email referencing the specific leadership hire. The email is sent automatically through Gmail. How it works A schedule trigger runs the workflow daily at 8:00 AM. The workflow reads target account domains from Google Sheets. It loops through each company and fetches job openings from PredictLeads. It filters for leadership roles such as CRO, CMO, CTO, VP, Head of, Chief, and Director. If leadership roles are found, it enriches the company with PredictLeads company data such as industry, size, and location. It builds a structured prompt combining company context and the detected leadership roles. It sends the prompt to OpenAI to generate a personalized outreach email. It sends the AI-generated email through Gmail with a tailored subject line. It loops back to process the next company. Setup Create a Google Sheet with these columns: domain company_name Connect your Gmail account using OAuth2 for sending outreach emails. Add your OpenAI API key in the Generate Outreach Email HTTP Request node. Add your PredictLeads API credentials using the X-Api-Key and X-Api-Token headers. Requirements Google Sheets OAuth2 credentials Gmail OAuth2 credentials OpenAI API account using gpt-4o-mini PredictLeads API account: https://docs.predictleads.com Notes The leadership role filter uses regex matching for roles such as CRO, CMO, CTO, VP, Vice President, Head of, and Chief. You can customize this as needed. The AI prompt instructs OpenAI to write concise emails with a maximum of 150 words, referencing the specific leadership hire. PredictLeads Job Openings and Company API docs: https://docs.predictleads.com
by oka hironobu
Who is this for Template creators who want to validate their n8n workflows against the official Creator Hub approval criteria before submitting. Useful for both new and verified creators looking to reduce rejection rates. What this workflow does This workflow scrapes the latest approval guidelines from four n8n Creator Hub Notion pages, generates a structured pass/fail checklist using Gemini AI, then reviews your uploaded workflow JSON against every criterion. The results are delivered as a formatted HTML email report with a score and specific fixes. Setup Add a Google Gemini (PaLM) API credential for criteria generation, file upload, and review. Add a Gmail OAuth2 credential for sending the results email. Activate the workflow and open the generated form URL. Requirements Google Gemini API key (used for three separate calls: criteria generation, file uploads, and final review) Gmail account with OAuth2 access enabled How to customize Change the Gemini model in the "Google Gemini for criteria generation" and "Review workflow against criteria" nodes. Edit the review prompt in the "Review workflow against criteria" node to adjust scoring weight or add custom checks. Replace the Gmail node with another email service or a Discord/Slack webhook for different delivery methods. Important disclaimer This workflow provides AI-generated feedback based on n8n Creator Hub guidelines available as of February 2026. The review results are not a guarantee of approval or rejection — actual decisions are made by the n8n review team. Guidelines and criteria may change over time. Always check the latest official information on the n8n Creator Hub before submitting your template.
by yuta tokumitsu
AI-Powered Japanese Social Media Content Generator with Quality Control 🎯 Who's it for Marketing teams and social media managers in Japan who want to automate content creation while maintaining high quality standards and cultural appropriateness. Perfect for businesses that need consistent Japanese-language social media presence with built-in compliance checks. 📝 What it does This workflow creates an intelligent content generation system that: Generates culturally-aware Japanese Twitter posts using GPT-4 Automatically scores content quality across 5 dimensions (engagement, SEO, brand voice, readability, CTA) Performs sentiment analysis and risk detection for controversial topics Routes content intelligently: auto-posts high-quality/low-risk content, flags medium-risk content for approval, and rejects high-risk content Includes an auto-improvement loop that refines content up to 3 times if quality scores are below 70 Provides weekly performance analytics and recommendations 🔧 How it works Daily Content Generation Flow: Schedule trigger runs weekday mornings at 9 AM Fetches Japanese cultural context (seasons, holidays, business events) Analyzes brand voice from past 30 days of posts Generates 3 Twitter post variations with GPT-4 Each post is scored on quality metrics (100-point scale) Low-scoring content enters auto-improvement loop Risk analysis checks for controversy, cultural sensitivity, and sentiment Decision routing: auto-approve and post OR send for manual approval OR reject Approval Workflow: Pending posts trigger approval emails Webhook receives approval/rejection/edit actions Approved posts are published to Twitter and archived in Notion Weekly Analytics: Monday morning trigger analyzes past week's performance GPT-4 generates insights report with best practices Email sent to team with recommendations ⚙️ Requirements APIs & Credentials: OpenAI API (GPT-4 access) Twitter API v2 with OAuth 2.0 Notion API (database for content storage) Email sending service (SMTP or SendGrid) Setup: Create a Notion database with columns: Content, Hashtags, Quality Score, Risk Level, Status, Engagement Configure OpenAI API credentials with HTTP Header Auth Set up Twitter OAuth 2.0 credentials Configure email service for approval notifications 🎨 How to customize Adjust Quality Thresholds: Modify the quality scoring criteria in "AI Quality Scoring" node Change auto-approval threshold (currently 70+ points) Content Generation: Edit GPT-4 prompts in "Generate Content with GPT-4" node to match your brand tone Adjust temperature settings for more/less creative content Modify the number of posts generated per run Risk Detection: Customize risk factors in "Sentiment & Risk Analysis" node Add industry-specific compliance checks Brand Voice Learning: Adjust the lookback period in "Get Past 30 Days Posts" (currently 30 days) Modify brand voice analysis logic in "Analyze Brand Voice" node Scheduling: Change cron expressions for daily content generation and weekly reports Add additional triggers for special campaigns ⚠️ Important Notes This workflow uses Japanese language prompts - modify system prompts if using for other languages Ensure compliance with Twitter's API rate limits and automation policies Review auto-posted content regularly to validate AI quality assessments The workflow stores all generated content in Notion for audit trails
by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features ⏰ Scheduled Processing:** Runs daily at 7 AM automatically 📊 Data Validation:** Ensures application completeness 💾 Database Updates:** Maintains student records 📧 Auto Emails:** Sends welcome messages ❌ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email
by Kumar Shivam
The AI-Powered Shopify SEO Content Automation is an enterprise-grade workflow that transforms product content creation for e-commerce stores. This sophisticated multi-agent system integrates GPT-4o, Claude Sonnet 4, Claude 3.5, Perplexity AI, and Haloscan keyword research to generate SEO-optimized product descriptions, metafields, and meta descriptions with zero manual intervention and built-in cannibalization prevention. To see the demo connect via my profile profile 💡 Key Advantages 🎯 Multi-Agent AI Orchestration Central Orchestrator manages complex workflows with specialized agents for descriptions, metafields, and SEO, each optimized for specific content types. 🔍 Advanced Keyword Research & Cannibalization Prevention Integrates Haloscan API for premium keyword discovery and SERP overlap analysis to prevent keyword cannibalization across your product catalog. 📊 Enterprise SEO Optimization Specialized for e-commerce with semantic alignment, TF-IDF optimization, and compliance with industry regulations and best practices. 🧠 Intelligent Content Strategy Perplexity AI provides market intelligence, search intent analysis, and trending keyword discovery for data-driven content decisions. 🏗️ Comprehensive Content Generation Creates product descriptions, 6 specialized metafields, SEO meta descriptions, and rich text formatting for complete Shopify integration. 📋 Automated Workflow Management Airtable integration tracks content creation status, manages keyword databases, and provides centralized workflow control. ⚙️ How It Works Content Type Selection Form-based trigger allows selection of content types: create_product_description, create_product_meta, or create_product_seo. Product Data Collection Retrieves comprehensive product information from Shopify and Airtable, including titles, descriptions, handles, and vendor details. Premium Keyword Discovery Haloscan API analyzes product titles for keyword opportunities Extracts search metrics, competitor keywords, and SERP data Perplexity provides market intelligence and search intent analysis SEO Compliance Checking Performs SERP overlap analysis to identify existing rankings Filters keywords to prevent cannibalization Updates Airtable with curated keyword lists Generates actionable SEO content strategies Multi-Agent Content Generation Product Description Agent (Claude Sonnet 4): Generates SEO-optimized product descriptions with verified facts Implements strict HTML structure with proper heading hierarchy Ensures compliance with e-commerce regulations and best practices Meta Fields Agent (Claude Sonnet 4): Creates 6 specialized metafields: ingredients, recommendations, nutritional values, warnings, short descriptions, and client arguments Enforces strict formatting rules and regulatory compliance Generates clean HTML compatible with Shopify themes SEO Fields Agent (Claude Sonnet 4): Produces optimized meta descriptions for search engines Integrates keyword research data for maximum organic visibility Applies current year SEO best practices and anti-keyword stuffing techniques Shopify Integration & Updates Updates product descriptions via Shopify API Uploads metafields using GraphQL mutations Converts HTML to Shopify Rich Text format Tracks completion status in Airtable 🛠️ Setup Steps Core Integrations Shopify Access Token – For product data retrieval and content updates OpenRouter API – For GPT-4o and Claude model access Haloscan API – For keyword research and SERP analysis Perplexity API – For market intelligence and content strategy Airtable OAuth – For workflow management and keyword tracking Agent Configuration Orchestrator Agent – Central workflow management with routing logic Product Description Agent – SEO content generation with fact verification Meta Fields Agent – Structured metafield creation with compliance rules SEO Fields Agent – Meta description optimization with keyword integration Premium Keyword Discovery – Automated keyword research and analysis -SEO Compliance Checker – Cannibalization prevention and strategy generation Workflow Tools MCP Server Integration – Airtable data management HTTP Request Tools – Haloscan API communication Structured Output Parsers – Data validation and formatting Memory Buffer Windows – Conversation context management Rich Text Converters – Shopify-compatible content formatting 🎯 Workflow Capabilities Product Description Generation Length Control: 150-300 words with hard limits SEO Structure: Optimized heading hierarchy and keyword placement Fact Verification: Zero-invention policy with source validation Brand Compliance: Controlled brand mentions and positioning Metafield Creation 6 Specialized Fields: Arguments, ingredients, recommendations, nutrition, warnings, descriptions HTML Formatting: Clean structure with allowed tags only Regulatory Compliance: Industry-specific warnings and disclaimers Dynamic Content: Adapts to different product categories automatically Advanced SEO Features Keyword Research: Automated discovery with search volume analysis Cannibalization Prevention: SERP overlap detection and filtering Meta Optimization: Character-limited descriptions with CTR focus Content Strategy: AI-generated SEO roadmaps based on market data 🔐 Credentials Required Shopify Access Token – Product management and content publishing OpenRouter API Key – Multi-model AI access (GPT-4o, Claude variants) Haloscan API Key – Keyword research and SERP analysis Perplexity API Key – Market intelligence and content strategy Airtable OAuth – Database management and workflow tracking 👤 Ideal For E-commerce Teams scaling content creation across hundreds of products SEO Specialists implementing advanced cannibalization prevention strategies Shopify Store Owners seeking enterprise-level content automation Marketing Agencies building scalable, multi-client SEO workflows Product Managers requiring compliance-focused content generation 💬 Advanced Features Multi-Language Ready Workflow architecture supports easy extension to multiple markets and languages with minimal configuration changes. Compliance Framework Built-in regulatory compliance checking ensures content meets industry standards and legal requirements. Scalable Architecture Modular design allows adding new content types, AI models, or integration points without workflow restructuring. Error Handling & Retries Comprehensive error management with automatic retries and fallback mechanisms ensures reliable content generation. 💡 Pro Tip: This workflow represents a complete SEO content factory that can process hundreds of products daily while maintaining quality, compliance, and search engine optimization standards.
by takuma
Who is this for This template is perfect for: Market Researchers** tracking industry trends. Tech Teams** wanting to stay updated on specific technologies (e.g., "AI", "Cybersecurity"). Content Creators** looking for curated news topics. Busy Professionals** who need a high-signal, low-noise news digest. What it does Fetches News: Pulls daily articles via NewsAPI based on your chosen keyword (default: "technology"). AI Filtering: Uses an AI Agent (via OpenRouter) to filter out low-quality or irrelevant clickbait. Daily Digest (Slack): Summarizes the top 3 articles in English. Translates the summaries to Japanese using DeepL (optional). Posts both versions to a Slack channel. Data Archiving (Sheets): Extracts structured data (Title, Author, Summary, URL) and saves it to Google Sheets. Weekly Trend Report: Every Monday, it reads the past week's data from Google Sheets and uses AI to generate a high-level trend report and strategic insights. How to set up Configure Credentials: You will need API keys/auth for NewsAPI, OpenRouter (or OpenAI), DeepL, Google Sheets, and Slack. Setup Google Sheet: Create a sheet with the following headers in the first row: title, author, summary, url. Map the Sheet: In the "Append row in sheet" and "Read sheet (weekly)" nodes, select your file and map the columns. Define Keyword: Open the "Set Keyword" node and change chatInput to the topic you want to track (e.g., "Crypto", "SaaS", "Climate Change"). Slack Setup: Select your desired channel in the Slack nodes. Requirements n8n** (Self-hosted or Cloud) NewsAPI** Key (Free tier available) OpenRouter** (or any LangChain compatible Chat Model like OpenAI) DeepL** API Key (for translation) Google Sheets** account Slack** Workspace How to customize Change the Language:** Remove the DeepL node if you only want English, or change the target language code. Adjust the Prompt:** Modify the "AI Agent (Filter)" system message to change how strict the news filtering is. Change Schedule:** Adjust the Cron nodes to run at your preferred time (currently set to Daily 8 AM and Weekly Monday 9 AM).
by Yuvraj Singh
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. This is 2 way sync with partial support for recurring How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist. To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 80 seconds. Depending on the detected changes, the other side is updated accordingly .Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame. Subtasks are not linked automatically to their parent yet. Tasks names do not support URL’s yet. Credentials Follow the video: Setup credentials for Notion (access token), Todoist (access token) and Redis. Todoist Follow this video to get Todoist to obtain API Token. Todoist Credentials.mp4 Notion Follow this video to get Notion Integration Secret. Redis Follow this video to get Redis Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist. This workflow incorporates ideas and techniques inspired by Mario (https://n8n.io/creators/octionic/) whose expertise with specific nodes helped shape parts of this automation. Significant enhancements and customizations have been made to deliver a unique and improved solution.
by moosa
This workflow monitors product prices from BooksToScrape and sends alerts to a Discord channel via webhook when competitor's prices are lower than our prices. 🧩 Nodes Used Schedule (for daily or required schedule) If nodes (to check if checked or unchecked data exists) HTTP Request (for fetching product page ) Extract HTML (for extracting poduct price) Code(to clean and extract just the price number) Discord Webhook (send discord allerts) Sheets (extract and update) 🚀 How to Use Replace the Discord webhook URL with your own. Customize the scraping URL if you're monitoring a different site.(Sheet i used) Run the workflow manually or on a schedule. ⚠️ Important Do not use this for commercial scraping without permission. Ensure the site allows scraping (this example is for learning only).
by Hugo Le Poole
Who is this for? Agencies, consultants, and service providers who conduct discovery calls and need to quickly turn conversations into professional proposals. What it does: This workflow transforms meeting transcripts into complete, professional quotes using a sophisticated multi-agent AI architecture. It handles the entire quote lifecycle: from transcript analysis to client signature and onboarding. How it works: Trigger: Google Drive detects a new VTT/transcript file in a designated folder Extraction: The transcript is cleaned and parsed, then matched with calendar data to identify the client AI Analysis: A main orchestrator agent analyzes the call and delegates tasks to specialized sub-agents: SOW Agent: Generates problems, solutions, and action items Pricing Agent: Creates competitive pricing based on service catalog and market research Document Creation: PandaDoc API creates the quote with all tokens populated Review & Approval: Quote is sent to Slack for human review with approve/reject buttons Delivery: Approved quotes are sent via Gmail with custom HTML templates Post-Signature: Webhook triggers CRM update and welcome email upon signature Key Features: Multi-agent architecture with specialized AI agents Automatic pricing calculation with 80%+ margin targeting Market research integration via Perplexity API Human-in-the-loop approval via Slack Professional HTML email templates CRM integration (Notion) for client status tracking Requirements Google Drive account (for transcript storage) Google Calendar (for meeting context) PandaDoc account (for quote generation) OpenRouter API (for LLM access - Claude/GPT models) Perplexity API (for market research) Slack workspace (for approval workflow) Gmail account (for client communication) Notion database (for CRM) Setup Instructions Configure Google Drive trigger folder Set up PandaDoc template with required tokens Add API credentials for OpenRouter and Perplexity Connect Slack workspace for approval notifications Configure Gmail for outbound emails Set up Notion CRM database with required properties
by Yaron Been
Description This workflow automatically screens job applicants, scores them against role requirements, sends personalized emails (interview invitation or polite rejection), and logs everything to a Google Sheet. It replaces hours of manual resume triage per hiring cycle. Overview A candidate submits an application through a Tally form with their name, email, role, experience, and motivation. Two AI agents process the application: one extracts and normalizes the fields, the other scores the candidate 1-10 against the role requirements. An IF node routes the applicant based on score: 7 or above triggers an interview invitation email, below 7 triggers a personalized rejection email. Both paths log the applicant with their score, grade, evaluation, and status to a Google Sheet for the hiring manager. Tools Used n8n**: The automation platform that orchestrates the workflow. Tally**: Free form builder used as the job application form. Create forms at tally.so. OpenAI (GPT-5.4)**: Powers two AI agents for applicant data extraction and candidate scoring/evaluation. Gmail**: Sends personalized interview invitations and rejection emails automatically. Google Sheets**: Logs every applicant with their score, grade, AI evaluation, and shortlisted/rejected status. How it works A candidate submits a Tally application form with their full name, email, role applied for, years of experience, experience summary, and motivation statement. Agent 1 extracts and normalizes all applicant fields from the raw Tally payload. Agent 2 scores the candidate 1-10 based on relevance of experience and quality of motivation, assigns a grade (A-D), identifies key strengths and concerns, and writes a 2-sentence evaluation for the hiring manager. An IF node checks the score: 7 or above routes to the shortlist path, below 7 routes to the rejection path. Shortlisted candidates receive a personalized interview invitation email via Gmail. Rejected candidates receive a polite, personalized rejection email via Gmail. Both paths set a status (Shortlisted or Rejected) and log the applicant to a Google Sheet with all details. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Update Job Requirements (Critical First Step): Open the Score Candidate agent node and update the system prompt with the actual job requirements, must-have skills, and evaluation criteria for the open role. Without this, the AI scores against generic criteria. Configure Tally: Add your Tally API key in n8n credentials (get it from Tally > Settings > Integrations > API). Select your Tally form in the Tally Trigger dropdown. Configure OpenAI: Add your OpenAI API key in n8n credentials. Configure Gmail: Add Gmail OAuth2 credentials using the Google account that will send the emails. Edit the email templates in the Send Interview Invitation and Send Rejection Email nodes if needed. Configure Google Sheets: Add Google Sheets OAuth2 credentials. Create a Tally Form: Create a job application form in Tally with these fields: Full Name (short text) Email (email field) Role Applied For (short text or dropdown) Years of Experience (number) Experience Summary (long text) Why do you want this role? (long text) Set Up Google Sheets: Create a spreadsheet with a tab named exactly applications and these column headers in Row 1 (all lowercase, copy-paste ready): | timestamp | name | email | role | score | grade | evaluation | status | |-----------|------|-------|------|-------|-------|------------|--------| Connect the Sheet: Paste your spreadsheet ID into the Log to Applications Sheet node. Test: Submit a test application through your Tally form to verify the full flow: scoring, email delivery, and sheet logging. Use Cases HR Teams**: Automate first-pass screening for high-volume roles and save 2-3 hours per hiring cycle. Startup Founders**: Screen applicants without a dedicated recruiter. Recruiting Agencies**: Triage candidate submissions across multiple open roles. Hiring Managers**: Get an AI evaluation with score and grade before reviewing any application manually. Internal Mobility Programs**: Let employees apply for internal roles with automated, fair scoring. Notes The score threshold (7) is set in the IF node. Adjust it higher for stricter screening or lower for more inclusive shortlisting. Edit the email templates in both Gmail nodes to match your company's tone and include relevant links (calendar scheduling, office address, etc.). GPT-5.4 costs approximately $2.50 per million input tokens and $15 per million output tokens. A typical applicant screening uses 2 API calls, costing approximately $0.005-0.01 per applicant. For multiple open roles, create separate Tally forms per role and duplicate this workflow, customizing Agent 2's system prompt for each role's requirements. Tally's free plan has no form limit, so this scales without extra cost. The workflow sends emails from whatever Gmail account you connect. Make sure it's the appropriate hiring/recruitment account. Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ #n8n #automation #tally #openai #gpt5 #recruitment #hiring #hrautomation #gmail #googlesheets #applicantscreening #airecruitment #n8nworkflow #workflow #nocode #hiringautomation #candidatescoring #talentacquisition #hrtech #jobapplication #tallyforms