by Dev Dutta
Geopolitics Breaking News Alert System Workflow Name: Geopolitics Breaking News Alert System Author: Devjothi Dutta Category: Productivity, News & Media, AI/Machine Learning Complexity: Medium Setup Time: 45-60 minutes ๐ Description An intelligent geopolitical monitoring system that filters 200+ daily news articles down to only the critical breaking news that matters to you. This workflow uses smart keyword filtering and AI-powered scoring to eliminate noise, reduce AI costs, and deliver only high-priority geopolitical alerts to Telegram. The Problem: Traditional news monitoring is overwhelming - hundreds of articles per hour, 95% irrelevant to your region of interest, no urgency prioritization, and critical breaking news gets buried in noise. The Solution: This workflow combines dual-layer filtering (primary + secondary keywords) with AI scoring to distinguish actual breaking news from general news coverage. By filtering first and scoring second, you reduce AI API costs by 80-90% while ensuring you never miss critical geopolitical developments. Switch between monitoring India, China, Middle East, Russia-Ukraine, or any region by simply changing a configuration file. Perfect for government analysts, corporate security teams, investment research firms, news organizations, or anyone who needs to stay informed about geopolitical developments without information overload. ๐ฅ Who's it for For Government & Defense Analysts: Monitor specific regions for military actions, diplomatic developments, and security threats Filter by mission-critical keywords to eliminate irrelevant news AI scoring identifies genuine breaking news vs routine coverage Reduce analyst workload by 90% through intelligent automation For Corporate Security & Risk Teams: Track geopolitical risks affecting global supply chains and operations Custom keyword filters for industry-specific concerns (e.g., "semiconductor", "tariff", "sanctions") Real-time alerts for events impacting business continuity Cost-efficient monitoring with minimal AI API usage For Investment Research Firms: Monitor emerging market geopolitical risks affecting portfolio companies AI scoring differentiates market-moving events from background noise Configurable alert thresholds based on investment strategy (conservative vs aggressive) Track multiple regions simultaneously with different configs For News Organizations & Journalists: Monitor breaking geopolitical developments for editorial coverage Filter by urgency to prioritize assignment desk resources Aggregate multiple international news sources in one place Extend alerts to newsroom Slack channels or email โจ Key Features ๐ฏ Smart Dual-Layer Filtering - Primary keywords ensure regional relevance, secondary keywords filter by event type (military, diplomatic, economic) ๐ค AI-Powered Urgency Scoring - GPT-4o-mini scores articles 1-10 based on geopolitical urgency, distinguishing breaking news from routine coverage ๐ฐ Cost-Efficient Design - Filter first, score second approach reduces AI API calls by 80-90% (only ~5 articles analyzed out of 200) ๐ Multi-Region Support - Monitor India, China, Middle East, Russia-Ukraine, or any region by switching config files ๐ฐ Multi-Source RSS Aggregation - Combines 6 international news sources (NYT, BBC, Al Jazeera, SCMP, regional feeds) ๐ Duplicate Detection - Persistent storage prevents re-analyzing same articles across multiple executions ๐ Consolidated Alerts - Single Telegram message with all breaking news, grouped by urgency score โฐ Flexible Scheduling - Configure trigger interval per your needs (15min for active conflicts, 3hr for routine monitoring) ๐พ Config-Driven Architecture - All filters, keywords, and scoring rules in Google Drive JSON file ๐ Production Ready - Tested end-to-end with real-world India and China configurations ๐ Scalable Design - Run multiple regional configs in parallel, extend to Slack/WhatsApp/Email delivery ๐ ๏ธ Requirements Required Services: n8n (version 1.0+) - Workflow automation platform Free tier: n8n cloud or self-hosted Docker Required feature: Data Tables (for duplicate tracking) OpenAI API (GPT-4o-mini) - AI scoring engine Cost: ~$0.10/day for 30min intervals Free tier: $5 credit for new accounts Telegram Bot - Alert delivery Free: Create via @BotFather on Telegram Get chat ID via @userinfobot Google Drive - Config file storage Free: Any Google account Used for publicly shared JSON config files Required Credentials: OpenAI API Key** - Get from platform.openai.com (GPT-4o-mini access) Telegram Bot Token** - Create bot via @BotFather, get token n8n Data Table** - Built-in n8n feature (no external credential) Optional: Slack Webhook URL (for extending alerts to Slack) SMTP credentials (for email alerts) Twilio account (for WhatsApp/SMS alerts) ๐ฆ What's Included This workflow package includes: Complete n8n workflow JSON (ready to import) Complete setup guide - Detailed configuration with Data Table setup, troubleshooting Technical architecture documentation Use cases and customization guide 4 pre-built regional configs (India, China, Middle East, Russia-Ukraine) ๐ Quick Start Full setup takes 45-60 minutes. For detailed step-by-step instructions, see SETUP_GUIDE.md Overview Create n8n Data Table (analyzed_articles with 2 columns) Upload config to Google Drive (choose region, share publicly, get file ID) Import workflow (22 nodes ready to configure) Configure nodes: Update Google Drive config URL with your file ID Update 6 RSS Feed URLs for your region Link 3 Data Table nodes to analyzed_articles table Add credentials (OpenAI API, Telegram Bot) Set schedule (15min-daily based on monitoring needs) Test workflow (verify filtering, scoring, alerts work) Activate (workflow runs automatically on schedule) Quick Start Result: โ 200+ articles processed โ 5-7 filtered โ 3-5 scored โ 1-3 alerts sent โ Telegram receives consolidated breaking news message โ Workflow runs every 30min (or your chosen interval) โ Total monthly cost: $3-5 (OpenAI API only) Need help? See detailed SETUP_GUIDE.md for complete instructions with screenshots and troubleshooting. ๐ Workflow Stats Nodes:** 22 Complexity:** Medium Execution Time:** ~30-60 seconds per run Monthly Cost:** $3-5 (OpenAI API usage only) Maintenance:** Minimal (update RSS feeds if sources change) Scalability:** Handles 200+ articles per execution, easily scales to 10+ RSS feeds ๐จ Customization Options Add more regions:** Create new config JSON files for North Korea, Taiwan, Africa, Latin America, etc. Multi-channel alerts:** Extend to Slack, WhatsApp, Email, Discord, Microsoft Teams, SMS Severity-based routing:** Send critical alerts (score 9-10) via SMS, others to Telegram Custom scoring models:** Switch between GPT-4o-mini, GPT-4o, Claude based on config Exclude keywords:** Add "exclude_keywords" array to filter out sports, entertainment, weather Alert digest mode:** Aggregate alerts into daily/weekly summary emails instead of real-time Dashboard integration:** Connect to Grafana or Metabase for visual trend analysis Webhook triggers:** Use workflow output to trigger other n8n workflows or external systems Custom RSS feeds:** Add industry-specific or regional news sources Adjust alert threshold:** Change from score >= 6 to higher/lower based on notification preferences ๐ง How it Works Schedule Trigger (Configurable): Workflow runs at your configured interval (15min, 30min, 1hr, 3hr, daily, etc.) Trigger frequency depends on use case: active conflicts need more frequent monitoring Config Loading: HTTP Request node fetches JSON config from Google Drive Config contains: keywords, scoring rules, AI role, alert threshold, Telegram chat ID RSS Aggregation: 6 RSS Feed nodes fetch articles from international news sources Merge node combines all feeds (~200 articles per execution) RSS Cleanup node strips HTML and normalizes to 5 fields (60-75% size reduction) Smart Filtering (Cost Optimization Layer 1): Dynamic Filter checks PRIMARY keywords (geographic/entity: "india", "modi", "delhi") Also checks SECONDARY keywords (event type: "military", "conflict", "trade deal") Both conditions required: Article must mention at least one primary AND one secondary Result: 200 articles reduced to ~5-7 relevant articles (95% reduction) Why this matters: Eliminates noise BEFORE expensive AI scoring Duplicate Detection (Cost Optimization Layer 2): Queries Data Table for previously analyzed article links Filters out articles already scored in last 7 days Result: 5-7 filtered articles reduced to 3-5 new articles Why this matters: Prevents redundant AI API calls (saves 80% on repeat articles) Dynamic AI Prompt Generation: Code node builds system prompt from config.ai_role and config.scoring_criteria Instructs AI: "You are a geopolitical analyst for [REGION]. Score articles 1-10..." Includes scoring rubric: 9-10 = Military Action, 7-8 = Trade/Economic, etc. AI Urgency Scoring (Breaking News Detection): Breaking News Analyzer (GPT-4o-mini) evaluates geopolitical urgency Scores 1-10: Distinguishes genuine breaking news from routine coverage Returns: score, category, reasoning, should_alert (true/false based on threshold) Cost: $0.002 per article (only 3-5 articles scored per execution) Alert Decision: IF node checks: should_alert === true (score >= config.alert_threshold) Only high-priority alerts proceed to Telegram Articles below threshold are logged but not sent Alert Aggregation: Consolidates multiple breaking news alerts into single Telegram message Groups by urgency score with color-coded emojis (๐ด 9-10, ๐ 7-8, ๐ก 6-7) Includes: score, category, title, link for each alert Telegram Delivery: Sends consolidated alert to configured Telegram chat Uses HTML formatting for bold text and clickable links Chat ID dynamically loaded from config (different regions โ different chats) ๐ก Pro Tips Start with Higher Threshold:** Begin with alert_threshold = 7 to avoid alert fatigue, lower to 6 after tuning keywords Regional RSS Matters:** Use region-specific news sources for better coverage (e.g., Times of India for India, not just BBC/NYT) Test Keywords First:** Run workflow manually with "Test Workflow" to verify keyword filtering before activating schedule Monitor AI Costs:** Check OpenAI usage dashboard after first week to confirm ~$0.10/day cost estimate Tune Secondary Keywords:** Add domain-specific terms to secondary keywords (e.g., "semiconductor" for tech supply chain monitoring) Use Separate Configs for Critical Regions:** Clone workflow for high-priority regions instead of switching configs manually Schedule Based on Time Zones:** Align execution intervals with business hours in monitored region (e.g., 9AM-6PM IST for India) Clear Duplicates for Testing:** Manually clear analyzed_articles Data Table when testing new configs for fresh results Backup Working Configs:** Export and version control config files before making major keyword changes Consider Alert Fatigue:** Score 9-10 events are rare (0-1 per day), score 6-8 events are common (2-5 per day) - set threshold accordingly ๐ Related Workflows Multi-Region Geopolitics Dashboard** - Combine multiple regional configs into single monitoring dashboard Geopolitical Risk Scoring for Portfolios** - Integrate with stock portfolio data to assess investment risk Automated Geopolitical Intelligence Reports** - Generate daily/weekly PDF reports from breaking news data Conflict Escalation Tracker** - Track score trends over time to detect escalating tensions Supply Chain Risk Alerting** - Focus on trade/sanctions news affecting global supply chains ๐ง Support & Feedback For questions, issues, or feature requests: GitHub:** n8n-geopolitics-breaking-news-alert Repository n8n Community Forum:** Tag @devdutta Email:** devjothi@gmail.com ๐ License MIT License - Free to use, modify, and distribute โญ If you find this workflow useful, please share your feedback and star the workflow!
by Cheng Siong Chin
How It Works This workflow automates contract governance auditing by deploying a multi-agent AI system that validates contracts, assesses risk, checks compliance, and routes alerts based on risk level. Designed for legal, procurement, and compliance teams, it eliminates manual contract review bottlenecks and ensures timely escalation of high-risk issues. A schedule trigger initiates the workflow, simulating a contract audit data input. A Contract Validation Agent performs initial validation via OpenAI, then passes results to a Governance Orchestration Agent, which delegates to Risk Assessment and Compliance Checker sub-agents. Risk scores are routed by levelโlow, medium, or highโtriggering appropriate notifications via Slack or email escalation before logging the audit trail. Setup Steps Set schedule trigger interval to match audit frequency requirements. Add OpenAI API credentials to all OpenAI Chat Model nodes. Configure Slack credentials and set target channel for risk notifications. Add Gmail/SMTP credentials to the Send Email Escalation node. Define risk thresholds in the Route by Risk Level rules node. Prerequisites Slack workspace with bot token Gmail or SMTP credentials Basic n8n workflow knowledge Use Cases Automated periodic contract risk auditing for procurement teams Compliance breach detection with instant escalation to legal Customization Replace simulated data with live contract database or webhook input Benefits Eliminates manual contract review with scheduled AI auditing
by moosa
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Overview Automate your email management with this n8n workflow that fetches, summarizes, and shares critical emails from your Gmail inbox. Designed for busy professionals, this workflow runs daily to extract important emails from the past 24 hours, summarizes key details (like credentials, OTPs, deadlines, and action items), converts the summary into a PDF, and sends it to your Discord channel for quick access. Key Features Scheduled Automation**: Triggers daily at 8 PM to process emails from the last 24 hours. Gmail Integration**: Retrieves emails labeled "INBOX" and "IMPORTANT" using secure OAuth2 authentication (no hardcoded API keys). Smart Email Parsing**: Extracts essential fields (subject, sender, and plain text) while cleaning up URLs, extra lines, and formatting for clarity. AI-Powered Summarization**: Uses OpenAI's GPT-4.1-mini to create concise plain text and markdown summaries, highlighting urgent actions with "[Action Required]". PDF Conversion**: Converts the markdown summary into a professional PDF using PDF.co API. Discord Notifications**: Shares the PDF via a Discord webhook for seamless team communication. Why Use This Workflow? Save time by automating email triage and focusing on what matters. Stay organized with clear, actionable summaries delivered to Discord. Securely handle sensitive data with proper credential management. Perfect for teams, freelancers, or anyone managing high email volumes. Setup Instructions Configure Gmail OAuth2 credentials for secure access. Set up PDF.co API and Discord webhook credentials. Customize the schedule or filters as needed. Activate and let the workflow handle your daily email summaries!
by Yurie Ino
Employee Onboarding Automation with Multi-System Provisioning What this workflow does This workflow automates the end-to-end employee onboarding process by provisioning new hires across multiple internal systems and delivering a personalized welcome experience. Upon receiving new employee data via a webhook or form submission, it creates user accounts in Google Workspace, invites the employee to Slack, sets up a Notion onboarding page, generates an AI-powered welcome package, and notifies relevant stakeholders. All onboarding activities are logged for tracking and audit purposes. This template helps HR and People Operations teams reduce manual work, ensure consistency, and deliver a smooth onboarding experience from day one. How it works Employee data intake Triggered by a webhook or form submission. Collects employee details such as name, email, department, role, start date, and manager. Data preparation Generates a company email address. Assigns a unique onboarding ID. Standardizes employee information for downstream systems. Parallel account provisioning Creates a Google Workspace user account. Sends an invitation to the Slack workspace. Creates a dedicated Notion onboarding page. Executes these steps in parallel to minimize onboarding time. Provisioning result compilation Consolidates account creation statuses. Produces a single onboarding summary object. AI-powered welcome package Generates: A personalized welcome message A suggested first-week schedule Practical tips for success in the role Formats content for email delivery. Notifications & communication Sends a welcome email to the employee (if a personal email is provided). Notifies HR or People Ops via Slack. Logs onboarding details to Google Sheets. Webhook response Returns a structured JSON response confirming onboarding initiation. Setup requirements Before activating this workflow, ensure the following are configured: Enable the webhook endpoint and connect it to your form or HR system. Configure Google Workspace Admin API access. Set up Slack workspace permissions for user invitations. Define a parent Notion page for onboarding content. Prepare Google Sheets for onboarding logs. Customize email templates, departments, and org unit paths as needed. Required credentials This workflow requires the following credentials to be configured in n8n: Google API** (Google Workspace user provisioning) Slack** (workspace invitations and HR notifications) Notion** (onboarding page creation) OpenAI** (AI-generated welcome content) Gmail** (sending welcome emails) Google Sheets** (onboarding tracking and logs) Customization ideas Add role-based access provisioning (VPN, GitHub, Jira, etc.). Delay account creation until a specific start date. Generate localized onboarding content by region or language. Integrate with HRIS tools such as BambooHR or Workday. Add approval steps for managers or IT before provisioning. Who this is for HR & People Operations teams IT & Identity management teams Startups and scaling organizations Companies seeking consistent, automated onboarding This template provides a scalable, repeatable onboarding foundation that connects HR systems, IT provisioning, and AI-driven communication into a single automated workflow.
by Cheng Siong Chin
How It Works This workflow automates inventory management and customer engagement for e-commerce businesses and retail operations managing multiple product categories. It solves the critical challenge of maintaining optimal stock levels while personalizing customer communications across order fulfillment, product recommendations, and support interactions. The system processes webhook-triggered data across four parallel streams (orders, reviews, inventory, social media), applies AI-powered analysis for sentiment extraction, pricing optimization, promotion targeting, and demand forecasting, then distributes personalized communications through email campaigns and Slack/Microsoft Teams notifications. This eliminates manual inventory tracking, reduces stockouts, and delivers data-driven customer engagement. Setup Steps Configure webhook URLs for orders, reviews, inventory systems, and social media platforms Add AI model API credentials (OpenAI/Anthropic) for sentiment, pricing Connect CRM database for customer profile management and segmentation Set up email service (Gmail/SendGrid) with campaign templates for personalized communications Integrate Slack workspace or Microsoft Teams channels for internal inventory alerts Prerequisites Active e-commerce platform with webhook support, AI service API keys Use Cases Multi-channel retailers optimizing stock across locations, subscription box services Customization Adjust AI prompts for industry-specific sentiment rules, modify inventory thresholds for restocking alerts Benefits Reduces inventory management overhead by 70%, prevents stockouts through predictive forecasting
by Elay Guez
AI-Powered HR Candidate Evaluation Agent with LinkedIn Data Enrichment in CSV/XLSX Format ๐ฏ Overview Transform your manual hiring process into an intelligent evaluation system that saves 15-20 minutes per candidate! This workflow automates the entire candidate assessment pipeline - from CSV/XLSX upload to AI-powered scoring with LinkedIn insights. When you upload a candidate list, this workflow automatically: ๐ Converts your file into a formatted Google Sheet with RTL support ๐ Researches each candidate's recent LinkedIn posts via Apify ๐ค Evaluates candidates using GPT-4.1 with context-aware scoring (0-100) ๐ฌ Generates professional Hebrew explanations for each score ๐ Auto-sorts by score and applies professional formatting โ ๏ธ Sends error alerts to keep everything running smoothly Cost per candidate: ~$0.05 | Time saved: 15-20 minutes each ๐ฅ Who's it for? HR teams drowning in candidate applications Recruitment agencies needing consistent evaluation criteria Hiring managers seeking data-driven candidate insights Companies looking to scale their team Anyone tired of manual spreadsheet juggling โก How it works Form submission triggers with CSV/XLSX upload Google Drive stores the file and creates a new Sheet Data extraction processes the file content AI Agent loops through each candidate: Fetches up to 3 recent LinkedIn posts via Apify Analyzes qualifications against job requirements Generates evaluation score and Hebrew explanation Sheet formatting applies filters, sorting, and styling Error handling notifies admin of any issues ๐ ๏ธ Setup Instructions Time to deploy: 15 minutes Requirements: Google account (Drive + Sheets access) OpenAI API key (GPT-4.1 access) Apify API key (for LinkedIn scraping) Gmail account (for error notifications) Step-by-step: Import this template into your n8n instance Configure Google credentials: Connect Google Drive OAuth2 Connect Google Sheets OAuth2 Add OpenAI API key to the GPT-4.1 node Set up Apify credentials for LinkedIn scraping Configure Gmail for error alerts (update email in "Send a message" node) Update folder IDs in Google Drive nodes to your folders Test with a sample CSV containing 2-3 candidates Activate and share the form URL with your team! ๐ Input File Format Your CSV/XLSX should include these columns (Hebrew): ืฉื ืคืจืื (First name) ืฉื ืืฉืคืื (Last name) ืืฉืืื ืืื ืงืืืื (LinkedIn URL) Your custom evaluation questions ๐จ Customization Options Easy tweaks: Scoring criteria**: Modify the AI agent's system message Language**: Switch from Hebrew to any language Scoring rubric**: Adjust the 50/25/15/10 weighting LinkedIn posts**: Change from 3 posts to more/fewer Sheet styling**: Customize colors and formatting Advanced modifications: Add integration with your ATS (Greenhouse, Lever, etc.) Connect to Slack for real-time notifications Add multiple evaluation agents for different roles Implement multi-language support Add candidate email automation ๐ก Pro Tips Better LinkedIn data**: Ensure candidates provide complete LinkedIn URLs (not just usernames) Consistent scoring**: Run batches of similar roles together for normalized scoring Cost optimization**: Adjust Apify settings to fetch only essential data Scale smartly**: Process in batches of min 10-20 for optimal performance โ ๏ธ Important Notes LinkedIn scraping respects Apify's rate limits Scores are relative within each batch - don't compare across different job roles The workflow handles both CSV and XLSX formats automatically Error notifications help you catch issues before they cascade ๐ Expected Results After implementation, expect: Data-driven evaluation across candidates Professional explanation for hiring decisions Happy recruiters who can focus on human connection Built with โค๏ธ by Elay Guez
by Julian Kaiser
n8n Forum Job Aggregator - AI-Powered Email Digest Overview Automate your n8n community job board monitoring with this intelligent workflow that scrapes, analyzes, and delivers opportunities straight to your inbox. Perfect for freelancers, agencies, and developers looking to stay on top of n8n automation projects without manual checking. How It Works Scrapes the n8n community job board to find new postings from the last 7 days Extracts key metadata including job titles, descriptions, posting dates, and client details Analyzes each listing using OpenRouter AI to generate concise summaries of project requirements and client needs Delivers a professionally formatted email digest with all opportunities organized and ready for review Prerequisites OpenRouter API Key**: Sign up at OpenRouter.ai to access AI summarization capabilities SMTP Email Account**: Gmail, Outlook, or any SMTP-compatible email service Setup Steps Time estimate: 5-10 minutes Configure OpenRouter Credentials Add your OpenRouter API key in n8n credentials manager Recommended model: GPT-3.5-turbo or Claude for cost-effective summaries Set Up SMTP Email Configure sender email address Add recipient email(s) for digest delivery Test connection to ensure delivery Customize Date Range (Optional) Default: Last 7 days of job postings Adjust the date filter node to match your preferred frequency Test & Refine Run a test execution Review email formatting and AI summary quality Customize HTML template styling to match your preferences Customization Options Scheduling**: Set up cron triggers (daily, weekly, or custom intervals) Filtering**: Add keyword filters for specific technologies or project types AI Prompts**: Modify the summarization prompt to extract different insights Email Design**: Customize HTML/CSS styling in the email template node Example Use Cases Freelance Developers**: Never miss relevant n8n automation opportunities Agencies**: Monitor market demand and competitor activity Job Seekers**: Track n8n-related positions and consulting gigs Market Research**: Analyze trends in automation project requests Example Output Each email digest includes: Job title and posting date AI-generated summary (e.g., "Client needs workflow automation for Shopify order processing with Slack notifications") Direct link to original posting Organized by recency
by ็ฆๅฃฝไธ่ฒด
Who is this for? This template is designed for B2B sales teams, recruiters, and business development professionals who want to identify sales opportunities by monitoring hiring signals from target companies. It's particularly useful for: Sales teams selling HR tech, recruitment software, or staffing services Consultancies offering technical talent or project-based work Any B2B company that uses "intent data" from job postings to time their outreach What this workflow does This workflow automates the entire process of monitoring job postings and converting hiring signals into actionable sales leads: Daily Job Scraping: Automatically scrapes job postings from Google Jobs, LinkedIn, and Indeed for your target companies using Apify actors Data Normalization: Standardizes job data from multiple sources into a unified format Keyword Filtering: Filters jobs based on your target keywords to identify relevant opportunities AI-Powered Analysis: Uses GPT-4o to analyze each qualified job posting and generate: Inferred pain points from the hiring signal Strategic sales approach angles Urgency scoring (1-10) Ready-to-send cold email drafts Slack Notifications: Sends real-time alerts with AI insights to your sales channel Weekly Reports: Generates comprehensive trend analysis reports every Monday with AI-powered insights Setup Google Sheets: Create a spreadsheet with 4 sheets: Target Companies (columns: Company Name, Target Keywords, My Solution) Raw Jobs (for all scraped jobs) Qualified Leads (for filtered opportunities) Weekly Reports (for trend analysis) Apify: Set up accounts and get Actor IDs for: Google Jobs Scraper LinkedIn Jobs Scraper Indeed Scraper Credentials: Connect your Google Sheets, Slack, Gmail, OpenAI, and Apify credentials Configuration: Update the placeholder values in the workflow for your specific IDs and channel names Requirements n8n instance (self-hosted or cloud) Apify account with credits OpenAI API key (GPT-4o access) Google Sheets access Slack workspace (optional, for notifications) Gmail account (optional, for email reports) Customization Adjust maxJobsPerSource and daysToCheck in the Configuration node Modify AI prompts to match your sales style and language preferences Add or remove job sources based on your needs Customize Slack message format and notification triggers
by Cheng Siong Chin
Introduction Automate peer review assignment and grading with AI-powered evaluation. Designed for educators managing collaborative assessments efficiently. How It Works Webhook receives assignments, distributes them, AI generates review rubrics, emails reviewers, collects responses, calculates scores, stores results, emails reports, updates dashboards, and posts analytics to Slack. Workflow Template Webhook โ Store Assignment โ Distribute โ Generate Review Rubric โ Notify Slack โ Email Reviewers โ Prepare Response โ Calculate Score โ Store Results โ Check Status โ Generate Report โ Email Report โ Update Dashboard โ Analytics โ Post to Slack โ Respond to Webhook Workflow Steps Receive & Store: Webhook captures assignments, stores data. Distribute & Generate: Assigns peer reviewers, AI creates rubrics. Notify & Email: Alerts via Slack, sends review requests. Collect & Score: Gathers responses, calculates peer scores. Report & Update: Generates reports, emails results, updates dashboard. Analyze & Alert: Posts analytics to Slack, confirms completion. Setup Instructions Webhook & Storage: Configure endpoint, set up database. AI Configuration: Add OpenAI key, customize rubric prompts. Communication: Connect Gmail, Slack credentials. Dashboard: Link analytics platform, configure metrics. Prerequisites OpenAI API key Gmail account Slack workspace Database or storage system Dashboard tool Use Cases University peer review assignments Corporate training evaluations Research paper assessments Customization Multi-round review cycles Custom scoring algorithms Benefits Eliminates manual distribution Ensures consistent evaluation
by Yaron Been
Stay ahead in your job search with this Automated Job Intelligence System! This workflow scans company career pages daily for new job listings, uses AI to analyze job relevance and seniority levels, and sends personalized email alerts for high-priority opportunities while maintaining a comprehensive job database. Perfect for job seekers, recruiters, and career coaches tracking ideal opportunities across target companies. What This Template Does Triggers daily at 9 AM to scan for new job opportunities. Retrieves company URLs from Google Sheets database. Uses Decodo scraper to extract job listings from career pages. Analyzes job data with AI to identify company names and positions. Converts job data into individual listing items for processing. Compares new jobs against existing database to filter duplicates. Uses AI to assign relevance scores, seniority levels, and tech stack analysis. Filters high-relevance jobs (score >8/10) for priority alerts. Stores all new jobs in Google Sheets for historical tracking. Sends personalized email alerts for high-relevance opportunities. Key Benefits Automated daily scanning of target company career pages AI-powered relevance scoring and job matching Duplicate prevention to avoid redundant notifications Comprehensive job database for tracking and analysis Personalized alerts for high-priority opportunities Time-saving automation for job search activities Features Daily automated scheduling for consistent monitoring AI-powered job extraction and data structuring Decodo web scraping for reliable career page access Intelligent relevance scoring and seniority analysis Duplicate detection and filtering Google Sheets integration for job tracking Personalized email alerts for premium opportunities Multi-company monitoring capability Historical job data maintenance Requirements Decodo API credentials for web scraping OpenAI API credentials for AI analysis Google Sheets OAuth2 credentials with edit access Gmail OAuth2 credentials for email alerts Slack Bot Token for error notifications (optional) Environment variables for configuration settings Target Audience Job seekers and career changers Recruitment and talent acquisition teams Career coaches and placement agencies HR professionals monitoring competitor hiring Technology professionals tracking market opportunities University career services and placement offices Step-by-Step Setup Instructions Connect Decodo API credentials for career page scraping Set up OpenAI credentials for job relevance analysis Configure Google Sheets with company URLs and job tracking sheets Add Gmail credentials for personalized job alerts Optional: Set up Slack for error notifications Populate company URLs in your Google Sheets database Test with sample companies to verify job extraction and analysis Customize relevance thresholds for your job preferences Activate for daily automated job intelligence monitoring Pro Tip: Use coupon code "YARON" to get 23K requests for testing (in Decodo) This workflow transforms your job search with automated intelligence, smart filtering, and personalized opportunity alerts!
by WeblineIndia
AI-Powered Lead Qualification using Zoho CRM, People Data Labs, and Google Gemini This workflow automatically checks Zoho CRM every 5 minutes for newly created leads, enriches each lead using People Data Labs, evaluates its quality using Google Gemini (LLM Chain) and updates the lead status in Zoho CRM as Qualified or Not Qualified. Qualified leads trigger an automated Gmail notification to the sales team. Quick Start Setup Add Zoho CRM OAuth credentials. Add your People Data Labs API key. Add your Google Gemini (PaLM) LLM API credentials. Add Gmail OAuth credentials and set your recipient email. Activate the workflow. Create a test lead and verify enrichment โ scoring โ update โ email. What It Does This workflow serves as an automated AI-driven lead qualification engine. Every 5 minutes, it fetches leads from Zoho CRM, filters newly created ones, enriches them using People Data Labs and scores them via Google Gemini. Based on the AI-generated score, the workflow updates the lead status and optionally sends an email notification. Whoโs It For Sales teams using Zoho CRM SDR/Marketing automation teams Agencies performing automated lead pre-qualification Businesses with high inbound lead volume Anyone wanting AI-powered CRM automation via n8n Requirements n8n instance Zoho CRM OAuth credentials People Data Labs API key (can use another service , modify accordingly) Google Gemini API credentials Gmail OAuth credentials Zoho fields: Company, Email, First_Name, Last_Name, Created_Time, Lead_Status How It Works & Setup Steps Step 1: Run every 5 minutes via Schedule Trigger Triggers the workflow and computes a timestamp window. Step 2: Fetch Zoho leads Retrieves all leads from Zoho CRM. Step 3: Filter newly created leads Compares Created_Time with timestamp from previous run. Step 4: Extract website field Extract the website field, Used for People Data Labs enrichment API for search about that company. Step 5: Enrich via People Data Labs Adds company size, industry, founding year, etc. Step 6: Score using Google Gemini LLM produces a JSON response: summary, score, factors. Step 7: Update CRM status IF score > 80 โ Qualified Else โ Not Qualified Step 8: Send Gmail notification Send gmail notifications to sales team, Only for Qualified leads - informs that this Lead is marked as Qualified Leads Customization Adjust score threshold in IF node Edit email recipients in Gmail node Modify AI prompt in LLM Chain Change 3rd party api for enrichment, if required Modify PDL parameters Change Zoho CRM fields AddโOns Slack notifications Google Sheets logging Auto-create Deals in Zoho Add CRM notes Owner reassignment Tagging Use Case Examples Automated B2B lead scoring High-intent lead notifications CRM hygiene automation Enriched lead analytics SDR productivity boost Troubleshooting Guide | Issue | Cause | Solution | |-------|--------|----------| | No new leads detected | Timestamp mismatch | Validate โCompute Last Checkโ logic | | No enrichment data | Empty website / invalid API key | Check PDL credentials & website value | | AI output invalid | Prompt overwritten | Restore original prompt | | CRM update fails | Wrong leadId mapping | Confirm Zoho lead ID | | Gmail errors | OAuth expired | Reconnect Gmail credentials | | No qualified leads | Score too strict | Lower IF threshold | Need Help? WeblineIndia can help with workflow customization, advanced automations, CRM integrations and AI-driven business processes. Contact us for expert assistance.
by Pinecone
Try it out This n8n workflow template lets you chat with your Google Drive documents (.docx, .json, .md, .txt, .pdf) using OpenAI and Pinecone vector database. It retrieves relevant context from your files in real time so you can get accurate, context-aware answers about your proprietary dataโwithout the need to train your own LLM. Not interested in chunking and embedding your own data or figuring out which search method to use? Try our n8n quickstart for Pinecone Assistant here or check out the full workflow to chat with your Google Drive documents here. Prerequisites A Pinecone account A GCP project with Google Drive API enabled and configured An Open AI account and API key A Cohere account and API key Setup Create a Pinecone index in the Pinecone Console here Name your index n8n-dense-index Select OpenAI's text-embedding-3-small Set the Dimension to 1536 Leave everything else as default If you use a different index name, update the related nodes to reflect this change Use the Connect to Pinecone button to authenticate to Pinecone or if you self-host n8n, create a Pinecone credential and add your Pinecone API key directly Setup your Google Drive OAuth2 API, Open AI, and Cohere credentials in n8n Download these files and add them to a Drive folder named n8n-pinecone-demo in the root of your My Drive https://docs.pinecone.io/release-notes/2022.md https://docs.pinecone.io/release-notes/2023.md https://docs.pinecone.io/release-notes/2024.md https://docs.pinecone.io/release-notes/2025.md https://docs.pinecone.io/release-notes/2026.md Activate the workflow or test it with a manual execution to ingest the documents Enter the chat prompts to chat with the Pinecone release notes What support does Pinecone have for MCP? When was fetch by metadata released? Ideas for customizing this workflow Use your own data and adjust the chunking strategy Update the AI Agent System Message to reflect how the Pinecone Vector Store Tool will be used. Be sure to include info on what data can be retrieved using that tool. Update the Pinecone Vector Store Tool Description to reflect what data you are storing in the Pinecone index Need help? You can find help by asking in the Pinecone Discord community or filing an issue on this repo.