by Dariusz Koryto
Google Drive to FTP Transfer Workflow - Setup Guide Overview This n8n workflow automatically transfers files from Google Drive to an FTP server on a scheduled basis. It includes comprehensive logging, email notifications, and error handling. Features Automated Scheduling**: Runs every 6 hours (customizable) Manual Trigger**: Webhook endpoint for on-demand transfers File Filtering**: Supports specific file types and size limits Comprehensive Logging**: Detailed transfer reports saved to Google Drive Email Notifications**: HTML reports sent after each run Error Handling**: Graceful handling of failed transfers Batch Processing**: Files processed individually to prevent rate limits Prerequisites Before setting up this workflow, ensure you have: n8n instance running (self-hosted or cloud) Google Drive account with files to transfer FTP server with upload permissions Email service for sending reports (SMTP) Step-by-Step Setup Instructions 1. Google Drive API Setup 1.1 Create Google Cloud Project Go to Google Cloud Console Create a new project or select existing one Enable the Google Drive API: Navigate to "APIs & Services" → "Library" Search for "Google Drive API" Click "Enable" 1.2 Create OAuth2 Credentials Go to "APIs & Services" → "Credentials" Click "Create Credentials" → "OAuth client ID" Configure consent screen if prompted Choose "Web application" as application type Add your n8n instance URL to authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback Note down the Client ID and Client Secret 1.3 Configure n8n Credential In n8n, go to "Credentials" → "Add Credential" Select "Google Drive OAuth2 API" Enter your Client ID and Client Secret Complete OAuth flow by clicking "Connect my account" Set credential ID as: your-google-drive-credentials-id 2. FTP Server Setup 2.1 FTP Server Requirements Ensure FTP server is accessible from your n8n instance Verify you have upload permissions Note the server details: Host/IP address Port (usually 21 for FTP) Username and password Destination directory path 2.2 Configure n8n FTP Credential In n8n, go to "Credentials" → "Add Credential" Select "FTP" Enter your FTP server details: Host: your-ftp-server.com Port: 21 (or your custom port) Username: your-ftp-username Password: your-ftp-password Set credential ID as: your-ftp-credentials-id 3. Email Setup (SMTP) 3.1 Choose Email Provider Configure SMTP settings for one of these providers: Gmail**: smtp.gmail.com, port 587, use App Password Outlook**: smtp-mail.outlook.com, port 587 Custom SMTP**: Your organization's SMTP server 3.2 Configure n8n Email Credential In n8n, go to "Credentials" → "Add Credential" Select "SMTP" Enter your SMTP details: Host: smtp.gmail.com (or your provider) Port: 587 Security: STARTTLS Username: your-email@example.com Password: your-app-password Set credential ID as: your-email-credentials-id 4. Workflow Configuration 4.1 Import Workflow Copy the workflow JSON from the artifact above In n8n, click "Import from JSON" Paste the workflow JSON and import 4.2 Update Credential References Google Drive nodes: Verify credential ID matches your-google-drive-credentials-id FTP node: Verify credential ID matches your-ftp-credentials-id Email node: Verify credential ID matches your-email-credentials-id 4.3 Customize Parameters FTP Server Settings (Upload to FTP node) { "host": "your-ftp-server.com", // Replace with your FTP host "username": "your-ftp-username", // Replace with your FTP username "password": "your-ftp-password", // Replace with your FTP password "path": "/remote/directory/{{ $json.validFiles[$json.batchIndex].name }}", // Update destination path "port": 21 // Change if using different port } Email Settings (Send Report Email node) { "sendTo": "admin@yourcompany.com", // Replace with your email address "subject": "Google Drive to FTP File Transfer - Report" } File Filter Settings (Filter & Validate Files node) In the JavaScript code, update these settings: const transferNotes = { settings: { maxFileSizeMB: 50, // Change maximum file size allowedExtensions: [ // Add/remove allowed file types '.pdf', '.doc', '.docx', '.txt', '.jpg', '.png', '.zip', '.xlsx' ], autoDeleteAfterTransfer: false, // Set to true to delete from Drive after transfer verifyTransfer: true // Keep true for verification } }; Google Drive Notes Storage (Upload Notes to Drive node) { "parents": { "parentId": "your-notes-folder-id" // Replace with actual folder ID from Google Drive } } 5. Schedule Configuration 5.1 Modify Schedule Trigger In the "Schedule Trigger" node, adjust the interval: { "rule": { "interval": [ { "field": "hours", "hoursInterval": 6 // Change to desired interval (hours) } ] } } Alternative schedule options: Daily**: "field": "days", "daysInterval": 1 Weekly**: "field": "weeks", "weeksInterval": 1 Custom cron**: Use cron expression for complex schedules 5.2 Webhook Configuration The webhook trigger is available at: POST https://your-n8n-instance.com/webhook/webhook-transfer-status Use this for manual triggers or external integrations. 6. Testing and Validation 6.1 Test Connections Test Google Drive: Run "Get Drive Files" node manually Test FTP: Upload a test file using "Upload to FTP" node Test Email: Send a test email using "Send Report Email" node 6.2 Run Test Transfer Activate the workflow Click "Execute Workflow" to run manually Monitor execution in the workflow editor Check for any error messages or failed nodes 6.3 Verify Results FTP Server**: Confirm files appear in destination directory Email**: Check you receive the transfer report Google Drive**: Verify transfer notes are saved to specified folder 7. Monitoring and Maintenance 7.1 Workflow Monitoring Execution History**: Review past runs in n8n interface Error Logs**: Check failed executions for issues Performance**: Monitor execution times and resource usage 7.2 Regular Maintenance Credential Renewal**: Google OAuth tokens may need periodic renewal Storage Cleanup**: Consider archiving old transfer notes Performance Tuning**: Adjust batch sizes or schedules based on usage 8. Troubleshooting 8.1 Common Issues Google Drive Authentication Errors: Verify OAuth2 credentials are correctly configured Check if Google Drive API is enabled Ensure redirect URI matches n8n instance URL FTP Connection Failures: Verify FTP server credentials and connectivity Check firewall settings allow FTP connections Confirm destination directory exists and has write permissions Email Delivery Issues: Verify SMTP credentials and server settings Check if email provider requires app-specific passwords Ensure sender email is authorized File Transfer Failures: Check file size limits in filter settings Verify allowed file extensions include your file types Monitor FTP server disk space 8.2 Debug Mode Enable debug mode by: Adding console.log statements in code nodes Using "Execute Workflow" with step-by-step execution Checking node outputs for data validation 9. Advanced Customizations 9.1 Additional File Filters Add custom filtering logic in the "Filter & Validate Files" node: // Example: Filter by modification date const isRecentFile = new Date(file.modifiedTime) > new Date(Date.now() - 7 * 24 * 60 * 60 * 1000); // Last 7 days // Example: Filter by folder location const isInSpecificFolder = file.parents && file.parents.includes('specific-folder-id'); 9.2 Enhanced Reporting Customize the email report template in "Send Report Email" node: 📊 File Transfer Report Summary Date: {{ new Date().toLocaleString('en-US') }} Success Rate: {{ Math.round((successfulTransfers / totalFiles) * 100) }}% 9.3 Integration with Other Services Add nodes to integrate with: Slack**: Send notifications to team channels Discord**: Post updates to Discord servers Webhook**: Trigger other workflows or systems Database**: Log transfers to MySQL, PostgreSQL, etc. 10. Security Considerations 10.1 Credential Security Use environment variables for sensitive data Regularly rotate FTP and email passwords Implement least-privilege access for service accounts 10.2 Network Security Use SFTP instead of FTP when possible Implement VPN connections for sensitive transfers Monitor network traffic for unusual patterns 10.3 Data Privacy Ensure compliance with data protection regulations Implement data retention policies for transfer logs Consider encryption for sensitive file transfers Support and Resources Documentation Links n8n Documentation Google Drive API Documentation n8n Community Forum Getting Help If you encounter issues: Check the troubleshooting section above Review n8n execution logs for error details Search the n8n community forum for similar issues Create a support ticket with detailed error information Note: Replace all placeholder values (URLs, credentials, IDs) with your actual configuration before running the workflow.
by Genzi
Description A comprehensive real estate chatbot automation system that handles customer inquiries, property searches, and appointment scheduling through intelligent conversation flows and email processing. How it works? This template creates an end-to-end real estate automation system that handles customer inquiries from initial contact through appointment booking. 1. Customer Entry Point Webhook receives customer messages from chat interface Link detection checks if customer shared property URLs Smart routing - if property link found, fetch details immediately; otherwise proceed to chat 2. AI Content Processing Content filter (PRIORITY) - blocks non-real estate queries upfront Information extraction - scans messages for personal details and property requirements Human handoff detection - identifies requests for live agent assistance 3. Data Collection Phase Sequential gathering: Personal info (name → phone → email) then property needs Smart validation - phone format, email structure, budget parsing No redundancy - never asks for information already provided PostgreSQL storage - saves customer data and conversation memory 4. Property Search & Matching Database query filters properties by type, location, budget, availability Image enhancement - fetches property photos from media storage Results ranking - returns top 5 matches sorted by price 5. AI Response Generation GPT-4 formatting creates engaging, professional property listings Visual enhancement - includes property images and key details Personalized tone - acknowledges customer preferences 6. Appointment Automation Gmail monitoring - checks for appointment confirmations every hour Calendar integration - creates, updates, deletes appointments automatically Smart scheduling - checks availability, suggests alternatives for conflicts Email responses - sends confirmations and follow-ups Intelligence Features Context Awareness Remembers conversation history across sessions Builds complete customer profile progressively Maintains property preferences throughout interaction Smart Extraction Recognizes property types: HDB, Condo, Apartment Parses locations and MRT preferences automatically Handles various budget formats (SGD 2,500, $2500, etc.) Identifies timeline requirements and citizenship status Professional Handoffs Detects human agent requests with keyword matching Collects complete customer context before transfer Sends structured handoff emails with all requirements Ensures smooth transition to live agents Technical Components AI Models OpenAI GPT-4 - Main conversation handling and response formatting GPT-4 Mini - Appointment processing and email management LangChain Memory - Conversation context retention Database Integration PostgreSQL - Customer data, property listings, conversation history Property search with multi-criteria filtering Media storage integration for property images Communication Channels Webhook API - Primary chat interface Gmail integration - Appointment confirmations and notifications Google Calendar - Automated scheduling and availability checking Setup Requirements Configure database - PostgreSQL with property and customer tables Set up integrations - Gmail, Google Calendar, OpenAI API Customize prompts - Adjust AI responses for your brand Test workflow - Verify end-to-end functionality Monitor performance - Track conversation success rates The system is designed to handle the complete customer journey from initial inquiry to scheduled property viewing, with intelligent automation reducing manual work while maintaining high service quality.
by AppStoneLab Technologies LLP
AI-Powered Hiring Pipeline: Auto-Screen CVs, Score Candidates & Send Interview Invites Stop manually reading every CV. This workflow watches your inbox, extracts CV text using Mistral OCR, scores every candidate against your job description using Google Gemini AI, and automatically routes them - shortlisted candidates get a professional interview invite, rejected ones get a polite decline, and HR receives a full AI summary with the CV attached. All hands-free. Who Is This For? HR teams and recruiters** at startups or growing companies who receive a high volume of CV emails Technical hiring managers** who want AI-assisted pre-screening before spending time on interviews Solo founders** who are hiring but don't have a dedicated recruiter No-code automation builders** looking for a production-ready hiring automation template What Problem Does This Solve? Manually reviewing CVs is time-consuming, inconsistent, and expensive. This workflow eliminates the bottleneck by automatically: Extracting CV text from PDF attachments (including scanned documents) via Mistral OCR Evaluating every candidate against your specific job description using Gemini AI Routing candidates and sending the right email to the right person - instantly You focus on interviewing. The pipeline handles everything else. Key Features 📥 Email-triggered** → fires automatically when a CV arrives in your inbox, no manual steps 📄 Mistral OCR** → works on both digitally-created and scanned/image-based PDF CVs 🤖 Gemini AI scoring** → returns a 0–100 score, shortlist/reject decision, candidate summary, and key skills 🔀 Smart routing** → shortlisted and rejected candidates are handled differently in the same workflow 📧 3 beautiful HTML email templates** → HR notification (with CV attached), interview invite, and polite decline ⚙️ Binary passthrough** → original CV PDF is preserved and forwarded to HR's email as an attachment 📋 Sticky note documentation** → every node is documented inside the workflow canvas How It Works (Step-by-Step) 📥 Watch Inbox → IMAP trigger fires when a new email arrives with a CV attachment 📄 OCR Extraction → Mistral's mistral-ocr-latest model reads the CV and outputs clean structured text 🤖 AI Scoring → Google Gemini evaluates the CV against your job description and returns a structured JSON with score, decision, candidate name, 3–4 sentence summary, and top 5 skills ⚙️ Parse & Route → a Code node cleans Gemini's response, extracts candidate email from the IMAP from field, and passes the binary CV forward 🔀 IF Decision → routes shortlisted candidates to the true branch and rejected to the false branch 📧 HR Email → HR receives a branded email with the AI score, candidate summary, key skills, and the original CV attached 📧 Interview Invite → shortlisted candidate receives a professional invitation with a scheduling link and "What to Expect" section 📧 Polite Decline → rejected candidate receives a warm, empathetic decline with a link to your careers page 🛠️ Setup Instructions Step 1 - Credentials Required You need to set up 4 credentials in n8n: | Credential | Node Used | Where to Get It | |---|---|---| | IMAP account | Email Trigger | Your email provider settings (Gmail: use App Password) | | Mistral Cloud API | OCR Extraction | Mistral AI Studio → API Keys | | Google Gemini (PaLM) API | AI Scoring | Google AI Studio → Get API Key | | SMTP account | All 3 email nodes | Your email provider SMTP settings | > 💡 Gmail users: Enable 2FA and generate an App Password for both IMAP and SMTP. Use imap.gmail.com:993 and smtp.gmail.com:587. Step 2 - Update Email Addresses In all 3 Send Email nodes, replace the placeholder emails: fromEmail → your sending address (e.g. hr@yourcompany.com) toEmail in the HR node → your HR team's inbox The candidate email fields are already dynamic ({{ $json.candidate_email }}) Step 3 - Add Your Job Description Open the 🤖 AI Score CV (Gemini) node and replace the JOB DESCRIPTION: section in the prompt with your actual role requirements. The current template uses an AI Engineer JD from AppStoneLab as a working example. Step 4 - Add Your Interview Scheduling Link In the 📧 Send Interview Invite to Candidate node, find YOUR_CALENDLY_OR_CAL_LINK_HERE in the HTML and replace it with your actual booking link (Calendly, Cal.com, TidyCal, etc.). How to Customize for Your Use Case | What to Change | Where | Example | |---|---|---| | Job description | Gemini node prompt | Swap in your own role requirements | | Scoring threshold | IF node condition | Change "shortlisted" to score-based logic e.g. score >= 70 | | Company name & branding | All 3 HTML email templates | Replace "AppStoneLab Technologies" with your company | | Careers page URL | Decline email HTML | Replace appstonelab.com/career with your URL | | AI model | Gemini node | Switch to gemini-3-flash-preview or gemini-3.1-pro-preview for different speed/quality | | Watched mailbox | IMAP trigger | Change INBOX to a dedicated folder like INBOX.careers | | Interview questions | Invite email HTML | Add/edit the "What to Expect" section steps | API Keys — Quick Links Mistral AI** → Mistral AI Studio - Free tier includes OCR. Pricing: $1 per 1,000 pages for mistral-ocr-latest Google Gemini** → Google AI Studio - Free tier available. gemini-3-flash-preview is fast and cheap for production Gmail App Password** → Google App Passwords n8n IMAP docs** → docs.n8n.io/integrations/core-nodes/n8n-nodes-base.emailimap n8n SMTP docs** → docs.n8n.io/integrations/core-nodes/n8n-nodes-base.sendemail Important Notes The IMAP Format field must be set to Resolved (not Simple) - this is required for binary attachment data to flow correctly through the workflow The Code node carries the binary CV attachment forward from the IMAP trigger to the HR email node. If you add new nodes between them, make sure binary passthrough is preserved Mistral OCR works on both text-based and scanned/image PDFs, making it more reliable than n8n's built-in Extract from File node The workflow uses the from.value[0].address path to extract the candidate's email from the IMAP trigger output - this is the correct path for the Resolved format 💬 Questions or Issues? Drop a comment on this template or reach out on the n8n community forum. Happy to help you adapt this for your specific hiring use case.
by OwenLee
📚In the social and behavioral sciences (e.g., psychology, sociology, economics, management), researchers and students often need to normalize academic paper metadata and extract variables before any literature review or meta-analysis. 🧩This workflow automates the busywork. Using an LLM, it processes CSV/XLSX/XLS files (exported from WoS, Scopus, EndNote, Zotero, or your own spreadsheets) into normalized metadata and extracted variables, and writes a neat table to Google Sheets. 🔗 Example Google Sheet: click me 👥 Who is this for? 🎓 Undergraduate and graduate students or researchers in soft-science fields (psychology, sociology, economics, business) ⏱️ People who don’t have time to read full papers and need quick overviews 📊 Anyone who wants to automate academic paper metadata normalization and variable extraction to speed up a literature review ⚙️ How it works 📤 Upload an academic paper file (CSV/XLSX/XLS) in chat. 📑 The workflow creates a Google Sheets spreadsheet with two tabs: Checkpoint and FinalResult. 🔎 A structured-output LLM normalizes core metadata (title, abstract, authors, publication date, source) from the uploaded file and writes it to Checkpoint; 📧 a Gmail notification is sent when finished. 🧪 A second structured-output LLM uses the metadata above to extract variables (Independent Variable, Dependent Variable) and writes them to FinalResult; 📧 you’ll get a second Gmail notification when done. 🛠️ How to set up 🔑 Credentials Google Sheets OAuth2** (read/write) Gmail OAuth2** (send notifications) Google Gemini (or any LLM you prefer)** 🚀 Quick start Connect Google Sheets, Gmail, and Gemini (or your LLM) credentials. Open File Upload Trigger → upload your CSV/XLSX/XLS file and type a name in chat (used as the Google Sheets spreadsheet title). Watch your inbox for status emails and open the Google Sheets spreadsheet to review Checkpoint and FinalResult. 🎛 Customization 🗂️ Journal lists: Edit the Journal Rank Classifier code node to add/remove titles. The default list is for business/management journals—swap it for a list from your own field. 🔔 Notifications: Replace Gmail with Slack, Teams, or any channel you prefer. 🧠 LLM outputs: Need different metadata or extracted data? Edit the LLM’s system prompt and Structured Output Parser. 📝 Note 📝 Make sure your file includes abstracts. If the academic paper data you upload doesn’t contain an abstract, the extracted results will be far less useful. 🧩 CSV yields no items? Encoding mismatches can break the workflow. If this happens, convert the CSV to .xls or .xlsx and try again. 📩 Help Contact: owenlzyxg@gmail.com
by Mohamed Abubakkar
WORKFLOW OVERVIEW This workflow is an AI-powered business intelligence agent designed for founders and business owners. It automatically collects key business metrics, calculates performance KPIs, applies decision logic, uses AI reasoning, and sends clear, actionable notifications — without dashboards or manual reports. Key Features: ✅ Aggregates multiple data sources (MSSQL, Google Analytics, Google Sheets) ✅ Calculates critical KPIs: ROAS, CAC, Revenue & User Growth ✅ Applies rule-based decision logic for business risk and opportunity detection ✅ AI-powered reasoning: summarizes insights and recommends actions ✅ Multi-channel notifications: Email, WhatsApp, Slack, Telegram ✅ Fully automated daily execution via Cron trigger ✅ Enterprise-ready: error handling, structured data, KPI validation Setup & Requirements: API access to data sources (MSSQL, Google Analytics, Google Sheets) OpenAI or Google Gemini API for AI reasoning Messaging integration: Gmail, Twilio (WhatsApp), Slack, Telegram Workflow Flow: Cron Trigger – runs daily at a chosen time Data Collection – revenue, users, marketing spend, website analytics Merge Node – combines all data sources Function Node – consolidates into a single JSON object KPI Calculation – calculates ROAS, CAC, growth rates Business Logic Engine – identifies risks and opportunities AI Reasoning Agent – summarizes insights, suggests actions Notification Formatter – builds founder-friendly message Notification Delivery – sends via WhatsApp, Email, Slack, or Telegram Example Data Formation These data below Getting from all different channels. `{ "revenue": 4290, "registeredUsers": 20, "totalUsers": 3, "adSpend": 800 }` Applies rule-based logic to detect potential risks or opportunities `{ "ROAS": 5.36, "CAC": 40, "agentStatus": "normal", "agentPriority": "low", "insights": ["Marketing campaigns are performing very well"] }` Workflow Highlights Fully automated, runs daily without human intervention Integrates multiple business data sources Converts raw data into KPIs for actionable insight Applies both rule-based logic and AI reasoning Generates concise, human-friendly notifications Sending notification to different channels.
by SOLOVIEVA ANNA
Overview This workflow turns audio attachments you receive by Gmail into Japanese transcripts and structured AI summaries, then saves everything to Google Drive and Google Sheets while notifying you via Gmail and Slack. Every time an email with a voice recording arrives, the audio is stored in a dated folder, fully transcribed in Japanese, summarized into clear meeting-style points, and logged so you can quickly review and search later. Audio Email to Japanese Transcr… Audio Email to Japanese Transcript with AI Summary & Multi-Channel Notification Who this is for People who get voice memos or meeting recordings as email attachments Teams that want clear Japanese transcripts plus action-item summaries from calls Anyone who wants audio notes automatically archived and searchable in Drive/Sheets How it works Trigger: New Gmail with audio attachment A Gmail Trigger watches your inbox, downloads attachments for each new email, and passes them into the workflow. Split & filter attachments A Code node splits the email into one item per attachment and normalizes the binary data to binary.data. A Filter node keeps only audio files (mp3, wav, m4a, ogg) and discards everything else. Create date-based Drive folder & upload audio A Code node builds a YYYY/MM folder path from the current date. A Google Drive node creates that folder (if it doesn’t exist) under your chosen parent folder. A Merge node combines folder info with file info, and the audio file is uploaded into that folder so all recordings are organized by year/month. Transcribe audio to Japanese text An HTTP Request node calls the OpenAI Audio Transcriptions API (gpt-4o-transcribe) with the audio file. The prompt tells the model to produce a verbatim Japanese transcript (no summarization, no guessing), returned as plain text. Generate structured AI summary The transcript is sent to an OpenAI Chat node (gpt-4o), which outputs JSON with: title: short Japanese title for the recording points: key discussion points (array) decisions: decisions made (array) actionItems: action items with owner/deadline (array) A Set node then formats this JSON into a Markdown summary (summaryContent) with sections for 要点 / 決定事項 / アクションアイテム. Save transcript & summary files to Drive The transcript text is converted into a .txt file and uploaded to the same YYYY/MM folder. The Markdown summary is converted into a .md file (e.g. xxx_summary.md) and uploaded as well. Each file is then shared in Drive so you have accessible web links to both transcript and summary. Log to Google Sheets A Code node collects the email subject, file name, full transcript, formatted summary, and Drive links into one JSON object. A Google Sheets node appends a new row with timestamp, subject, summary, transcript, and link so you get a running log of all processed audios. Notify via Gmail & Slack Finally, the workflow: Sends a Gmail message back to the original sender with the meeting summary and links Posts a Slack notification in your chosen channel, including subject, file name, summary text, and Drive link How to set up Connect your Gmail, Google Drive, Google Sheets, Slack, and OpenAI credentials in the respective nodes. In the Gmail Trigger, narrow the scope if needed (e.g. specific label, sender, or inbox). In the Drive nodes, set the parent folder where you want the YYYY/MM subfolders to be created. In the Google Sheets node, point to your own spreadsheet and sheet name. In the Slack node, select the channel where reminders should be posted. Make sure your OpenAI credentials have access to both audio transcription and chat endpoints. Customization ideas Filter by sender, subject keyword, or label so only certain emails are processed. Change the folder structure (e.g. ProjectName/YYYY/MM or YYYY/MM/DD) in the folder-path Code node. Adjust the transcription prompt (e.g. allow light punctuation clean-up, use another language). Modify the summary format or add extra fields (e.g. meeting participants, project name) in the AI prompt and Markdown template. Send notifications to other tools: add branches for Notion, LINE, Teams, or additional Slack channels.
by Rahul Joshi
Description This workflow is designed to evaluate newly added CVs for Diversity, Equity, and Inclusion (DEI) eligibility. It automatically ingests CVs from Google Drive, extracts key fields, analyzes them with Azure OpenAI, logs structured DEI outcomes in Google Sheets, and sends a concise DEI-focused summary email to the hiring manager. The entire flow prioritizes consistent, auditable DEI checks and controlled logic paths. What This Template Does Watches Google Drive for new CV files to trigger DEI evaluation. Downloads and extracts text/structured fields from PDF CVs. Assesses DEI eligibility using Azure OpenAI, following defined criteria and prompts. Appends DEI results (eligible/not eligible, rationale, confidence) to Google Sheets for tracking. Generates and sends a DEI-focused summary email to the hiring manager for review. Key Benefits Standardized DEI screening to support equitable hiring decisions. Centralized, structured logging in Sheets for transparency and audits. Automated DEI summaries for faster, consistent manager review. Reliable routing with true/false logic to enforce DEI evaluation steps. Features Google Drive trigger (fileCreated) for CV intake tied to DEI checks. PDF extraction mapped to fields relevant for DEI evaluation. Azure OpenAI Chat Model prompts tuned for DEI criteria and rationale. Google Sheets append with eligibility status, notes, and timestamps. Email node that delivers DEI summaries and next-step guidance. Logic branching (true/false) to control DEI evaluation and notifications. Requirements n8n instance (cloud or self-hosted). Google Drive access to the CV intake folder. Google Sheets access for DEI results logging. Azure OpenAI access and configured prompts reflecting DEI criteria. Email node credentials to send DEI summaries to managers. Step-by-Step Setup Instructions Connect Google Drive and select the CV folder for the fileCreated trigger. Configure the Download CV and Extract From PDF nodes to capture fields needed for DEI checks. Add Azure OpenAI credentials and set DEI-specific prompts (criteria, rationale, confidence). Connect Google Sheets and select the target sheet; map columns for status, rationale, and timestamps. Configure the Email to Manager node with a DEI-focused subject and template. Test with sample CVs, verify sheet entries and email content, then enable the workflow. DEI-Focused Best Practices Clarify DEI criteria and document them in your prompt and sheet schema. Avoid including sensitive PII in emails; store only necessary fields for DEI decisions. Use n8n Credentials; never hardcode API keys or private data. Maintain an audit trail (timestamps, model version, prompt version, decision rationale). Periodically review prompts and sheet schema to align with policy updates.
by Julian Kaiser
Automatically Scrape Make.com Job Board with GPT-5-mini Summaries & Email Digest Overview Who is this for? Make.com consultants, automation specialists, and freelancers who want to catch new client opportunities without manually checking the forum. What problem does it solve? Scrolling through forum posts to find jobs wastes time. This automation finds new postings, uses AI to summarize what clients need, and emails you a clean digest. How it works: Runs on schedule → scrapes the Make.com professional services forum → filters jobs from last 7 days → AI summarizes each posting → sends formatted email digest. Use Cases Freelancers: Get daily job alerts without forum browsing, respond to opportunities faster Agencies: Keep sales teams informed of potential clients needing Make.com expertise Job Seekers: Track contract and full-time positions requiring Make.com skills Detailed Workflow Scraping: HTTP module pulls HTML from the Make.com forum job board Parsing: Extracts job titles, dates, authors, and thread links Filtering: Only jobs posted within last 7 days pass through (configurable) AI Processing: GPT-5-mini analyzes each post to extract: Project type Key requirements Complexity level Budget/timeline (if mentioned) Email Generation: Aggregates summaries into organized HTML email with direct links Delivery: Sends via SMTP to your inbox Setup Steps Time: ~10 minutes Requirements: OpenRouter API key (get one here) SMTP credentials (Gmail, SendGrid, etc.) Steps: Import template Add OpenRouter API key in "OpenRouter Chat Model" node Configure SMTP settings in "Send email" node Update recipient email address Set schedule (recommended: daily at 8 AM) Run test to verify Customization Tips Change date range: Modify filter from 7 days to X days: {{now - X days}} Keyword filtering: Add filter module to only show jobs mentioning "API", "Shopify", etc. AI detail level: Edit prompt for shorter/longer summaries Multiple recipients: Add comma-separated emails in Send Email node Different AI model: Switch to Gemini or Claude in OpenRouter settings Team notifications: Add Slack/Discord webhook instead of email
by Paul Karrmann
LinkedIn Inbox Triage (Gmail Label to Notion + Slack) This n8n template demonstrates how to use AI to triage LinkedIn emails in your Gmail inbox, so you only see the messages worth your time. It filters out automated noise, scores sales likelihood, drafts quick replies for real conversations, stores everything in Notion, and sends you a Slack DM for items you should answer quickly. Good to know This workflow sends email content to an LLM. Do not use it with sensitive mailboxes unless you are comfortable with that. Cost depends on your model choice and token usage. The body is currently limited to 4000 characters to control spend. If you want a shorter run window, adjust the receivedAfter filter. How it works Runs on a daily schedule. Pulls emails from Gmail using a label you define (example: LinkedIn). Applies two filters: Keeps only invitations and messages Removes common automated notifications Fetches the full email body for better classification. Sends the message to an AI agent that returns strict structured JSON: action (reply_quick, review, ignore, block) relevancy_score (0 to 100) sales_likelihood (0 to 1) summary optional reply_draft Applies a quality gate to keep high signal messages. Writes the output to a Notion database as a ticket. Sends a Slack DM only for items marked reply_quick. How to use Create a Gmail label that captures LinkedIn emails, then add the label id to the Gmail node. Create a Notion database with fields matching the Notion node mapping. Connect your OpenAI, Gmail, Notion, and Slack credentials in n8n. Run once manually to verify mapping, then enable the workflow. Requirements Gmail account OpenAI API credentials (or compatible model node) Notion database Slack account Customising this workflow Make it more aggressive by increasing the sales threshold or raising the relevancy cutoff. Add more filter phrases for your own LinkedIn email language. Swap Slack DM for a channel post, or send a daily digest instead of per message. Add a redaction step before the AI node if you want to remove signatures or quoted replies.
by Rahul Joshi
Description Automate your weekly cross-platform social media analytics workflow with AI-powered insights. 📊🤖 This system retrieves real-time Twitter (X) and Facebook data, validates and merges the metrics, formats them via custom JavaScript, generates a visual HTML summary with GPT-4o, stores structured analytics in Notion, and broadcasts key results through Gmail and Slack — all in one seamless flow. Perfect for marketing, social media, and growth teams tracking weekly engagement trends. 🚀💬 What This Template Does 1️⃣ Starts on manual execution to fetch the latest performance data. 🕹️ 2️⃣ Collects live metrics from both Twitter (X API) and Facebook Graph API. 🐦📘 3️⃣ Merges API responses into one unified dataset for analysis. 🧩 4️⃣ Validates data completeness before processing; logs missing or invalid data to Google Sheets. 🔍 5️⃣ Uses JavaScript to normalize data into clean JSON structures for AI analysis. 💻 6️⃣ Leverages Azure OpenAI GPT-4o to generate a professional HTML analytics report. 🧠📈 7️⃣ Updates Notion’s “Growth Chart” database with historical metrics for record-keeping. 🗂️ 8️⃣ Sends the HTML report via Gmail to the marketing or analytics team. 📧 9️⃣ Posts a summarized Slack message highlighting key insights and platform comparisons. 💬 Key Benefits ✅ Eliminates manual social media reporting with full automation. ✅ Ensures clean, validated data before report generation. ✅ Delivers visually engaging HTML performance summaries. ✅ Centralizes analytics storage in Notion for trend tracking. ✅ Keeps teams aligned with instant Slack and Gmail updates. Features Dual-platform analytics integration (Twitter X + Facebook Graph). Custom JavaScript node for data normalization and mapping. GPT-4o model integration for HTML report generation. Real-time error logging to Google Sheets for transparency. Notion database update for structured performance tracking. Slack notifications with emoji-rich summaries and insights. Gmail automation for formatted weekly performance emails. Fully modular — easy to scale to other social platforms. Requirements Twitter OAuth2 API credentials for fetching X metrics. Facebook Graph API credentials for retrieving page data. Azure OpenAI credentials for GPT-4o AI report generation. Notion API credentials with write access to “Growth Chart.” Slack Bot Token with chat:write permission for updates. Google Sheets OAuth2 credentials for error logs. Gmail OAuth2 credentials to send HTML reports. Environment Variables TWITTER_API_KEY FACEBOOK_GRAPH_TOKEN AZURE_OPENAI_KEY NOTION_GROWTH_DB_ID SLACK_ALERT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID GMAIL_MARKETING_RECIPIENTS Target Audience 📈 Marketing and growth teams analyzing engagement trends. 💡 Social media managers tracking cross-channel performance. 🧠 Data and insights teams needing AI-based summaries. 💬 Brand strategists and content teams monitoring audience health. 🧾 Agencies and operations teams automating weekly reporting. Step-by-Step Setup Instructions 1️⃣ Connect all required API credentials (Twitter, Facebook, Azure OpenAI, Notion, Gmail, Slack, Sheets). 2️⃣ Replace the username and page IDs in the HTTP Request nodes for your brand handles. 3️⃣ Verify the JavaScript node output structure for correct field mapping. 4️⃣ Configure the Azure GPT-4o prompt with your preferred tone and formatting. 5️⃣ Link your Notion database and confirm property names match (followers, likes, username). 6️⃣ Add recipient email(s) in the Gmail node. 7️⃣ Specify your Slack channel ID for automated alerts. 8️⃣ Test run the workflow manually to validate end-to-end execution. 9️⃣ Activate or schedule the workflow for regular weekly reporting. ✅
by Nirav Gajera
🏠 AI Real Estate Lead Qualifier — Typeform to Airtable with Smart Email Routing Automatically qualify property leads, score them with AI, save to Airtable, and send personalised emails — all in seconds. 📖 Description Every time a prospect submits your Typeform property inquiry, this workflow kicks in automatically. It extracts their details, runs them through an AI lead scoring engine, saves the record to Airtable, and sends a personalised email — a priority response for hot leads, a nurture email for everyone else. No manual review. No missed leads. No delayed follow-ups. This is built for real estate agencies, leasing companies, and property managers who receive inquiries through Typeform and want instant, intelligent responses without hiring extra staff. ✨ Key Features Instant lead capture** — triggers the moment Typeform receives a submission AI lead scoring** — Google Gemini classifies every lead as High, Medium, or Low automatically Smart email routing** — High leads get a priority response, others get a nurture sequence Airtable CRM sync** — every lead saved with full details + AI assessment Robust AI parsing** — 4-layer fallback system ensures AI output is always usable Personalised emails** — name, property type, location, budget pulled into every email automatically No-code maintenance** — update scoring rules in the prompt, email copy in the nodes 🔄 How It Works Prospect submits Typeform inquiry ↓ Webhook receives the form payload ↓ Extract Typeform Fields → name, email, phone, property type → purpose, location, budget, requirements ↓ AI Lead Qualifier (Google Gemini) → lead_score: High / Medium / Low → intent, timeline, notes ↓ Parse AI Output (4-layer fallback) ↓ Save to Airtable CRM ↓ High Lead? (IF check) ✅ YES → Priority Email (contact within 2 hours) ❌ NO → Nurture Email (contact within 1-2 days) 🤖 AI Lead Scoring Rules The AI automatically classifies leads based on these criteria: | Score | Criteria | | :--- | :--- | | 🔴 High | Has budget + specific location + clear purpose (investment or near-term buying) | | 🟡 Medium | Partial information — needs follow-up to qualify further | | 🟢 Low | Vague inquiry, missing budget or location, early exploration | The AI also extracts: Intent** — what the lead is trying to achieve Timeline** — how urgently they need a property Notes** — assessment summary for your sales team 📧 Email Templates High Lead Email Subject: 🏠 [Name], we have properties matching your needs! Dark blue header — premium feel Full inquiry summary (property, purpose, location, budget, timeline) AI assessment notes included Promise: senior agent contacts within 2 hours Nurture Email (Medium / Low) Subject: Thanks for reaching out, [Name]! Grey header — warm and professional Inquiry summary (property, location, budget) Preparation tips while they wait Promise: contact within 1-2 business days 🛠 Setup Requirements 1. Typeform Setup Create a Typeform with these fields and note their field ref IDs from the Typeform API: | Field | Type | Required | | :--- | :--- | :---: | | Full Name | Short text | ✅ | | Phone Number | Phone | ✅ | | Email Address | Email | ✅ | | Property Type | Multiple choice | ✅ | | Purpose | Multiple choice | ✅ | | Preferred Location | Short text | ✅ | | Budget Range | Multiple choice | ✅ | | Requirements | Long text | ✅ | | Consent | Yes/No | Optional | Connect Typeform webhook: Typeform Dashboard → Connect → Webhooks URL: your n8n webhook URL (Production URL from the Webhook node) Method: POST 2. Update Field Refs In the Extract Typeform Fields Code node, update the REF_* constants to match your actual Typeform field reference IDs: const REF_NAME = 'your-field-ref-here'; const REF_EMAIL = 'your-field-ref-here'; const REF_PHONE = 'your-field-ref-here'; // ... etc Find your field refs from the Typeform API or by logging a test webhook payload. 3. Airtable Setup Create an Airtable base with a table containing these fields: | Field Name | Field Type | | :--- | :--- | | Full Name | Single line text | | Email Address | Email | | Mobile Phone Number | Phone | | Property Type | Single line text | | Purpose | Single line text | | Preferred Location | Single line text | | Budget Range | Single line text | | Requirements | Long text | | Submit Date | Single line text | | Lead Score | Single line text | | Intent | Single line text | | Timeline | Single line text | | Notes | Long text | Update the Save to Airtable node with your Base ID and Table ID. 4. Credentials Required | Credential | Used for | Free? | | :--- | :--- | :--- | | Google Gemini (PaLM) API | AI lead scoring | Free tier available | | Airtable Personal Access Token | CRM save | Free | | SMTP | Sending emails | Depends on provider | 5. Email Configuration In both email nodes, update: fromEmail — your sending address SMTP credential — your email provider details Recommended SMTP providers for testing: Mailtrap (sandbox), Gmail, SendGrid ⚙️ Workflow Nodes | Node | Type | Purpose | | :--- | :--- | :--- | | Webhook | Webhook | Receives Typeform POST payload | | Extract Typeform Fields | Code | Parses answers by field ref ID | | AI Lead Qualifier | AI Agent | Scores lead using Gemini | | Google Gemini Chat Model | LLM | AI model for scoring | | Parse AI Output | Code | Extracts JSON with 4-layer fallback | | Save to Airtable | Airtable | Creates CRM record | | High Lead? | IF | Routes by lead score | | Priority Email | Email Send | Sends to High leads | | Nurture Email | Email Send | Sends to Medium/Low leads | 🔧 Customisation Change scoring criteria: Edit the prompt in the AI Lead Qualifier node. Add your own rules — e.g. score higher if budget exceeds a threshold, or if the purpose is investment. Add more email tiers: Add a third IF branch for Low leads with a different nurture sequence, or add a Slack/WhatsApp alert for High leads. Use a different form: The Extract Typeform Fields node uses Typeform's field.ref system. Replace it with a Google Forms or Jotform parser by adjusting how you read the incoming webhook body. Change the AI model: Replace the Google Gemini node with Claude, OpenAI, or any other LLM Chat Model node — no other changes needed. Add lead deduplication: Before saving to Airtable, add a search step to check if the email already exists and skip or update instead of creating a duplicate. 🛡 Robustness Features The Parse AI Output node uses a 4-layer fallback to ensure the workflow never breaks due to AI formatting issues: Direct JSON parse — if Gemini returns clean JSON Strip markdown fences — if Gemini wraps in Regex JSON extraction — if there's extra text around the JSON Field-by-field regex — if JSON is malformed, extracts each field individually Default fallback — if all else fails, sets lead_score: Medium This means the workflow continues and saves the record even if the AI returns unexpected output. 📊 Sample Airtable Record Full Name: Sarah Johnson Email Address: sarah@example.com Mobile Phone: +91 9876543210 Property Type: 2BHK Apartment Purpose: Investment Preferred Location: Bandra, Mumbai Budget Range: ₹80L - ₹1.2Cr Requirements: Parking, gym, sea view Submit Date: 2026-03-18T10:30:00Z Lead Score: High Intent: Serious buyer looking for investment property with strong rental yield Timeline: Within 3 months Notes: High-intent lead with clear budget and location. Has specific amenity requirements. Recommend immediate callback. 📦 Requirements Summary n8n (cloud or self-hosted) Typeform account (any paid plan for webhooks) Airtable account (free tier works) Google AI Studio account for Gemini API key (free tier available) SMTP email account 💡 Enhancement Ideas WhatsApp alert** — send instant WhatsApp to sales team for every High lead via Twilio Calendar booking link** — include a Calendly link in the High lead email Lead follow-up reminder** — add a Wait node + reminder email if no reply in 48 hours Slack notification** — ping your team channel instantly for High leads Lead scoring dashboard** — connect Airtable to a dashboard tool for weekly reports Built with n8n · Google Gemini AI · Typeform · Airtable · SMTP
by Oneclick AI Squad
Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. 💸📧 Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data** retrieves financial data from a Google Sheet. Analyze Inflows & Outflows** processes the data to identify trends and totals. Validate Records** ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet** appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows – Get in touch with us