by Kirill Khatkevich
This workflow transforms your Meta Ads creatives into a rich dataset of actionable insights. It's designed for data-driven marketers, performance agencies, and analysts who want to move beyond basic metrics and understand the specific visual and textual elements that drive ad performance. By automatically analyzing every video and image with Google's powerful AI (Video Intelligence and Vision APIs), it systematically deconstructs your creatives into labeled data, ready for correlation with campaign results. Use Case You know some ads perform better than others, but do you know why? Is it the presence of a person, a specific object, the on-screen text, or the spoken words in a video? Answering these questions manually is nearly impossible at scale. This workflow automates the deep analysis process, allowing you to: Automate Creative Analysis:** Stop guessing and start making data-backed decisions about your creative strategy. Uncover Hidden Performance Drivers:** Identify which objects, themes, text, or spoken phrases correlate with higher engagement and conversions. Build a Structured Creative Database:** Create a detailed, searchable log of every element within your ads for long-term analysis and trend-spotting. Save Countless Hours:** Eliminate the tedious manual process of watching, tagging, and logging creative assets. How it Works The workflow is triggered on a schedule and follows a clear, structured path: 1. Configuration & Ad Ingestion: The workflow begins on a schedule (e.g., weekly on Monday at 10 AM). It starts by fetching all active ads from a specific Meta Ads Campaign, which you define in the Set Campaign ID node. 2. Intelligent Branching (Video vs. Image): An IF node inspects each creative to determine its type. Video creatives** are routed to the Google Video Intelligence API pipeline. Image creatives** are routed to the Google Vision API pipeline. 3. The Video Analysis Pipeline: For each video, the workflow gets a direct source URL, downloads the file, and converts it to a Base64 string. It then initiates an asynchronous analysis job in the Google Video Intelligence API, requesting LABEL_DETECTION, SPEECH_TRANSCRIPTION, and TEXT_DETECTION. A loop with a wait timer periodically checks the job status until the analysis is complete. Finally, a Code node parses the complex JSON response, structuring the annotations (like detected objects with timestamps or full speech transcripts) into clean rows. 4. The Image Analysis Pipeline: For each image, the file is downloaded, converted to Base64, and sent to the Google Vision API. It requests a wide range of features, including label, text, logo, and object detection. A Code node parses the response and formats the annotations into a standardized structure. 5. Data Logging & Robust Error Handling: All successfully analyzed data from both pipelines is appended to a primary Google Sheet. The workflow is built to be resilient. If an error occurs (e.g., a video fails to be processed by the API, or an image URL is missing), a detailed error report is logged to a separate errors sheet in your Google Sheet, ensuring no data is lost and problems are easy to track. Setup Instructions To use this template, you need to configure a few key nodes. 1. Credentials: Connect your Meta Ads account. Connect your Google account. This account needs access to Google Sheets and must have the Google Cloud Vision API and Google Cloud Video Intelligence API enabled in your GCP project. 2. The Set Campaign ID Node: This is the primary configuration step. Open this Set node and replace the placeholder value with the ID of the Meta Ads campaign you want to analyze. 3. Google Sheets Nodes: You need to configure two Google Sheets nodes: Add Segments data:** Select your spreadsheet and the specific sheet where you want to save the successful analysis results. Ensure your sheet has the following headers: campaign_id, ad_id, creative_id, video_id, file_name, image_url, source, annotation_type, label_or_text, category, full_transcript, confidence, start_time_s, end_time_s, language_code, processed_at_utc. Add errors:** Select your spreadsheet and the sheet you want to use for logging errors (e.g., a sheet named "errors"). Ensure this sheet has headers like: error_type, error_message, campaign_id, ad_id, creative_id, file_name, processed_at_utc. 4. Activate the Workflow: Set your desired frequency in the Run Weekly on Monday at 10 AM (Schedule Trigger) node. Save and activate the workflow. Further Ideas & Customization This workflow provides the "what" inside your creatives. The next step is to connect it to performance. Build a Performance Analysis Workflow:** Create a second workflow that reads this Google Sheet, fetches performance data (spend, clicks, conversions) for each ad_id from the Meta Ads API, and merges the two datasets. This will allow you to see which labels correlate with the best performance. Create Dashboards:** Use the structured data in your Google Sheet as a source for a Looker Studio or Tableau dashboard to visualize creative trends. Incorporate Generative AI:** Add a final step that sends the combined performance and annotation data to an LLM (like in the example you provided) to automatically generate qualitative summaries and recommendations for each creative. Add Notifications:** Use the Slack or Email nodes to send a summary after each run, reporting how many creatives were analyzed and if any errors occurred.
by Cheng Siong Chin
Introduction Automate price monitoring for e-commerce competitors—ideal for retailers, analysts, and pricing teams. Scrapes competitor sites, extracts pricing/stock data via AI, detects changes, and sends instant alerts for dynamic pricing strategies. How It Works Scrapes competitor URLs via Firecrawl and Apify, extracts data with AI, detects price/stock changes, logs to Google Sheets, and sends Telegram alerts. Workflow Template Trigger → Scrape URL → AI Extract → Parse → Merge Historical → Detect Changes → Update Sheets + Send Telegram Alert Workflow Steps Trigger & Scrape → Manual/scheduled trigger → Firecrawl + Apify fetch competitor data AI Processing → Claude extracts product details → Parses and structures data Change Detection → Reads historical prices → Merges with current data → Identifies updates Output → Logs alerts to Sheets → Updates historical data → Sends Telegram notification Setup Instructions 1. Firecrawl API Get key from dashboard → Add to n8n 2. Apify API Get key from console → Add to n8n → Configure actors 3. AI Model (Claude/OpenAI) Get API key → Add to n8n 4. Google Sheets OAuth2 Create OAuth2 in Google Cloud Console → Authorize in n8n → Enable API 5. Telegram Bot Create via BotFather → Get token & chat ID → Add to n8n 6. Spreadsheet Setup Create Sheet with required columns → Copy ID → Paste in workflow Prerequisites Self-hosted n8n, Firecrawl account, Apify account, Claude/OpenAI API key, Google account (Sheets OAuth2),Telegram bot Customization Add more URLs, adjust scraping intervals, change detection thresholds, switch to Slack/email alerts, integrate databases Benefits Saves 2+ hours daily, real-time tracking, automated alerts, historical analysis, multi-source scraping
by Jitesh Dugar
👤 Who’s it for This workflow is designed for employees who need to submit expense claims for business trips. It automates the process of extracting data from receipts/invoices, logging it to a Google Sheet, and notifying the finance team via email. Ideal users: Employees submitting business trip expense claims HR or Admins reviewing travel-related reimbursements Finance teams responsible for processing claims ⚙️ How it works / What it does Employee submits a form with trip information (name, department, purpose, dates) and uploads one or more receipts/invoices (PDF). Uploaded files are saved to Google Drive for record-keeping. Each PDF is passed to a DocClaim Assistant agent, which uses GPT-4o and a structured parser to extract structured invoice data. The data is transformed and formatted into a standard JSON structure. Two parallel paths are followed: Invoice records are appended to a Google Sheet for centralized tracking. A detailed HTML email summarizing the trip and expenses is generated and sent to the finance department for claim processing. 🛠 How to set up Create a form to capture: Employee Name Department Trip Purpose From Date / To Date Receipt/Invoice File Upload (multiple PDFs) Configure file upload node to store files in a specific Google Drive folder. Set up DocClaim Agent using: GPT-4o or any LLM with document analysis capability Output parser for standardizing extracted receipt data (e.g., vendor, total, tax, date) Transform extracted data into a structured claim record (Code Node). Path 1: Save records to a Google Sheet (one row per expense). Path 2: Format the employee + claim data into a dynamic HTML email Use Send Email node to notify the finance department (e.g., finance@yourcompany.com) ✅ Requirements Jotform account with expense form setup Sign up for free here n8n running with access to: Google Drive API (for file uploads) Google Sheets API (for logging expenses) Email node (SMTP or Gmail for sending) GPT-4o or equivalent LLM with document parsing ability PDF invoices with clear formatting Shared Google Sheet for claim tracking Optional: Shared inbox for finance team 🧩 How to customize the workflow Add approval steps**: route the email to a manager before finance Attach original PDFs**: include uploaded files in the email as attachments Localize for other languages**: adapt form labels, email content, or parser prompts Sync to ERP or accounting system**: replace Google Sheet with QuickBooks, Xero, etc. Set limits/validation**: enforce max claim per trip or required fields before submission Auto-tag expenses**: add categories (e.g., travel, accommodation) for better reporting
by DigiMetaLab
How it works: Daily Trigger: Every morning at 8 AM, the workflow is automatically triggered. Fetch Trending Topics: The workflow collects trending topics from external sources, such as news RSS feeds and Reddit popular posts. These trends are merged and summarized to provide up-to-date context for content generation. Read Active Campaigns: The workflow reads all rows from the “Active Campaigns” Google Sheet, but only processes campaigns with a status of "active" to avoid generating content for paused or inactive campaigns. Enrich Campaigns with Trends: Each active campaign is enriched with the latest trending topics, so the generated content can reference current events or popular themes. AI Content Generation: For each enriched campaign, Groq AI generates: An engaging post caption tailored to the platform and target audience Creative direction with visual suggestions Relevant hashtags (5-10) Best posting time recommendation for the platform Quality Scoring: The workflow calculates a quality score for each generated content idea, considering factors like caption length, hashtag count, and creative direction. Append to Google Sheets: The generated content ideas, along with their quality scores and other details, are appended to the “Daily Content Plan” Google Sheet for record-keeping and team collaboration. Schedule in Google Calendar: For each campaign, an event is created in Google Calendar with the content details and recommended posting time, ensuring the team is reminded to review or publish the content. Daily Email Summary (Optional): At the end of the process, a summary email can be sent to the team, including statistics such as the number of campaigns processed, average quality score, and a platform breakdown. Set up steps: Prepare Your Google Sheets: Create a sheet named “Active Campaigns” with columns: Project Name, Theme, Target Audience, Platform, and Status (to mark campaigns as active/inactive). Create another sheet named “Daily Content Plan” with columns for Project Name, Date, Platform, Caption, Creative Direction, Hashtags, and any other details you want to track. Connect Google Services to n8n: In n8n, set up and authenticate your Google Sheets and Google Calendar credentials. You can find authentication information in the n8n documentation for Google credentials. Add a Cron Node: Drag in a Cron node and set it to trigger every day at 8:00 AM. Read Campaigns from Google Sheets: Add a Google Sheets node. Set the operation to “Read Rows” and select your “Active Campaigns” sheet. (Optional) Use a Filter or IF node to process only rows where Status is “active”. (Optional) Fetch Trending Topics: If you want to enrich your content with trending topics, add nodes to fetch data from RSS feeds, Reddit, or other sources. Process Each Campaign: Use a SplitInBatches node to process each campaign row individually. Generate Content Ideas with Groq AI: Add a Groq AI node (or OpenAI node if Groq is not available). Configure the prompt to generate a content idea using the campaign’s theme, target audience, and platform. You can reference fields from the Google Sheets node using expressions like $("Google Sheets").item.json['Theme']. Append Results to “Daily Content Plan”: Add another Google Sheets node. Set the operation to “Append” and select your “Daily Content Plan” sheet. Map the generated content fields to the appropriate columns. Schedule Events in Google Calendar: Add a Google Calendar node. Set the operation to “Create Event”. Use the project name and content idea for the event title and description, and set the event time as needed. (Optional) Send a Daily Summary Email: Add an Email node to send a summary of the day’s content plan to your team. Test the Workflow: Run the workflow manually to ensure all steps work as expected. Check that new content ideas appear in the “Daily Content Plan” sheet and that events are created in Google Calendar. Activate the Workflow: Once you’ve confirmed everything works, activate the workflow so it runs automatically every morning.
by Oneclick AI Squad
This automated n8n workflow transforms uploaded radiology images into professional, patient-friendly PDF reports. It uses AI-powered image analysis to interpret medical scans, simplify technical terms, and produce clear explanations. The reports are formatted, converted to PDF, stored in a database, and sent directly to patients via email, ensuring both accuracy and accessibility. 🏥 Workflow Overview: Simple Process Flow: Upload Image → 2. AI Analysis → 3. Generate Report → 4. Send to Patient 🔧 How It Works: Webhook Trigger - Receives image uploads via POST request Extract Image Data - Processes patient info and image data AI Image Analysis - Uses GPT-4 Vision to analyze the radiology image Process Analysis - Structures the AI response into readable sections Generate PDF Report - Creates a beautiful HTML report Convert to PDF - Converts HTML to downloadable PDF Save to Database - Logs all reports in Google Sheets Email Patient - Sends the report via email Return Response - Confirms successful processing 📊 Key Features: AI-Powered Analysis** using GPT-4 Vision Patient-Friendly Language** (no medical jargon) Professional PDF Reports** with clear sections Email Delivery** with report attachment Database Logging** for record keeping Simple Webhook Interface** for easy integration 🚀 Usage Example: Send POST request to webhook with: { "patient_name": "John Smith", "patient_id": "P12345", "scan_type": "X-Ray", "body_part": "Chest", "image_url": "https://example.com/xray.jpg", "doctor_name": "Dr. Johnson", "patient_email": "john@email.com" } ⚙️ Required Setup: OpenAI API - For GPT-4 Vision image analysis PDF Conversion Service - HTML to PDF converter Gmail Account - For sending reports Google Sheets - For logging reports Replace YOUR_REPORTS_SHEET_ID with your actual sheet ID Want a tailored workflow for your business? Our experts can craft it quickly Contact our team
by Hans Wilhelm Radam
📌 Title (SEO-Friendly) Automate Facebook Messenger orders to Google Sheets and Google Calendar Introduction This workflow automates Facebook Messenger order management by connecting your Facebook Page with Google Sheets and Google Calendar. It’s designed to help small businesses save time, reduce errors, and streamline order-taking. Every time a customer messages your page, they receive a structured order form, their responses are parsed, and the details are saved directly to Google Sheets. The same workflow also creates a Google Calendar event, ensuring you never miss a delivery or pickup schedule. Who’s It For Small businesses** selling products through Facebook Messenger. Entrepreneurs** who want to eliminate manual order-taking. Teams** that need a centralized order tracker (Google Sheets) and automatic reminders (Google Calendar). How It Works Listen to incoming messages on Facebook Messenger. Send an automated greeting and order form to the customer. Parse their responses (items, quantity, payment method, etc.). Save order details into Google Sheets for easy tracking. Create a matching Google Calendar event for the order date/time. Send a confirmation message and an optional upsell suggestion. Requirements Facebook Page** with Messenger enabled. Meta for Developers account** to create a Facebook App and generate a Page Access Token. Google Sheets** account with a spreadsheet containing the following columns: Date, Customer Name, Order Details, Payment Method, Order Status, Notes Google Calendar** account for order scheduling. n8n instance** (cloud or self-hosted). 💡 Security Best Practice: Store your Page Access Token and Google credentials in n8n Credentials (not hardcoded in nodes). Setup Instructions 1. Facebook Messenger Connection Go to Meta for Developers. Create a Messenger App and generate a Page Access Token. Copy the Webhook URL from your n8n Webhook Trigger node. Add the webhook URL and verify it in your Facebook Page settings. 2. Google Sheets Setup Create a new spreadsheet named Messenger Orders. Add columns: Date, Customer Name, Order Details, Payment Method, Order Status, Notes. Share the sheet with the Google account connected in n8n. 3. Google Calendar Setup Connect your Google Calendar credentials in n8n. Select the calendar where orders should be added. 4. Import & Configure Workflow Download this workflow template. Replace placeholders ({{YOUR_PAGE_ACCESS_TOKEN}}, {{YOUR_GOOGLE_SHEET_ID}}, etc.). Test by sending a message to your Facebook Page. Customization Personalize messages** in the Messenger node (greeting, upsell suggestions). Add extra fields such as delivery address or contact number to both the form and the Google Sheet. Extend the workflow by adding Telegram, Email, or SMS notifications for customers or staff. Use Filter nodes to route VIP orders or high-value purchases to a separate workflow. ⚡ Final Flow: Facebook Messenger → Order Form → Google Sheets → Google Calendar → Customer Confirmation 💬 Call to Action: Clone this workflow, connect your accounts, and start automating your Messenger orders in minutes!
by Mohammad Abubakar
This n8n template demonstrates how to capture website leads via a webhook, validate the data, optionally enrich it, store it in your CRM (HubSpot) or a simple Google Sheet, and instantly notify your team via email and Slack. This is ideal for agencies, freelancers, SaaS founders, and small sales teams who want every lead recorded and followed up automatically within seconds. Good to know The workflow supports two storage options: HubSpot or Google Sheets (choose one branch). Enrichment (Clearbit/Hunter) is optional and can be disabled with a single toggle/IF branch. Consider adding anti-spam (honeypot/captcha) if your form gets abused. How it works Webhook receives the lead Your website form sends a POST request to the Webhook URL with lead fields (name, email, message, etc.). Validation & normalization The workflow trims and normalizes fields (like lowercasing email) and checks required fields. If invalid, it returns a Optional enrichment (Clearbit/Hunter) If enrichment is enabled, the workflow calls an enrichment API and merges results into the lead object (industry, company size, domain, etc.). If enrichment fails, the workflow continues (doesn’t block lead capture). Save lead to CRM (Choose one) HubSpot branch**: find contact by email → create or update the contact record Google Sheets branch**: lookup row by email → update if found → otherwise append a new row Instant notifications Posts a Slack message to a channel, optionally including a CRM/Sheet link Success response to the website Returns a #### How to use? Import the workflow into n8n. Configure the Webhook node and copy the production URL into your website form submit action. Choose your storage path: Enable HubSpot nodes OR Enable Google Sheets nodes Add credentials: Slack credential (Optional) HubSpot / Google Sheets (Optional) Clearbit/Hunter keys in the HTTP Request node Send a test lead from your website and confirm: Lead saved correctly Email received Slack notification posted Website receives a 200 response Requirements An n8n instance (cloud or self-hosted) One of: HubSpot account (for CRM storage), or Google account + Google Sheets (for spreadsheet storage) Slack workspace + Slack credentials Optional: Clearbit/Hunter account for enrichment
by Marco Venturi
How it works This workflow sources news from news websites. The information is then passed to an LLM, which processes the article's content. An editor approves or rejects the article. If accepted, the article is first published on the WordPress site and then on the LinkedIn page. Setup Instructions 1. Credentials You'll need to add credentials for the following services in your n8n instance: News API**: A credential for your chosen news provider. LLM**: Your API key for the LLM you want to use. Google OAuth**: For both Gmail and Google Sheets. WordPress OAuth2**: To publish articles via the API. See the WordPress Developer Docs. LinkedIn OAuth2**: To share the post on a company page. 2. Node Configuration Don't forget to: Fetch News (HTTP Request)**: Set the query parameters (keywords, language, etc.) for your news source. Basic LLM Chain: Review and **customize the prompt to match your desired tone, language, and style. Approval request (Gmail)**: Set your email address in the Send To field. HTTP Request WP - Push article**: Replace <site_Id> in the URL with your WordPress Site ID. getImageId (Code Node)**: Update the array with your image IDs from the WordPress Media Library. Create a post (LinkedIn)**: Enter your LinkedIn Organization ID. Append row in sheet (Google Sheets)**: Select your Google Sheet file and the target sheet. All Email Nodes**: Make sure the Send To field is your email.
by Avkash Kakdiya
How it works This workflow automates the handling of new lead responses received in Gmail. It captures emails with a specific label, analyzes the message using AI to determine sentiment, intent, urgency, next action, and priority, and then decides whether follow-up is needed. If required, it creates tasks in HubSpot, notifies the sales team via Slack, and logs all details into Google Sheets for tracking. Step-by-step Trigger on New Lead Email Workflow starts whenever a new email with a defined Gmail label arrives. Captures the sender’s email, subject, message snippet, and timestamp. Normalize Email Data Standardizes Gmail fields into structured values: leadEmail (sender’s address) subject (email subject) message (email content snippet) source (Gmail) receivedAt (timestamp) AI-Powered Lead Analysis Uses OpenAI to analyze the lead’s message. Extracts: Sentiment (Positive / Neutral / Negative) Intent (Interested, Not Interested, Needs Info, Ready to Buy, Objection) Urgency (High / Medium / Low) Next Action (Call, Email, Demo, Quote, No Action) Summary (1–2 sentence description) Priority (Hot / Warm / Cold) Parsed results are merged with the original email data. Flags are added: needsFollowUp (true/false) isHighPriority (true/false) Decision: Needs Follow-Up? If AI suggests a follow-up action, the workflow continues. Otherwise, the process stops here. Create HubSpot Task Automatically creates a HubSpot CRM task for the sales team. Task includes email subject, body, and lead details. Notify Sales Team on Slack Sends a formatted message to Slack with key lead insights: Summary Lead email Priority Urgency Date of analysis Log Lead Data to Google Sheets Appends structured data to Google Sheets for record-keeping. Stores all fields: Email, Date, Subject, Message, Sentiment, Intent, Urgency, Next Action, Summary, and Priority. Why use this? Automates lead triage directly from Gmail. Saves time by using AI-powered analysis instead of manual review. Ensures no potential lead is missed by logging into Google Sheets. Provides instant sales team alerts on high-priority leads. Integrates seamlessly with HubSpot CRM for structured follow-up. Keeps your sales pipeline efficient, organized, and proactive.
by Shelly-Ann Davy
Who’s it for Women creators, homemakers-turned-entrepreneurs, and feminine lifestyle brands who want a graceful, low-lift way to keep an eye on competitor content and spark weekly ideas. What it does On a weekly schedule, this workflow crawls your competitor URLs with Firecrawl (HTTP Request), summarizes each page with OpenAI, brainstorms carousel/pin ideas with Gemini, appends results to Google Sheets (Date, URL, Title, Summary, Ideas), and sends you a single email digest (optional Telegram alert). It includes basic error notifications and a setup-friendly config node. Requirements HTTP credentials** for Firecrawl, OpenAI, and Gemini (no keys in nodes) Google Sheets** OAuth credential A Sheets document with a target sheet/range (e.g., Digest!A:F) (Optional) Telegram bot + chat ID How to set up Open Set: Configuration (edit me) and fill: competitorUrls (one per line), sheetsSpreadsheetId, sheetsRange, ownerEmail, emailTo, geminiModel, openaiModel Attach credentials to the HTTP and Sheets nodes. Test by switching Cron to Every minute, then revert to weekly. How it works Cron → Firecrawl (per URL) → Normalize → OpenAI (summary) + Gemini (ideas) → Merge → Compile Row → Google Sheets append → Build one digest → Email (+ optional Telegram). How to customize Add/remove competitors or change the weekly send time. Tweak the OpenAI/Gemini prompts for your brand voice. Expand columns in Sheets (e.g., category, tone, CTA). Swap email/Telegram for Slack/Notion, or add persistent logs.
by Nirav Gajera
🤖 AI Resume Screener — Google Forms to Automated Shortlisting & Email Automatically score every job application with AI, update your tracking sheet, and send personalised emails to candidates — all without human review. 📖 Description This workflow automates your entire first-round resume screening process. When a candidate submits your Google Form application, the workflow triggers, extracts their details, scores them against the job description using AI, writes the results back to your Google Sheet, and sends the right email — a congratulations to shortlisted candidates, a polite rejection to others. Your HR team only reviews candidates the AI has already scored 7 or above. Built for companies hiring across multiple roles simultaneously, HR teams wanting to reduce screening time, and startups with no dedicated recruiting staff. ✨ Key Features Google Sheets Trigger** — fires automatically when a new application row is added Duplicate prevention** — tracks processed rows with a Processed column, never scores the same application twice Multi-role support** — built-in Job Description map for 5 roles, easily extendable AI scoring 1–10** — Google Gemini scores each candidate against the JD with strengths, weaknesses, recommendation Direct sheet write via API** — writes score + grade + status to exact row using Google Sheets API Smart email routing** — score ≥ 7 → shortlisted (HR alert + candidate email), score < 7 → rejection email Rate limit protection** — Wait node between HR alert and candidate email prevents SMTP throttling Robust AI parsing** — 3-layer fallback ensures AI output always produces a usable result 🔄 How It Works New Google Form submission ↓ Google Sheets Trigger (rowAdded, polls every minute) ↓ Read All Rows → Filter Unprocessed Rows (col_18 / Processed ≠ 1) ↓ Extract Fields + Load Job Description → name, email, phone, position, experience, skills, cover note → loads matching JD from built-in map ↓ AI Resume Scorer (Google Gemini) → score: 1–10 → grade: Strong Yes / Maybe / No → strengths, weaknesses, recommendation, summary ↓ Parse AI Output (3-layer fallback) ↓ Build Sheet Request → Write to Sheet via API → writes score, grade, strengths, weaknesses, recommendation, summary, status, timestamp, Processed=1 ↓ Score ≥ 7? ✅ YES → Alert Email to HR + Wait 10s → Shortlist Email to Candidate ❌ NO → Rejection Email to Candidate 🤖 AI Scoring System The AI scores each candidate from 1 to 10 against the job description: | Score | Grade | Action | | :---: | :--- | :--- | | 8–10 | Strong Yes | Shortlisted ✅ | | 7 | Strong Yes / Maybe | Shortlisted ✅ | | 5–6 | Maybe | Rejected ❌ | | 1–4 | No | Rejected ❌ | For each application the AI also provides: Strengths** — what the candidate does well vs the JD Weaknesses** — gaps or concerns Recommendation** — e.g. "Invite for technical interview" Summary** — 2–3 sentence overall assessment 📋 Built-in Job Descriptions Five roles are pre-configured and auto-matched by position name: | Role | Key Requirements | | :--- | :--- | | Frontend Developer | 2+ yrs React/Vue, HTML/CSS/JS, REST APIs, Git | | Backend Developer | 2+ yrs Node.js/Python/Java, PostgreSQL/MongoDB, Docker | | UI/UX Designer | 2+ yrs UI/UX, Figma or Adobe XD, portfolio | | Project Manager | 3+ yrs PM, PMP/Scrum preferred, Jira/Asana | | Digital Marketing Executive | 2+ yrs digital marketing, Google Ads, Meta Ads, GA4 | Add more roles by editing the JD_MAP object in the Extract Fields + Load JD node. 📧 Email Templates Shortlist Email (to candidate) Subject: Congratulations [Name] - You have been shortlisted! Navy blue header Score table (score/10, grade, position, experience, skills) Strengths highlighted in green AI assessment in blue Promise: HR contacts within 2 business days HR Alert Email (internal) Subject: Strong Candidate - [Name] for [Position] Amber header — urgent feel Full candidate details + complete AI breakdown Strengths, weaknesses, recommendation, summary all included Rejection Email (to candidate) Subject: Your application for [Position] - Update Grey header — professional and neutral Polite decline with encouragement No score or feedback shared 🛠 Setup Requirements 1. Google Form Create a Google Form with these fields: | Field | Type | | :--- | :--- | | Full Name | Short answer | | Email Address | Short answer | | Phone Number | Short answer | | Select the Position You Are Applying For | Dropdown | | Years of Experience | Short answer | | Relevant Skills | Long answer | | Cover Note | Long answer | Link the form to a Google Sheet (Responses tab). 2. Google Sheet — Response Sheet Columns The form creates columns A–H automatically. Add these columns manually for AI results: | Col | Header | Filled by | | :---: | :--- | :--- | | A | Timestamp | Google Forms | | B | Select the Position... | Google Forms | | C | Full Name | Google Forms | | D | Email Address | Google Forms | | E | Phone Number | Google Forms | | F | Years of Experience | Google Forms | | G | Relevant Skills | Google Forms | | H | Upload CV | Google Forms | | I | Cover Note | Google Forms | | J | AI Score | This workflow | | K | Grade | This workflow | | L | Strengths | This workflow | | M | Weaknesses | This workflow | | N | Recommendation | This workflow | | O | Summary | This workflow | | P | Status | This workflow | | Q | Processed At | This workflow | | R | Processed | This workflow (1 = done) | > Important: The workflow uses column R (Processed) to track which rows have been scored. Rows where R = 1 are skipped on subsequent triggers. 3. Credentials Required | Credential | Used for | Free? | | :--- | :--- | :--- | | Google Sheets Trigger OAuth2 | Detects new rows | Free | | Google Sheets OAuth2 | Read rows + write results | Free | | Google Gemini (PaLM) API | AI resume scoring | Free tier available | | SMTP | Sending emails | Depends on provider | 4. Update Sheet ID In the Read All Rows and Write to Sheet via API nodes, replace the Sheet ID with your own: https://docs.google.com/spreadsheets/d/YOUR_SHEET_ID/... 5. Update Email Addresses In all three email nodes, update: fromEmail — your HR sending address HR alert toEmail — your internal HR inbox SMTP credential — your email provider ⚙️ Workflow Nodes | Node | Type | Purpose | | :--- | :--- | :--- | | Google Sheets Trigger | Trigger | Fires on new row (polls every minute) | | Read All Rows | Google Sheets | Reads all form responses | | Filter Unprocessed Rows | Code | Skips rows where Processed = 1 | | Extract Fields + Load JD | Code | Extracts candidate data + matches JD | | AI Resume Scorer | AI Agent | Scores candidate vs JD using Gemini | | Google Gemini Chat Model | LLM | AI model for scoring | | Parse AI Output | Code | Parses JSON with 3-layer fallback | | Build Sheet Request | Code | Prepares API write request for exact row | | Write to Sheet via API | HTTP Request | Writes results to columns J–R | | Score ≥ 7? | IF | Routes by score | | Alert Email - HR Team | Email Send | Notifies HR of strong candidates | | Wait 10s | Wait | Prevents SMTP rate limiting | | Shortlist Email - Candidate | Email Send | Congratulations to shortlisted | | Rejection Email - Candidate | Email Send | Polite rejection to others | 🔑 Key Technical Details Why HTTP Request for sheet write instead of Google Sheets node? The standard Google Sheets node cannot reliably write to a specific row by row number. Using the Sheets API PUT values/{range} endpoint writes to the exact cell range Form responses 1!J{rowNum}:R{rowNum} — always the correct row regardless of concurrent submissions. Why Read All Rows then Filter instead of reading one row? The Google Sheets Trigger fires on rowAdded but may not pass the exact new row data reliably. Reading all rows and filtering by Processed ≠ 1 is more reliable and handles backlog processing too. The Processed column (R) Set to 1 after scoring completes. This prevents re-processing if the trigger fires again before the workflow finishes, or if old unprocessed rows exist. 3-layer AI output fallback: Direct JSON parse Strip markdown fences then parse Regex extract JSON block Default values if all fail (score: 5, grade: Maybe) 📊 Sample Sheet Output After Scoring Col J (AI Score): 8 Col K (Grade): Strong Yes Col L (Strengths): 3+ yrs React experience; Strong TypeScript skills; Relevant API work Col M (Weaknesses): No Docker experience mentioned; No TypeScript explicitly listed Col N (Recommendation): Invite for technical interview Col O (Summary): Strong frontend candidate with solid React/Vue background. Experience aligns well with role requirements. Minor gaps in DevOps knowledge. Col P (Status): Shortlisted Col Q (Processed At): 2026-03-18T10:30:00.000Z Col R (Processed): 1 🔧 Customisation Add more roles: Edit the JD_MAP object in Extract Fields + Load JD: const JD_MAP = { 'Your New Role': 'ROLE: ... REQUIREMENTS: ...', // add as many as needed }; Change the score threshold: In the Score ≥ 7? IF node, change rightValue: 5 to any number. Setting to 8 makes the bar higher (only Strong Yes candidates shortlisted). Add a second HR alert channel: After the HR Alert Email, add a Slack or Telegram node to ping your team instantly for high-scoring candidates. Add calendar booking link: Include a Calendly link in the shortlist email so candidates can self-schedule interviews immediately. Extend with CV parsing: Add a Google Drive node before the AI scorer to download the CV attachment, extract text, and include it in the AI prompt for more accurate scoring. ⚠️ Important Notes The Google Sheets Trigger polls every minute — there may be up to a 1-minute delay after form submission before scoring begins The Processed column must exist in your sheet (column R) before activating the workflow Trailing spaces in Google Forms column headers are common — the key.trim() normalization in the code handles this automatically The Wait node between HR alert and shortlist email is intentional — prevents hitting SMTP rate limits when processing multiple candidates 💡 Enhancement Ideas CV download + parse** — download Google Drive CV attachment and include content in AI scoring Calendly integration** — auto-create interview slot and include booking link in shortlist email Slack alert** — instant Slack message to hiring manager for score ≥ 9 candidates Airtable sync** — mirror all candidate data to an Airtable CRM for pipeline tracking Telegram bot** — let HR check candidate status by texting a bot with the candidate email Multi-stage pipeline** — add a second workflow that triggers when HR marks a candidate for second round 📦 Requirements Summary n8n (cloud or self-hosted) Google Workspace account (Forms + Sheets) Google AI Studio account for Gemini API key (free tier available) SMTP email account (Gmail, SendGrid, Mailtrap etc.) Built with n8n · Google Gemini AI · Google Forms · Google Sheets · SMTP
by vinci-king-01
Scheduled Backup Automation – Mailgun & Box This workflow automatically schedules, packages, and uploads backups of your databases, files, or configuration exports to Box cloud storage, then sends a completion email via Mailgun. It is ideal for small-to-medium businesses or solo developers who want hands-off, verifiable backups without writing custom scripts. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) Box account with a folder dedicated to backups Mailgun account & verified domain Access to the target database/server you intend to back up Basic knowledge of environment variables to store secrets Required Credentials Box OAuth2** – For uploading the backup file(s) Mailgun API Key** – For sending backup status notifications (Optional) Database Credentials** – Only if the backup includes a DB dump triggered from inside n8n Specific Setup Requirements | Variable | Example | Purpose | |-------------------------|------------------------------------------|---------------------------------------| | BOX_FOLDER_ID | 1234567890 | ID of the Box folder that stores backups | | MAILGUN_DOMAIN | mg.example.com | Mailgun domain used for sending email | | MAILGUN_FROM | Backups <backup@mg.example.com> | “From” address in status emails | | NOTIFY_EMAIL | admin@example.com | Recipient of backup status emails | How it works This workflow automatically schedules, packages, and uploads backups of your databases, files, or configuration exports to Box cloud storage, then sends a completion email via Mailgun. It is ideal for small-to-medium businesses or solo developers who want hands-off, verifiable backups without writing custom scripts. Key Steps: Webhook (Scheduler Trigger)**: Triggers the workflow on a CRON schedule or external call. Code (DB/File Dump)**: Executes bash or Node.js commands to create a tar/zip or SQL dump. Move Binary Data**: Converts the created file into n8n binary format. Set**: Attaches metadata (timestamp, file name). Split In Batches* *(optional): Splits multiple backup files for sequential uploads. Box Node**: Uploads each backup file into the specified Box folder. HTTP Request (Verify Upload)**: Calls Box API to confirm upload success. If**: Branches on success vs failure. Mailgun Node**: Sends confirmation or error report email. Sticky Notes**: Provide inline documentation inside the workflow canvas. Set up steps Setup Time: 15-20 minutes Clone or import the workflow JSON into your n8n instance. Create credentials: Box OAuth2: paste Client ID, Client Secret, perform OAuth handshake. Mailgun API: add Private API key and domain. Update environment variables (BOX_FOLDER_ID, MAILGUN_DOMAIN, etc.) or edit the relevant Set node. Modify the Code node to run your specific backup command, e.g.: pg_dump -U $DB_USER -h $DB_HOST $DB_NAME > /tmp/db_backup.sql tar -czf /tmp/full_backup_{{new Date().toISOString()}}.tar.gz /etc/nginx /var/www /tmp/db_backup.sql Set the CRON schedule inside the Webhook node (or replace with a Cron node) to your desired frequency (daily, weekly, etc.). Execute once manually to verify the Box upload and email notification. Enable the workflow. Node Descriptions Core Workflow Nodes: Webhook / Cron** – Acts as the time-based trigger for backups. Code** – Creates the actual backup archive (tar, zip, SQL dump). Move Binary Data** – Moves the generated file into binary property. Set** – Adds filename and timestamp metadata for Box. Split In Batches** – Handles multiple files when necessary. Box** – Uploads the backup file to Box. HTTP Request** – Optional re-check to ensure the file exists in Box. If** – Routes the flow based on success or error. Mailgun** – Sends success/failure notifications. Sticky Note** – Explains credential handling and customization points. Data Flow: Webhook/Cron → Code → Move Binary Data → Set → Split In Batches → Box → HTTP Request → If → Mailgun Customization Examples Add Retention Policy (Auto-delete old backups) // In a Code node before upload const retentionDays = 30; const cutoff = Date.now() - retentionDays * 246060*1000; items = items.filter(item => { return item.json.modifiedAt > cutoff; // keep only recent files }); return items; Parallel Upload to S3 // Duplicate the Box node, replace with AWS S3 node // Use Merge node to combine results before the HTTP Request verification Data Output Format The workflow outputs structured JSON data: { "fileName": "full_backup_2023-10-31T00-00-00Z.tar.gz", "boxFileId": "9876543210", "uploadStatus": "success", "timestamp": "2023-10-31T00:05:12Z", "emailNotification": "sent" } Troubleshooting Common Issues “Invalid Box Folder ID” – Verify BOX_FOLDER_ID and ensure the OAuth user has write permissions. Mailgun 401 Unauthorized – Check that you used the Private API key and the domain is verified. Backup file too large – Enable chunked upload in Box node or increase client_max_body_size on reverse proxy. Performance Tips Compress backups with gzip or zstd to reduce upload time. Run the database dump on the same host as n8n to avoid network overhead. Pro Tips: Store secrets as environment variables and reference them in Code nodes (process.env.MY_SECRET). Chain backups with version numbers (YYYYMMDD_HHmm) for easy sorting. Use n8n’s built-in execution logging to audit backup history. This is a community workflow template provided “as-is” without warranty. Adapt and test in a safe environment before using in production.