by Yanagi Chinatsu
Who's it for? This template is perfect for teams, communities, or anyone who wants a fun, automated way to remember and celebrate birthdays. If you love space and want to make someone's special day a little more cosmic, this is for you. It's a great way to boost morale and ensure no one's birthday is forgotten. How it works / What it does The workflow runs automatically every morning. It reads a list of birthdays from your Google Sheet and checks if any match the current date. If a match is found, it fetches the latest Earth image from NASA's EPIC API and uses an AI model to generate a unique, space-themed birthday greeting. Finally, it sends a personalized HTML email to the birthday person and posts a celebratory message in your designated Slack channel. Requirements A Google Sheet with columns for Name, Email, and Birthday (in a format like YYYY-MM-DD). Credentials for Google Sheets, Gmail, OpenAI, and Slack configured in your n8n instance. How to set up Configure Credentials: Add your API credentials for Google (for both Sheets and Gmail), OpenAI, and Slack in the respective nodes. Set Up Google Sheet: In the "Get Birthday Roster" node, enter your Google Sheet ID and the name of the sheet where the birthday list is stored. Configure Slack: In the "Post Slack Notification" node, select the channel where you want birthday announcements to be posted. Activate Workflow: Save and activate the workflow. It will now run automatically every day! How to customize the workflow Change the schedule: Modify the CRON expression in the "Every Morning at 7:00" node to change when the workflow runs. Adjust the AI prompt: Edit the system message in the "Generate Space Birthday Message" node to alter the tone, length, or style of the AI-generated greetings. Customize the email design: Modify the HTML in the "Send Birthday Email" node to change the look and feel of the birthday message.
by Cheng Siong Chin
Introduction Automates Singapore COE price tracking, predicts trends using AI, and recommends optimal car purchase timing. Scrapes LTA data biweekly, analyzes historical trends, forecasts next 6 bidding rounds, and sends alerts when buying windows appear—saving time and identifying cost-saving opportunities. How it Works Biweekly trigger scrapes LTA COE data → processes historical trends → AI predicts 6-month prices → compares current vs forecast → generates buy/wait recommendations → alerts sent via Gmail or Telegram. Setup Steps Add NVIDIA/OpenAI API credentials in n8n Connect Google Sheets for data storage Authenticate Gmail/Telegram for notifications Schedule trigger for Wednesdays 8PM SGT Configure alert thresholds in conditional nodes Workflow Schedule Trigger → HTTP Request (Scrape LTA) → Data Processing → Google Sheets (Store) → AI Prediction → Analysis Engine → Conditional Logic → Gmail/Telegram Notification Workflow Steps Scraping: Extract COE prices from OneMotoring Processing: Calculate moving averages, volatility, seasonal trends Storage: Save to Google Sheets with timestamps Prediction: AI forecasts next 6 bidding rounds Analysis: Compare current vs predicted prices, generate recommendation Notification: Alerts via email/Telegram Prerequisites NVIDIA/OpenAI API key, Google account (Sheets), Gmail/Telegram for notifications, basic COE category knowledge Use Cases First-time buyers monitoring price dips, fleet managers timing bulk purchases Customization Add economic indicators, integrate car loan calculators, track parallel imported car prices Benefits Saves hours of manual monitoring, captures 10–15% price dips, provides data-driven purchase timing (potential $5K–$15K savings)
by Moka Ouchi
Who is it for This template is perfect for content creators, space enthusiasts, educators, and organizations who want to automatically generate and distribute engaging weekly space newsletters. It's especially useful for those who want to stay updated on space events without manual content creation. How it works This workflow automatically generates a comprehensive space newsletter every week by: Fetching the latest data from multiple NASA APIs (APOD, DONKI, NeoWS) Using AI to create an engaging newsletter from the raw data Creating a formatted document with the content Converting it to PDF format Distributing via email and archiving to your preferred platform Sending success notifications to your team Setup steps Get NASA API Key: Register at https://api.nasa.gov to get your API key (free) Configure OpenAI: Add your OpenAI credentials for content generation Set up Google Workspace: Connect Google Docs, Drive, and Gmail accounts Optional integrations: Configure Notion for archiving and Slack for notifications Customize schedule: Adjust the weekly trigger time to your preference Requirements NASA API key (free at api.nasa.gov) OpenAI API access Google Workspace account (Docs, Drive, Gmail) Optional: Notion and Slack accounts How to customize Modify the AI prompt to change the newsletter style and tone Adjust the date range for fetching space data Add or remove NASA API endpoints based on your interests Customize email recipients and notification channels Enable/disable the Notion archiving feature as needed
by Jitesh Dugar
Transform procurement from manual chaos to intelligent automation - AI-powered supplier selection analyzes urgency, cost, and delivery requirements to recommend optimal vendors, then automatically generates professional POs, manages approval workflows, and tracks delivery while maintaining complete audit trails. What This Workflow Does Revolutionizes purchase order management with AI-driven supplier optimization and automated procurement workflows: Webhook-Triggered Generation** - Automatically creates POs from inventory systems, manual requests, or threshold alerts Smart Data Validation** - Verifies item details, quantities, pricing, and calculates totals with tax and shipping AI Supplier Selection** - OpenAI agent analyzes order requirements and recommends optimal supplier based on multiple factors Intelligent Analysis** - AI considers urgency level, total value, item categories, delivery requirements, and cost optimization Multi-Supplier Database** - Maintains supplier profiles with contact details, payment terms, delivery times, and specializations Approval Workflow** - Routes high-value orders (>$5000) for management approval before supplier notification Professional PO Generation** - Creates beautifully formatted purchase orders with company branding and complete details AI Insights Display** - Shows supplier selection reasoning, cost optimization notes, and alternative supplier recommendations PDF Conversion** - Transforms HTML into print-ready, professional-quality purchase order documents Automated Email Distribution** - Sends POs directly to selected suppliers with all necessary attachments Google Drive Archival** - Automatically saves POs to organized folders with searchable filenames Procurement System Logging** - Records complete PO details, supplier info, and status in centralized system Delivery Tracking** - Monitors order status from placement through delivery confirmation Slack Team Notifications** - Real-time alerts to procurement team with PO details and AI recommendations Urgency Classification** - Prioritizes orders based on urgency (urgent, normal) affecting supplier selection Cost Optimization** - AI identifies opportunities for savings or faster delivery based on requirements Key Features AI-Powered Supplier Matching**: Machine learning analyzes order characteristics and recommends best supplier from database based on delivery speed, cost, and specialization Intelligent Trade-Off Analysis**: AI balances cost vs delivery time vs supplier capabilities to find optimal choice for specific order requirements Automatic PO Numbering**: Generates unique sequential purchase order numbers with format PO-YYYYMM-#### for tracking and reference Approval Threshold Management**: Configurable dollar thresholds trigger approval workflows for high-value purchases requiring management authorization Multi-Criteria Supplier Selection**: Considers urgency level, order value, item categories, delivery requirements, and historical performance Supplier Specialization Matching**: Routes technology orders to tech suppliers, construction materials to building suppliers, etc. Cost vs Speed Optimization**: AI recommends premium suppliers for urgent orders and budget suppliers for standard delivery timelines Alternative Supplier Suggestions**: Provides backup supplier recommendations in case primary choice is unavailable Real-Time Pricing Calculations**: Automatically computes line items, subtotals, taxes, shipping, and grand totals Payment Terms Automation**: Pulls supplier-specific payment terms (Net 30, Net 45, etc.) from supplier database Shipping Address Management**: Maintains multiple delivery locations with automatic address population Special Instructions Field**: Captures custom requirements, delivery notes, or handling instructions for suppliers Item Catalog Integration**: Supports product codes, descriptions, quantities, and unit pricing for accurate ordering Audit Trail Generation**: Complete activity log tracking PO creation, approvals, supplier notification, and delivery Status Tracking System**: Monitors PO lifecycle from creation through delivery confirmation with real-time updates Multi-Department Support**: Tracks requesting department for budget allocation and accountability Perfect For Retail Stores** - Automated inventory reordering when stock reaches threshold levels Manufacturing Companies** - Raw material procurement with delivery scheduling for production planning Restaurant Chains** - Food and supplies ordering with vendor rotation and cost optimization IT Departments** - Equipment purchasing with approval workflows for technology investments Construction Companies** - Materials procurement with urgency-based supplier selection for project timelines Healthcare Facilities** - Medical supplies ordering with compliance tracking and vendor management Educational Institutions** - Procurement for facilities, supplies, and equipment across departments E-commerce Businesses** - Inventory replenishment with AI-optimized supplier selection for margins Hospitality Industry** - Supplies procurement for hotels and resorts with cost control Government Agencies** - Compliant procurement workflows with approval chains and audit trails What You Will Need Required Integrations OpenAI API** - AI agent for intelligent supplier selection and optimization (API key required) HTML to PDF API** - PDF conversion service for professional PO documents (approximately 1-5 cents per PO) Gmail or SMTP** - Email delivery for sending POs to suppliers and approval requests Google Drive** - Cloud storage for PO archival and compliance documentation Optional Integrations Slack Webhook** - Procurement team notifications with PO details and AI insights Procurement Software** - ERP/procurement system API for automatic logging and tracking Inventory Management** - Connect to inventory systems for automated reorder triggers Accounting Software** - QuickBooks, Xero integration for expense tracking and reconciliation Supplier Portal** - Direct integration with supplier order management systems Approval Software** - Connect to approval management platforms for workflow automation Quick Start Import Template - Copy JSON workflow and import into your n8n instance Configure OpenAI - Add OpenAI API credentials for AI supplier selection agent Setup PDF Service - Add HTML to PDF API credentials in the HTML to PDF node Configure Gmail - Connect Gmail OAuth2 credentials and update sender email Connect Google Drive - Add Google Drive OAuth2 credentials and set folder ID for PO archival Customize Company Info - Edit company data with your company name, address, contact details Update Supplier Database - Modify supplier information in enrichment node with actual vendor details Set Approval Threshold - Adjust dollar amount requiring management approval ($5000 default) Configure Email Templates - Customize supplier email and approval request messages Add Slack Webhook - Configure Slack notification URL for procurement team alerts Test AI Agent - Submit sample order to verify AI supplier selection logic Test Complete Workflow - Run end-to-end test with real PO data to verify all integrations Customization Options Supplier Scoring Algorithm** - Adjust AI weighting for cost vs delivery speed vs quality factors Multi-Location Support** - Add multiple shipping addresses for different facilities or warehouses Budget Tracking** - Integrate departmental budgets with automatic budget consumption tracking Volume Discounts** - Configure automatic discount calculations based on order quantities Contract Compliance** - Enforce existing vendor contracts and preferred supplier agreements Multi-Currency Support** - Handle international suppliers with currency conversion and forex rates RFQ Generation** - Extend workflow to generate requests for quotes for new items Delivery Scheduling** - Integrate calendar for scheduled deliveries and receiving coordination Quality Tracking** - Add supplier performance scoring based on delivery time and quality Return Management** - Create return authorization workflows for defective items Recurring Orders** - Automate standing orders with scheduled generation Inventory Forecasting** - AI predicts reorder points based on historical consumption patterns Supplier Negotiation** - Track pricing history and flag opportunities for renegotiation Compliance Documentation** - Attach required certifications, insurance, or regulatory documents Multi-Approver Chains** - Configure complex approval hierarchies for different dollar thresholds Expected Results 90% time savings** - Reduce PO creation from 30 minutes to 3 minutes per order 50% faster supplier selection** - AI recommends optimal vendor instantly vs manual research Elimination of stockouts** - Automated reordering prevents inventory shortages 20-30% cost savings** - AI optimization identifies better pricing and supplier options 100% approval compliance** - No high-value orders bypass required approvals Zero lost POs** - Complete digital trail with automatic archival Improved supplier relationships** - Professional, consistent POs with clear requirements Faster order processing** - Suppliers receive clear POs immediately enabling faster fulfillment Better delivery predictability** - AI matches urgency to supplier capabilities reducing delays Reduced procurement overhead** - Automation eliminates manual data entry and follow-up Pro Tips Train AI with Historical Data** - Feed past successful orders to improve AI supplier recommendations Maintain Supplier Performance Scores** - Track delivery times and quality to enhance AI selection accuracy Set Smart Thresholds** - Adjust approval amounts based on department budgets and risk tolerance Use Urgency Levels Strategically** - Reserve "urgent" classification for true emergencies to optimize costs Monitor AI Recommendations** - Review AI reasoning regularly to validate supplier selection logic Integrate Inventory Triggers** - Connect to inventory systems for automatic PO generation at reorder points Establish Preferred Vendors** - Flag preferred suppliers in database for AI to prioritize when suitable Document Special Requirements** - Use special instructions field consistently for better supplier compliance Track Cost Trends** - Export PO data to analyze spending patterns and negotiation opportunities Review Alternative Suppliers** - Keep AI's alternative recommendations for backup when primary unavailable Schedule Recurring Orders** - Set up automated triggers for regular supply needs Centralize Receiving** - Use consistent ship-to addresses to simplify delivery coordination Archive Systematically** - Organize Drive folders by fiscal year, department, or supplier Test Approval Workflow** - Verify approval routing works before deploying to production Communicate AI Benefits** - Help procurement team understand AI recommendations build trust Business Impact Metrics Track these key metrics to measure workflow success: PO Generation Time** - Average minutes from request to supplier notification (target: under 5 minutes) Supplier Selection Accuracy** - Percentage of AI recommendations that meet delivery and cost expectations (target: 90%+) Approval Workflow Speed** - Average hours for high-value PO approvals (target: under 4 hours) Stockout Prevention** - Reduction in inventory shortages due to faster PO processing Cost Savings** - Percentage reduction in procurement costs from AI optimization (typical: 15-25%) Order Accuracy** - Reduction in PO errors requiring correction or cancellation Supplier On-Time Delivery** - Improvement in delivery performance from better supplier matching Procurement Productivity** - Number of POs processed per procurement staff member Budget Compliance** - Percentage of POs staying within approved departmental budgets Audit Readiness** - Time required to produce PO documentation for audits (target: under 5 minutes) Template Compatibility Compatible with n8n version 1.0 and above Requires OpenAI API access for AI agent functionality Works with n8n Cloud and Self-Hosted instances Requires HTML to PDF API service subscription No coding required for basic setup Fully customizable supplier database and selection criteria Integrates with major procurement and ERP systems via API Supports unlimited suppliers and product categories Scales to handle thousands of POs monthly Ready to transform your procurement process? Import this template and start generating intelligent purchase orders with AI-powered supplier selection, automated approval workflows, and complete procurement tracking - eliminating manual processes, preventing stockouts, and optimizing costs across your entire supply chain!
by WeblineIndia
Android Feature Flag Cleanup Bot (GitLab + LaunchDarkly) This n8n automation detects unused (“dead”) feature flags in an Android Kotlin/Java codebase by comparing your GitLab repository code against LaunchDarkly’s feature flag list. It logs results in Google Sheets, creates Jira tickets for cleanup and sends Slack alerts automatically. Who’s it for Android engineering teams using Kotlin/Java. Teams managing feature flags in LaunchDarkly. DevOps/QA teams wanting to reduce technical debt from stale flags. How it works Weekly Trigger runs the process. GitLab Node fetches repository code. Regex Extraction finds all feature flags in code. LaunchDarkly API retrieves all configured flags. Comparison Logic marks flags as “dead” if unused in code and archived or off in production. Google Sheets stores flagged results. Jira creates a ticket for each dead flag. Slack notifies the team. How to set up Import JSON into n8n. Connect credentials for: GitLab OAuth2 Google Sheets Jira Slack webhook URL Update: GitLab repo details in the GitLab node. LaunchDarkly API key in HTTP Request node. Google Sheet ID in Google Sheets node. Jira project & issue type in Jira node. Slack message formatting in Slack node. Activate workflow. Requirements n8n** (self-hosted or cloud) GitLab repository with Kotlin/Java code LaunchDarkly account + API token Google Sheets API access Jira API access Slack incoming webhook How to customize Change regex pattern in “Detect flags” node if your flag naming convention differs. Adjust dead flag logic in “Find dead flags” node (e.g., treat test env separately). Modify Slack message to include more details (e.g., description from LaunchDarkly). Add email notifications for broader distribution. Add-ons Email Alerts** via Gmail/SMTP. GitHub / GitLab MR** to remove dead flags automatically. Confluence Integration** to document flag cleanup history. Use Case Examples Weekly automated cleanup alerts for large engineering teams. Maintaining clean feature flag lists in high-traffic apps. Compliance-driven projects requiring flag lifecycle tracking. Common troubleshooting | Issue | Possible Cause | Solution | | ------------------------------------ | --------------------------------------------- | -------------------------------------------------------- | | Workflow fails at GitLab node | Invalid repo path or missing OAuth scope | Update repo path & check GitLab OAuth permissions | | LaunchDarkly API request returns 401 | Invalid or expired API key | Generate a new API key in LaunchDarkly & update node | | Google Sheets node fails | Wrong Sheet ID or missing sharing permissions | Confirm Sheet ID and share with connected Google account | | Jira ticket not created | Missing required fields | Set project key, issue type, and summary in Jira node | | Slack alert not sent | Webhook URL invalid or revoked | Regenerate Slack webhook and update in node | Need Help? If you’d like, we can help set up and customize this workflow for your exact repo, flag rules and team notification preferences — including regex adjustments, extra reporting or adding automatic cleanup PRs. Contact our n8n automation team at WeblineIndia.
by Oneclick AI Squad
Streamline your post-event analysis with this smart n8n workflow. Triggered by a simple webhook, it instantly gathers attendee and engagement data from your event platform, calculates key metrics, and uses AI to generate a polished, professional report. The final summary is emailed to stakeholders and saved securely in a database — all without manual effort. Perfect for conferences, webinars, and corporate events. 📧📈 Key Features Webhook triggered** – Starts instantly via HTTP POST request Multi-source data collection** – Fetches attendees & engagement metrics Advanced analytics** – Calculates attendance rates, engagement scores, top sessions AI-powered insights** – Uses GPT-4 to generate professional reports Auto-email delivery** – Sends report to stakeholders Database archiving** – Saves reports to PostgreSQL What it Analyzes Attendance rates & check-ins Average session time Engagement scores (polls, Q&A, networking) Top performing sessions Attendee breakdown (by role & company) AI-generated insights & recommendations Workflow Process The Webhook Trigger node starts the workflow when an HTTP POST request is received with event details. Get Attendees (GET)** pulls the list of registered and checked-in participants from your event system. Get Engagement Metrics (GET)** retrieves interaction data like poll responses, Q&A activity, and session views. Process Metrics** calculates key stats: attendance rate, average session duration, engagement score, and ranks top sessions. AI Generate Report** uses GPT-4 to create a clear, professional summary with insights and recommendations based on the data. AI Agent** coordinates data flow and prepares the final report structure using chat model and memory tools. Save to Database (Insert)** stores the full report and raw metrics in PostgreSQL for future reference. Send Report Email** automatically emails the AI-generated report to the specified recipient. Send Response** returns a confirmation back to the triggering system via webhook. Setup Instructions Import this JSON into n8n Configure credentials: Event API (for GET requests) OpenAI (GPT-4) SMTP (for email delivery) PostgreSQL (for data storage) Trigger via webhook with event data Receive comprehensive report via email within minutes! Prerequisites Event platform with REST API (for attendee & engagement data) OpenAI API key (GPT-4 access) SMTP server credentials (Gmail, SendGrid, etc.) PostgreSQL database with write access Example Webhook Payload { "eventId": "evt_123", "eventName": "Tech Summit 2025", "eventDate": "2025-10-29", "email": "manager@company.com" } Modification Options Add custom metrics in the Process Metrics node (e.g., NPS score, feedback sentiment) Change AI tone in AI Generate Report (formal, executive summary, or creative) Modify email template in Send Report Email with your branding Connect to different data sources by updating GET nodes Add Slack or Teams notification after Send Report Email Ready to automate your event reporting? Get in touch with us for custom n8n workflows!
by Avkash Kakdiya
How it works This workflow monitors Meta Ads and Google Ads campaigns on a daily schedule to detect performance drops. It fetches yesterday’s campaign data, standardizes metrics, and calculates CTR and ROAS against fixed benchmarks. Campaigns that fall below thresholds are flagged automatically. Alerts are then sent across multiple channels and all results are logged in Google Sheets for tracking. Step-by-step Step 1: Fetch ad performance data** Schedule Trigger (Daily Ad Check2) – Runs the Meta Ads performance check at the scheduled time. HTTP Request (Fetch Meta Ads Data) – Retrieves campaign metrics from the Meta Ads API. Set (Set Benchmarks) – Normalizes Meta Ads data and assigns platform details. Schedule Trigger (Daily Ad Check3) – Runs the Google Ads performance check at a separate scheduled time. Google Ads (Get many campaigns) – Fetches campaign performance data from Google Ads. Set (Set Benchmarks4) – Normalizes Google Ads data and assigns platform details. Step 2: Detect performance drops** Code (Detect Performance Drop) – Calculates CTR and ROAS, compares them with predefined benchmarks, and flags drops. If – Filters only campaigns where a performance drop is detected. Split In Batches (Loop Over Items) – Processes each affected campaign individually. Step 3: Alert and log results** WhatsApp (Send message1) – Sends instant WhatsApp alerts for critical visibility. Slack (Send a message) – Posts detailed alerts to a Slack channel. Gmail (Send a message4) – Sends email notifications with full campaign metrics. Code (Code in JavaScript) – Recombines campaign data after notifications. Wait (Wait1) – Ensures alert delivery before logging. Google Sheets (your-google-sheets-name) – Appends or updates campaign records for reporting and audit history. Why use this? Catch CTR and ROAS drops before ad spend is wasted. Monitor Meta Ads and Google Ads in a single automated flow. Notify teams instantly via WhatsApp, Slack, and Email. Keep a centralized performance log for analysis and reporting. Reduce manual checks with consistent daily monitoring.
by Jitesh Dugar
Automated AI-Powered Testimonial Processing & Social Media Workflow Overview: This comprehensive workflow automates the entire testimonial collection and publishing process, from submission to social media-ready content. It uses AI to enhance testimonials, generates beautiful branded cards, and implements an approval system before posting. Key Features: ✅ Webhook-based submission - Accept testimonials via API 🤖 AI Enhancement - GPT-4 polishes grammar while maintaining authenticity 🎨 Automated Design - Generates professional 800x600px testimonial cards ☁️ Cloud Storage - Uploads to Google Drive with organized naming 📊 Database Logging - Tracks all testimonials in Google Sheets 🔔 Team Notifications - Slack alerts for new and approved testimonials ✅ Approval Workflow - Manual review before social media posting 🔄 Scheduled Checker - Auto-detects approved testimonials every 5 minutes Workflow Steps: Main Flow (Testimonial Processing): Receives testimonial via webhook (POST request) Validates and cleans data (name, testimonial, photo, email) Enhances testimonial using GPT-4 Turbo Generates HTML template with customer details Converts HTML to PNG image (800x600px) Uploads image to Google Drive Logs all data to Google Sheets with "Pending Approval" status Sends Slack notification to review team Approval Flow (Scheduled Check): Runs every 5 minutes automatically Checks Google Sheets for approved testimonials Filters testimonials not yet posted Sends ready-to-post Slack notification with formatted text Marks testimonial as processed in database Use Cases: SaaS companies collecting customer feedback Marketing agencies managing client testimonials E-commerce businesses showcasing reviews Course creators featuring student success stories Any business automating social proof collection What Makes This Workflow Special: Zero manual design work** - Beautiful cards generated automatically AI-powered quality** - Professional grammar enhancement Audit trail** - Complete tracking in Google Sheets Approval control** - Review before publishing Duplicate prevention** - Smart matching by Drive ID Flexible input** - Accepts multiple field name variations 🔧 Required Integrations: OpenAI API (GPT-4 Turbo) - AI testimonial enhancement HTML/CSS to Image API - Screenshot generation Google Drive OAuth2 - Image storage Google Sheets OAuth2 - Database management Slack API - Team notifications 📋 Prerequisites: n8n instance (self-hosted or cloud) OpenAI API key (https://platform.openai.com) HTML/CSS to Image account (https://htmlcsstoimg.com) - Free tier available Google Cloud project with Drive & Sheets API enabled Slack workspace with app permissions 🚀 Setup Instructions: 1. Import Workflow Download the JSON file Import into your n8n instance Replace placeholder credentials (see below) 2. Configure Credentials Add these credentials in n8n: OpenAI API** - Your API key htmlcsstoimgApi** - User ID and API key Google Drive OAuth2** - Configure OAuth app Google Sheets OAuth2** - Same Google Cloud project Slack API** - Create Slack app with chat:write scope 3. Update Configuration Replace in the JSON: Google Drive Folder ID** - Your testimonial storage folder Google Sheets ID** - Your database spreadsheet Slack Channel ID** - Your notification channel 4. Test the Workflow Send a POST request to your webhook URL: { "name": "Sarah Johnson", "designation": "Marketing Director", "photo_url": "https://i.pravatar.cc/400?img=5", "testimonial_text": "Working with this team was amazing!", "email": "sample@gmail.com" } 📊 Google Sheets Setup: Create a Google Sheet with these columns: Timestamp Name Designation Original Testimonial Testimonial (Enhanced) Image Link Drive ID Status Email Original Length Enhanced Source Posted to Social Posted At 🎨 Customization Options: Modify AI prompt for different enhancement styles Change HTML template colors/design Add more validation rules Integrate with Twitter/LinkedIn APIs for auto-posting Add email notifications instead of Slack Include rating/score system Add custom approval fields 🆘 Troubleshooting: Webhook not receiving data: Check webhook URL is correct Verify HTTP method is POST Ensure Content-Type is application/json AI enhancement failing: Verify OpenAI API key is valid Check API usage limits Ensure sufficient credits Image not generating: Confirm htmlcsstoimg credentials are correct Check HTML template has no errors Verify you haven't exceeded free tier limit Google Drive upload failing: Re-authenticate OAuth2 connection Check folder ID is correct Verify folder permissions 🏆 Perfect For: Marketing teams Customer success teams Product managers Social media managers Growth hackers Agency owners ⚖️ License: Free to use and modify for personal and commercial projects.
by OwenLee
📚In the social and behavioral sciences (e.g., psychology, sociology, economics, management), researchers and students often need to normalize academic paper metadata and extract variables before any literature review or meta-analysis. 🧩This workflow automates the busywork. Using an LLM, it processes CSV/XLSX/XLS files (exported from WoS, Scopus, EndNote, Zotero, or your own spreadsheets) into normalized metadata and extracted variables, and writes a neat table to Google Sheets. 🔗 Example Google Sheet: click me 👥 Who is this for? 🎓 Undergraduate and graduate students or researchers in soft-science fields (psychology, sociology, economics, business) ⏱️ People who don’t have time to read full papers and need quick overviews 📊 Anyone who wants to automate academic paper metadata normalization and variable extraction to speed up a literature review ⚙️ How it works 📤 Upload an academic paper file (CSV/XLSX/XLS) in chat. 📑 The workflow creates a Google Sheets spreadsheet with two tabs: Checkpoint and FinalResult. 🔎 A structured-output LLM normalizes core metadata (title, abstract, authors, publication date, source) from the uploaded file and writes it to Checkpoint; 📧 a Gmail notification is sent when finished. 🧪 A second structured-output LLM uses the metadata above to extract variables (Independent Variable, Dependent Variable) and writes them to FinalResult; 📧 you’ll get a second Gmail notification when done. 🛠️ How to set up 🔑 Credentials Google Sheets OAuth2** (read/write) Gmail OAuth2** (send notifications) Google Gemini (or any LLM you prefer)** 🚀 Quick start Connect Google Sheets, Gmail, and Gemini (or your LLM) credentials. Open File Upload Trigger → upload your CSV/XLSX/XLS file and type a name in chat (used as the Google Sheets spreadsheet title). Watch your inbox for status emails and open the Google Sheets spreadsheet to review Checkpoint and FinalResult. 🎛 Customization 🗂️ Journal lists: Edit the Journal Rank Classifier code node to add/remove titles. The default list is for business/management journals—swap it for a list from your own field. 🔔 Notifications: Replace Gmail with Slack, Teams, or any channel you prefer. 🧠 LLM outputs: Need different metadata or extracted data? Edit the LLM’s system prompt and Structured Output Parser. 📝 Note 📝 Make sure your file includes abstracts. If the academic paper data you upload doesn’t contain an abstract, the extracted results will be far less useful. 🧩 CSV yields no items? Encoding mismatches can break the workflow. If this happens, convert the CSV to .xls or .xlsx and try again. 📩 Help Contact: owenlzyxg@gmail.com
by Oneclick AI Squad
Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. 💸📧 Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data** retrieves financial data from a Google Sheet. Analyze Inflows & Outflows** processes the data to identify trends and totals. Validate Records** ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet** appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows – Get in touch with us
by Yaron Been
Stay ahead in your job search with this Automated Job Intelligence System! This workflow scans company career pages daily for new job listings, uses AI to analyze job relevance and seniority levels, and sends personalized email alerts for high-priority opportunities while maintaining a comprehensive job database. Perfect for job seekers, recruiters, and career coaches tracking ideal opportunities across target companies. What This Template Does Triggers daily at 9 AM to scan for new job opportunities. Retrieves company URLs from Google Sheets database. Uses Decodo scraper to extract job listings from career pages. Analyzes job data with AI to identify company names and positions. Converts job data into individual listing items for processing. Compares new jobs against existing database to filter duplicates. Uses AI to assign relevance scores, seniority levels, and tech stack analysis. Filters high-relevance jobs (score >8/10) for priority alerts. Stores all new jobs in Google Sheets for historical tracking. Sends personalized email alerts for high-relevance opportunities. Key Benefits Automated daily scanning of target company career pages AI-powered relevance scoring and job matching Duplicate prevention to avoid redundant notifications Comprehensive job database for tracking and analysis Personalized alerts for high-priority opportunities Time-saving automation for job search activities Features Daily automated scheduling for consistent monitoring AI-powered job extraction and data structuring Decodo web scraping for reliable career page access Intelligent relevance scoring and seniority analysis Duplicate detection and filtering Google Sheets integration for job tracking Personalized email alerts for premium opportunities Multi-company monitoring capability Historical job data maintenance Requirements Decodo API credentials for web scraping OpenAI API credentials for AI analysis Google Sheets OAuth2 credentials with edit access Gmail OAuth2 credentials for email alerts Slack Bot Token for error notifications (optional) Environment variables for configuration settings Target Audience Job seekers and career changers Recruitment and talent acquisition teams Career coaches and placement agencies HR professionals monitoring competitor hiring Technology professionals tracking market opportunities University career services and placement offices Step-by-Step Setup Instructions Connect Decodo API credentials for career page scraping Set up OpenAI credentials for job relevance analysis Configure Google Sheets with company URLs and job tracking sheets Add Gmail credentials for personalized job alerts Optional: Set up Slack for error notifications Populate company URLs in your Google Sheets database Test with sample companies to verify job extraction and analysis Customize relevance thresholds for your job preferences Activate for daily automated job intelligence monitoring Pro Tip: Use coupon code "YARON" to get 23K requests for testing (in Decodo) This workflow transforms your job search with automated intelligence, smart filtering, and personalized opportunity alerts!
by explorium
Outbound Agent - AI-Powered Lead Generation with Natural Language Prospecting This n8n workflow transforms natural language queries into targeted B2B prospecting campaigns by combining Explorium's data intelligence with AI-powered research and personalized email generation. Simply describe your ideal customer profile in plain English, and the workflow automatically finds prospects, enriches their data, researches them, and creates personalized email drafts. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Anthropic API Type:** API Key Used for:** AI Agent query interpretation, email research, and email writing Get your API key at Anthropic Console Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Prospect matching, contact enrichment, professional profiles, and MCP research Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company and prospect intelligence research Connect to: https://mcp.explorium.ai/mcp Gmail Type:** OAuth2 Used for:** Creating email drafts Alternative options: Outlook, Mailchimp, SendGrid, Lemlist Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: When chat message received This node creates an interactive chat interface where users can describe their prospecting criteria in natural language. Type:** Chat Trigger Purpose:** Accept natural language queries like "Get 5 marketing leaders at fintech startups who joined in the past year and have valid contact information" Example Prompts:** "Find SaaS executives in New York with 50-200 employees" "Get marketing directors at healthcare companies" "Show me VPs at fintech startups with recent funding" Node 2: Chat or Refinement This code node manages the conversation flow, handling both initial user queries and validation error feedback. Function:** Routes either the original chat input or validation error messages to the AI Agent Dynamic Input:** Combines chatInput and errorInput fields Purpose:** Creates a feedback loop for validation error correction Node 3: AI Agent The core intelligence node that interprets natural language and generates structured API calls. Functionality: Interprets user intent from natural language queries Maps concepts to Explorium API filters (job levels, departments, company size, revenue, location, etc.) Generates valid JSON requests with precise filter criteria Handles off-topic queries with helpful guidance Connected to MCP Client for real-time filter specifications AI Components: Anthropic Chat Model:** Claude Sonnet 4 for query interpretation Simple Memory:** Maintains conversation context (100 message window) Output Parser:** Structured JSON output with schema validation MCP Client:** Connected to https://mcp.explorium.ai/mcp for Explorium specifications System Instructions: Expert in converting natural language to Explorium API filters Can revise previous responses based on validation errors Strict adherence to allowed filter values and formats Default settings: mode: "full", size: 10000, page_size: 100, has_email: true Node 4: API Call Validation This code node validates the AI-generated API request against Explorium's filter specifications. Validation Checks: Filter key validity (only allowed filters from approved list) Value format correctness (enums, ranges, country codes) No duplicate values in arrays Proper range structure for experience fields (total_experience_months, current_role_months) Required field presence Allowed Filters: country_code, region_country_code, company_country_code, company_region_country_code company_size, company_revenue, company_age, number_of_locations google_category, naics_category, linkedin_category, company_name city_region_country, website_keywords has_email, has_phone_number job_level, job_department, job_title business_id, total_experience_months, current_role_months Output: isValid: Boolean validation status validationErrors: Array of specific error messages Node 5: Is API Call Valid? Conditional routing node that determines the next step based on validation results. If Valid:** Proceed to Explorium API: Fetch Prospects If Invalid:** Route to Validation Prompter for correction Node 6: Validation Prompter Generates detailed error feedback for the AI Agent when validation fails. This creates a self-correcting loop where the AI learns from validation errors and regenerates compliant requests by routing back to Node 2 (Chat or Refinement). Node 7: Explorium API: Fetch Prospects Makes the validated API call to Explorium's prospect database. Method:** POST Endpoint:** /v1/prospects/fetch Authentication:** Header Auth (Bearer token) Input:** JSON with filters, mode, size, page_size, page Returns:** Array of matched prospects with prospect IDs based on filter criteria Node 8: Pull Prospect IDs Extracts prospect IDs from the fetch response for bulk enrichment. Input:** Full fetch response with prospect data Output:** Array of prospect_id values formatted for enrichment API Node 9: Explorium API: Contact Enrichment Single enrichment node that enhances prospect data with both contact and profile information. Method:** POST Endpoint:** /v1/prospects/enrich Enrichment Types:** contacts, profiles Authentication:** Header Auth (Bearer token) Input:** Array of prospect IDs from Node 8 Returns: Contacts:** Professional emails (current, verified), phone numbers (mobile, work), email validation status, all available email addresses Profiles:** Full professional history, current role details, company information, skills and expertise, education background, experience timeline, job titles and seniority levels Node 10: Clean Output Data Transforms and structures the enriched data for downstream processing. Node 11: Loop Over Items Iterates through each prospect to generate individualized research and emails. Batch Size:** 1 (processes prospects one at a time) Purpose:** Enable personalized research and email generation for each prospect Loop Control:** Processes until all prospects are complete Node 12: Research Email AI-powered research agent that investigates each prospect using Explorium MCP. Input Data: Prospect name, job title, company name, company website LinkedIn URL, job department, skills Research Focus: Company automation tool usage (n8n, Zapier, Make, HubSpot, Salesforce) Data enrichment practices Tech stack and infrastructure (Snowflake, Segment, etc.) Recent company activity and initiatives Pain points related to B2B data (outdated CRM data, manual enrichment, static workflows) Public content (speaking engagements, blog posts, thought leadership) AI Components: Anthropic Chat Model1:** Claude Sonnet 4 for research Simple Memory1:** Maintains research context Explorium MCP1:** Connected to https://mcp.explorium.ai/mcp for real-time intelligence Output: Structured JSON with research findings including automation tools, pain points, personalization notes Node 13: Email Writer Generates personalized cold email drafts based on research findings. Input Data: Contact info from Loop Over Items Current experience and skills Research findings from Research Email agent Company data (name, website) AI Components: Anthropic Chat Model3:** Claude Sonnet 4 for email writing Structured Output Parser:** Enforces JSON schema with email, subject, message fields Output Schema: email: Selected prospect email address (professional preferred) subject: Compelling, personalized subject line message: HTML formatted email body Node 14: Create a draft (Gmail) Creates email drafts in Gmail for review before sending. Resource:** Draft Subject:** From Email Writer output Message:** HTML formatted email body Send To:** Selected prospect email address Authentication:** Gmail OAuth2 After Creation: Loops back to Node 11 (Loop Over Items) to process next prospect Alternative Output Options: Outlook:** Create drafts in Microsoft Outlook Mailchimp:** Add to email campaign SendGrid:** Queue for sending Lemlist:** Add to cold email sequence Workflow Flow Summary Input: User describes target prospects in natural language via chat interface Interpret: AI Agent converts query to structured Explorium API filters using MCP Validate: API call validation ensures filter compliance Refine: If invalid, error feedback loop helps AI correct the request Fetch: Retrieve matching prospect IDs from Explorium database Enrich: Parallel bulk enrichment of contact details and professional profiles Clean: Transform and structure enriched data Loop: Process each prospect individually Research: AI agent uses Explorium MCP to gather company and prospect intelligence Write: Generate personalized email based on research Draft: Create reviewable email drafts in preferred platform This workflow eliminates manual prospecting work by combining natural language processing, intelligent data enrichment, automated research, and personalized email generation—taking you from "I need marketing leaders at fintech companies" to personalized, research-backed email drafts in minutes. Customization Options Flexible Triggers The chat interface can be replaced with: Scheduled runs for recurring prospecting Webhook triggers from CRM updates Manual execution for ad-hoc campaigns Scalable Enrichment Adjust enrichment depth by: Adding more Explorium API endpoints (technographics, funding, news) Configuring prospect batch sizes Customizing data cleaning logic Output Destinations Route emails to your preferred platform: Email Platforms:** Gmail, Outlook, SendGrid, Mailchimp Sales Tools:** Lemlist, Outreach, SalesLoft CRM Integration:** Salesforce, HubSpot (create leads with research) Collaboration:** Slack notifications, Google Docs reports AI Model Flexibility Swap AI providers based on your needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Domain Filtering: The workflow prioritizes professional emails—customize email selection logic in the Clean Output Data node MCP Configuration: Explorium MCP requires Header Auth setup—ensure credentials are properly configured Rate Limits: Adjust Loop Over Items batch size if hitting API rate limits Memory Context: Simple Memory maintains conversation history—increase window length for longer sessions Validation: The AI self-corrects through validation loops—monitor early runs to ensure filter accuracy This workflow represents a complete AI-powered sales development representative (SDR) that handles prospecting, research, and personalized outreach with minimal human intervention.