by JKingma
🛍️ Automated Product Description Generation for Adobe Commerce (Magento 2) Description This n8n template demonstrates how to automatically generate product descriptions for items in Adobe Commerce (Magento 2) that are missing one. The workflow retrieves product data, converts raw attribute values (like numeric IDs) into human-readable labels, and passes the enriched product data to an LLM (Azure OpenAI by default). The LLM generates a compelling description, which is then saved back to Magento using the API. This ensures all products have professional descriptions without manual writing effort. Use cases include: Auto-generating missing descriptions for catalog completeness. Creating consistent descriptions across large product datasets. Reducing manual workload for content managers. Tailoring descriptions for SEO and customer readability. Good to know All attribute options are resolved to human-readable labels before being sent to the LLM. The flow uses Azure OpenAI, but you can replace it with OpenAI, Anthropic, Gemini, or other LLM providers. The LLM prompt can be customised to adjust tone, length, SEO-focus, or specific brand style. Works out-of-the-box with Adobe Commerce (Magento 2) APIs, but can be adapted for other ecommerce systems. How it works Get Product from Magento Retrieves a product that has no description. Collects all product attributes. Generate Description with LLM Resolves attribute option IDs into human-readable values (e.g. color_id = 23 → "Red"). Passes the readable product attributes to an Azure OpenAI model. The LLM creates a clear, engaging product description. The prompt can be customised (e.g. SEO-optimized, short catalog text, or marketing style). Save Description in Magento Updates the product via the Magento API with the generated description. Ensures product data is enriched and visible in the webshop immediately. How to use Configure your Magento 2 API credentials in n8n. Replace the Azure OpenAI node with another provider if needed. Adjust the prompt to match your brand’s tone of voice. Run the workflow to automatically process products missing descriptions. Requirements ✅ n8n instance (self-hosted or cloud) ✅ Adobe Commerce (Magento 2) instance with API access ✅ Azure OpenAI (or other LLM provider) credentials (Optional) Prompt customisations for SEO or brand voice Customising this workflow This workflow can be adapted for: Other attributes**: Include or exclude attributes (e.g. only color & size for apparel). Different LLMs**: Swap Azure OpenAI for OpenAI, Anthropic, Gemini, or any supported n8n AI node. Prompt tuning**: Adjust instructions to generate shorter, longer, or SEO-rich descriptions. Selective updates**: Target only specific categories (e.g. electronics, fashion). Multi-language support**: Generate product descriptions in multiple languages for international shops.
by Automate With Marc
Viral Marketing Reel & Autopost with Sora2 + Blotato Create funny, ultra-realistic marketing reels on autopilot using n8n, Sora2, Blotato, and OpenAI. This beginner-friendly template generates a comedic video prompt, creates a 12-second Sora2 video, writes a caption, and auto-posts to Instagram/TikTok — all on a schedule. 🎥 Watch the full step-by-step tutorial: https://www.youtube.com/watch?v=lKZknEzhivo What this template does This workflow automates an entire short-form content production pipeline: Scheduled Trigger: Runs automatically at your chosen time (e.g., every evening at 7PM). AI “Video Prompt Agent”: Creates a cinematic, funny, 12-second Sora2 text-to-video prompt designed to promote a product (default: Sally’s Coffee). Insert Row (Data Table): Logs each generated video prompt for tracking, reuse, or inspiration. Sora2 (via Wavespeed): Sends POST request to generate a video. Waits 30 seconds. Polls the prediction endpoint until the video is completed. Blotato Integration: Uploads the finished video to your connected social account(s). Automatically publishes or schedules the post. Caption Generator: Uses an AI agent to create an Instagram/TikTok-ready caption with relevant hashtags. This turns n8n into a hands-free comedic marketing engine that writes, creates, and posts content for you. Why it’s useful Create daily or weekly marketing reels without filming, editing, or writing scripts. Experiment with new comedic formats, hooks, and product placements in seconds. Perfect for small businesses, agencies, creators, and social media managers. Demonstrates how to combine AI agents + Sora2 + polling + external posting services inside one workflow. Requirements Before running this template, configure: OpenAI API Key (for the prompt agent & caption model) Wavespeed / Sora2 API credentials Blotato account connected to Instagram/TikTok (for posting) n8n Data Table (optional, or replace with your own) ⚠️ All credentials must be added manually after import. No real credentials are included in the template. How it works Schedule Trigger Runs at a fixed time or interval. Video Prompt Agent (LangChain Agent) Generates a cinematic, realistic comedic video idea. Built with a detailed system prompt. Ensures brand integration (e.g., Sally’s Coffee) happens naturally. Insert Row (Data Table) Logs each generated prompt so future videos can be referenced or reused. Sora2 POST Request Sends the generated prompt to Sora2 via Wavespeed’s /text-to-video endpoint. Wait 30s + GET Sora2 Result Polls the result until data.status === "completed". Continues looping if still “processing”. Upload Media (Blotato) Uploads the finished video file. Caption Generator Creates a funny, platform-ready Instagram/TikTok caption with hashtags. Create Post (Blotato) Publishes (or schedules) the video + caption. Setup Instructions (Step-by-Step) Import template into n8n. Open Video Prompt Agent → review or customize the brand name, style, humor tone. Add your OpenAI API credentials: For prompt generation For caption generation Add your Wavespeed/Sora2 credentials to the POST and GET nodes. Connect your Blotato credential for uploading and posting. (Optional) Replace the Data Table ID with your own table. Adjust the Schedule Trigger time to your desired posting schedule. Run once manually to confirm: Prompt is generated Video is created Caption is written Video uploads successfully Enable workflow → your daily/weekly comedic autoposter is live. Customization Ideas Change the brand from Sally’s Coffee to any business, product, or influencer brand. Modify the prompt agent to enforce specific camera styles, settings, or comedic tones. Swap posting destinations: Blotato supports multiple networks—configure IG/TikTok/Facebook/YouTube Shorts. Add approval steps: Insert a Slack/Telegram “Approve before posting” step. Add analytics logging: Store video URLs, caption, and AI cost estimate. Troubleshooting Sora video stuck in processing: Increase the wait time or add another polling loop. Upload fails: Ensure media URL exists and Blotato account has posting permissions. Caption empty: Reconnect OpenAI credential or check model availability. Posting fails: Confirm your Blotato API key is valid and linked to a connected account. Category: Marketing, AI Video, Social Media Automation Difficulty: Beginner–Intermediate Core Nodes: LangChain Agent, HTTP Request, Wait, Data Table, Blotato, OpenAI Includes: System prompts, polling logic, caption generator, posting workflow
by Kendra McClanahan
Champion Migration Tracker Automatically detect when your champion contacts change companies and respond with intelligent, personalized AI outreach before your competitors do. THE PROBLEM When champions move to new companies, sales teams lose track and miss high-value opportunities. Manual LinkedIn monitoring doesn't scale, and by the time you notice, the relationship has gone cold. THE SOLUTION This workflow automates champion migration tracking end-to-end, combining Explorium's data intelligence with Claude AI agents to maintain relationships and prioritize opportunities. HOW IT WORKS 1. Automated Job Change Detection Uses Explorium person enrichment to detect when champions move companies Eliminates manual LinkedIn monitoring Triggers immediately when employment changes 2. Intelligent Company Enrichment Enriches new companies with Explorium data: firmographics, funding, tech stack, hiring velocity Checks if company already exists in your CRM (Customer vs Prospect) Identifies open opportunities and account owners 3. Multi-Dimensional Opportunity Scoring (0-100) ICP Fit (40%)**: Company size, funding stage, revenue, tech stack alignment Relationship Strength (40%)**: Past deals influenced, relationship warmth, CRM status Timing (20%)**: Days at new company, recent funding/acquisition signals Results in Hot/Warm/Cold priority classification 4. Smart Routing by Context Customers**: Notify account manager with congratulations message Hot Prospects (75+ score)**: Draft detailed strategic outreach for rep review 5. AI-Powered Personalization Claude AI agents generate contextually relevant emails References past relationship, deals influenced, and company intelligence Adapts tone and content based on opportunity priority and CRM status DEMO SETUP (Google Sheets) This demo uses Google Sheets for simplicity. For production use, replace with your actual CRM: Salesforce HubSpot Pipedrive Any CRM with n8n integration Important Fields to Consider: Champions: champion_id, name, email, company, title, last_checked_date relationship_strength (Hot/Warm/Cold), last_contact_date, deals_influenced relationship_notes, isChampion (TRUE/FALSE), linkedin_url, explorium_prospect_id Companies: company_ID, companyName, domain, relationship_type (Customer/Prospect/None) open_opportunity (TRUE/FALSE), opportunity_stage, account_owner, account_owner_email contractValue, notes, ExploriumBusinessID REQUIRED CREDENTIALS Anthropic API Key - Powers Claude AI agents for email generation Explorium API Key - Provides person and company enrichment data Google Sheets or Your CRM (production) - Data source and logging SETUP INSTRUCTIONS Connect Credentials in n8n Settings → Credentials Update Data Sources: Replace Google Sheets nodes with your CRM nodes (or create demo sheets with structure above) Configure Scoring: Adjust ICP scoring criteria in "Score Company" node to match your ideal customer profile Test with Sample Data: Run with 2-3 test champions to verify routing and email generation Schedule Trigger: Set to run daily or weekly based on your needs CUSTOMIZATION TIPS Scoring Weights: Adjust the 40/40/20 weighting in the scoring node to prioritize what matters most to your business Tech Stack Matching: Update the relevantTech array with tools your champions likely use Email Tone: Modify Claude prompts to match your brand voice (formal, casual, technical, etc.) Routing Logic: Add additional branches for specific scenarios (e.g., churned customers, enterprise accounts) **Agentic Experience: Consider adding an agent that sends the email for Cold prospects automatically. Integrations: Add Slack notifications, CRM updates, or calendar booking links to the output BUSINESS VALUE Prevent Revenue Leakage**: Never lose track of champion relationships Prioritize Intelligently**: Focus on opportunities with highest potential Scale Relationship Building**: Automate what used to require manual effort Act Before Competitors**: Reach out while champions are still settling into new roles Data-Driven Decisions**: Quantifiable scores replace gut feelings USE CASES Sales Teams**: Re-engage champions at new prospect companies Customer Success**: Track champions who move to existing accounts Account-Based Marketing**: Identify high-fit accounts through champion networks Revenue Operations**: Automate champion tracking at scale NOTES Production Recommendation**: Replace Google Sheets with your production CRM for real-time data Privacy**: All API keys are credential-referenced (not hardcoded) for security Explorium Credits**: Person + company enrichment uses ~2-3 credits per champion
by Omer Fayyaz
This n8n template implements a Calendly Booking Link Generator that creates single-use, personalized booking links, logs them to Google Sheets, and optionally notifies a Slack channel Who's it for This template is designed for teams and businesses that send Calendly links proactively and want to generate trackable, single-use booking links on demand. It’s perfect for: Sales and SDR teams** sending 1:1 outreach and needing unique booking links per prospect Customer success and support teams** who want prefilled, one-click rescheduling or follow-up links Marketing and growth teams** that want UTM-tagged booking links for campaigns Ops/RevOps** who need a central log of every generated link for tracking and reporting How it works / What it does This workflow turns a simple HTTP request into a fully configured single-use Calendly booking link: Webhook Trigger (POST) Receives JSON payload with recipient details: name, email, optional event_type_uri, optional utm_source Configuration & Input Normalization Set Configuration extracts and normalizes: recipient_name, recipient_email requested_event_type (can be empty) utm_source (defaults to "n8n" if not provided) Calendly API – User & Event Types Get Current User calls GET /users/me using Calendly OAuth2 to get the current user URI Extract User stores user_uri and user_name Get Event Types calls GET /event_types?user={user_uri}&active=true to fetch active event types Select Event Type: Uses requested_event_type if provided, otherwise selects the first active event type Stores event type URI, name, and duration (minutes) Create Calendly Single-Use Scheduling Link Create Single-Use Link calls POST /scheduling_links with: owner: selected event type URI owner_type: "EventType" max_event_count: 1 (single use) Build Personalized Booking URL Build Personalized Link: Reads the base booking_url from Calendly Appends query parameters to prefill: name (encoded) email (encoded) utm_source Stores: base_booking_url personalized_booking_url recipient_name, recipient_email event_type_name, event_duration link_created_at (ISO timestamp) Optional Logging and Notifications Log to Google Sheets (optional but preconfigured): Appends each generated link to a “Generated Links” sheet Columns: Recipient Name, Recipient Email, Event Type, Duration (min), Booking URL, Created At, Status Notify via Slack (optional): Posts a nicely formatted Slack message with: recipient name & email event name & duration clickable booking link API Response to Caller Respond to Webhook returns a structured JSON response: success booking_url (personalized) base_url recipient object event object (name + duration) created_at expires explanation ("Single-use or 90 days") The result is an API-style service you can call from any system to generate trackable, single-use Calendly links. How to set up 1. Calendly OAuth2 setup Go to calendly.com/integrations or developer.calendly.com Create an OAuth2 application (or use an existing one) In n8n, create Calendly OAuth2 credentials: Add client ID, client secret, and redirect URL as required by Calendly Connect your Calendly user account In the workflow, make sure all Calendly HTTP Request nodes use your Calendly OAuth2 credential 2. Webhook Trigger configuration Open the Webhook Trigger node Confirm: HTTP Method: POST Path: generate-calendly-link Response Mode: Response Node (points to Respond to Webhook) Copy the Production URL from the node once the workflow is active Use this URL as the endpoint for your CRM, outbound tool, or any system that needs to request links Expected request body: { "name": "John Doe", "email": "john@example.com", "event_type_uri": "optional", "utm_source": "optional" } If event_type_uri is not provided, the workflow automatically uses the first active event type for the current Calendly user. 3. Google Sheets setup (optional but recommended) Create a Google Sheet for tracking links Add a sheet/tab named e.g. “Generated Links” Set the header row to: Recipient Name, Recipient Email, Event Type, Duration (min), Booking URL, Created At, Status In n8n: Create Google Sheets OAuth2 credentials Open the Log to Google Sheets node Update: documentId → your spreadsheet ID sheetName → your tab name (e.g. “Generated Links”) 4. Slack notification setup (optional) Create a Slack app at api.slack.com Add Bot Token scopes (for basic posting): chat:write channels:read (or groups:read if posting to private channels) Install the app to your workspace and get the Bot User OAuth Token In n8n: Create a Slack API credential using the bot token Open the Notify via Slack node Select your credential Set: select: channel channelId: your desired channel (e.g. #sales or #booking-links) 5. Test the workflow end-to-end Activate the workflow Use Postman, curl, or another system to POST to the webhook URL, e.g.: { "name": "Test User", "email": "test@example.com" } Verify: The HTTP response contains a valid booking_url A new row is added to your Google Sheet (if configured) A Slack notification is posted (if configured) Requirements Calendly account* with at least one *active event type** n8n instance** (cloud or self-hosted) with public access for the webhook Calendly OAuth2 credentials** configured in n8n (Optional) Google Sheets account and OAuth2 credentials (Optional) Slack workspace with permissions to install a bot and post to channels How to customize the workflow Input & validation Update the Set Configuration node to: Enforce required fields (e.g. fail if email is missing) Add more optional parameters (e.g. utm_campaign, utm_medium, language) Add an IF node after the Webhook Trigger for stricter validation and custom error responses Event type selection logic In Select Event Type: Change the fallback selection rule (e.g. pick the longest or shortest duration event) Add logic to map a custom field (like event_key) to specific event type URIs Link parameters & tracking In Build Personalized Link: Add additional query parameters (e.g. utm_campaign, source, segment) Remove or rename existing parameters if needed If you don’t want prefilled name/email, remove those query parameters and just keep tracking fields Google Sheets logging Extend the Log to Google Sheets mapping to include: utm_source or other marketing attributes Sales owner, campaign name, or pipeline stage Any additional fields you compute in previous nodes Slack notification formatting In Notify via Slack: Adjust the message text to your team’s tone Add emojis or @mentions for certain event types Include utm_source or other metadata for debugging and tracking Key features Single-use Calendly links** – each generated link is limited to one booking (or expires after ~90 days) Prefilled recipient details** – name and email are embedded in the URL, making it frictionless to book Webhook-first design** – easily call this from CRMs, outreach tools, or any external system Central link logging** – every link is stored in Google Sheets for auditing and reporting Optional Slack alerts** – keep sales/support teams notified when new links are generated Safe error handling** – HTTP nodes are configured with continueRegularOutput to avoid hard workflow failures Example scenarios Scenario 1: Sales outreach A CRM workflow triggers when a lead moves to “Meeting Requested”. It calls this n8n webhook with the lead’s name and email. The workflow generates a single-use Calendly link, logs it to Sheets, and posts to Slack. The CRM sends an email to the lead with the personalized booking link. Scenario 2: Automated follow-up link A support ticket is resolved and the system wants to offer a follow-up call. It calls the webhook with name, email, and a dedicated event_type_uri for “Follow-up Call”. The generated link is logged and returned via API, then included in an automated email. Scenario 3: Campaign tracking A marketing automation tool triggers this webhook for each contact in a campaign, passing utm_source (e.g. q1-outbound). The workflow adds utm_source to the link and logs it in Google Sheets. Later, you can analyze which campaigns generated the most completed bookings from single-use links. This template gives you a reliable, reusable Calendly link generation service that plugs into any part of your stack, while keeping tracking, logging, and team visibility fully automated.
by Abdullah Alshiekh
🧩 What Problem Does It Solve? In real estate, inquiries come from many sources and often require immediate, personalized attention. Brokers waste significant time manually: Qualifying leads:** Determining if a prospect's budget, neighborhood, and needs match available inventory. Searching listings:** Cross-referencing customer criteria against a large, static database. Data entry:** Moving contact details and search summaries into a CRM like Zoho. Initial follow-up:** Sending an email to confirm the submission and schedule the next step. 🛠️ How to Configure It Jotform & CRM Setup Jotform Trigger:** Replace the placeholder with your specific Jotform ID. Zoho CRM:** Replace the placeholder TEMPLATED_COMPANY_NAME with your actual company name. Gmail:** Replace the placeholder Calendly link YOUR_CALENDLY_LINK in the Send a message node with your real estate consultant's booking link. Database & AI Setup Google Sheets:** Replace YOUR_GOOGLE_SHEET_DOCUMENT_ID and YOUR_SHEET_GID_OR_NAME in both Google Sheets nodes. Your listings must be structured with columns matching the AI prompt (e.g., bedrooms, rent, neighborhoods). AI Models:** Ensure your Google Gemini API key is linked to the Google Gemini Chat Model node. AI Agent Prompt:** The included prompt contains the exact matching and scoring rules for the AI. You can edit this prompt to refine how the AI prioritizes factors like supplier_rating or neighborhood proximity. 🧠 Use Case Examples Small Startups:** Collect High-Quality Leads: New inquiries must be quickly logged for sales follow-up, but manual entry is slow. B2B Sales:** High-Value Lead Enrichment: Need to prioritize leads that match specific product requirements and budget tiers. Travel/Hospitality:** Personalized Itinerary Matching: Quickly match customer preferences (e.g., dates, group size, activity level) to available packages. E-commerce:** Manual Product Recommendation: Sales teams manually recommend expensive, configurable items (e.g., furniture, specialized equipment). If you need any help Get in Touch
by Samir Saci
Tags: Image Compression, Tinify API, TinyPNG, SEO Optimisation, E-commerce, Marketing Context Hi! I’m Samir Saci, Supply Chain Engineer, Data Scientist based in Paris, and founder of LogiGreen. I built this workflow for an agency specialising in e-commerce to automate the daily compression of their images stored in a Google Drive folder. This is particularly useful when managing large libraries of product photos, website assets or marketing visuals that need to stay lightweight for SEO, website performance or storage optimisation. > Test this workflow with the free tier of the API! 📬 For business inquiries, you can find me on LinkedIn Who is this template for? This template is designed for: E-commerce managers** who need to keep product images optimised Marketing teams** handling large volumes of visuals Website owners** wanting automatic image compression for SEO Anyone using Google Drive** to store images that gradually become too heavy What does this workflow do? This workflow acts as an automated image compressor and reporting system using Tinify, Google Drive, and Gmail. Runs every day at 08:00 using a Schedule Trigger Fetches all images from the Google Drive Input folder Downloads each file and sends it to the Tinify API for compression Downloads the optimised image and saves it to the Compressed folder Moves the original file to the Original Images archive Logs: fileName, originalSize, compressedSize, imageId, outputUrl and processingId into a Data Table After processing, it retrieves all logs for the current batch Generates a clean HTML report summarising the compression results Sends the report via Gmail, including total space saved Here is an example from my personal folder: Here is the report generated for these images: P.S.: You can customise the report to match your company branding or visual identity. 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: Next Steps Before running the workflow, follow the sticky notes and configure the following: Get your Tinify API key for the free tier here: Get your key Replace Google Drive folder IDs in: Input, Compressed, and Original Images Replace the Data Table reference with your own (fields required: fileName, originalSize, compressedSize, imageId, outputUrl, processingId) Add your Tinify API key in the HTTP Basic Auth credentials Set up your Gmail credentials and recipient email (Optional) Customise the HTML report in the Generate Report Code node (Optional) Adjust the daily schedule to your preferred time Submitted: 18 November 2025 Template designed with n8n version 1.116.2
by Jitesh Dugar
Revolutionize university admissions with intelligent AI-driven application evaluation that analyzes student profiles, calculates eligibility scores, and automatically routes decisions - saving 2.5 hours per application and reducing decision time from weeks to hours. 🎯 What This Workflow Does Transforms your admissions process from manual application review to intelligent automation: 📝 Captures Applications - Jotform intake with student info, GPA, test scores, essay, extracurriculars 🤖 AI Holistic Evaluation - OpenAI analyzes academic strength, essay quality, extracurriculars, and fit 🎯 Intelligent Scoring - Evaluates students using 40% academics, 25% extracurriculars, 20% essay, 15% fit (0-100 scale) 🚦 Smart Routing - Automatically routes based on AI evaluation: Auto-Accept (95-100)**: Acceptance letter with scholarship details → Admin alert → Database Interview Required (70-94)**: Interview invitation with scheduling link → Admin alert → Database Reject (<70)**: Respectful rejection with improvement suggestions → Database 💰 Scholarship Automation - Calculates merit scholarships ($5k-$20k+) based on eligibility score 📊 Analytics Tracking - All applications logged to Google Sheets for admissions insights ✨ Key Features AI Holistic Evaluation: Comprehensive analysis weighing academics, extracurriculars, essays, and institutional fit Intelligent Scoring System: 0-100 eligibility score with automated categorization and scholarship determination Structured Output: Consistent JSON schema with academic strength, admission likelihood, and decision reasoning Automated Communication: Personalized acceptance, interview, and rejection letters for every applicant Fallback Scoring: Manual GPA/SAT scoring if AI fails - ensures zero downtime Admin Alerts: Instant email notifications for exceptional high-scoring applicants (95+) Comprehensive Analytics: Track acceptance rates, average scores, scholarship distribution, and applicant demographics Customizable Criteria: Easy prompt editing to match your institution's values and requirements 💼 Perfect For Universities & Colleges: Processing 500+ undergraduate applications per semester Graduate Programs: Screening master's and PhD applications with consistent evaluation Private Institutions: Scaling admissions without expanding admissions staff Community Colleges: Handling high-volume transfer and new student applications International Offices: Evaluating global applicants 24/7 across all timezones Scholarship Committees: Identifying merit scholarship candidates automatically 🔧 What You'll Need Required Integrations Jotform - Application form with student data collection (free tier works) Create your form for free on Jotform using this link Create your application form with fields: Name, Email, Phone, GPA, SAT Score, Major, Essay, Extracurriculars OpenAI API - GPT-4o-mini for cost-effective AI evaluation (~$0.01-0.05 per application) Gmail - Automated applicant communication (acceptance, interview, rejection letters) Google Sheets - Application database and admissions analytics Optional Integrations Slack - Real-time alerts for exceptional applicants Calendar APIs - Automated interview scheduling Student Information System (SIS) - Push accepted students to enrollment system Document Analysis Tools - OCR for transcript verification 🚀 Quick Start Import Template - Copy JSON and import into n8n (requires LangChain support) Create Jotform - Use provided field structure (Name, Email, GPA, SAT, Major, Essay, etc.) Add API Keys - OpenAI, Jotform, Gmail OAuth2, Google Sheets Customize AI Prompt - Edit admissions criteria with your university's specific requirements and values Set Score Thresholds - Adjust auto-accept (95+), interview (70-94), reject (<70) cutoffs if needed Personalize Emails - Update templates with your university branding, dates, and contact info Create Google Sheet - Set up columns: id, Name, Email, GPA, SAT Score, Major, Essay, Extracurriculars Test & Deploy - Submit test application with pinned data and verify all nodes execute correctly 🎨 Customization Options Adjust Evaluation Weights: Change academics (40%), extracurriculars (25%), essay (20%), fit (15%) percentages Multiple Programs: Clone workflow for different majors with unique evaluation criteria Add Document Analysis: Integrate OCR for transcript and recommendation letter verification Interview Scheduling: Connect Google Calendar or Calendly for automated booking SIS Integration: Push accepted students directly to Banner, Ellucian, or PeopleSoft Waitlist Management: Add conditional routing for borderline scores (65-69) Diversity Tracking: Include demographic fields and bias detection in AI evaluation Financial Aid Integration: Automatically calculate need-based aid eligibility alongside merit scholarships 📈 Expected Results 90% reduction in manual application review time (from 2.5 hours to 15 minutes per application) 24-48 hour decision turnaround time vs 4-6 weeks traditional process 40% higher yield rate - faster responses increase enrollment commitment 100% consistency - every applicant evaluated with identical criteria Zero missed applications - automated tracking ensures no application falls through cracks Data-driven admissions - comprehensive analytics on applicant pools and acceptance patterns Better applicant experience - professional, timely communication regardless of decision Defensible decisions - documented scoring criteria for accreditation and compliance 🏆 Use Cases Large Public Universities Screen 5,000+ applications per semester, identify top 20% for auto-admit, route borderline to committee review. Selective Private Colleges Evaluate 500+ highly competitive applications, calculate merit scholarships automatically, schedule interviews with top candidates. Graduate Programs Process master's and PhD applications with research experience weighting, flag candidates for faculty review, automate fellowship awards. Community Colleges Handle high-volume open enrollment while identifying honors program candidates and scholarship recipients instantly. International Admissions Evaluate global applicants 24/7, account for different GPA scales and testing systems, respond same-day regardless of timezone. Rolling Admissions Provide instant decisions for early applicants, fill classes strategically, optimize scholarship budget allocation. 💡 Pro Tips Calibrate Your AI: After 100+ applications, refine evaluation criteria based on enrolled student success A/B Test Thresholds: Experiment with score cutoffs (e.g., 93 vs 95 for auto-admit) to optimize yield Build Waitlist Pipeline: Keep 70-84 score candidates engaged for spring enrollment or next year Track Source Effectiveness: Add UTM parameters to measure which recruiting channels deliver best students Committee Review: Route 85-94 scores to human admissions committee for final review Bias Audits: Quarterly review of AI decisions by demographic groups to ensure fairness Parent Communication: Add parent/guardian emails for admitted students under 18 Financial Aid Coordination: Sync scholarship awards with financial aid office for packaging 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output** - LangChain integration for consistent JSON responses Multi-stage conditional routing** - IF nodes for three-tier decision logic Holistic evaluation** - Weighted scoring across multiple dimensions Automated communication** - HTML email templates with dynamic content Real-time notifications** - Admin alerts for high-value applicants Analytics and data logging** - Google Sheets integration for reporting Fallback mechanisms** - Manual scoring when AI unavailable Perfect for learning advanced n8n automation patterns in educational technology! 🔐 Compliance & Ethics FERPA Compliance: Protects student data with secure credential handling Fair Admissions: Documented criteria eliminate unconscious bias Human Oversight: Committee review option for borderline cases Transparency: Applicants can request evaluation criteria Appeals Process: Structured workflow for decision reconsideration Data Retention: Configurable Google Sheets retention policies 📊 What Gets Tracked Application submission date and time Complete student profile (GPA, test scores, major, essay, activities) AI eligibility score (0-100) and decision category Academic strength rating (excellent/strong/average) Scholarship eligibility and amount ($0-$20,000+) Admission likelihood (high/medium/low) Decision outcome (accepted/interview/rejected) Email delivery status and open rates Time from application to decision Ready to transform your admissions process? Import this template and start evaluating applications intelligently in under 1 hour. Questions or customization needs? The workflow includes detailed sticky notes explaining each section and comprehensive fallback logic for reliability.
by vinci-king-01
Certification Requirement Tracker with Rocket.Chat and GitLab ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically monitors websites of certification bodies and industry associations, detects changes in certification requirements, commits the updated information to a GitLab repository, and notifies a Rocket.Chat channel. Ideal for professionals and compliance teams who must stay ahead of annual updates and renewal deadlines. Pre-conditions/Requirements Prerequisites Running n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed and active Rocket.Chat workspace (self-hosted or cloud) GitLab account and repository for documentation Publicly reachable URL for incoming webhooks (use n8n tunnel, Ngrok, or a reverse proxy) Required Credentials ScrapeGraphAI API Key** – Enables scraping of certification pages Rocket.Chat Access Token & Server URL** – To post update messages GitLab Personal Access Token** – With api and write_repository scopes Specific Setup Requirements | Item | Example Value | Notes | | ------------------------------ | ------------------------------------------ | ----- | | GitLab Repo | gitlab.com/company/cert-tracker | Markdown files will be committed here | | Rocket.Chat Channel | #certification-updates | Receives update alerts | | Certification Source URLs file | /data/sourceList.json in the repository | List of URLs to scrape | How it works This workflow automatically monitors websites of certification bodies and industry associations, detects changes in certification requirements, commits the updated information to a GitLab repository, and notifies a Rocket.Chat channel. Ideal for professionals and compliance teams who must stay ahead of annual updates and renewal deadlines. Key Steps: Webhook Trigger**: Fires on a scheduled HTTP call (e.g., via cron) or manual trigger. Code (Prepare Source List)**: Reads/constructs a list of certification URLs to scrape. ScrapeGraphAI**: Fetches HTML content and extracts requirement sections. Merge**: Combines newly scraped data with the last committed snapshot. IF Node**: Determines if a change occurred (hash/length comparison). GitLab**: Creates a branch, commits updated Markdown/JSON files, and opens an MR (optional). Rocket.Chat**: Posts a message summarizing changes and linking to the GitLab diff. Respond to Webhook**: Returns a JSON summary to the requester (useful for monitoring or chained automations). Set up steps Setup Time: 20-30 minutes Install Community Node: In n8n UI, go to Settings → Community Nodes and install @n8n/community-node-scrapegraphai. Create Credentials: a. ScrapeGraphAI – paste your API key. b. Rocket.Chat – create a personal access token (Personal Access Tokens → New Token) and configure credentials. c. GitLab – create PAT with api + write_repository scopes and add to n8n. Clone the Template: Import this workflow JSON into your n8n instance. Edit StickyNote: Replace placeholder URLs with actual certification-source URLs or point to a repo file. Configure GitLab Node: Set your repository, default branch, and commit message template. Configure Rocket.Chat Node: Select credential, channel, and message template (markdown supported). Expose Webhook: If self-hosting, enable n8n tunnel or configure reverse proxy to make the webhook public. Test Run: Trigger the workflow manually; verify GitLab commit/MR and Rocket.Chat notification. Automate: Schedule an external cron (or n8n Cron node) to POST to the webhook yearly, quarterly, or monthly as needed. Node Descriptions Core Workflow Nodes: stickyNote** – Human-readable instructions/documentation embedded in the flow. webhook** – Entry point; accepts POST /cert-tracker requests. code (Prepare Source List)** – Generates an array of URLs; can pull from GitLab or an environment variable. scrapegraphAi** – Scrapes each URL and extracts certification requirement sections using CSS/XPath selectors. merge (by key)** – Joins new data with previous snapshot for change detection. if (Changes?)** – Branches logic based on whether differences exist. gitlab** – Creates/updates files and opens merge requests containing new requirements. rocketchat** – Sends formatted update to designated channel. respondToWebhook** – Returns 200 OK with a JSON summary. Data Flow: webhook → code → scrapegraphAi → merge → if if (true) → gitlab → rocketchat if (false) → respondToWebhook Customization Examples Change Scraping Frequency // Replace external cron with n8n Cron node { "nodes": [ { "name": "Cron", "type": "n8n-nodes-base.cron", "parameters": { "schedule": { "hour": "0", "minute": "0", "dayOfMonth": "1" } } } ] } Extend Notification Message // Rocket.Chat node → Message field const diffUrl = $json["gitlab_diff_url"]; const count = $json["changes_count"]; return :bell: ${count} Certification Requirement Update(s)\n\nView diff: ${diffUrl}; Data Output Format The workflow outputs structured JSON data: { "timestamp": "2024-05-15T12:00:00Z", "changesDetected": true, "changesCount": 3, "gitlab_commit_sha": "a1b2c3d4", "gitlab_diff_url": "https://gitlab.com/company/cert-tracker/-/merge_requests/42", "notifiedChannel": "#certification-updates" } Troubleshooting Common Issues ScrapeGraphAI returns empty results – Verify your CSS/XPath selectors and API key quota. GitLab commit fails (401 Unauthorized) – Ensure PAT has api and write_repository scopes and is not expired. Performance Tips Limit the number of pages scraped per run to avoid API rate limits. Cache last-scraped HTML in an S3 bucket or database to reduce redundant requests. Pro Tips: Use GitLab CI to auto-deploy documentation site whenever new certification files are merged. Enable Rocket.Chat threading to keep discussions organized per update. Tag stakeholders in Rocket.Chat messages with @cert-team for instant visibility.
by Shelly-Ann Davy
Description Wake up gently. This elegant workflow runs every morning at 7 AM, picks one uplifting affirmation from a curated list, and delivers it to your inbox (with optional Telegram). Zero code, zero secrets—just drop in your SMTP and Telegram credentials, edit the affirmations, and activate. Perfect for creators, homemakers, and entrepreneurs who crave intention and beauty before the day begins. How it works (high-level steps) Cron wakes the flow daily at 7 AM. Set: Configuration stores your email, Telegram chat ID, and affirmations. Code node randomly selects one affirmation. Email node sends the message via SMTP. IF node decides whether to forward it to Telegram as well. Set-up time 2 – 3 minutes 30 s: add SMTP credential 30 s: add Telegram Bot credential (optional) 1 min: edit affirmations & email addresses 30 s: activate Detailed instructions All deep-dive steps live inside the yellow and white sticky notes on the canvas—no extra docs needed. Requirements SMTP account (SendGrid, Gmail, etc.) Telegram Bot account (optional) Customisation tips Change Cron time or frequency Swap affirmation list for quotes, verses, or mantras Add Notion logger branch for journaling
by Intuz
This n8n template from Intuz provides a complete solution to automate on-demand lead generation. It acts as a powerful scraping agent that takes a simple chat query, scours both Google Search and Google Maps for relevant businesses, scrapes their websites for contact details, and compiles an enriched lead list directly in Google Sheets. Who's this workflow for? Sales Development Representatives (SDRs) Local Marketing Agencies Business Development Teams Freelancers & Consultants Market Researchers How it works 1. Start with a Chat Query: The user initiates the workflow by typing a search query (e.g., "dentists in New York") into a chat interface. 2. Multi-Source Search: The workflow queries both the Google Custom Search API (for web results across multiple pages) and scrapes Google Maps (for local businesses) to gather a broad list of potential leads. 3. Deep Dive Website Scraping: For each unique business website found, the workflow visits the URL to scrape the raw HTML content of the page. 4. Intelligent Contact Extraction: Using custom code, it then parses the scraped website content to find and extract valuable contact information like email addresses, phone numbers, and social media links. 5. Deduplicate and Log to Sheets: Before saving, the workflow checks your Google Sheet to ensure the lead doesn't already exist. All unique, newly enriched leads are then appended as clean rows to your sheet, along with the original search query for tracking. Key Requirements to Use This Template 1. n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow uses the official n8n LangChain integration (@n8n/n8n-nodes-langchain) for the chat trigger. If you are using a self-hosted version of n8n, please ensure this package is installed. 2. Google Custom Search API: A Google Cloud Project with the "Custom Search API" enabled. You will need an API Key for this service. You must also create a Programmable Search Engine and get its Search engine ID (cx). This tells Google what to search (e.g., the whole web). 3. Google Sheets Account: A Google account and a pre-made Google Sheet with columns for Business Name, Primary Email, Contact Number, URL, Description, Socials, and Search Query. Setup Instructions 1. Configure the Chat Trigger: In the "When chat message received" node, you can find the Direct URL or Embed code to use the chat interface. Set Up Google Custom Search API (Crucial Step): Go to the "Custom Google Search API" (HTTP Request) node. Under "Query Parameters", you must replace the placeholder values for key (with your API Key) and cx (with your Search Engine ID). 3. Configure Google Sheets: In all Google Sheets nodes (Append row in sheet, Get row(s) in sheet, etc.), connect your Google Sheets credentials. Select your target spreadsheet (Document ID) and the specific sheet (Sheet Name) where you want to store the leads. 4. Activate the Workflow: Save the workflow and toggle the "Active" switch to ON. Open the chat URL and enter a search query to start generating leads. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Onur
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Scrape.do This template requires a self-hosted n8n instance to run. A complete n8n automation that extracts competitor data from Google search results for specific keywords and target countries using Scrape.do SERP API, and saves structured results into Google Sheets for SEO, competitive analysis, and market research. 📋 Overview This workflow provides a lightweight competitor analysis solution that identifies ranking websites for chosen keywords across different countries. Ideal for SEO specialists, content strategists, and digital marketers who need structured SERP insights without manual effort. Who is this for? SEO professionals tracking keyword competitors Digital marketers conducting market analysis Content strategists planning based on SERP insights Business analysts researching competitor positioning Agencies automating SEO reporting What problem does this workflow solve? Eliminates manual SERP scraping Processes multiple keywords across countries Extracts structured data (position, title, URL, description) Automates saving results into Google Sheets Ensures repeatable & consistent methodology ⚙️ What this workflow does Manual Trigger → Starts the workflow manually Get Keywords from Sheet → Reads keywords + target countries from a Google Sheet URL Encode Keywords → Converts keywords into URL-safe format Process Keywords in Batches → Handles multiple keywords sequentially to avoid rate limits Fetch Google Search Results → Calls Scrape.do SERP API to retrieve raw HTML of Google SERPs Extract Competitor Data from HTML → Parses HTML into structured competitor data (top 10 results) Append Results to Sheet → Writes structured SERP results into a Google Sheet 📊 Output Data Points | Field | Description | Example | |--------------------|------------------------------------------|-------------------------------------------| | Keyword | Original search term | digital marketing services | | Target Country | 2-letter ISO code of target region | US | | position | Ranking position in search results | 1 | | websiteTitle | Page title from SERP result | Digital Marketing Software & Tools | | websiteUrl | Extracted website URL | https://www.hubspot.com/marketing | | websiteDescription | Snippet/description from search results | Grow your business with HubSpot’s tools… | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Scrape.do* account with *SERP API token** Google Sheet Structure This workflow uses one Google Sheet with two tabs: Input Tab: "Keywords" | Column | Type | Description | Example | |----------|------|-------------|---------| | Keyword | Text | Search query | digital marketing | | Target Country | Text | 2-letter ISO code | US | Output Tab: "Results" | Column | Type | Description | Example | |--------------------|-------|-------------|---------| | Keyword | Text | Original search term | digital marketing | | position | Number| SERP ranking | 1 | | websiteTitle | Text | Title of the page | Digital Marketing Software & Tools | | websiteUrl | URL | Website/page URL | https://www.hubspot.com/marketing | | websiteDescription | Text | Snippet text | Grow your business with HubSpot’s tools | 🛠 Step-by-Step Setup Import Workflow: Copy the JSON → n8n → Workflows → + Add → Import from JSON Configure **Scrape.do API**: Endpoint: https://api.scrape.do/ Parameter: token=YOUR_SCRAPEDO_TOKEN Add render=true for full HTML rendering Configure Google Sheets: Create a sheet with two tabs: Keywords (input), Results (output) Set up Google Sheets OAuth2 credentials in n8n Replace placeholders: YOUR_GOOGLE_SHEET_ID and YOUR_GOOGLE_SHEETS_CREDENTIAL_ID Run & Test: Add test data in Keywords tab Execute workflow → Check results in Results tab 🧰 How to Customize Add more fields**: Extend HTML parsing logic in the “Extract Competitor Data” node to capture extra data (e.g., domain, sitelinks). Filtering**: Exclude domains or results with custom rules. Batch Size**: Adjust “Process Keywords in Batches” for speed vs. rate-limits. Rate Limiting: Insert a **Wait node (e.g., 10–30 seconds) if API rate limits apply. Multi-Sheet Output**: Save per-country or per-keyword results into separate tabs. 📊 Use Cases SEO Competitor Analysis**: Identify top-ranking sites for target keywords Market Research**: See how SERPs differ by region Content Strategy**: Analyze titles & descriptions of competitor pages Agency Reporting**: Automate competitor SERP snapshots for clients 📈 Performance & Limits Single Keyword: ~10–20 seconds (depends on **Scrape.do response) Batch of 10**: 3–5 minutes typical Large Sets (50+)**: 20–40 minutes depending on API credits & batching API Calls: 1 **Scrape.do request per keyword Reliability**: 95%+ extraction success, 98%+ data accuracy 🧩 Troubleshooting API error** → Check YOUR_SCRAPEDO_TOKEN and API credits No keywords loaded** → Verify Google Sheet ID & tab name = Keywords Permission denied** → Re-authenticate Google Sheets OAuth2 in n8n Empty results** → Check parsing logic and verify search term validity Workflow stops early** → Ensure batching loop (SplitInBatches) is properly connected 🤝 Support & Community n8n Forum: https://community.n8n.io n8n Docs: https://docs.n8n.io Scrape.do Dashboard: https://dashboard.scrape.do 🎯 Final Notes This workflow provides a repeatable foundation for extracting competitor SERP rankings with Scrape.do and saving them to Google Sheets. You can extend it with filtering, richer parsing, or integration with reporting dashboards to create a fully automated SEO intelligence pipeline.
by WeblineIndia
Webhook from IoT Devices → Jira Maintenance Ticket → Slack Factory Alert This workflow automates predictive maintenance by receiving IoT machine-failure webhooks, creating Jira maintenance tickets, checking technician availability in Slack and sending the alert to the correct Slack channel. If an active technician is available, the system notifies the designated technician channel; if not, it escalates automatically to your chosen emergency/escalation channel. ⚡ Quick Implementation: Start Using in 10 Seconds Import the workflow JSON into n8n. Add Slack API credentials (with all required scopes). Add Jira Cloud credentials. Select Slack channels for: Technician alerts Emergency/escalation alerts Deploy the webhook URL to your IoT device. Run a test event. What It Does This workflow implements a real-time predictive maintenance automation loop. An IoT device sends machine data — such as temperature, vibration and timestamps — to an n8n webhook whenever a potential failure is detected. The workflow immediately evaluates whether the values exceed a defined safety threshold. If a failure condition is detected, a Jira maintenance ticket is automatically created with all relevant machine information. The workflow then gathers all technicians from your selected Slack channel and checks each technician’s presence status in real time. A built-in decision engine chooses the first available technician. If someone is active, the workflow sends a maintenance alert to your technician channel. If no technicians are available, the workflow escalates the alert to your chosen emergency channel to avoid operational downtime. This eliminates manual monitoring, accelerates response times and ensures no incident goes unnoticed — even if the team is unavailable. Who’s It For This workflow is ideal for: Manufacturing factories Industrial automation setups IoT monitoring systems Warehouse operations Maintenance & facility management teams Companies using Jira + Slack Organizations implementing predictive maintenance or automated escalation workflows Requirements to Use This Workflow You will need: An n8n instance (Cloud or Self-hosted) Slack App with the scopes: users:read users:read.presence channels:read chat:write Jira Cloud credentials (email + API token) Slack channels of your choice for: Technician alerts Emergency/escalation alerts IoT device capable of POST webhook calls Machine payload must include: machineId temperature vibration timestamp How It Works & How To Set Up 🔧 High-Level Workflow Logic IoT Webhook receives machine data. IF Condition checks whether values exceed safety thresholds. Jira Ticket is created with machine details if failure detected. Slack Channel Members are fetched from your selected technician channel. Loop Through Technicians to check real-time presence. Code Node determines: first available (active) technician or fallback mode if none available IF Condition checks technician availability. Slack Notification is sent to: your chosen technician channel if someone is available your chosen emergency/escalation channel if no one is online 🛠 Step-by-Step Setup Instructions Import Workflow: n8n → Workflows → Import from File → Select JSON. Configure Slack: Add required scopes (users:read, users:read.presence, channels:read, chat:write) and reconnect credentials. Select Slack Channels: Choose any Slack channels you want for technician notifications and emergency alerts—no fixed naming is required. Configure Jira: Add credentials, select project and issue type, and set priority mapping if needed. Deploy Webhook: Copy the n8n webhook URL and configure your IoT device to POST machine data. Test System: Send a test payload to ensure Jira tickets are created and Slack notifications route correctly based on technician availability. This setup allows real-time monitoring, automated ticket creation and flexible escalation — reducing manual intervention and ensuring fast maintenance response. How To Customize Nodes Webhook Node Add security tokens Change webhook path Add response message IF Node (Threshold Logic) Lower/raise temperature threshold Change OR to AND Add more conditions (humidity, RPM, pressure) Jira Node Customize fields like summary, labels or assign issues based on technician availability Slack Presence Node Add DND checks Treat “away” as “available” during night shift Combine multiple channels Code Node Randomly rotate technicians Pick technician with lowest alert count Keep a history log Add-Ons SMS fallback notifications (Twilio) WhatsApp alerts Telegram alerts Notify supervisors via email Store machine failures into Google Sheets Push metrics into PowerBI Auto-close Jira tickets after normalizing machine values Create a daily maintenance report Use Case Examples Overheating Machine Alert – Detect spikes and notify technician instantly. Vibration Pattern Anomaly Detection – Trigger early maintenance before full breakdown. Multi-Shift Technician Coverage – Automatically switch to emergency mode when no technician is online. Factory Night-Shift Automation – Night alerts automatically escalate without manual verification. Warehouse Robotics Malfunction – Sends instant Slack + Jira alerts when robots overheat or jam. Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------------- | ----------------------------------- | -------------------------------------------- | | Webhook returns no data | Wrong endpoint or method | Use POST + correct URL | | Slack presence returns error | Missing Slack scopes | Add users:read.presence | | Jira ticket not created | Invalid project key or credentials | Reconfigure Jira API credentials | | All technicians show offline | Wrong channel or IDs | Ensure correct channel members | | Emergency alert not triggered | Code node returning incorrect logic | Test code with all technicians set to “away” | | Slack message fails | Wrong channel ID | Replace with correct Slack channel | Need Help? If you need help customizing this workflow, adding new automation features, connecting additional systems or building enterprise IoT maintenance solutions, our n8n automation development team at WeblineIndia team can help. We can assist with: Workflow setup Advanced alert logic Integrating SMS / WhatsApp / Voice alerts Custom escalation rules Industrial IoT integration Reach out anytime for support or enhancements.