by masahiro hanawa
Video Metadata Extraction & YouTube Auto-Upload with AI Automatically process video files, extract metadata, generate AI-optimized titles/descriptions/tags, and upload directly to YouTube with proper categorization and thumbnail handling. Key Features YouTube Node Integration**: Direct video upload to YouTube with full metadata Binary Data Handling**: Proper video and thumbnail binary processing AI-Powered SEO Optimization**: Generates engaging titles, descriptions, and tags Video Metadata Extraction**: Analyzes video properties (duration, resolution, codec) Thumbnail Processing**: Extracts or uploads custom thumbnails Category Auto-Selection**: AI determines optimal YouTube category How It Works Video Intake: Receives video file via webhook or cloud storage trigger Metadata Extraction: Analyzes video file for technical properties AI Content Generation: Creates SEO-optimized title, description, and tags Thumbnail Processing: Extracts frame or uses provided thumbnail YouTube Upload: Uploads video with all metadata Post-Upload Processing: Retrieves video ID, creates playlist entry Notification: Sends confirmation with video URL Required Credentials YouTube OAuth2 (for video upload) OpenAI API (for AI metadata generation) Google Drive or Dropbox (optional, for cloud storage triggers) Gmail (for notifications) Google Sheets (for tracking) Unique Features Uses YouTube node for direct video upload (rarely used in templates) Binary data manipulation** for video and thumbnail handling AI-generated SEO metadata** optimized for YouTube algorithm Category detection** using AI classification Merge node** with chooseBranch for conditional flows Example Request { "videoFile": "<binary data>", "projectName": "Product Demo 2024", "targetAudience": "developers", "language": "en", "thumbnailFile": "<binary data>", "playlistId": "PLxxxxxxxx", "publishTime": "2024-01-15T14:00:00Z" } Supported Video Formats MP4, MOV, AVI, MKV, WebM Maximum file size: 128GB (YouTube limit) Recommended: MP4 with H.264 codec Output { "videoId": "dQw4w9WgXcQ", "videoUrl": "https://youtu.be/dQw4w9WgXcQ", "title": "AI-Generated Title", "description": "SEO-optimized description...", "tags": ["tag1", "tag2", "tag3"], "category": "Science & Technology", "uploadStatus": "processed", "thumbnailUrl": "https://i.ytimg.com/vi/..." }
by Jitesh Dugar
Revolutionize university admissions with intelligent AI-driven application evaluation that analyzes student profiles, calculates eligibility scores, and automatically routes decisions - saving 2.5 hours per application and reducing decision time from weeks to hours. 🎯 What This Workflow Does Transforms your admissions process from manual application review to intelligent automation: 📝 Captures Applications - Jotform intake with student info, GPA, test scores, essay, extracurriculars 🤖 AI Holistic Evaluation - OpenAI analyzes academic strength, essay quality, extracurriculars, and fit 🎯 Intelligent Scoring - Evaluates students using 40% academics, 25% extracurriculars, 20% essay, 15% fit (0-100 scale) 🚦 Smart Routing - Automatically routes based on AI evaluation: Auto-Accept (95-100)**: Acceptance letter with scholarship details → Admin alert → Database Interview Required (70-94)**: Interview invitation with scheduling link → Admin alert → Database Reject (<70)**: Respectful rejection with improvement suggestions → Database 💰 Scholarship Automation - Calculates merit scholarships ($5k-$20k+) based on eligibility score 📊 Analytics Tracking - All applications logged to Google Sheets for admissions insights ✨ Key Features AI Holistic Evaluation: Comprehensive analysis weighing academics, extracurriculars, essays, and institutional fit Intelligent Scoring System: 0-100 eligibility score with automated categorization and scholarship determination Structured Output: Consistent JSON schema with academic strength, admission likelihood, and decision reasoning Automated Communication: Personalized acceptance, interview, and rejection letters for every applicant Fallback Scoring: Manual GPA/SAT scoring if AI fails - ensures zero downtime Admin Alerts: Instant email notifications for exceptional high-scoring applicants (95+) Comprehensive Analytics: Track acceptance rates, average scores, scholarship distribution, and applicant demographics Customizable Criteria: Easy prompt editing to match your institution's values and requirements 💼 Perfect For Universities & Colleges: Processing 500+ undergraduate applications per semester Graduate Programs: Screening master's and PhD applications with consistent evaluation Private Institutions: Scaling admissions without expanding admissions staff Community Colleges: Handling high-volume transfer and new student applications International Offices: Evaluating global applicants 24/7 across all timezones Scholarship Committees: Identifying merit scholarship candidates automatically 🔧 What You'll Need Required Integrations Jotform - Application form with student data collection (free tier works) Create your form for free on Jotform using this link Create your application form with fields: Name, Email, Phone, GPA, SAT Score, Major, Essay, Extracurriculars OpenAI API - GPT-4o-mini for cost-effective AI evaluation (~$0.01-0.05 per application) Gmail - Automated applicant communication (acceptance, interview, rejection letters) Google Sheets - Application database and admissions analytics Optional Integrations Slack - Real-time alerts for exceptional applicants Calendar APIs - Automated interview scheduling Student Information System (SIS) - Push accepted students to enrollment system Document Analysis Tools - OCR for transcript verification 🚀 Quick Start Import Template - Copy JSON and import into n8n (requires LangChain support) Create Jotform - Use provided field structure (Name, Email, GPA, SAT, Major, Essay, etc.) Add API Keys - OpenAI, Jotform, Gmail OAuth2, Google Sheets Customize AI Prompt - Edit admissions criteria with your university's specific requirements and values Set Score Thresholds - Adjust auto-accept (95+), interview (70-94), reject (<70) cutoffs if needed Personalize Emails - Update templates with your university branding, dates, and contact info Create Google Sheet - Set up columns: id, Name, Email, GPA, SAT Score, Major, Essay, Extracurriculars Test & Deploy - Submit test application with pinned data and verify all nodes execute correctly 🎨 Customization Options Adjust Evaluation Weights: Change academics (40%), extracurriculars (25%), essay (20%), fit (15%) percentages Multiple Programs: Clone workflow for different majors with unique evaluation criteria Add Document Analysis: Integrate OCR for transcript and recommendation letter verification Interview Scheduling: Connect Google Calendar or Calendly for automated booking SIS Integration: Push accepted students directly to Banner, Ellucian, or PeopleSoft Waitlist Management: Add conditional routing for borderline scores (65-69) Diversity Tracking: Include demographic fields and bias detection in AI evaluation Financial Aid Integration: Automatically calculate need-based aid eligibility alongside merit scholarships 📈 Expected Results 90% reduction in manual application review time (from 2.5 hours to 15 minutes per application) 24-48 hour decision turnaround time vs 4-6 weeks traditional process 40% higher yield rate - faster responses increase enrollment commitment 100% consistency - every applicant evaluated with identical criteria Zero missed applications - automated tracking ensures no application falls through cracks Data-driven admissions - comprehensive analytics on applicant pools and acceptance patterns Better applicant experience - professional, timely communication regardless of decision Defensible decisions - documented scoring criteria for accreditation and compliance 🏆 Use Cases Large Public Universities Screen 5,000+ applications per semester, identify top 20% for auto-admit, route borderline to committee review. Selective Private Colleges Evaluate 500+ highly competitive applications, calculate merit scholarships automatically, schedule interviews with top candidates. Graduate Programs Process master's and PhD applications with research experience weighting, flag candidates for faculty review, automate fellowship awards. Community Colleges Handle high-volume open enrollment while identifying honors program candidates and scholarship recipients instantly. International Admissions Evaluate global applicants 24/7, account for different GPA scales and testing systems, respond same-day regardless of timezone. Rolling Admissions Provide instant decisions for early applicants, fill classes strategically, optimize scholarship budget allocation. 💡 Pro Tips Calibrate Your AI: After 100+ applications, refine evaluation criteria based on enrolled student success A/B Test Thresholds: Experiment with score cutoffs (e.g., 93 vs 95 for auto-admit) to optimize yield Build Waitlist Pipeline: Keep 70-84 score candidates engaged for spring enrollment or next year Track Source Effectiveness: Add UTM parameters to measure which recruiting channels deliver best students Committee Review: Route 85-94 scores to human admissions committee for final review Bias Audits: Quarterly review of AI decisions by demographic groups to ensure fairness Parent Communication: Add parent/guardian emails for admitted students under 18 Financial Aid Coordination: Sync scholarship awards with financial aid office for packaging 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output** - LangChain integration for consistent JSON responses Multi-stage conditional routing** - IF nodes for three-tier decision logic Holistic evaluation** - Weighted scoring across multiple dimensions Automated communication** - HTML email templates with dynamic content Real-time notifications** - Admin alerts for high-value applicants Analytics and data logging** - Google Sheets integration for reporting Fallback mechanisms** - Manual scoring when AI unavailable Perfect for learning advanced n8n automation patterns in educational technology! 🔐 Compliance & Ethics FERPA Compliance: Protects student data with secure credential handling Fair Admissions: Documented criteria eliminate unconscious bias Human Oversight: Committee review option for borderline cases Transparency: Applicants can request evaluation criteria Appeals Process: Structured workflow for decision reconsideration Data Retention: Configurable Google Sheets retention policies 📊 What Gets Tracked Application submission date and time Complete student profile (GPA, test scores, major, essay, activities) AI eligibility score (0-100) and decision category Academic strength rating (excellent/strong/average) Scholarship eligibility and amount ($0-$20,000+) Admission likelihood (high/medium/low) Decision outcome (accepted/interview/rejected) Email delivery status and open rates Time from application to decision Ready to transform your admissions process? Import this template and start evaluating applications intelligently in under 1 hour. Questions or customization needs? The workflow includes detailed sticky notes explaining each section and comprehensive fallback logic for reliability.
by Parag Javale
Turn a simple email workflow into a LinkedIn content machine. Generate post ideas, draft full posts, and auto-publish to LinkedIn all controlled by replying to emails. 📌 Purpose Automate your LinkedIn posting pipeline using AI + Email approvals. Generate 10 scroll-stopping post ideas tailored to your niche & audience. Approve your favorite by replying to the email with a number. Receive 3 AI-written drafts for the chosen idea. Pick your favorite draft via email reply. The selected post gets auto-published to LinkedIn ✅. All steps are logged in Google Sheets. 🔗 Apps Used Google Gemini** → generates ideas & drafts Gmail** → email-based approval workflow Google Sheets** → tracks ideas, drafts, and published posts LinkedIn API** → posts directly to your company or personal account ✨ Highlights 📬 Email-based approval → no dashboards, just reply with a number 📝 10 AI-generated content ideas + 3 full drafts per topic 🔄 End-to-end tracking in Google Sheets (ideas → drafts → published) ⚡ Auto-posting directly to LinkedIn ✅ Final confirmation email with preview 👤 Best For Startup founders Agencies managing multiple clients’ LinkedIn Solopreneurs & creators who want consistent posting 🛠️ Workflow Overview flowchart TB A["Manual Trigger"] --> B["AI Agent - Generate 10 Ideas"] B --> C["Code - Parse JSON + Correlation ID"] C --> D["Google Sheets - Append Ideas"] D --> E["Gmail - Send Ideas Email"] E --> F["Gmail Trigger - Await Reply"] F --> G["Code1 - Extract Reply Number"] G --> H["Google Sheets - Fetch Row"] H --> I{"Switch Stage"} I -- Ideas --> J["AI Agent - Generate 3 Drafts"] J --> K["Code3 - Parse Drafts"] K --> L["Google Sheets - Update Drafts"] L --> M["Gmail - Send Drafts Email"] I -- Drafts --> N["Code4 - Select Final Draft"] N --> O["LinkedIn - Publish Post"] O --> P["Google Sheets - Update Posted"] P --> Q["Gmail - Send Confirmation"] `
by dongou
Fetch user-specific research papers from arXiv on a daily schedule, process and structure the data, and create or update entries in a Notion database, with support for data delivery Paper Topic**: single query keyword Update Frequency**: Daily updates, with fewer than 20 entries expected per day Tools**: Platform: n8n, for end-to-end workflow configuration AI Model: Gemini-2.5-Flash, for daily paper summarization and data processing Database: Notion, with two tables — Daily Paper Summary and Paper Details Message: Feishu (IM bot notifications), Gmail (email notifications) 1. Data Retrieval arXiv API The arXiv provides a public API that allows users to query research papers by topic or by predefined categories. arXiv API User Manual Key Notes: Response Format: The API returns data as a typical Atom Response. Timezone & Update Frequency: The arXiv submission process operates on a 24-hour cycle. Newly submitted articles become available in the API only at midnight after they have been processed. Feeds are updated daily at midnight Eastern Standard Time (EST). Therefore, a single request per day is sufficient. Request Limits: The maximum number of results per call (max_results) is 30,000, Results must be retrieved in slices of at most 2,000 at a time, using the max_results and start query parameters. Time Format: The expected format is [YYYYMMDDTTTT+TO+YYYYMMDDTTTT], TTTT is provided in 24-hour time to the minute, in GMT. Scheduled Task Execution Frequency**: Daily Execution Time**: 6:00 AM Time Parameter Handling (JS)**: According to arXiv’s update rules, the scheduled task should query the previous day’s (T-1) submittedDate data. 2. Data Extraction Data Cleaning Rules (Convert to Standard JSON) Remove Header Keep only the 【entry】【/entry】 blocks representing paper items. Single Item Each 【entry】【/entry】 represents a single item. Field Processing Rules 【id】【/id】 ➡️ id Extract content. Example: 【id】http://arxiv.org/abs/2409.06062v1【/id】 → http://arxiv.org/abs/2409.06062v1 【updated】【/updated】 ➡️ updated Convert timestamp to yyyy-mm-dd hh:mm:ss 【published】【/published】 ➡️ published Convert timestamp to yyyy-mm-dd hh:mm:ss 【title】【/title】 ➡️ title Extract text content 【summary】【/summary】 ➡️ summary Keep text, remove line breaks 【author】【/author】 ➡️ author Combine all authors into an array Example: [ "Ernest Pusateri", "Anmol Walia" ] (for Notion multi-select field) 【arxiv:comment】【/arxiv:comment】 ➡️ Ignore / discard 【link type="text/html"】 ➡️ html_url Extract URL 【link type="application/pdf"】 ➡️ pdf_url Extract URL 【arxiv:primary_category term="cs.CL"】 ➡️ primary_category Extract term value 【category】 ➡️ category Merge all 【category】 values into an array Example: [ "eess.AS", "cs.SD" ] (for Notion multi-select field) Add Empty Fields github huggingface 3. Data Processing Analyze and summarize paper data using AI, then standardize output as JSON. Single Paper Basic Information Analysis and Enhancement Daily Paper Summary and Multilingual Translation 4. Data Storage: Notion Database Create a corresponding database in Notion with the same predefined field names. In Notion, create an integration under Integrations and grant access to the database. Obtain the corresponding Secret Key. Use the Notion "Create a database page" node to configure the field mapping and store the data. Notes "Create a database page"** only adds new entries; data will not be updated. The updated and published timestamps of arXiv papers are in UTC. Notion single-select and multi-select fields only accept arrays. They do not automatically parse comma-separated strings. You need to format them as proper arrays. Notion does not accept null values, which causes a 400 error. 5. Data Delivery Set up two channels for message delivery: EMAIL and IM, and define the message format and content. Email: Gmail GMAIL OAuth 2.0 – Official Documentation Configure your OAuth consent screen Steps: Enable Gmail API Create OAuth consent screen Create OAuth client credentials Audience: Add Test users under Testing status Message format: HTML (Model: OpenAI GPT — used to design an HTML email template) IM: Feishu (LARK) Bots in groups Use bots in groups
by kote2
Overview This workflow lets you capture, store, and retrieve notes from LINE chats — both text and voice messages — and automatically send them to your Gmail inbox. By leveraging Supabase Vector Database, you can not only store and recall your notes, but also repurpose them for idea generation, quiz creation, or hypothesis building. Key Features Receive text and audio messages via LINE Transcribe audio messages automatically and save them in Supabase Trigger note storage with a specific keyword (default: “Diane”) Automatically send the latest notes to your Gmail every morning at 7 AM Search and reuse your notes (e.g., generate ideas, quizzes, or insights) Requirements Supabase account (free plan supported) LINE Messaging API channel setup (obtain your access token) Gmail authentication (OAuth2) Notes Replace placeholders such as LINE_CHANNELACCESS_TOKEN, YOUR_USERID, and YOUR_EMAIL_ADDRESS with your own information. All credentials (OpenAI, Supabase, LINE, Gmail, etc.) must be configured securely in the Credentials section of n8n. You may customize the trigger keyword (“Diane”) to any word you like.
by WeblineIndia
Multi-Asset Daily Market Snapshot This workflow fully automates the creation of a daily multi-asset market report. It retrieves live pricing data for specified indices, forex pairs and commodities using the TwelveData API, manages rate limits safely and feeds the normalized data into a Groq-powered AI (Llama-3). The AI generates a professional, institutional-grade market summary which is then automatically logged in Google Sheets and emailed to your inbox. Quick Implementation Steps Import the Workflow: Upload the JSON file into your n8n workspace. Add Your Keys: In the Environment Config node, paste your TwelveData API key. Connect Accounts: Authenticate your Google Sheets, Gmail and Groq API credentials in their respective nodes. Prepare the Sheet: Create a Google Sheet with two tabs ("Sheet1" for reports, "Error logs" for failures) matching the column headers defined in the Google Sheets nodes. Execute: Click "Test Workflow" (or trigger it manually) to fetch data and receive your daily snapshot. What It Does This workflow acts as an automated quantitative analyst. It starts by establishing your target asset watchlists across three categories: Indices (e.g., SPY, QQQ), Forex (e.g., EUR/USD) and Commodities (e.g., Gold, Oil). It breaks these lists down and carefully queues them up to fetch daily pricing from the TwelveData API. To ensure it doesn't overwhelm the API and get blocked, it uses a batching system with a built-in 15-second throttle. As data flows in, the workflow actively monitors for errors. If an API call fails or hits a hard limit, it instantly logs the failure details into an "Error logs" Google Sheet and sends an emergency failure alert via Gmail. Successfully fetched data is normalized into a clean format, calculating daily percentage changes and basic bullish/bearish trends. Finally, the cleaned dataset is passed to a Llama-3 AI agent via Groq. Instructed to act as a macro strategist, the AI parses the numbers to generate a structured snapshot including a market summary, key movers, risk sentiment and actionable outlook. A custom script safely extracts these exact sections, logs the complete report into your main Google Sheet for historical tracking and delivers the final formatted text straight to your Gmail. Who’s It For This automation is ideal for day traders, macro analysts, portfolio managers, and financial newsletter writers. It is exceptionally useful for anyone who spends the first hour of their morning manually checking tickers and writing up market summaries to share with a team or clients. Requirements to Use This Workflow An active n8n instance. A TwelveData API Key for fetching real-time/daily financial market data. A Groq API account to utilize the Llama-3 language model. A Google Workspace account to authenticate both Google Sheets and Gmail nodes. A Google Sheet pre-configured with the exact column headers expected by the workflow. How It Works & How To Set Up 1. Configure the Environment Open the Environment Config node and input your twelve_api_key. The base API URL is already set up for you. 2. Define Your Watchlist Open the Set Market Assets node. Here you will see comma-separated lists for Indices, Forex and Commodities. You can replace these ticker symbols with any standard symbols supported by TwelveData. 3. API Throttling & Fetching The Rate Controlled Queue node processes one ticker at a time, followed by a 15-second API Throttle wait. This ensures you comply with free-tier or basic API limits. The Fetch Asset Prices node automatically constructs the HTTP request for each symbol. 4. Error Handling Pipeline The Validate API Response node checks if the TwelveData response status is "ok". If not, the workflow branches downward, logging the specific error code and symbol to your Google Sheet and sending a localized Gmail alert, before continuing the loop. 5. AI Processing & Normalization Once all loops finish, the Normalize Market Data node calculates the price changes. The AI Market Insights node then feeds this to Groq. You must select your Groq credentials in the attached Insights model node. 6. Final Delivery The Parse AI Output node uses Regex to strictly format the AI's response. Finally, the Log Daily Market Report node saves the record and the Send Today's market summary node emails you the results. How To Customize Nodes Wait Node (API Throttle):** If you have a premium TwelveData API plan with higher rate limits, you can reduce or remove the 15-second wait time to make the workflow run instantly. AI Market Insights:** You can edit the System Message to change the AI's output tone or request additional sections (like "Crypto Outlook") if you add crypto tickers to your watchlist. Send Today's market summary:** The current email format is plain text. You can change the "Email Type" to HTML and use CSS to design a beautiful, branded daily newsletter. Add‑ons Scheduled Trigger:** Replace the "Manual Trigger" with a "Schedule Trigger" to automatically run this workflow every weekday at 4:30 PM EST right after the market closes. Slack Integration:** Swap out the Gmail nodes for Slack nodes to deliver the daily snapshot directly into your company's #finance or #trading channel. Multi-Model Routing:** Add an IF node before the AI to route the data to OpenAI's GPT-4o if Groq's API is temporarily unavailable, ensuring 100% uptime for your daily reports. Use Case Examples Morning Prep for Day Traders: Run the workflow an hour before the opening bell to get a quick digest of how forex and commodities behaved overnight. Automated Client Newsletters: Feed the final generated text into an email marketing tool (like Mailchimp or ActiveCampaign) to automatically send a daily brief to your subscribers. Internal Risk Management: Use the report to give a firm's trading desk a standardized baseline of macro risk sentiment before the trading day begins. Historical Sentiment Tracking: By logging the AI's daily "Risk Sentiment" into Google Sheets, you can eventually build charts comparing AI sentiment versus actual market performance over time. API Health Monitoring: The built-in failure alert branch acts as an excellent template for monitoring the uptime and health of third-party financial APIs. Troubleshooting Guide | Issue | Possible Cause | Solution | | :--- | :--- | :--- | | Workflow times out or runs endlessly | The Split In Batches loop is not closing properly or the Wait node is set too high for server timeout limits. | Ensure n8n's execution timeout variable is high enough to accommodate 15 seconds per ticker. Reduce the watchlist size if necessary. | | "Unknown API Error" in logs | TwelveData rate limit exceeded or an invalid ticker symbol was requested. | Check the Set Market Assets node to ensure all symbols are correctly spelled. Verify your TwelveData API quota. | | Parse AI Output node fails (Regex error) | The Groq AI ignored formatting instructions and did not output the exact expected headers (e.g., "Market Summary:"). | Adjust the AI Market Insights prompt to be even stricter or lower the "Temperature" setting on the Groq model node to reduce creative variability. | | Google Sheets Node throws a 400 error | Column headers in your Google Sheet do not match the schema in the workflow. | Open the Google Sheets node and verify that columns like execution_date, market_summary and key_movers exist exactly as written in row 1 of your sheet. | Need Help? Stuck on setting up your API keys or looking to expand this workflow to include dozens of new asset classes and complex HTML formatting? If you need advanced customizations, API troubleshooting or want to build a completely tailored financial automation architecture, WeblineIndia is here to help. Reach out to our n8n team of integration experts to take your automated business processes to the next level!
by Mychel Garzon
AI-Powered CV Feedback & Fit Score This workflow uses AI to automatically analyze a candidate’s CV against any job posting. It extracts key skills, requirements, and gaps, then generates a clear fit summary, recommendations, and optimization tips. Candidates also receive a structured email report, helping them improve their CV and focus on the right roles. No more guesswork, the workflow delivers objective, AI-powered career insights in minutes. Benefits • Automated CV analysis: Instantly compare your CV with any job description. • Clear recommendations: Get a fit score (1–10) plus “Apply,” “Consider,” or “Not a fit.” • Actionable feedback: See missing skills and concrete optimization tips. • Email reports: Candidates receive a professional summary directly in their inbox. Target Audience • Job seekers • Career coaches and recruiters • HR teams evaluating candidate job alignment • Tech bootcamps and training programs Required APIs • Google Gemini API (AI analysis) • Email credentials (send candidate reports) Easy Customization • Fit score logic: Adjust thresholds for “Apply,” “Consider,” and “Not a fit.” • Email templates: Personalize branding, tone, or add follow-up resources. • Delivery channels: Add Slack, Teams, or WhatsApp nodes for real-time feedback. • Language detection: Extend to more languages by adding translation nodes.
by Adam Goodyer
Video Digestion Workflow — n8n Template Description How it works This workflow takes any YouTube video URL and automatically extracts a rich, structured analysis — including transcript, key visual moments, video metadata, SEO keywords, and content section breakdowns. It's designed as the foundation layer for content repurposing, feeding its output into downstream workflows for creating Shorts, LinkedIn posts, Twitter threads, blog articles, email newsletters, and more. The pipeline: YouTube URL Input — A simple form trigger accepts any YouTube video URL. Video Download (Apify) — Downloads the video file at 720p via the Apify YouTube Video Downloader actor. Transcript Extraction (Apify) — Pulls the full transcript with timestamps from YouTube using the Apify YouTube Video Transcript actor. No audio processing needed — fast and reliable. Data Consolidation — A Code node merges both Apify outputs into a single structured object containing: video URL, transcript text, timestamped segments, video metadata (title, description, duration, channel info, like/comment counts, thumbnail, publish date). Visual Analysis (Google Gemini Pro) — Sends the actual video to Gemini's video analysis endpoint, which watches the entire video and identifies key B-roll moments with precise timestamps, app detection, and webcam overlay awareness. It categorises clips as clean screen recordings vs. webcam overlays vs. talking head segments. Key Action Parsing — Filters and categorises the Gemini output into usable clips, removing talking-head-only segments and incomplete data. Outputs chronologically sorted clips with cropping metadata for downstream video editing. AI Section Analysis (OpenAI) — Sends the transcript + key moments to OpenAI with structured output (JSON schema) to generate: video summary, one-liner, main argument, target audience, content style, tone, key takeaways, problems addressed, tools mentioned, frameworks explained, suggested titles, and SEO keywords. Output — The final structured payload is ready to pass to any downstream workflow (e.g., Shorts creation, social media posting, blog generation). Setup guide Required accounts & API keys You'll need API credentials for the following services: | Service | What it does | Sign up | |---------|-------------|---------| | Apify | YouTube video downloading + transcript extraction | https://apify.com | | Google AI Studio (Gemini) | Video analysis — watches the video and detects key visual moments | https://aistudio.google.com | | OpenAI | Structured content analysis with JSON schema output | https://platform.openai.com | Required Apify actors You need to add these two Apify actors to your account: YouTube Video Downloader by epctex — https://apify.com/epctex/youtube-video-downloader YouTube Video Transcript by starvibe — https://apify.com/starvibe/youtube-video-transcript n8n credentials to configure Apify API** — Add your Apify API token in n8n credentials Google Gemini** — Add your Google AI Studio API key in n8n credentials OpenAI** — Add your OpenAI API key in n8n credentials Steps Import the workflow into n8n Configure all three credential sets (Apify, Gemini, OpenAI) Ensure both Apify actors are added to your Apify account Activate the workflow Open the form trigger URL and paste any YouTube video URL The workflow outputs a comprehensive JSON payload ready for downstream workflows What you can build with the output The structured output from this workflow is designed to be piped into other workflows. Some ideas: YouTube Shorts creation** — Use the key moments + timestamps to auto-clip and render short-form content LinkedIn carousel posts** — Pull key takeaways and section summaries Twitter/X threads** — Convert section breakdowns into threaded posts Blog articles** — Use the full transcript + structure as a draft foundation Email newsletters** — Summarise the video for your subscriber list SEO-optimised descriptions** — Auto-generate YouTube descriptions with keywords Nodes used Form Trigger (n8n built-in) Apify (x2 — video download + transcript) Code (x2 — data consolidation + key action parsing) Google Gemini (video analysis) OpenAI (structured content analysis with JSON schema) Edit Fields (data mapping) Execute Workflow (optional — calls downstream Shorts creation workflow) Built by @adamfreelances — The Anti-Guru Technical Educator. Real workflows, real implementation, no fluff.
by Jitesh Dugar
Who’s it for This template is designed for anyone who wants to use Telegram as a personal AI assistant hub. If you often juggle tasks, emails, calendars, and expenses across multiple tools, this workflow consolidates everything into one seamless AI-powered agent. What it does Jarvis listens to your Telegram messages (text or audio) and processes them with OpenAI. Based on your request, it can: ✅ Manage tasks (create, complete, or delete) 📅 Handle calendar events (schedule, reschedule, or check availability) 📧 Send, draft, or fetch emails with Gmail 👥 Retrieve Google Contacts 💵 Log and track expenses All responses are returned directly to Telegram, giving you a unified command center. How to set up Clone this template into your n8n workspace. Connect your accounts (Telegram, Gmail, Google Calendar, Contacts, etc.). Add your OpenAI API key in the Credentials section. Test by sending a Telegram message like “Create a meeting tomorrow at 3pm” or “Add expense $50 for lunch.” or "Draft a reply with a project proposal to that email from Steve" Requirements n8n instance (cloud or self-hosted) Telegram Bot API credentials Gmail, Google Calendar, and Google Contacts credentials (optional, if using those features) OpenAI API key ElevenLabs API Key (optional, if you need audio note support) How to customize Swap Gmail with another email provider by replacing the Gmail MCP node. Add additional MCP integrations (e.g., Notion, Slack, CRM tools). Adjust memory length to control how much context Jarvis remembers. With this template, you can transform Telegram into your all-in-one AI assistant, simplifying workflows and saving hours every week.
by Rahul Joshi
📘 Description This workflow automates the employee onboarding process by creating Jira accounts, generating Notion onboarding checklists, crafting AI-generated welcome messages, and sending personalized welcome emails — all automatically. It provides a complete hands-free onboarding experience for HR and IT teams by connecting Jira, Notion, Google Sheets, Gmail, and Azure OpenAI. Failures (like Jira account creation errors) are logged into Google Sheets to ensure full transparency and no missed onboardings. ⚙️ What This Workflow Does (Step-by-Step) 🟢 When Clicking “Execute Workflow” Manually triggers the entire onboarding automation. Useful for testing or initiating onboarding on demand for a new hire. 👤 Define New Hire Profile Data Structures all essential employee information into a clean dataset including name, email, start date, buddy, and access links (Slack, GitHub, Jira, Notion). Acts as the single source of truth for all downstream systems ensuring consistent, error-free onboarding data. 🎫 Create Jira User Account Automatically creates a Jira account for the new employee using REST API calls. Includes email, display name, username, and product access (Jira Software). Removes the need for manual admin setup and ensures immediate access to project boards. ✅ Validate Jira Account Creation Success: Checks if the Jira API response contains a valid accountId. If successful → continues onboarding. If failed → logs error to Google Sheets. Ensures downstream steps don’t continue if Jira setup fails. 📊 Log Jira Provisioning Failures to Error Sheet Appends any account creation errors (duplicate emails, invalid permissions, or API issues) into an “error log sheet” in Google Sheets. Helps HR/IT monitor issues and manually resolve them. Guarantees no silent onboarding failures. 📋 Generate Notion Onboarding Checklist Creates a personalized Notion page titled “{Name} - Onboarding Checklist” that includes: Welcome message Access links (Slack, GitHub, Jira) Assigned buddy details Start date and status Optionally, embedded videos or docs Gives each new hire a structured hub to manage onboarding tasks independently. 🤖 AI-Generated Welcome Message Creator Uses GPT-4o (Azure OpenAI) to craft a friendly, motivational welcome message for the new employee. Incorporates name, buddy, and access details with emojis and warm tone. Ensures every message feels human and engaging — not robotic. 🧠 GPT-4o Language Model Configuration Configures the AI assistant persona for personalized onboarding messages. Ensures tone consistency, friendliness, and empathy across all communications. 🔗 Consolidate Onboarding Data Streams Merges data from Jira, Notion, and AI message generation into a single payload. This ensures the final email contains every onboarding element — access links, checklist URL, and the AI-generated message. 📧 Format Comprehensive Welcome Email Generates a complete HTML-formatted email with: Personalized greeting AI-generated welcome message Clickable links (Jira, Notion, Slack, GitHub) Buddy info and start date Designed for mobile responsiveness and branded presentation. 📬 Send Welcome Email to New Hire Sends the final welcome email to the employee’s inbox with the subject: “Welcome to Techdome, {Name}! 🎉” Includes all essential access information, links, and team introductions — ensuring the new hire starts strong on Day 1. 🧩 Prerequisites Jira Admin API credentials Notion API integration Gmail OAuth2 credentials Azure OpenAI (GPT-4o) access Google Sheets document for logging errors 💡 Key Benefits ✅ Fully automated new hire onboarding ✅ AI-generated personalized communications ✅ Real-time error logging for IT transparency ✅ Seamless integration across Jira, Notion, and Gmail ✅ Professional first-day experience with zero manual work 👥 Perfect For HR teams managing multiple onboardings IT admins automating access provisioning Startups scaling employee onboarding Organizations using Jira + Notion + Gmail stack
by Oneclick AI Squad
This automated n8n workflow streamlines real estate marketing by combining voice campaigns and email outreach with AI-powered lead generation. The system monitors real estate offers, generates personalized promotional content using AI, creates targeted email campaigns, and manages lead follow-up through automated voice calls and CRM integration. Good to Know Integrates voice campaign automation with email marketing for multi-channel outreach Uses Llama 3.2 AI model for generating personalized promotional content Automatically syncs lead data with CRM systems for comprehensive tracking Includes delay mechanisms to ensure proper data synchronization Supports both email and voice-based lead nurturing strategies How It Works Watch Real Estate Offer** - Monitors incoming real estate listings and opportunities to trigger marketing campaigns Get Client Contact List** - Fetches targeted client information and contact details from CRM or database systems Generate Promo Content with Llama** - Uses AI to create personalized marketing content based on property details and client preferences Trigger Voice Campaign via VAPI** - Initiates automated voice calls to prospects using personalized messaging Create Personalized Email Template** - Generates custom HTML email templates with property information and promotional content Email Promo to Clients (Gmail)** - Sends targeted email campaigns to segmented client lists through Gmail integration Delay to Sync Data** - Ensures proper data synchronization between systems before processing leads Receive Lead Data from VAPI** - Captures lead information and responses from voice campaign interactions Save Lead to CRM Sheet** - Logs all lead data and campaign results to spreadsheet for tracking and analysis Send Acknowledgment to VAPI** - Confirms successful lead processing and maintains system synchronization How to Use Import workflow into n8n Configure VAPI credentials for voice campaign automation Set up Gmail API for email marketing integration Connect CRM or Google Sheets for lead management Configure Llama 3.2 AI model access Test with sample real estate data Monitor campaign performance and lead conversion rates Requirements VAPI account for voice campaigns Gmail API credentials Llama 3.2 AI model access Google Sheets or CRM integration Real estate data source Customizing This Workflow Adjust AI prompts for different property types or market segments Modify email templates for various campaign styles Configure voice campaign scripts based on target audience Set up custom lead scoring and qualification criteria Integrate additional CRM systems or marketing platforms
by Abdul Mir
Overview Stop spending hours formatting proposals. This workflow turns a short post-call form into a high-converting, fully-personalized PandaDoc proposal—plus updates your CRM and drafts the follow-up email for you. After a sales call, just fill out a 3-minute form summarizing key pain points, solutions pitched, and the price. The workflow uses AI to generate polished proposal copy, then builds a PandaDoc draft using dynamic data mapped into the JSON body (which you can fully customize per business). It also updates the lead record in ClickUp with the proposal link, company name, and quote—then creates an email draft in Gmail, ready to send. Who’s it for Freelancers and consultants sending service proposals Agencies closing deals over sales calls Sales reps who want to automate proposal follow-up Teams using ClickUp as their lightweight CRM How it works After a call, fill out a short form with client details, pitch notes, and price AI generates professional proposal copy based on form input Proposal is formatted and sent to PandaDoc via HTTP request ClickUp lead is updated with: Company Name Proposal URL Quote/price A Gmail draft is created using the proposal link and a thank-you message Example use case > You hop off a call, fill out: > - Prospect: Shopify agency > - Pain: No lead gen system > - Solution: Automated cold outreach > - Price: $2,500/month > > 3 minutes later: PandaDoc proposal is ready, CRM is updated, and your email draft is waiting to be sent. How to set up Replace the form with your preferred tool (e.g. Tally, Typeform) Connect PandaDoc API and structure your proposal template Customize the JSON body inside the HTTP request to match your business Link your ClickUp space and custom fields Connect Gmail (or other email tool) for final follow-up draft Requirements Form tool for capturing sales call notes OpenAI or LLM key for generating proposal copy PandaDoc API access ClickUp custom fields set up for lead tracking Gmail integration How to customize Customize your PandaDoc proposal fields in the JSON body of the HTTP node Replace ClickUp with another CRM like HubSpot or Notion Adjust AI tone (casual, premium, corporate) for proposal writing Add Slack or Telegram alerts when the draft is ready Add PDF generation or auto-send email step