by Khair Ahammed
Meet Troy, your intelligent personal assistant that seamlessly manages your Google Calendar and Tasks through Telegram. This workflow combines AI-powered natural language processing with MCP (Model Context Protocol) integration to provide a conversational interface for scheduling meetings, managing tasks, and organizing your digital life. Key Features 📅 Smart Calendar Management Create single and recurring events with conflict detection Support for multiple attendees (1-2 attendee variants) Automatic time zone handling (Bangladesh Standard Time) Weekly recurring event scheduling Event retrieval, updates, and deletion ✅ Task Management Create, update, and delete tasks in Google Tasks Mark tasks as completed Retrieve task lists with completion status Task repositioning and organization Parent-child task relationships 🤖Intelligent Processing Natural language understanding for scheduling requests Automatic conflict detection before event creation Context-aware responses with conversation memory Error handling with fallback messages 📱 Telegram Interface Real-time chat interaction Simple commands and natural language Instant confirmations and updates Error notifications Workflow Components Core Architecture: Telegram Trigger for user messages AI Agent with GPT-4o-mini processing MCP Client Tools for Google services Conversation memory for context Error handling with backup responses MCP Integrations: Google Calendar MCP Server (6 specialized tools) Google Tasks MCP Server (5 task operations) Custom HTTP tool for advanced task positioning Use Cases Calendar Scenarios: "Schedule a meeting tomorrow at 3 PM with john@example.com" "Set up weekly team standup every Monday at 10 AM" "Check my calendar for conflicts this afternoon" "Delete the meeting with ID xyz123" Task Management: "Add a task to buy groceries" "Mark the project report task as completed" "Update my presentation task due date to Friday" "Show me all pending tasks" Setup Requirements Required Credentials: Google Calendar OAuth2 Google Tasks OAuth2 OpenAI API key Telegram Bot token ** MCP Configuration:** Two MCP server endpoints for Google services Proper webhook configurations SSL-enabled n8n instance for MCP triggers Business Benefits Productivity: Voice-to-action task and calendar management *Efficiency: *Eliminate app switching with chat interface Intelligence: AI prevents scheduling conflicts automatically Accessibility: Simple Telegram commands for complex operations Technical Specifications Components: 1 Telegram trigger 1 AI Agent with memory 2 MCP triggers (Calendar & Tasks) 13 Google service tools Error handling flows Response Time: Sub-second for most operations *Memory: *Session-based conversation context Timezone: Automatic Bangladesh Standard Time conversion This personal assistant transforms how you interact with Google services, making scheduling and task management as simple as sending a text message to Troy on Telegram. Tags: personal-assistant, mcp-integration, google-calendar, google-tasks, telegram-bot, ai-agent, productivity
by AFK Crypto
Try It Out! The AI Investment Research Assistant (Discord Summary Bot) transforms your Discord server into a professional-grade AI-driven crypto intelligence center. Running automatically every morning, it gathers real-time news, sentiment, and market data from multiple trusted sources — including NewsAPI, Crypto Compare, and CoinGecko — covering the most influential digital assets like BTC, ETH, SOL, BNB, and ADA. An AI Research Analyst Agent then processes this data using advanced reasoning and summarization to deliver a structured Market Intelligence Briefing. Each report distills key market events, sentiment shifts, price movements, and analyst-grade insights, all formatted into a visually clean and actionable message that posts directly to your Discord channel. Whether you’re a fund manager, community owner, or analyst, this workflow helps you stay informed about market drivers — without manually browsing dozens of news sites or data dashboards. Detailed Use Cases Crypto Research Teams:** Automate daily market briefings across key assets. Investment Communities:** Provide daily insights and sentiment overviews directly on Discord. Trading Desks:** Quickly review summarized market shifts and performance leaders. DAOs or Fund Analysts:** Centralize institutional-style crypto intelligence into your server. How It Works Daily Trigger (Schedule Node) – Activates each morning to begin data collection. News Aggregation Layer – Uses NewsAPI (and optionally CryptoPanic or GDELT) to fetch the latest crypto headlines and event coverage. Market & Sentiment Fetch – Collects market metrics via CoinGecko or Crypto Compare, including: 24-hour price change Market cap trend Social sentiment or Fear & Greed index AI Research Analyst (LLM Agent) – Processes and synthesizes all data into a cohesive insight report containing: 🧠 Executive Summary 📊 Top Gainers & Losers 💬 Sentiment Overview 🔍 Analyst Take / Actionable Insight Formatting Layer (Code Node) – Converts the analysis into a Discord-ready structure. Discord Posting Node – Publishes the final Market Intelligence Briefing to a specified Discord channel. Setup and Customization Import this workflow into your n8n workspace. Configure credentials: NewsAPI Key – For crypto and blockchain news. CoinGecko / Crypto Compare API Key – For real-time asset data. LLM Credential – OpenAI, Gemini, or Anthropic. Discord Webhook URL or Bot Token – To post updates. Customize the tracked assets in the News and Market nodes (BTC, ETH, SOL, BNB, ADA, etc.). Set local timezone for report delivery. Deploy and activate — your server will receive automated morning briefings. Output Format Each daily report includes: 📰 AI Market Intelligence Briefing 📅 Date: October 16, 2025 💰 Top Movers: BTC +2.3%, SOL +1.9%, ETH -0.8% 💬 Sentiment: Moderately Bullish 🔍 Analyst Take: Accumulation signals forming in mid-cap layer-1s. 📈 Outlook: Positive bias, with ETH showing strong support near $2,400. Compact yet rich in insight, this format ensures quick readability and fast decision-making for traders and investors. (Optional) Extend This Workflow Portfolio-Specific Insights:** Fetch your wallet holdings from AFK Crypto or Zapper APIs for personalized reports. Interactive Commands:** Add /compare or /analyze commands for Discord users. Multi-Language Summaries:** Auto-translate for international communities. Historical Data Logging:** Store briefings in Notion or Google Sheets. Weekly Recaps:** Summarize all daily reports into a long-form analysis. Requirements n8n Instance** (with HTTP Request, AI Agent, and Discord nodes enabled) NewsAPI Key** CoinGecko / Crypto Compare API Key** LLM Credential** (OpenAI / Gemini / Anthropic) Discord Bot Token or Webhook URL** APIs Used GET https://newsapi.org/v2/everything?q=crypto OR bitcoin OR ethereum OR defi OR nft&language=en&sortBy=publishedAt&pageSize=10 GET https://api.coingecko.com/api/v3/simple/price?ids=bitcoin,ethereum,solana&vs_currencies=usd&include_market_cap=true&include_24hr_change=true (Optional) GET https://cryptopanic.com/api/v1/posts/?auth_token=YOUR_TOKEN&kind=news (Optional) GET https://api.gdeltproject.org/api/v2/doc/doc?query=crypto&format=json Summary The AI Investment Research Assistant (Discord Summary Bot) is your personal AI research analyst — delivering concise, data-backed crypto briefings directly to Discord. It intelligently combines news aggregation, sentiment analysis, and AI reasoning to create actionable market intelligence each morning. Ideal for crypto traders, funds, or educational communities seeking a reliable daily edge — this workflow replaces hours of manual research with one automated, professional-grade summary. Our Website: https://afkcrypto.com/ Check our blogs: https://www.afkcrypto.com/blog
by Hugo
🤖 n8n AI Workflow Dashboard Template Overview This template is designed to collect execution data from your AI workflows and generate an interactive dashboard for easy monitoring. It's compatible with any AI Agent or RAG workflow in n8n. Main Objectives 💾 Collect Execution Data Track messages, tokens used (prompt/completion), session IDs, model names, and compute costs Designed to plug into any AI agent or RAG workflow in n8n 📊 Generate an Interactive Dashboard Visualize KPIs like total messages, unique sessions, tokens used, and costs Display daily charts, including stacked bars for prompt vs completion tokens Monitor AI activity, analyze usage, and track costs at a glance ✨ Key Features 💬 Conversation Data Collection Messages sent to the AI agent are recorded with: sessionId chatInput output promptTokens, completionTokens, totalTokens globalCost and modelName This allows detailed tracking of AI interactions across sessions. 💰 Model Pricing Management A sub-workflow with a Set node provides token prices for LLMs Data is stored in the Model price table for cost calculations 🗄️ Data Storage via n8n Data Tables Two tables need to be created: Model price { "id": 20, "createdAt": "2025-10-11T12:16:47.338Z", "updatedAt": "2025-10-11T12:16:47.338Z", "name": "claude-4.5-sonnet", "promptTokensPrice": 0.000003, "completionTokensPrice": 0.000015 } Messages [ { "id": 20, "createdAt": "2025-10-11T15:28:00.358Z", "updatedAt": "2025-10-11T15:31:28.112Z", "sessionId": "c297cdd4-7026-43f8-b409-11eb943a2518", "action": "sendMessage", "output": "Hey! \nHow's it going?", "chatInput": "yo", "completionTokens": 6, "promptTokens": 139, "totalTokens": 139, "globalCost": null, "modelName": "gpt-4.1-mini", "executionId": 245 } ] These tables store conversation data and pricing info to feed the dashboard and calculations. 📈 Interactive Dashboard KPIs Generated**: total messages, unique sessions, total/average tokens, total/average cost 💸 Charts Included**: daily messages, tokens used per day (prompt vs completion, stacked bar) Provides a visual summary of AI workflow performance ⚙️ Installation & Setup Follow these steps to set up and run the workflow in n8n: 1. Import the Workflow Download or copy the JSON workflow and import it into n8n. 2. Create the Data Tables Model price table**: stores token prices per model Messages table**: stores messages generated by the AI agent 3. Configure the Webhook The workflow is triggered via a webhook Use the webhook URL to send conversation data 4. Set Up the Pricing Sub-workflow Automatically generates price data for the models used Connect it to your main workflow to enrich cost calculations 5. Dashboard Visualization The workflow returns HTML code rendering the dashboard View it in a browser or embed it in your interface 🌐 Once configured, your workflow tracks AI usage and costs in real-time, providing a live dashboard for quick insights. 🔧 Adaptability The template is modular and can be adapted to any AI agent or RAG workflow KPIs, charts, colors, and metrics can be customized in the HTML rendering Ideal for monitoring, cost tracking, and reporting AI workflow performance
by Paul Roussel
Automated workflow that generates custom AI image backgrounds from text prompts using Gemini's Nano Banana (native image generation), removes video backgrounds, and composites videos on AI-generated scenes. Create any background you can imagine without needing stock images. How it works • Describe background: Provide video URL and text prompt describing desired background scene (e.g., "modern office with city skyline at golden hour") • AI generates image: Gemini creates a background image from your prompt in ~10-20 seconds • Upload to Drive: Generated background is saved to Google Drive and made publicly accessible • Remove & composite: Video background is removed and composited on AI-generated scene with centered template • Save final video: Completed video is uploaded to Google Drive with shareable link Set up steps ⏱️ Total setup time: ~5 minutes • Get Gemini API Key (~1 min): Visit https://aistudio.google.com/apikey, create new API key, add to n8n Settings → Variables as GEMINI_KEY • Get VideoBGRemover API Key (~2 min): Visit https://videobgremover.com/n8n, sign up, add to n8n as VIDEOBGREMOVER_KEY • Connect Google Drive (~2 min): Click "Save Background Image to Drive" node, click "Connect", authorize n8n Use cases: Marketing videos with custom branded environments tailored to your message Product demos with unique AI-generated backgrounds that match your product aesthetic Social media content with creative scenes you can't find in stock libraries AI avatars placed in AI-generated worlds Presentations with custom backgrounds generated for specific topics A/B testing different background variations for the same video Pricing: Gemini: ~$0.03 per generated image VideoBGRemover: $0.50-$2.00 per minute of video Total: ~$0.53-$2.03 per video Triggers: Webhook (for automation) or Manual (for testing) Processing time: Typically 5-7 minutes total Prompt tips: Be descriptive and specific. Instead of "office," try: "A modern minimalist office with floor-to-ceiling windows overlooking a city skyline at golden hour. Warm sunlight, polished concrete floors, sleek wooden desks, green plants."
by Nikan Noorafkan
🧠 Google Ads Monthly Performance Optimization (Channable + Google Ads + Relevance AI) 🚀 Overview This workflow automatically analyzes your Google Ads performance every month, identifies top-performing themes and categories, and regenerates optimized ad copy using Relevance AI — powered by insights from your Channable product feed. It then saves the improved ads to Google Sheets for review and sends a detailed performance report to your Slack workspace. Ideal for marketing teams who want to automate ad optimization at scale with zero manual intervention. 🔗 Integrations Used Google Ads** → Fetch campaign and ad performance metrics using GAQL. Relevance AI** → Analyze performance data and regenerate ad copy using AI agents and tools. Channable** → Pull updated product feeds for ad refresh cycles. Google Sheets** → Save optimized ad copy for review and documentation. Slack** → Send a 30-day performance report to your marketing team. 🧩 Workflow Summary | Step | Node | Description | | ---- | --------------------------------------------------- | --------------------------------------------------------------------------- | | 1 | Monthly Schedule Trigger | Runs automatically on the 1st of each month to review last 30 days of data. | | 2 | Get Google Ads Performance Data | Fetches ad metrics via GAQL query (impressions, clicks, CTR, etc.). | | 3 | Calculate Performance Metrics | Groups results by ad group and theme to find top/bottom performers. | | 4 | AI Performance Analysis (Relevance AI) | Generates human-readable insights and improvement suggestions. | | 5 | Update Knowledge Base (Relevance AI) | Saves new insights for future ad copy training. | | 6 | Get Updated Product Feed (Channable) | Retrieves the latest catalog items for ad regeneration. | | 7 | Split Into Batches | Splits the feed into groups of 50 to avoid API rate limits. | | 8 | Regenerate Ad Copy with Insights (Relevance AI) | Rewrites ad copy with the latest product and performance data. | | 9 | Save Optimized Ads to Sheets | Writes output to your “Optimized Ads” Google Sheet. | | 10 | Generate Performance Report | Summarizes the AI analysis, CTR trends, and key insights. | | 11 | Email Performance Report (Slack) | Sends report directly to your Slack channel/team. | 🧰 Requirements Before running the workflow, make sure you have: A Google Ads account with API access and OAuth2 credentials. A Relevance AI project (with one Agent and one Tool setup). A Channable account with API key and project feed. A Google Sheets document for saving results. A Slack webhook URL for sending performance summaries. ⚙️ Environment Variables Add these environment variables to your n8n instance (via .env or UI): | Variable | Description | | -------------------------------- | ------------------------------------------------------------------- | | GOOGLE_ADS_API_VERSION | API version (e.g., v17). | | GOOGLE_ADS_CUSTOMER_ID | Your Google Ads customer ID. | | RELEVANCE_AI_API_URL | Base Relevance AI API URL (e.g., https://api.relevanceai.com/v1). | | RELEVANCE_AGENT_PERFORMANCE_ID | ID of your Relevance AI Agent for performance analysis. | | RELEVANCE_KNOWLEDGE_SOURCE_ID | Knowledge base or dataset ID used to store insights. | | RELEVANCE_TOOL_AD_COPY_ID | Relevance AI tool ID for generating ad copy. | | CHANNABLE_API_URL | Channable API endpoint (e.g., https://api.channable.com/v1). | | CHANNABLE_COMPANY_ID | Your Channable company ID. | | CHANNABLE_PROJECT_ID | Your Channable project ID. | | FEED_ID | The feed ID for product data. | | GOOGLE_SHEET_ID | ID of your Google Sheet to store optimized ads. | | SLACK_WEBHOOK_URL | Slack Incoming Webhook URL for sending reports. | 🔐 Credentials Setup in n8n | Credential | Type | Usage | | ----------------------------------------------- | ------- | --------------------------------------------------- | | Google Ads OAuth2 API | OAuth2 | Authenticates your Ads API queries. | | HTTP Header Auth (Relevance AI & Channable) | Header | Uses your API key as Authorization: Bearer <key>. | | Google Sheets OAuth2 API | OAuth2 | Writes optimized ads to Sheets. | | Slack Webhook | Webhook | Sends monthly reports to your team channel. | 🧠 Example AI Insight Output { "insights": [ "Ad groups using 'vegan' and 'organic' messaging achieved +23% CTR.", "'Budget' keyword ads underperformed (-15% CTR).", "Campaigns featuring 'new' or 'bestseller' tags showed higher conversion rates." ], "recommendations": [ "Increase ad spend for top-performing 'vegan' and 'premium' categories.", "Revise copy for 'budget' and 'sale' ads with low CTR." ] } 📊 Output Example (Google Sheet) | Product | Category | Old Headline | New Headline | CTR Change | Theme | | ------------------- | -------- | ------------------------ | -------------------------------------------- | ---------- | ------- | | Organic Protein Bar | Snacks | “Healthy Energy Anytime” | “Organic Protein Bar — 100% Natural Fuel” | +12% | Organic | | Eco Face Cream | Skincare | “Gentle Hydration” | “Vegan Face Cream — Clean, Natural Moisture” | +17% | Vegan | 📤 Automation Flow Run Automatically on the first of every month (cron: 0 0 1 * *). Fetch Ads Data → Analyze & Learn → Generate New Ads → Save & Notify. Every iteration updates the AI’s knowledge base — improving your campaigns progressively. ⚡ Scalability The flow is batch-optimized (50 items per request). Works for large ad accounts with up to 10,000 ad records. AI analysis & regeneration steps are asynchronous-safe (timeouts extended). Perfect for agencies managing multiple ad accounts — simply duplicate and update the environment variables per client. 🧩 Best Use Cases Monthly ad creative optimization for eCommerce stores. Marketing automation for Google Ads campaign scaling. Continuous learning ad systems powered by Relevance AI insights. Agencies automating ad copy refresh cycles across clients. 💬 Slack Report Example 30-Day Performance Optimization Report Date: 2025-10-01 Analysis Period: Last 30 days Ads Analyzed: 842 Top Performing Themes Vegan: 5.2% CTR (34 ads) Premium: 4.9% CTR (28 ads) Underperforming Themes Budget: 1.8% CTR (12 ads) AI Insights “Vegan” and “Premium” themes outperform baseline by +22% CTR. “Budget” ads underperform due to lack of value framing. Next Optimization Cycle: 2025-11-01 🛠️ Maintenance Tips Update your GAQL query occasionally to include new metrics or segments. Refresh Relevance AI tokens every 90 days (if required). Review generated ads in Google Sheets before pushing them live. Test webhook and OAuth connections after major n8n updates. 🧩 Import Instructions Open n8n → Workflows → Import from File / JSON. Paste this workflow JSON or upload it. Add all required environment variables and credentials. Execute the first run manually to validate connections. Once verified, enable scheduling for automatic monthly runs. 🧾 Credits Developed for AI-driven marketing teams leveraging Google Ads, Channable, and Relevance AI to achieve continuous ad improvement — fully automated via n8n.
by Rahul Joshi
Description Automatically extract a structured skill matrix from PDF resumes in a Google Drive folder and store results in Google Sheets. Uses Azure OpenAI (GPT-4o-mini) to analyze predefined tech stacks and filters for relevant proficiency. Fast, consistent insights ready for review. 🔍📊 What This Template Does Fetches all resumes from a designated Google Drive folder (“Resume_store”). 🗂️ Downloads each resume file securely via Google Drive API. ⬇️ Extracts text from PDF files for analysis. 📄➡️📝 Analyzes skills with Azure OpenAI (GPT-4o-mini), rating 1–5 and estimating years. 🤖 Parses and filters to include only skills with proficiency > 2, then updates Google Sheets (“Resume store” → “Sheet2”). ✅ Key Benefits Saves hours on manual resume screening. ⏱️ Produces a consistent, structured skill matrix. 📐 Focuses on intermediate to expert skills for faster shortlisting. 🎯 Centralizes candidate data in Google Sheets for easy sharing. 🗃️ Features Predefined tech stack focus: React, Node.js, Angular, Python, Java, SQL, Docker, Kubernetes, AWS, Azure, GCP, HTML, CSS, JavaScript. 🧰 Proficiency scoring (1–5) and estimated years of experience. 📈 PDF-to-text extraction for robust parsing. 🧾 JSON parsing with error handling for invalid outputs. 🛡️ Manual Trigger to run on demand. ▶️ Requirements n8n instance (cloud or self-hosted). Google Drive access with credentials to the “Resume_store” folder. Google Sheets access to the “Resume store” spreadsheet and “Sheet2” tab. Azure OpenAI with GPT-4o-mini deployed and connected via secure credentials. PDF text extraction enabled within n8n. Target Audience HR and Talent Acquisition teams. 👥 Recruiters and staffing agencies. 🧑💼 Operations teams managing hiring pipelines. 🧭 Tech hiring managers seeking consistent skill insights. 💡 Step-by-Step Setup Instructions Place candidate resumes (PDF) into Google Drive → “Resume_store”. In n8n, add Google Drive and Google Sheets credentials and authorize access. In n8n, add Azure OpenAI credentials (GPT-4o-mini deployment). Import the workflow, assign credentials to each node, and confirm folder/sheet names. Run the Manual Trigger to execute the flow and verify data in “Resume store” → “Sheet2”.
by Daniel Shashko
This workflow automates the creation of user-generated-content-style product videos by combining Gemini's image generation with OpenAI's SORA 2 video generation. It accepts webhook requests with product descriptions, generates images and videos, stores them in Google Drive, and logs all outputs to Google Sheets for easy tracking. Main Use Cases Automate product video creation for e-commerce catalogs and social media. Generate UGC-style content at scale without manual design work. Create engaging video content from simple text prompts for marketing campaigns. Build a centralized library of product videos with automated tracking and storage. How it works The workflow operates as a webhook-triggered process, organized into these stages: Webhook Trigger & Input Accepts POST requests to the /create-ugc-video endpoint. Required payload includes: product prompt, video prompt, Gemini API key, and OpenAI API key. Image Generation (Gemini) Sends the product prompt to Google's Gemini 2.5 Flash Image model. Generates a product image based on the description provided. Data Extraction Code node extracts the base64 image data from Gemini's response. Preserves all prompts and API keys for subsequent steps. Video Generation (SORA 2) Sends the video prompt to OpenAI's SORA 2 API. Initiates video generation with specifications: 720x1280 resolution, 8 seconds duration. Returns a video generation job ID for polling. Video Status Polling Continuously checks video generation status via OpenAI API. If status is "completed": proceeds to download. If status is still processing: waits 1 minute and retries (polling loop). Video Download & Storage Downloads the completed video file from OpenAI. Uploads the MP4 file to Google Drive (root folder). Generates a shareable Google Drive link. Logging to Google Sheets Records all generation details in a tracking spreadsheet: Product description Video URL (Google Drive link) Generation status Timestamp Summary Flow: Webhook Request → Generate Product Image (Gemini) → Extract Image Data → Generate Video (SORA 2) → Poll Status → If Complete: Download Video → Upload to Google Drive → Log to Google Sheets → Return Response If Not Complete: Wait 1 Minute → Poll Status Again Benefits: Fully automated video creation pipeline from text to finished product. Scalable solution for generating multiple product videos on demand. Combines cutting-edge AI models (Gemini + SORA 2) for high-quality output. Centralized storage in Google Drive with automatic logging in Google Sheets. Flexible webhook interface allows integration with any application or service. Retry mechanism ensures videos are captured even with longer processing times. Created by Daniel Shashko
by Amine ARAGRAG
This n8n template automates the collection and enrichment of Product Hunt posts using AI and Google Sheets. It fetches new tools daily, translates content, categorizes them intelligently, and saves everything into a structured spreadsheet—ideal for building directories, research dashboards, newsletters, or competitive intelligence assets. Good to know Sticky notes inside the workflow explain each functional block and required configurations. Uses cursor-based pagination to safely fetch Product Hunt data. AI agent handles translation, documentation generation, tech extraction, and function area classification. Category translations are synced with a Google Sheets dictionary to avoid duplicates. All enriched entries are stored in a clean “Tools” sheet for easy filtering or reporting. How it works A schedule trigger starts the workflow daily. Product Hunt posts are retrieved via GraphQL and processed in batches. A code node restructures each product into a consistent schema. The workflow checks if a product already exists in Google Sheets. For new items, the AI agent generates metadata, translations, and documentation. Categories are matched or added to a Google Sheets dictionary. The final enriched product entry is appended or updated in the spreadsheet. Pagination continues until no next page remains. How to use Connect Product Hunt OAuth2, Google Sheets, and OpenAI credentials. Adjust the schedule trigger to your preferred frequency. Optionally expand enrichment fields (tags, scoring, custom classifications). Replace the trigger with a webhook or manual trigger if needed. Requirements Product Hunt OAuth2 credentials Google Sheets account OpenAI (or compatible) API access Customising this workflow Add Slack or Discord notifications for new tools. Push enriched data to Airtable, Notion, or a database. Extend AI enrichment with summaries or SEO fields. Use the Google Sheet as a backend for dashboards or frontend applications.
by Oneclick AI Squad
Automatically creates complete videos from a text prompt—script, voiceover, stock footage, and subtitles all assembled and ready. How it works Send a video topic via webhook (e.g., "Create a 60-second video about morning exercise"). The workflow uses OpenAI to generate a structured script with scenes, converts text to natural-sounding speech, searches Pexels for matching B-roll footage, and downloads everything. Finally, it merges audio with video, generates SRT subtitles, and prepares all components for final assembly. The workflow handles parallel processing—while generating voiceover, it simultaneously searches and downloads stock footage to save time. Setup steps Add OpenAI credentials for script generation and text-to-speech Get a free Pexels API key from pexels.com/api for stock footage access Connect Google Drive for storing the final video output Install FFmpeg (optional) for automated video assembly, or manually combine the components Test the webhook by sending a POST request with your video topic Input format: { "prompt": "Your video topic here", "duration": 60, "style": "motivational" } What you get ✅ AI-generated script broken into scenes ✅ Professional voiceover audio (MP3) ✅ Downloaded stock footage clips (MP4) ✅ Timed subtitles file (SRT) ✅ All components ready for final editing Note: The final video assembly requires FFmpeg or a video editor. All components are prepared and organized by scene number for easy manual editing if needed.
by Yasir
🧠 Workflow Overview — AI-Powered Jobs Scraper & Relevancy Evaluator This workflow automates the process of finding highly relevant job listings based on a user’s resume, career preferences, and custom filters. It scrapes fresh job data, evaluates relevance using OpenAI GPT models, and automatically appends the results to your Google Sheet tracker — while skipping any jobs already in your sheet, so you don’t have to worry about duplicates. Perfect for recruiters, job seekers, or virtual assistants who want to automate job research and filtering. ⚙️ What the Workflow Does Takes user input through a form — including resume, preferences, target score, and Google Sheet link. Fetches job listings via an Apify LinkedIn Jobs API actor. Filters and deduplicates results (removes duplicates and blacklisted companies). Evaluates job relevancy using GPT-4o-mini, scoring each job (0–100) against the user’s resume & preferences. Applies a relevancy threshold to keep only top-matching jobs. Checks your Google Sheet for existing jobs and prevents duplicates. Appends new, relevant jobs directly into your provided Google Sheet. 📋 What You’ll Get A personal Job Scraper Form (public URL you can share or embed). Automatic job collection & filtering based on your inputs. Relevance scoring** (0–100) for each job using your resume and preferences. Real-time job tracking Google Sheet that includes: Job Title Company Name & Profile Job URLs Location, Salary, HR Contact (if available) Relevancy Score 🪄 Setup Instructions 1. Required Accounts You’ll need: ✅ n8n account (self-hosted or Cloud) ✅ Google account (for Sheets integration) ✅ OpenAI account (for GPT API access) ✅ Apify account (to fetch job data) 2. Connect Credentials In your n8n instance: Go to Credentials → Add New: Google Sheets OAuth2 API Connect your Google account. OpenAI API Add your OpenAI API key. Apify API Replace <your_apify_api> with your apify api key. Set Up Apify API Get your Apify API key Visit: https://console.apify.com/settings/integrations Copy your API key. Rent the required Apify actor before running this workflow Go to: https://console.apify.com/actors/BHzefUZlZRKWxkTck/input Click “Rent Actor”. Once rented, it can be used by your Apify account to fetch job listings. 3. Set Up Your Google Sheet Make a copy of this template: 📄 Google Sheet Template Enable Edit Access for anyone with the link. Copy your sheet’s URL — you’ll provide this when submitting the workflow form. 4. Deploy & Run Import this workflow (jobs_scraper.json) into your n8n workspace. Activate the workflow. Visit your form trigger endpoint (e.g. https://your-n8n-domain/webhook/jobs-scraper). Fill out the form with: Job title(s) Location Contract type, Experience level, Working mode, Date posted Target relevancy score Google Sheet link Resume text Job preferences or ranking criteria Submit — within minutes, new high-relevance job listings will appear in your Google Sheet automatically. 🧩 Example Use Cases Automate daily job scraping for clients or yourself. Filter jobs by AI-based relevance instead of keywords. Build a smart job board or job alert system. Support a career agency offering done-for-you job search services. 💡 Tips Adjust the “Target Relevancy Score” (e.g., 70–85) to control how strict the filtering is. You can add your own blacklisted companies in the Filter & Dedup Jobs node.
by Lidia
Who’s it for Teams who want to automatically generate structured meeting minutes from uploaded transcripts and instantly share them in Slack. Perfect for startups, project teams, or any company that collects meeting transcripts in Google Drive. How it works / What it does This workflow automatically turns raw meeting transcripts into well-structured minutes in Markdown and posts them to Slack: Google Drive Trigger – Watches a specific folder. Any new transcript file added will start the workflow. Download File – Grabs the transcript. Prep Transcript – Converts the file into plain text and passes the transcript downstream. Message a Model – Sends the transcript to OpenAI GPT for summarization using a structured system prompt (action items, decisions, N/A placeholders). Make Minutes – Formats GPT’s response into a Markdown file. Slack: Send a message – Posts a Slack message announcing the auto-generated minutes. Slack: Upload a file – Uploads the full Markdown minutes file into the chosen Slack channel. End result: your Slack channel always has clear, standardized minutes right after a meeting. How to set up Google Drive Create a folder where you’ll drop transcript files. Configure the folder ID in the Google Drive Trigger node. OpenAI Add your OpenAI API credentials in the Message a Model node. Select a supported GPT model (e.g., gpt-4o-mini or gpt-4). Slack Connect your Slack account and set the target channel ID in the Slack nodes. Run the workflow and drop a transcript file into Drive. Minutes will appear in Slack automatically. Requirements Google Drive account (for transcript upload) OpenAI API key (for text summarization) Slack workspace (for message posting and file upload) How to customize the workflow Change summary structure*: Adjust the system prompt inside *Message a Model (e.g., shorter summaries, language other than English). Different output format*: Modify *Make Minutes to output plain text, PDF, or HTML instead of Markdown. New destinations**: Add more nodes to send minutes to email, Notion, or Confluence in parallel. Multiple triggers**: Replace Google Drive trigger with Webhook if you want to integrate with Zoom or MS Teams transcript exports. Good to know OpenAI API calls are billed separately. See OpenAI pricing. Files must be text-based (.txt or .md). For PDFs or docs, add a conversion step before summarization. Slack requires the bot user to be a member of the target channel, otherwise you’ll see a not_in_channel error.
by Rapiwa
Who is this for? This workflow is designed for online store owners, customer-success teams, and marketing operators who want to automatically verify customers' WhatsApp numbers and deliver order updates or invoice links via WhatsApp. It is built around WooCommerce order WooCommerce Trigger (order.updated) but is easily adaptable to Shopify or other platforms that provide billing and line_items in the WooCommerce Trigger payload. What this Workflow Does / Key Features Listens for WooCommerce order events (example: order.updated) via a Webhook or a WooCommerce trigger. Filters only orders with status "completed" and maps the payload into a normalized object: { data: { customer, products, invoice_link } } using the Code node Order Completed check. Iterates over line items using SplitInBatches to control throughput. Cleans phone numbers (Clean WhatsApp Number code node) by removing all non-digit characters. Verifies whether the cleaned phone number is registered on WhatsApp using Rapiwa's verify endpoint (POST https://app.rapiwa.com/api/verify-whatsapp). If verified, sends a templated WhatsApp message via Rapiwa (POST https://app.rapiwa.com/api/send-message). Appends an audit row to a "Verified & Sent" Google Sheet for successful sends, or to an "Unverified & Not Sent" sheet for unverified numbers. Uses Wait and batching to throttle requests and avoid API rate limits. Requirements HTTP Bearer credential for Rapiwa (example name in flow: Rapiwa Bearer Auth). WooCommerce API credential for the trigger (example: WooCommerce (get customer)) Running n8n instance with nodes: WooCommerce Trigger, Code, SplitInBatches, HTTP Request, IF, Google Sheets, Wait. Rapiwa account and a valid Bearer token. Google account with Sheets access and OAuth2 credentials configured in n8n. WooCommerce store (or any WooCommerce Trigger source) that provides billing and line_items in the payload. How to Use — step-by-step Setup 1) Credentials Rapiwa: Create an HTTP Bearer credential in n8n and paste your token (flow example name: Rapiwa Bearer Auth). Google Sheets: Add an OAuth2 credential (flow example name: Google Sheets). WooCommerce: Add the WooCommerce API credential or configure a Webhook on your store. 3) Configure Google Sheets The exported flow uses spreadsheet ID: 1S3RtGt5xxxxxxxXmQi_s (Sheet gid=0) as an example. Replace with your spreadsheet ID and sheet gid. Ensure your sheet column headers exactly match the mapping keys listed below (case and trailing spaces must match or be corrected in the mapping). 5) Verify HTTP Request nodes Verify endpoint: POST https://app.rapiwa.com/api/verify-whatsapp — sends { number } (uses HTTP Bearer credential). Send endpoint: POST https://app.rapiwa.com/api/send-message — sends number, message_type=text, and a templated message that uses fields from the Clean WhatsApp Number output. Google Sheet Column Structure The Google Sheets nodes in the flow append rows with these column keys. Make sure the spreadsheet headers A Google Sheet formatted like this ➤ sample | Name | Number | Email | Address | Product Title | Product ID | Total Price | Invoice Link | Delivery Status | Validity | Status | |----------------|---------------|-------------------|------------------------------------------|-----------------------------|------------|---------------|--------------------------------------------------------------------------------------------------------------|-----------------|------------|-----------| | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | Air force 1 Fossil 1:1 - 44 | 238 | BDT 5500.00 | Invoice link | completed | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs h#1168 rd#10 av#10 mirpur dohs dhaka | Air force 1 Fossil 1:1 - 44 | 238 | BDT 5500.00 | Invoice link | completed | unverified | not sent | Important Notes Do not hard-code API keys or tokens; always use n8n credentials. Google Sheets column header names must match the mapping keys used in the nodes. Trailing spaces are common accidental problems — trim them in the spreadsheet or adjust the mapping. The IF node in the exported flow compares to the string "true". If the verify endpoint returns boolean true/false, convert to string or boolean consistently before the IF. Message templates in the flow reference $('Clean WhatsApp Number').item.json.data.products[0] — update templates if you need multiple-product support. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen