by Đỗ Thành Nguyên
Automated Facebook Page Story Video Publisher (Google Drive → Facebook → Google Sheet) This workflow is an automated solution for publishing video content from Google Drive to your Facebook Page Stories, while using Google Sheets as a posting queue manager. What This Workflow Does (Workflow Function) This automation orchestrates a complete multi-step process for uploading and publishing videos to Facebook Stories: Queue Management: Every 2 hours and 30 minutes, the workflow checks a Google Sheet (Get Row Sheet node) to find the first video whose Stories column is empty — meaning it hasn’t been posted yet. Conditional Execution: An If node confirms that the video’s File ID exists before proceeding. Video Retrieval: Using the File ID, the workflow downloads the video from Google Drive (Google Drive node) and calculates its binary size (Set to the total size in bytes node). Facebook 3-Step Upload: It performs the Facebook Graph API’s three-step upload process through HTTP Request nodes: Step 1 – Initialize Session: Starts an upload session and retrieves the upload_url and video_id. Step 2 – Upload File: Uploads the binary video data to the provided upload_url. Step 3 – Publish Video: Finalizes and publishes the uploaded video as a Facebook Story. Status Update: Once completed, the workflow updates the same row in Google Sheets (Update upload status in sheet node) using the row_number to mark the video as processed. Prerequisites (What You Need Before Running) 1. n8n Instance > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 — use code VPSN8N for up to 39% off. 2. Google Services Google Drive Credentials:** OAuth2 credentials for Google Drive to let n8n download video files. Google Sheets Credentials:** OAuth2 credentials for Google Sheets to read the posting queue and update statuses. Google Sheet:** A spreadsheet (ID: 1RnE5O06l7W6TLCLKkwEH5Oyl-EZ3OE-Uc3OWFbDohYI) containing: File ID — the video’s unique ID in Google Drive. Stories — posting status column (leave empty for pending videos). row_number — used for updating the correct row after posting. 3. Facebook Setup Page ID:** Your Facebook Page ID (currently hardcoded as 115432036514099 in the info node). Access Token:* A *Page Access Token** with permissions such as pages_manage_posts and pages_read_engagement. This token is hardcoded in the info node and again in Step 3. Post video. Usage Guide and Implementation Notes How to Use Queue Videos: Add video entries to your Google Sheet. Each entry must include a valid Google Drive File ID. Leave the Stories column empty for videos that haven’t been posted. Activate: Save and activate the workflow. The Schedule Trigger will automatically handle new uploads every 2 hours and 30 minutes. Implementation Notes ⚠️ Token Security:* Hardcoding your *Access Token* inside the info node is *not recommended**. Tokens expire and expose your Page to risk if leaked. 👉 Action: Replace the static token with a secure Credential setup that supports token rotation. Loop Efficiency:* The *“false”** output of the If node currently loops back to the Get Row Sheet node. This creates unnecessary cycles if no videos are found. 👉 Action: Disconnect that branch so the workflow stops gracefully when no unposted videos remain. Status Updates:* To prevent re-posting the same video, the final Update upload status in sheet node must update the *Stories** column (e.g., write "POSTED"). 👉 Action: Add this mapping explicitly to your Google Sheets node. Automated File ID Sync:** This workflow assumes that the Google Sheet already contains valid File IDs. 👉 You can build a secondary workflow (using Schedule Trigger1 → Search files and folders → Append or update row in sheet) to automatically populate new video File IDs from your Google Drive. ✅ Result Once active, this workflow automatically: pulls pending videos from your Google Sheet, uploads them to Facebook Stories, and marks them as posted — all without manual intervention.
by Asfandyar Malik
Who’s it for For HR professionals, recruiters, and hiring managers who want to automate the initial CV screening and candidate evaluation process. This workflow helps teams efficiently assess applicants based on submitted answers and resume data — saving hours of manual review and ensuring fair, consistent scoring. How it works This workflow automates CV screening using Google Drive, Google Sheets, and Gemini AI. When a candidate submits a form with their answers and CV, the file is uploaded to Drive, converted from PDF to plain text, and merged with the form data. Gemini AI then analyzes both inputs, comparing skills, experience, and responses to generate consistency, job-fit, and final scores. Finally, the results are parsed, saved to Google Sheets, and automatically sorted by score, providing a ranked list of candidates for easy review. How to set up Connect your Google Drive and Google Sheets credentials in n8n. Configure your Form Trigger to capture candidate answers and CV uploads. Set up the Extract from File node to parse PDF files into text. Add your Gemini AI credentials securely using n8n’s credential system (no hardcoded keys). Execute the workflow once to verify that CVs are uploaded, analyzed, and ranked in the connected Google Sheet. Requirements n8n account (cloud or self-hosted). Google Drive and Google Sheets integrations. Gemini AI (Chat Model) API credentials. A connected form (e.g., Typeform, n8n Form Trigger) How to customize You can modify the AI prompt to align with your company’s job criteria or evaluation style. Add more scoring categories (e.g., education, technical skills, experience). Change the output destination — send results to Airtable, Notion, or Slack. Enhance it with dashboards or extra nodes for reporting and analytics. ⚠️ Disclaimer This workflow uses Gemini AI, which may require self-hosting for community node compatibility. Ensure that no personal or sensitive candidate data is shared externally when using AI services.
by Anir Agram
📸🍽️ Telegram Food Photo → 🤖 Gemini Vision AI → 📊 Nutrition Data → 📄 Google Sheets + 🗂️ Drive What this workflow does 📸 Snap and send a photo of your meal via Telegram 🧠 Gemini Vision AI analyzes the image and estimates calories, protein, carbs, and fats 🤖 AI Agent structures the data with meal name, description, and timestamp 📄 Auto-logs nutrition data to Google Sheets for tracking 🗂️ Saves original meal photos to Google Drive with timestamped filenames 💬 Sends instant Telegram reply with full nutrition breakdown Why it's useful ⚡ Track nutrition in seconds—no manual entry or food databases 📊 Build a complete meal history with photos and macros in one place 🎯 AI estimates portion sizes and hidden ingredients (oils, sauces) 🏋️ Perfect for fitness tracking, meal prep, or health monitoring 📱 Works entirely through Telegram—no extra apps needed How it works 📲 Telegram Trigger → receives meal photo 🗂️ Google Drive → saves image with timestamp 🔎 Gemini Vision → analyzes food, estimates portions and macros 🤖 AI Agent → structures output (meal name, calories, protein, carbs, fats) 📄 Google Sheets → appends row with all nutrition data 💬 Telegram Reply → confirms with full breakdown What you'll need 🤖 Telegram Bot token 🧠 Google Gemini API key (includes Vision capabilities) 🔐 Google OAuth for Sheets + Drive 📊 Google Sheet with columns: Meal_Name, Date, Meal_description, Calories, Proteins, Carbs, Fats Setup steps 🔗 Connect credentials: Telegram, Google Gemini, Google Sheets, Google Drive 📄 Create Google Sheet with nutrition columns (see format above) 🗂️ Create Google Drive folder for meal photos 🧭 Update sheet ID and Drive folder ID in workflow 🧪 Test: send a meal photo via Telegram and check Sheet + Drive Customization ideas 📈 Daily summary: add scheduled workflow to calculate daily totals 🎯 Goal tracking: set IF conditions to alert when over/under calorie targets 📊 Charts: connect to Data Studio/Looker for visual progress tracking 🏃 Fitness integration: sync with MyFitnessPal or fitness apps Who it's for 🏋️ Fitness enthusiasts tracking macros without manual logging 🥗 Meal preppers analyzing portion sizes and nutrition 💪 Athletes monitoring calorie and protein intake 🩺 Health-conscious individuals building meal history 👨🍳 Nutritionists collecting client food data Quick Setup Guide - Before You Start - What You Need: 🔗 Telegram Bot (create via @BotFather) 🧠 Google Gemini API key with Vision enabled (get it here) 🔐 Google account for Sheets and Drive access 📊 Basic spreadsheet to track your meals Want help customizing? 📧 anirpoke@gmail.com 🔗 LinkedIn
by Babish Shrestha
🚀 Build Your Own Knowledge Chatbot Using Google Drive Create a smart chatbot that answers questions using your Google Drive PDFs—perfect for support, internal docs, education, or research. 🛠️ Quick Setup Guide** Step 1: Prerequisites n8n instance (cloud or self-hosted) Google Drive account (with PDFs) Supabase account (vector database) OpenAI API key PostgreSQL database (for chat memory) else remove the node Step 2: Supabase Setup Create supabase account (its free) Create a project Copy the sql and paste it in supabase sql editor -- Enable the pgvector extension to work with embedding vectors create extension vector; -- Create a table to store your documents create table documents ( id bigserial primary key, content text, -- corresponds to Document.pageContent metadata jsonb, -- corresponds to Document.metadata embedding vector(1536) -- 1536 works for OpenAI embeddings, change if needed ); -- Create a function to search for documents create function match_documents ( query_embedding vector(1536), match_count int default null, filter jsonb DEFAULT '{}' ) returns table ( id bigint, content text, metadata jsonb, similarity float ) language plpgsql as $$ #variable_conflict use_column begin return query select id, content, metadata, 1 - (documents.embedding <=> query_embedding) as similarity from documents where metadata @> filter order by documents.embedding <=> query_embedding limit match_count; end; $$; Step 3: Import & Configure n8n Workflow Import this template into n8n Add credentials: OpenAI API key Google Drive OAuth2 Supabase URL & service key PostgreSQL connection Set your Google Drive folder ID in triggers Step 4: Test & Use Add a PDF to your Drive folder → check Supabase for new entries Start the workflow and chat → ask questions about your documents. "What can you help me with?" Multi-turn chat → context is maintained per user ⚡ Features Auto-syncs new/updated PDFs from Google Drive Extracts, chunks, and vectorizes text Finds relevant info and answers questions Maintains chat history per user 📝 Troubleshooting Check folder permissions & IDs if no docs found Verify API keys & Supabase setup for errors Ensure PostgreSQL is connected for chat memory Tags: RAG, Chatbot, Google Drive, Supabase, OpenAI, n8n Setup Time: ~20 minutes
by Yasser Sami
Outlook & Notion Knowledge Base Builder for AI Agents This n8n template lets you automatically build and maintain an AI-ready knowledge base from Outlook emails and Notion pages. It stores both sources in a Pinecone vector database so your AI agent can reference them when generating answers, extracting information, or drafting context-aware emails. Who’s it for Teams using Outlook for client or customer communications. Businesses maintaining documentation or notes in Notion. Developers or AI builders who want to create custom knowledge bases for intelligent agents. How it works / What it does Outlook Integration: When you move an email or thread to a specific folder (e.g., “knowledgebase”), the workflow triggers automatically. It captures the entire conversation, removes duplicates, and stores it in Pinecone under a namespace called "emails". The AI agent later uses these stored emails to learn tone, structure, and context for drafting responses. Notion Integration: When a page is added to a designated Notion database, the workflow triggers. The page content is extracted and embedded into Pinecone under the "knowledgebase" namespace. This allows your agent to use your Notion pages as factual reference material when answering questions. AI Agent Setup: Uses two Pinecone tools — one to reference Notion content (knowledge base) and another for emails (style & tone). Equipped with Cohere embeddings for multilingual support and OpenAI for conversation or drafting tasks. When queried, the agent retrieves relevant knowledge from both namespaces to produce accurate, context-rich replies. How to set up Import this template into your n8n account. Connect your Outlook account and specify which folder to watch (e.g., “knowledgebase”). Connect your Notion account and select your target database. Connect your Pinecone, Cohere, and OpenAI credentials. Activate the workflow — new Notion pages and Outlook threads will automatically update your knowledge base. Requirements n8n account. Microsoft Outlook account. Notion account and database. Pinecone account for vector storage. Cohere API key for embeddings. OpenAI API key for AI model. How to customize the workflow Namespaces:** Rename or expand namespaces (e.g., “sales_emails”, “internal_docs”) to organize knowledge types. Embeddings Model:** Replace Cohere with OpenAI embeddings if preferred. Agent Behavior:** Adjust the agent’s system message to change its tone or purpose (e.g., “act as a sales assistant”). Extra Sources:** Extend this workflow to include PDFs, websites, or Slack messages. Sync Schedule:** Modify trigger intervals to control how frequently updates are captured. This workflow automatically builds a living, AI-powered knowledge base from your Outlook and Notion data — perfect for intelligent support, research, or content generation.
by tanaypant
This workflow is the second of three. You can find the other workflkows here: Incident Response Workflow - Part 1 Incident Response Workflow - Part 2 Incident Response Workflow - Part 3 We have the following nodes in the workflow: Webhook node: This trigger node listens to the event when the Acknowledge button is clicked. PagerDuty node: This node changes the status of the incident report from 'Triggered' to 'Acknowledged' in PagerDuty. Mattermost node: This node publishes a message in the auxiliary channel saying that the status of the incident report has been changed to Acknowledged.
by Dataki
What is this workflow? This n8n template automates the process of adding an AI-generated summary at the top of your WordPress posts. It retrieves, processes, and updates your posts dynamically, ensuring efficiency and flexibility without relying on a heavy WordPress plugin. Example of AI Summary Section How It Works Triggers → Runs on a scheduled interval or via a webhook when a new post is published. Retrieves posts → Fetches content from WordPress and converts HTML to Markdown for AI processing. AI Summary Generation → Uses OpenAI to create a concise summary. Post Update → Inserts the summary at the top of the post while keeping the original excerpt intact. Data Logging & Notifications → Saves processed posts to Google Sheets and notifies a Slack channel. Why use this workflow? ✅ No need for a WordPress plugin → Keeps your site lightweight. ✅ Highly flexible → Easily connect with Google Sheets, Slack, or other services. ✅ Customizable → Adapt AI prompts, formatting, and integrations to your needs. ✅ Smart filtering → Ensures posts are not reprocessed unnecessarily. 💡 Check the detailed sticky notes for setup instructions and customization options!
by omid dev
How It Works: This n8n template automates the process of tracking design changes in Figma and updating relevant Jira issues. The template is triggered when a new version is created in Figma via a custom plugin. Once the version is committed, the plugin sends the design details to an n8n workflow using a webhook. The workflow then performs the following actions: Fetches the Jira issue based on the provided issue link from Figma. Adds the design changes as a comment to the Jira issue. Updates the status of the Jira issue based on the provided task status (e.g., "In Progress", "Done"). This streamlines the workflow, reducing the need for manual updates and ensuring that both the design team and developers have the latest design changes and task statuses in sync. How to Use It: Set up the Figma Plugin: Install the Figma Commit Plugin from GitHub. In the plugin, fill out the version name, design link, Jira issue link, and the task status. Commit the changes in Figma, which will trigger the webhook. Set Up the n8n Workflow: Import this template into your n8n instance. Connect the Figma Trigger node to capture version updates from Figma. Configure the Jira nodes to retrieve the issue and update the status/comment based on the data sent from the plugin. Automate: Once the version is committed in Figma, the workflow will automatically update the Jira issue and keep both your Figma design and Jira tasks in sync! By integrating Figma, Jira, and n8n through this template, you’ll eliminate manual steps, making collaboration between design and development teams more efficient.
by Antonis Logothetis
Multi-functional Discord Bot with Llama AI, Image Generation, and Knowledge Base Integration 🤖🎨🧠 Overview 🔍 This workflow creates a Discord bot that can: Monitor Discord messages from specific users 👀 Process different media types (images, audio, text) 🔎 Analyze images using AI 🖼️ Transcribe audio files 🎤 Generate responses using Llama AI 🦙 Create images from text prompts using Gemini AI 🎨 Prerequisites ✅ n8n automation platform 💻 API keys for Discord, Groq, Google/Gemini, and SerpAPI 🔑 Ollama setup for Llama language model 🧠 Main Workflow Components 🛠️ Message Monitoring System 📨 Set up a Discord receiver to monitor messages in your server 💬 Add a filter to only process messages from specific users 🔍 Create a wait timer to control how often the bot checks for new messages ⏱️ Media Type Detection 🔄 Create a system that detects what kind of content was shared: Audio files (by checking for waveform data) 🎵 Images (by checking content type) 🖼️ Text (default if no media detected) 💬 Add special detection for image creation commands 🎭 Image Processing 🖼️ Fetch the image from Discord 📥 Convert the image to a format the AI can understand 🔄 Send the image to Groq for analysis 🔍 Return the AI's description back to Discord 📤 Audio Processing 🎵 Fetch the audio file from Discord 📥 Send it to Groq's audio transcription service 🎤 Process the transcribed text with the AI assistant 🧠 Return the response to Discord 📤 Text Processing 💬 Send the text to an AI agent powered by Llama 🦙 Connect the agent to memory to maintain conversation context 🧠 Add knowledge tools like Wikipedia and search capabilities 🔍 Return the AI's response to Discord, with optional text-to-speech 🔊 Image Generation 🎨 Process the user's image creation request ✏️ Use an AI agent to refine the prompt for better results ✨ Send the enhanced prompt to Gemini for image generation 🖌️ Extract the generated image and post it to Discord 📤 Connecting the Components 🔗 Set up routing between components based on content type 🔀 Ensure all processes loop back to the message monitoring system ♻️ Add wait timers between operations to avoid rate limits ⏱️ Testing Tips 🐛 Test each type of content separately 🧪 Verify API connections and authentication 🔐 Check if responses are appropriate and timely ⏰ Optimization Suggestions ⚡ Adjust wait times based on your usage patterns ⏱️ Add more specific filters for message detection 🔍 Consider implementing caching for frequent requests 💾 Monitor performance and adjust as needed 📈 This Discord bot combines multiple AI services into a seamless experience, allowing users to interact with various AI capabilities through simple Discord messages. The modular design makes it easy to expand or modify specific features as needed! 🚀
by Davide
This workflow automates the creation of exam questions (both open-ended and multiple-choice) from educational content stored in Google Docs, using AI-powered analysis and vector database retrieval This workflow saves educators hours of manual work while ensuring high-quality, curriculum-aligned assessments. Let me know if you'd like help adapting it for specific subjects! Use Cases Educators**: Rapidly generate quizzes, midterms, or flashcards. E-learning platforms**: Automate question banks for courses. Corporate training**: Create assessments for employee onboarding. Technical Requirements: APIs**: Google Gemini, OpenAI, Qdrant, Google Workspace. n8n Nodes**: LangChain, Google Sheets/Docs, HTTP requests, code blocks. This workflow combines AI efficiency with human-curated quality, making it a powerful tool for modern education and training. Advantages of This Workflow ✅ Fully Automated Exam Generation: From document to fully formatted quiz content with no manual intervention. ✅ Supports Comprehension and Critical Thinking: Questions are designed to go beyond factual recall, including inference and application. ✅ Uses AI and RAG for Accuracy: Ensures that answers are grounded in the document content, reducing hallucination. ✅ Seamless Google Integration: Pulls content from Google Docs and writes outputs to Google Sheets. ✅ Scalable for Any Subject: Works with any article or content domain as input. ✅ Modular and Customizable: Can be easily adapted to generate different question types or to use other LLMs or storage systems. How It Works Document Ingestion: The workflow starts by fetching an educational document (e.g., textbook chapter, lecture notes) from Google Docs. Converts the document to Markdown for structured processing. AI Processing: Splits text into chunks and generates vector embeddings (via OpenAI) for semantic analysis. Stores embeddings in Qdrant (vector database) for retrieval. Question Generation: Open-ended questions: Google Gemini AI creates 10 critical-thinking questions. Multiple-choice questions: Generates 10 MCQs (1 correct + 3 plausible distractors) using RAG to validate answers against the vector DB. Answer Validation: For open questions: Retrieves context-aware answers from the vector store. For MCQs: Ensures distractors are incorrect but believable via AI cross-checking. Output: Saves questions/answers to Google Sheets in two tabs: Open questions: Question + AI-generated answer. Closed questions: MCQ + options + correct answer. Set Up Steps Prerequisites: APIs/Accounts: Google Workspace (Docs + Sheets). OpenAI (for embeddings). Google Gemini (for question generation). Qdrant (vector DB – self-hosted or cloud). n8n Nodes: Ensure LangChain, Google Sheets/Docs, and HTTP request nodes are installed. Configure Connections: Link credentials for: Google Docs/Sheets (OAuth2). OpenAI (API key). Google Gemini (API key). Qdrant (URL + API key). Customize Input: Replace the default Google Doc ID in the "Get Doc" node with your source document. Adjust chunk size/overlap (Token Splitter node) for optimal text processing. Tweak Question Generation: Modify prompts in: "Open questions" node: Adjust criteria (e.g., difficulty, question types). "Closed questions" node: Edit MCQ formatting rules. Output Settings: Update the Google Sheet ID in "Write open" and "Write closed" nodes. Map columns in Google Sheets to match question/answer formats. Run & Automate: Trigger manually ("Test workflow") or schedule periodic runs (e.g., for updated content). Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Saverflow AI
🚀 LinkedIn Comments to Leads Extractor & Enricher (Apify) → Google Sheets / CSV Overview Automate LinkedIn lead generation by scraping comments from targeted posts and enriching profiles with detailed data This n8n workflow automatically extracts leads from LinkedIn post comments using Apify's powerful scrapers (no LinkedIn login required), enriches the data with additional profile information, and exports everything to Google Sheets or CSV format. ✨ Key Features 🔍 No Login Required: Scrape LinkedIn data without sharing credentials 💰 Cost-Effective: First 1,000 comments are free with Apify 📊 Data Enrichment: Enhance basic comment data with full profile details 📈 Export Options: Choose between Google Sheets or CSV output 🎯 Targeted Scraping: Focus on specific posts for quality leads 🛠️ Apify Scrapers Used 1. LinkedIn Post Comments Scraper Tool**: LinkedIn Post Comments, Replies, Engagements Scraper | No Cookies Pricing**: $5.00 per 1,000 results Function**: Extracts all comments and engagement data from specified LinkedIn posts 2. LinkedIn Profile Batch Scraper Tool**: LinkedIn Profile Details Batch Scraper (No Cookies Required) Pricing**: $5.00 per 1,000 results Function**: Enriches scraped profiles with detailed information > 💡 Free Tier: Apify provides 1,000 free scraped comments to get you started! 📋 Prerequisites Required API Credentials Apify Token Add your APIFY_TOKEN to the workflow credentials Get your token from Apify Console Google Sheets Credentials (if using Sheets export) Configure OAuth credentials for Google Sheets integration Follow n8n's Google Sheets setup guide 🔄 Workflow Process Default Mode: Form-Based Execution Manual Trigger → Launches the workflow Form Submission → User-friendly form for inputting LinkedIn post URLs Comment Scraping → Apify extracts all comments from specified posts Profile Enrichment → Additional profile data gathered for each commenter Data Processing → Creates unique, enriched lead list Google Sheets Export → Automatically populates your spreadsheet Result: You'll be redirected to a Google Sheets document containing all enriched leads Alternative Mode: CSV Export For users preferring CSV output: Disable: Form trigger nodes Enable: Manual trigger node Disable: Google Sheets export nodes Enable: CSV download nodes Configure: Add post IDs/URLs in "Set manual fields" node Execute: Run workflow and download CSV from the CSV node 📊 Output Data Structure Your exported data will include: Basic Info**: Name, headline, location Profile Details**: Company, position, industry Engagement Data**: Comment content, engagement metrics Contact Info**: Available profile links and connections Enriched Data**: Additional profile insights from Apify 💡 Pro Tips Quality over Quantity**: Target posts with high-quality, relevant engagement Monitor Costs**: Track your Apify usage to stay within budget Data Hygiene**: Regularly clean and deduplicate your lead lists Compliance**: Ensure your scraping activities comply with LinkedIn's terms of service 🆘 Troubleshooting Common Issues: Authentication Errors**: Verify your Apify token is correctly configured Empty Results**: Check that your LinkedIn post URLs are valid and public Export Failures**: Ensure Google Sheets credentials are properly set up Need Help? Contact Saverflow.ai for support and custom workflow development.
by Luciano Gutierrez
Google Calendar AI Agent with Dynamic Scheduling Version: 1.0.0 n8n Version: 1.88.0+ Author: Koresolucoes License: MIT Description An AI-powered workflow to automate Google Calendar operations using dynamic parameters and MCP (Model Control Plane) integration. Enables event creation, availability checks, updates, and deletions with timezone-aware scheduling [[1]][[2]][[8]]. Key Features: 📅 Full Calendar CRUD: Create, read, update, and delete events in Google Calendar. ⏰ Availability Checks: Verify time slots using AVALIABILITY_CALENDAR node with timezone support (e.g., America/Sao_Paulo). 🤖 AI-Driven Parameters: Use $fromAI() to inject dynamic values like Start_Time, End_Time, and Description [[3]][[4]]. 🔗 MCP Integration: Connects to an MCP server for centralized AI agent control [[5]][[6]]. Use Cases Automated Scheduling: Book appointments based on AI-recommended time slots. Meeting Coordination: Sync calendar events with CRM/task management systems. Resource Management: Check room/equipment availability before event creation. Instructions 1. Import Template Go to n8n > Templates > Import from File and upload this workflow. 2. Configure Credentials Add Google Calendar OAuth2 credentials under Settings > Credentials. Ensure the calendar ID matches your target (e.g., ODONTOLOGIA group calendar). 3. Set Up Dynamic Parameters Use $fromAI('Parameter_Name') in nodes like CREATE_CALENDAR to inject AI-generated values (e.g., event descriptions). 4. Activate & Test Enable the workflow and send test requests to the webhook path /mcp/:tool/calendar. Tags Google Calendar Automation MCP AI Agent Scheduling CRUD Screenshots License This template is licensed under the MIT License. Notes: Extend multi-tenancy by adding :userId to the webhook path (e.g., /mcp/:userId/calendar) [[7]]. For timezone accuracy, always specify options.timezone in availability checks [[8]]. Refer to n8n’s Google Calendar docs for advanced field mappings.