by Simeon
Google Calendar MCP โ Context-Aware Calendar Operations This n8n template implements an MCP (Model Context Protocol)-compliant module for managing Google Calendar events in a context-aware, conflict-free manner. ๐ง What It Does This MCP enables structured interaction with Google Calendar based on context and intent, ensuring reliable, reusable operations with awareness of existing data and state. โ Core Capabilities Context-aware event creation** Prevents overlapping by validating time availability before creating new events. Gap validation** Checks if a time range is busy or free, enabling smarter scheduling decisions. Conditional updates** Only updates events after confirming their existence and current state. Safe deletion** Removes events using MCP principles of validation and traceability. ๐ How to Use To use this MCP in your context-aware systems: Deploy the template in your n8n instance. Locate the Server node in the workflow โ it exposes a Server-Sent Events (SSE) URL. Copy that SSE URL. Use that URL as the entry point for your MCP client or orchestrator. This URL acts as the communication bridge, allowing you to interact with the MCP-compliant Google Calendar logic using standard MCP semantics.
by Julien DEL RIO
This template is inspired by Save your workflows into a GitHub repository by hikerspath and Back Up Your n8n Workflows To Github by jon-n8n. Basic Retrieve all workflows from an n8n instance and save it on a gitlab project. If the workflow exist, il will only save the changes. Flow What the workflow does : Sets custom parameters Gets workflows Iterates through each workflow one by one Get the file from Gitlab if exists Compare the files as objects (not as strings) Return a status on the workflow Create, Edit or ignore the file depending on the status Return a list of status for each workflow Configuration Select a credential in each Gitlab nodes. Edit the data in node "Globals" : repo.owner : slug of the user or team owning the repo repo.name : slug of the repository repo.branch : branch to commit on repo.path : from root of the repository. Should end with / Comments Error on gitlab nodes will not stop the run but will list the current workflow as error in the results Some fields are ignored to determined if there is changes : updatedAt : should be ignored if only ignores fields are changed globals : it's running information, no need to follow the changes
by ลukasz
What is This? This automation simulates Scrum Master role on daily meetings. Essentially it is an AI Scrum Master using different sources of data. As intelligent support system for Scrum Masters that leverages data from Asana, Slack, and direct developer responses for comprehensive sprint status analysis and identification of areas requiring intervention. As such it is usable for Scrum Masters (of course) but Scrum Team aswell, Product Owner and possibly Business Owner. Who is it For? This automation is designed for Agile teams to support the Scrum Master role by collecting and analyzing data from various sources to identify potential impediments and support the team in sprint delivery. How Does It Work? The workflow has four main data entry points, that are launched either on-click or on workdays. First is collecting project section information from Asana. The automation retrieves project structure, available sections, and their organization, allowing the AI to understand the team's work context. Second is getting recently modified tasks in the Asana project. The system tracks changes in tasks, their status, assignments, and updates to detect potential delays or issues. Third is obtaining communication in the team's Slack channel. The flow collects data about recent conversations, discussion threads, and team communication to identify warning signals or areas requiring attention. Fourth is directly collecting responses from developers about the current sprint - their progress, impediments, concerns, and support needs. All collected data is passed to an AI model that analyzes it within the Scrum methodology context and identifies: Potential impediments in sprint delivery Areas requiring Scrum Master intervention Recommendations for team support Warning signals regarding Sprint Goal achievement Output is being pushed to Slack channel so it can be potentially used by another iteration of same flow itself via Slack channel history. Requirements You need Asana oAuth credentials You need OpenAI / alternative AI for processing data You need to have Slack app with proper permissions channels:history chat:write groups:history im:history mpim:history users.profile:write users:write Configuration Set up node "Asana Project and Slack Channel". Provide Asana project ID and Slack Channel ID (optional) Set up node "Get Scrum Master Answers". There are daily questions/answers that are being sent to channel. Alternative use You can get rid of the whole "Ask Users Daily ScrumMaster Questions" part if you don't want to do it simirarly as "daily Scrum standups". In such case whole flow is essentially changed to static analyzer of project status based on Slack and Asana. Extensions and Customizations There are many possibilities to extend this automation depending on team needs. For example, you can add integration with additional project management tools, implement different notification schemes based on detected issue criticality, or adjust data collection frequency to match the team's work rhythm. Disclaimers and Notes Whole automation has one important assumption: project is run on single Slack channel and on single Asana board. Of cource this can be extended, but is beyond currently designed scope. Adding new sources for AI to analyze should be fairly easy - just add another branch of data and push it to AI prompt. This automation represents a proof-of-concept and should not replace an actual Scrum Master. The Scrum Master role extends far beyond data collection and analysis - it requires deep understanding of team dynamics, business context, and interpersonal skills. As Scrum.org emphasizes, the Scrum Master doesn't need to be present during Daily Scrum, and their role is to ensure the meeting happens, but developers are responsible for conducting the meeting. Mindlessly executing daily questions without proper context analysis can lead to situations where the Scrum Master becomes a team manager instead of a self-organization facilitator. A real Scrum Master analyzes much more data than what's collected by automation - they observe team dynamics, understand business context, identify deeper root causes of problems, and support the team in developing self-organization skills. AI can be a valuable support tool, but it cannot replace the human intuition, empathy, and experience essential in this role. The automation should be treated as a tool supporting the Teams's work, providing additional insights and helping identify areas requiring attention, but always under the supervision and interpretation of an experienced Scrum practitioner.
by tanaypant
This workflow automatically queries a Postgres database to find outlier readings for which SMS notifications have not been sent. This is Workflow 2 in the blog tutorial Database activity monitoring and alerting. Prerequisites A Postgres database set up and credentials A Twilio account and credentials Nodes Cron node triggers the workflow every minute, so the database is queried at regular intervals. Postgres nodes extract values from, and update values in the database. Twilio node sends an alert SMS about the outlier reading to a specified phone number. Set node sets the notification value to true.
by n8n Team
This n8n workflow automates the handling of security detections from CrowdStrike, streamlining incident response and notification processes. The workflow is triggered daily at midnight by the Schedule Trigger node. It begins by fetching recent security detections from CrowdStrike using an HTTP Request node. The response is then split into individual detections for further processing. Each detection is enriched by querying the CrowdStrike API for detailed information using another HTTP Request node. The workflow then processes these detections sequentially using the Split In Batches node. Next, it looks up behavioral information associated with each detection in VirusTotal using two HTTP Request nodes. One node queries VirusTotal based on SHA256 values, and the other based on IOC (Indicator of Compromise) values. The workflow includes a 1-second pause using the Wait node to prevent rate limiting when making requests to the VirusTotal API. Subsequently, the workflow sets fields with relevant details from both CrowdStrike and VirusTotal, including detection links, confidence scores, filenames, usernames, and more. These details are concatenated using an Item Lists node for each detection. The final step involves creating Jira issues for each detection, including summaries with CrowdStrike alert severity and hostnames, as well as descriptions that incorporate information from CrowdStrike and VirusTotal. Information about this issue is then sent via a Slack message to a Slack user. Potential issues during setup might include configuring the Schedule Trigger node to trigger at the correct time zone and handling potential rate limiting from the VirusTotal API, which could lead to throttled requests. Additionally, the note about a possible typo in the URL for the Virustotal nodes should be addressed to ensure correct API calls. The Jira node may need to be replaced with the latest version for compatibility. Properly configuring API credentials and handling errors that may occur during API requests are essential for a smooth workflow operation. Careful testing with sample data is recommended to validate the workflow's functionality and ensure it aligns with your organization's security incident response processes.
by Harshil Agrawal
This workflow allows you to get analytics of a website and store it Airtable. In this workflow, we get the analytics for the sessions grouped by the country. Based on your use-case, you can select different Dimensions and set different Metrics. You can use the Cron node or the Interval node to trigger the workflow on a particular interval and fetch the analytics data regularly. Based on your use-case, you might want to store the data returned by Google Analytics to a database or a Google Sheet. Replace the Airtable node with the appropriate node.
by ลukasz
What Is This? This workflow is a comprehensive solution for automating website audits and optimizations, leveraging advanced technologies to boost SEO effectiveness and overall site performance. Who Is It For? Designed for SEO specialists, digital marketers, webmasters, and content teams, this workflow empowers anyone responsible for website performance to automate and scale their audit processes. Agencies managing multiple client sites, in-house SEO teams aiming to save time on routine checks, and developers seeking to integrate data-driven insights into their deployment pipelines will all find this solution invaluable. By combining your siteโs sitemap with Google Search Console and Google Analytics data, then applying AI-powered analysis, the workflow continuously uncovers actionable recommendations to boost search visibility, improve user engagement, and accelerate page performance. Whether you manage a single blog or oversee a sprawling e-commerce platform, this automated pipeline delivers precise, prioritized SEO improvements without manual data wrangling. How Does It Work? This end-to-end site analysis automation consists of five main stages: 1. URL Discovery Processes the sitemap.xml using HTTP Request and XML nodes to extract all site URLs. 2. Search Console Performance Analysis Uses the Google Search Console API to fetch detailed metrics for each page, including search position, clicks, impressions, and CTR. 3. Analytics Data Collection Connects to the Google Analytics API to automatically retrieve traffic metrics such as pageviews, average session duration, bounce rate, and conversions. 4. AI Data Processing Employs OpenAI models to perform in-depth analysis of the collected data. The artificial intelligence engine merges insights from all sources, identifies patterns, and produces detailed optimization recommendations. AI analyses website itsefl aswell. Consider testing different models. I do recommend at least trying out o4-mini. 5. Recommendation Generation Creates tailored suggestions for each page, in form of HTML table, that is being sent to your email. How To Set It Up? Accounts: An active n8n account or instance, API keys for Google Search Console and Google Analytics, an OpenAI access token. Enabled Google APIs: You will neeed at least following scopes: Google Search Console API Google Analytics Aadmin API Google Analytics Data API Scheduling: The workflow can run manually for ad hoc audits or be scheduled (daily, weekly) for continuous site monitoring. Testing: There are two nodes that are optional: "Sort for testing purposes" and "Limit for testing purposes" Together they randomly select items from sitemap and limit them to few so you don't need to run hundreds of sitemap.xml items at once, but you can run just a random batch first. Globals: There is node called "Globals- CHANGE ME!". You need to set up proper variables in there, which are: sitemap_url - self exlpainatory search_console_selector - for example "sc-domain:sailingbyte.com" but can be URL aswell- depends on how did you set up your search console analysis_start_date and analysis_end_date - date range for analytics, by default last 30 days analytics_selector_id - ID of Google Analytics setup, it is a large integer, you can find it in analytics url preceeded with letter "p", ex (your number is where there are X's): https://analytics.google.com/analytics/web/#/pXXXXXXXXX/reports/intelligenthome report_receiver - email which will receive report What's More? That's actually it. I hope that this automation will help your website improvement will be much easier! Thank you, perfect! Glad I could help. Visit my profile for other automations for businesses. And if you are looking for dedicated software development, do not hesitate to reach out!
by Raz Hadas
Stay ahead of the market with this powerful, automated workflow that performs real-time sentiment analysis on stock market news. By leveraging the advanced capabilities of Google Gemini, this solution provides you with actionable insights to make informed investment decisions. This workflow is designed for investors, traders, and financial analysts who want to automate the process of monitoring news and gauging market sentiment for specific stocks. It seamlessly integrates with Google Sheets for input and output, making it easy to track a portfolio of stocks. Key Features & Benefits Automated Daily Analysis: The workflow is triggered daily, providing you with fresh sentiment analysis just in time for the market open. Dynamic Stock Tracking: Easily manage your list of tracked stocks from a simple Google Sheet. AI-Powered Insights: Utilizes Google Gemini's sophisticated language model to analyze news content for its potential impact on stock prices, including a sentiment score and a detailed rationale. Comprehensive News Aggregation: Fetches the latest news articles from EODHD for each of your specified stock tickers. Error Handling & Validation: Includes built-in checks for invalid stock tickers and formats the AI output for reliable data logging. Centralized Reporting: Automatically logs the sentiment score, rationale, and date into a Google Sheet for easy tracking and historical analysis. How It Works This workflow follows a systematic process to deliver automated sentiment analysis: Scheduled Trigger: The workflow begins each day at a specified time. Fetch Stock Tickers: It reads a list of stock tickers from your designated Google Sheet. Loop and Fetch News: For each ticker, it retrieves the latest news articles using the EODHD API. AI Sentiment Analysis: The collected news articles are then passed to a Google Gemini-powered AI agent. The agent is prompted to act as a stock sentiment analyzer, evaluating the news and generating: A sentiment score from -1 (strong negative) to 1 (strong positive). A detailed rationale explaining the basis for the score. Data Formatting & Validation: The AI's output is parsed and validated to ensure it is in the correct JSON format. Log to Google Sheets: The final sentiment score and rationale are appended to your Google Sheet, alongside the corresponding stock ticker and the current date. Nodes Used Schedule Trigger Google Sheets SplitInBatches HttpRequest (EODHD) If Code (JavaScript) AI Agent (LangChain) Google Gemini Chat Model This workflow is a valuable tool for anyone looking to harness the power of AI for financial market analysis. Deploy this automated solution to save time, gain a competitive edge, and make more data-driven trading decisions.
by Rahul Joshi
๐ Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. ๐๐๐ฌ What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. ๐งฉ Step 2: Fetches release metadata including version, tag, and publish date. ๐ฆ Step 3: Logs release data into Google Sheets for record-keeping and analysis. ๐ Step 4: Retrieves FAQ or documentation data from Notion. ๐ Step 5: Merges GitHub and Notion data to calculate documentation drift. ๐ Step 6: Flags SDKs whose documentation is over 30 days out of date. โ ๏ธ Step 7: Sends detailed Slack alerts to notify responsible teams. ๐ Key Benefits โ Keeps SDK documentation aligned with product releases โ Prevents outdated information from reaching users โ Provides centralized release tracking in Google Sheets โ Sends real-time Slack alerts for overdue updates โ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOUR_GOOGLE_SHEET_ID and YOUR_SHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.
by Raz Hadas
Description Transform your investment strategy with a fully automated, AI-driven trading bot. This workflow bridges the gap between AI-powered market insights and real-world trading by executing buy and sell orders directly through the Alpaca paper trading API. Designed to work in tandem with the Automated Stock Sentiment Analysis workflow, this solution takes the top-performing stocks based on daily news sentiment and automatically rebalances your portfolio. It's perfect for algorithmic traders, data-driven investors, and n8n enthusiasts who want to see their AI analysis translate into tangible actions, all while maintaining a comprehensive log of every transaction in Google Sheets. Key Features & Benefits Automated Trading Execution:** Automatically places buy and sell orders on the Alpaca paper trading platform without manual intervention. Sentiment-Driven Decisions:** Leverages the output from the sentiment analysis workflow to make informed decisions, selling positions with waning sentiment and buying into those with high positive sentiment. Dynamic Portfolio Rebalancing:** Intelligently calculates which positions to close and how to allocate the resulting funds into new, high-potential stocks. Paper Trading Ready:** Safely test and refine your trading strategies in a risk-free environment using Alpaca's paper trading API. Daily Performance Tracking:** Automatically logs your account equity and daily percentage change to a Google Sheet, giving you a clear view of your portfolio's performance. Detailed Trade Logging:** Every buy and sell order is meticulously recorded in a Google Sheet for easy review and historical analysis. Scheduled and Autonomous:** The entire process runs on a daily schedule, making it a "set and forget" solution for systematic trading. How It Works This workflow executes a sophisticated, automated trading strategy in a few key stages: Daily Kick-off & Snapshot: The workflow triggers on a daily schedule, first fetching your current Alpaca account balance and logging it to a Google Sheet to track daily performance. Strategy Formulation: It then reads the daily sentiment scores produced by the accompanying "Stock Sentiment Analysis" workflow. A Code node filters these results to identify the top four stocks with the highest positive sentiment. The Decision Engine: The core of the workflow is a custom Code node that acts as the trading brain. It: Retrieves your currently open positions from Alpaca. Compares your holdings against the day's top four sentiment stocks. Generates a "sell list" of positions you hold that are no longer in the top four. Generates a "buy list" of top-sentiment stocks that you don't yet own. Calculates the total cash value from the "sell list" and determines the exact notional value to invest in each stock on the "buy list." Trade Execution: The workflow first iterates through the "sell list" and executes a DELETE request to Alpaca for each, closing the positions. A Wait node pauses the workflow for two minutes to ensure the sell orders are filled and the account balance is updated. It then iterates through the "buy list," executing POST requests to Alpaca to purchase the new assets with the calculated funds. Record Keeping: All executed orders (both buys and sells) are merged and logged in a dedicated Google Sheet, giving you a permanent and detailed transaction history. Nodes Used Schedule Trigger HttpRequest (Alpaca API) Google Sheets Code (JavaScript) SplitOut Wait Merge This workflow is the perfect next step for anyone looking to take their AI analysis to the next level. Take the emotion out of your trading and let this bot systematically execute your data-driven strategy.
by Jonathan | NEX
Are you drowning in a sea of security notifications? Do your analysts spend more time sifting through low-level logs than investigating real threats? This workflow transforms n8n into an autonomous SOC (Security Operations Center) Analyst, tackling alert fatigue head-on. Leveraging the NixGuard Security RAG connector, this workflow automates the entire alert triage process. It ingests raw security events (from sources like Wazuh, your SIEM, or EDR), uses AI to analyze and assign a priority, and then intelligently routes the alert to the correct Slack channel. How It Works: Ingest & Filter: The workflow runs on a schedule, fetching all recent security alerts. It first performs a basic filtering to isolate events that meet a minimum severity threshold (e.g., level 7+). AI Analysis & Prioritization: The aggregated high-severity alerts are then sent to the AI with a specific prompt, asking it to analyze the situation and return a structured JSON object containing a single, overall priority (Critical, High, Info) and a concise summary. Intelligent Routing: A Switch node reads the AI-assigned priority and routes the notification to the appropriate destination. Critical alerts go to your #security-incident-response channel, high-priority alerts to #security-investigations, and informational ones to #security-logs. Key Features & Benefits: Eliminate Alert Fatigue:** Drastically reduce the noise by having AI pre-process and categorize alerts before they hit your team. Automate SOC Tier 1 Triage:** Free up your human analysts from repetitive triage tasks so they can focus on high-value investigation and threat hunting. Faster Incident Response:** Route critical alerts to the right people in real-time, cutting down on crucial response time. Consistent Prioritization:** Use AI to ensure a consistent, unbiased approach to alert prioritization, 24/7. Smart Routing Logic:** Go beyond simple keyword matching. The Switch node ensures alerts are delivered to the team best equipped to handle them based on AI-assessed severity. Who is this for? SOC Analysts & Security Engineers** looking to automate alert triage and incident response workflows. SecOps and DevOps Teams** who want to build a more efficient, automated security operations pipeline. IT Managers and Directors** aiming to improve their team's efficiency and reduce the risk of missing critical alerts. Anyone using Wazuh, a SIEM, or other security tools that generate a high volume of alerts. Stop manually triaging alerts. Install this workflow to build your own AI-powered security automation platform and let your team focus on what matters most. Don't have the main workflow yet? Get it HERE! ๐ Learn more about NixGuard: thenex.world ๐ Get started with a free security subscription: thenex.world/security/subscribe Tags / Keywords: AI, Security, SOC, Automation, Triage, Alerting, Cybersecurity, Wazuh, SIEM, Slack, Incident Response, Alert Fatigue, SecOps, Generative AI, LLM, NixGuard, Routing
by Evoort Solutions
โจ AI Text Summarizer with Google Sheets Logging Summarize large blocks of text into concise outputs using the Text Summarizer AI API and automatically log results in Google Sheets. This automation is ideal for content creators, marketers, researchers, and teams who need efficient summarization and record-keepingโwithout writing a single line of code. ๐ Features | Feature | Description | |--------------------------|-----------------------------------------------------------------------------| | Web Form Trigger | Collects title, content, mode (Paragraph/Bullet), and length preferences. | | API Integration | Sends content to Text Summarizer AI API via RapidAPI. | | Conditional Logic | Routes success and error cases appropriately. | | Google Sheets Logging| Stores summaries or error messages into Google Sheets. | | Error Handling | Captures failed summaries and ensures no data is lost. | ๐ง Problem Solved Manually summarizing long-form content is time-consuming and repetitive. Storing this output in structured logs (like Google Sheets) adds an extra layer of manual effort. This workflow solves that by: ๐ Automating AI-driven summarization using Text Summarizer AI API. ๐ Storing results (or fallback errors) directly into a Google Sheet. โ Ensuring no request goes unloggedโeven on API failure. ๐ฏ Use Cases ๐ Blog Writers: Quickly convert articles into summaries for social captions. ๐ Students: Break down textbook or lecture content. ๐ง Knowledge Management: Turn raw meeting notes into concise summaries. ๐ข SEO Teams: Use bullet-point output for schema markup or meta descriptions. ๐งฉ Nodes in the Flow | Node Name | Purpose | |----------------------|-------------------------------------------------------------------------| | On form submission | Captures user input via form (title, content, mode, length). | | Mapping | Formats input to match Text Summarizer AI API specs. | | HTTP Request | Sends POST request to the summarization API on RapidAPI. | | If | Validates whether a summary was returned. | | Wait | Adds short delay before writing to spreadsheet (success). | | Google Sheets | Appends summary data to Google Sheet. | | Wait1 | Adds delay for error handling path. | | Google Sheets1 | Logs failure with an โError occurredโ message. | โ Benefits ๐ง AI-Powered: Uses Text Summarizer AI API for fast and contextual summaries. ๐ Organized Logs: Google Sheets integration ensures easy tracking and auditing. ๐ Reliable: Captures all submissions, including failed ones. ๐งฉ Customizable: Easily adapt inputs or connect to other tools like Notion, Slack, or Airtable. ๐ ๏ธ Requirements โ n8n account (Cloud or Self-Hosted) โ Access to Text Summarizer AI API โ Google account (for Sheets integration) ๐ API key from RapidAPI (used in the HTTP Request node) ๐ Workflow Overview User submits text and preferences (mode, length) via a form. Workflow triggers and transforms the input. Formatted data is sent to the Text Summarizer AI API. If a valid summary is returned: Log it into Google Sheets. If the API fails: Log the error message instead. ๐ฅ Import Instructions Open n8n and import the workflow JSON. Replace the x-rapidapi-key in the HTTP node with your personal RapidAPI key. Configure your Google Sheets credentials. Deploy and test. โ ๐ Suggested Extensions ๐ข Notion database logging ๐ฃ Slack/Discord notification for each summary ๐ต CSV download or Airtable sync ๐ท Tags ai summarization text-processing rapidapi google-sheets automation markdown n8n Text Summarizer AI API Create your free n8n account and set up the workflow in just a few minutes using the link below: ๐ Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!