by Harshil Agrawal
This workflow allows you to receive updates about the positiong of the ISS and add it to a table in TimescaleDB. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Based on your use-case you may want to fetch data from a different URL. Enter the URL in the URL field. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. TimescaleDB node: This node stores the information in a table named iss. You can use a different table as well.
by Shashikanth
Source code, I maintain this worflow here. Usage Guide This workflow backs up all workflows as JSON files named in the [workflow_name].json format. Steps Create GitHub Repository Skip this step if using an existing repository. Add GitHub Credentials In Credentials, add the GitHub credential for the repository owner. Download and Import Workflow Import this workflow into n8n. Set Global Values In the Globals node, set the following: repo.owner: GitHub username of the repository owner. repo.name: Name of the repository for backups. repo.path: Path to the folder within the repository where workflows will be saved. Configure GitHub Nodes Edit each GitHub node in the workflow to use the added credentials. Workflow Logic Each workflow run handles files based on their status: New Workflow If a workflow is new, create a new file in the repository. Unchanged Workflow If the workflow is unchanged, skip to the next item. Changed Workflow If a workflow has changes, update the corresponding file in the repository. Current Limitations / Needs work Name Change of Workflows If a workflow is renamed or deleted in n8n, the old file remains in the repository. Deleted Workflows Deleted workflows in n8n are not removed from the repository.
by ConvertAPI
Who is this for? For developers and organizations that need to convert XLSX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the XLSX file from the web. Converts the XLSX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Muzaffer AKYIL
Docker Registry Cleanup Template This template is designed to automatically clean up old image tags in the Docker registry and perform garbage collection. Features List all images in the registry Preserve the last 10 tags for each image (latest tag is always preserved) Delete old tags Email notification for Successful/Excused cancellation Registry garbage collection automation Failure notification in error conditions Prerequisites Docker Registry v2 API access Basic Authentication credentials SMTP email settings (for notifications) SSH node installed on n8n (for garbage collection) Installation 1. Identity Information Add the following credentials in n8n: HTTP Basic Auth**: For Registry access SSH Private Key**: For Garbage collection command Email SMTP**: For notifications 2. Set Variables Replace your-registry-url with your actual registry URL on all nodes: ‘url": ‘https://your.registry.com/v2/_catalog’. Customisation Retention Policy: Set the number of tags to be retained by changing the slice(0, 10) value in the Identify Tags to Remove node Schedule: Change the frequency of operation at the Trigger node Notification Content: Customise email templates according to your needs Notes Check DELETE operations before running in a test environment Make sure that the registry is not in read-only mode The registry may need to be put into maintenance mode for garbage collection Step Details: Retrieving image information:** The workflow starts by fetching a list of images and their associated tags from the Docker registry. Filtering and sorting:** The retrieved tags are then filtered and sorted based on specific criteria, such as creation date and tag name. Deleting old tags:** The workflow identifies old or unused tags and attempts to delete them from the registry. Sending notifications:** The workflow sends email notifications to inform the user about the status of the cleanup process, including any errors or successes. Executing additional cleanup tasks:** Finally, the workflow executes an SSH command on the Docker registry server to perform additional cleanup tasks, such as garbage collection. TL;DR In summary, this n8n template provides a robust and automated solution for managing and cleaning up Docker registries. By regularly running this workflow, users can ensure that their registry remains organized and efficient, and avoid running out of storage space.-
by AiAgent
What It Does This intelligent workflow simplifies the complex task of determining whether a website is legitimate or potentially a scam. By simply submitting a URL through a form, the system initiates a multi-agent evaluation process. Four dedicated AI agents—each powered by GPT-4o and connected to SerpAPI—analyze different dimensions of the website: domain and technical details, search engine signals, product and pricing patterns, and on-site content analysis. Their findings are then passed to a fifth AI agent, the Analyzer, powered by GPT-4o mini, which consolidates the data, scores the site on a scale of 1–10 for scam likelihood, and presents the findings in a clear, structured format for the user. Who It's For This workflow is ideal for anyone who needs to quickly and reliably assess the trustworthiness of a website. Whether you're a consumer double-checking a store before making a purchase, a small business owner validating supplier sites, a cybersecurity analyst conducting threat assessments, or a developer building fraud detection into your platform — this tool offers fast, AI-powered insights without the need for manual research or technical expertise. It's designed for both individuals and teams who value accurate, scalable scam detection. How It Works The process begins with a simple form submission where the user enters the URL of the website they want to investigate. Once submitted, the workflow activates four specialized AI agents—each powered by GPT-4o and connected to SerpAPI—to independently analyze the site from different angles: Agent 1 examines domain age, SSL certificates, and TLD trustworthiness. Agent 2 reviews search engine results, forum mentions, and public scam reports. Agent 3 analyzes product pricing patterns and brand authenticity. Agent 4 assesses on-site content quality, grammar, legitimacy of claims, and presence of business info. Each agent returns its findings, which are then aggregated and passed to a fifth AI agent—the Analyzer. This final agent, powered by GPT-4o mini, evaluates all the input, assigns a scam likelihood score from 1 to 10, and compiles a neatly formatted summary with organized insights and a disclaimer for context. Set UP You will need to obtain an Open AI API key from platform.openai.com/api-keys After you obtain this Open AI API key you will need to connect it to the Open AI Chat Model for all of the Tools agents (Analyzer, Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents). You will now need to fund your Open AI account. GPT 4o costs ~$0.01 to run the workflow. Next you will need to create a SerpAPI account at https://serpapi.com/users/sign_up After you create an account you will need to obtain a SerpAPI key. You will then need to use this key to connect to the SerpAPI tool for each of the tools agents (Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents) Tip: SerpAPI will allow you to run 100 free searches each month. This workflow uses ~5-15 SerpAPI searches per run. If you would like to utilize the workflow more than that each month, create multiple SerpAPI accounts and have an API key for each account. When you utilize all 100 free searches for an account, switch to the API key for another account within the workflow. Disclaimer This tool is designed to assist in evaluating the potential risk of websites using AI-generated insights. The scam likelihood score and analysis provided are based on publicly available information and should not be considered a definitive or authoritative assessment. This tool does not guarantee the accuracy, safety, or legitimacy of any website. Users should perform their own due diligence and use independent judgment before engaging with any site. N8N, OpenAI, its affiliates, and the creators of this workflow are not responsible for any loss, damages, or consequences arising from the use of this tool or the actions taken based on its results.
by Raz Hadas
Description Transform your investment strategy with a fully automated, AI-driven trading bot. This workflow bridges the gap between AI-powered market insights and real-world trading by executing buy and sell orders directly through the Alpaca paper trading API. Designed to work in tandem with the Automated Stock Sentiment Analysis workflow, this solution takes the top-performing stocks based on daily news sentiment and automatically rebalances your portfolio. It's perfect for algorithmic traders, data-driven investors, and n8n enthusiasts who want to see their AI analysis translate into tangible actions, all while maintaining a comprehensive log of every transaction in Google Sheets. Key Features & Benefits Automated Trading Execution:** Automatically places buy and sell orders on the Alpaca paper trading platform without manual intervention. Sentiment-Driven Decisions:** Leverages the output from the sentiment analysis workflow to make informed decisions, selling positions with waning sentiment and buying into those with high positive sentiment. Dynamic Portfolio Rebalancing:** Intelligently calculates which positions to close and how to allocate the resulting funds into new, high-potential stocks. Paper Trading Ready:** Safely test and refine your trading strategies in a risk-free environment using Alpaca's paper trading API. Daily Performance Tracking:** Automatically logs your account equity and daily percentage change to a Google Sheet, giving you a clear view of your portfolio's performance. Detailed Trade Logging:** Every buy and sell order is meticulously recorded in a Google Sheet for easy review and historical analysis. Scheduled and Autonomous:** The entire process runs on a daily schedule, making it a "set and forget" solution for systematic trading. How It Works This workflow executes a sophisticated, automated trading strategy in a few key stages: Daily Kick-off & Snapshot: The workflow triggers on a daily schedule, first fetching your current Alpaca account balance and logging it to a Google Sheet to track daily performance. Strategy Formulation: It then reads the daily sentiment scores produced by the accompanying "Stock Sentiment Analysis" workflow. A Code node filters these results to identify the top four stocks with the highest positive sentiment. The Decision Engine: The core of the workflow is a custom Code node that acts as the trading brain. It: Retrieves your currently open positions from Alpaca. Compares your holdings against the day's top four sentiment stocks. Generates a "sell list" of positions you hold that are no longer in the top four. Generates a "buy list" of top-sentiment stocks that you don't yet own. Calculates the total cash value from the "sell list" and determines the exact notional value to invest in each stock on the "buy list." Trade Execution: The workflow first iterates through the "sell list" and executes a DELETE request to Alpaca for each, closing the positions. A Wait node pauses the workflow for two minutes to ensure the sell orders are filled and the account balance is updated. It then iterates through the "buy list," executing POST requests to Alpaca to purchase the new assets with the calculated funds. Record Keeping: All executed orders (both buys and sells) are merged and logged in a dedicated Google Sheet, giving you a permanent and detailed transaction history. Nodes Used Schedule Trigger HttpRequest (Alpaca API) Google Sheets Code (JavaScript) SplitOut Wait Merge This workflow is the perfect next step for anyone looking to take their AI analysis to the next level. Take the emotion out of your trading and let this bot systematically execute your data-driven strategy.
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? 📞 Book a Call | 💬 DM me on Linkedin Get Alerted When Your Website Is Down – n8n as an Uptime Robot Alternative If you manage a website (or client sites), getting notified when your site goes down is critical. But you’ve probably experienced alert fatigue if you’ve ever received a flood of downtime emails for every tiny outage. These short hiccups are common, and most monitoring tools don’t let you filter them out. Here’s how you can set up website downtime notifications with n8n - no extra subscriptions, and no more false positives spamming your inbox. For your most important sites, you can even get direct text alerts to your phone. We’ll use n8n, a powerful automation tool with thousands of templates and integrations. You can run it in the cloud or even self-host it for free. Why Use n8n for Website Monitoring? Uptime Robot and similar services have limitations or get expensive fast if you monitor several sites. n8n gives you full control - you choose when and how you get notified, set your own timing and thresholds, and you aren’t stuck with default alert logic. Plus, n8n can automate much more than just uptime monitoring: use it to handle other business workflows too. Quick Start: Free n8n Website Monitoring Template Use the template to get started fast. Log in or sign up for n8n Cloud, or self-host for free if you prefer. Configure your schedule (default: hourly) and list the sites you want to monitor. Key setting: wait seconds. Recommended: 300 seconds (5 minutes). If your site goes down, the workflow waits before sending an alert—no notification for a 1-2 minute outage, only real, persistent problems. How to Test & Use Activate the workflow. Toggle it on in n8n and it’ll check your sites automatically. Test instantly: Add a non-existent URL and run the workflow. After the wait time, you’ll get an email alert for that URL. Notifications stay organized: Alerts go straight to your inbox (see my AI email labelling template if you want color-coded organization). Get Critical Alerts on Your Phone (Telegram) Email is great, but for your most important sites you want instant notifications on your phone. Best option: Use Telegram. n8n can send you messages via a Telegram bot—easy to set up, free, and works in seconds. How: Create a Telegram bot via BotFather (instructions in the template). Add your bot token and chat ID to n8n. Now, you’ll get downtime alerts right on Telegram—no missed notifications, no extra costs. FAQ Can I monitor unlimited sites?** Yes, just list them all in the config. What if downtime is just a few seconds?** The default 5-minute wait filters these out. Do I need a paid n8n plan?** No - self-hosting is free, and the workflow works on free plans. Why not SMS or WhatsApp?** Telegram bots are fastest, easiest, and don’t require extra API setup or subscriptions.
by Jonathan | NEX
Are you drowning in a sea of security notifications? Do your analysts spend more time sifting through low-level logs than investigating real threats? This workflow transforms n8n into an autonomous SOC (Security Operations Center) Analyst, tackling alert fatigue head-on. Leveraging the NixGuard Security RAG connector, this workflow automates the entire alert triage process. It ingests raw security events (from sources like Wazuh, your SIEM, or EDR), uses AI to analyze and assign a priority, and then intelligently routes the alert to the correct Slack channel. How It Works: Ingest & Filter: The workflow runs on a schedule, fetching all recent security alerts. It first performs a basic filtering to isolate events that meet a minimum severity threshold (e.g., level 7+). AI Analysis & Prioritization: The aggregated high-severity alerts are then sent to the AI with a specific prompt, asking it to analyze the situation and return a structured JSON object containing a single, overall priority (Critical, High, Info) and a concise summary. Intelligent Routing: A Switch node reads the AI-assigned priority and routes the notification to the appropriate destination. Critical alerts go to your #security-incident-response channel, high-priority alerts to #security-investigations, and informational ones to #security-logs. Key Features & Benefits: Eliminate Alert Fatigue:** Drastically reduce the noise by having AI pre-process and categorize alerts before they hit your team. Automate SOC Tier 1 Triage:** Free up your human analysts from repetitive triage tasks so they can focus on high-value investigation and threat hunting. Faster Incident Response:** Route critical alerts to the right people in real-time, cutting down on crucial response time. Consistent Prioritization:** Use AI to ensure a consistent, unbiased approach to alert prioritization, 24/7. Smart Routing Logic:** Go beyond simple keyword matching. The Switch node ensures alerts are delivered to the team best equipped to handle them based on AI-assessed severity. Who is this for? SOC Analysts & Security Engineers** looking to automate alert triage and incident response workflows. SecOps and DevOps Teams** who want to build a more efficient, automated security operations pipeline. IT Managers and Directors** aiming to improve their team's efficiency and reduce the risk of missing critical alerts. Anyone using Wazuh, a SIEM, or other security tools that generate a high volume of alerts. Stop manually triaging alerts. Install this workflow to build your own AI-powered security automation platform and let your team focus on what matters most. Don't have the main workflow yet? Get it HERE! 🔗 Learn more about NixGuard: thenex.world 🔗 Get started with a free security subscription: thenex.world/security/subscribe Tags / Keywords: AI, Security, SOC, Automation, Triage, Alerting, Cybersecurity, Wazuh, SIEM, Slack, Incident Response, Alert Fatigue, SecOps, Generative AI, LLM, NixGuard, Routing
by Davide
This workflow allows you to send SMS messages globally using API without needing a physical phone number. 1. How It Works Consists of three main nodes: Manual Trigger**: The workflow starts when you click the "Test workflow" button in n8n. Set SMS Data**: This node defines the SMS message content and the recipient's phone number (including the international prefix). Send SMS**: This node sends the SMS using the ClickSend API. It uses HTTP Basic Authentication with your ClickSend credentials and sends a POST request to the ClickSend API endpoint with the message and recipient details. The workflow is simple and efficient, making it easy to automate SMS sending for various use cases, such as notifications, alerts, or marketing campaigns. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: Register on ClickSend: Go to ClickSend and create an account. Obtain your API Key and take advantage of the 2 € free credits provided. Set Up Basic Authentication in n8n: In the "Send SMS" node, configure the HTTP Basic Auth credentials: Username: Use the username you registered with on ClickSend. Password: Use the API Key provided by ClickSend. Configure the SMS Data: In the "Set SMS data" node, define: The message content (e.g., "Hi, this is my first message"). The recipient's phone number, including the international prefix (e.g., +39xxxxxxxx). Test the Workflow: Click the "Test workflow" button in n8n to trigger the workflow. The workflow will send the SMS using the ClickSend API, and you should receive the message on the specified phone number. Optional Customization: You can modify the workflow to dynamically set the message content or recipient phone number using data from other nodes or external sources. This workflow is a quick and efficient way to send SMS messages programmatically. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Mario
Template to get your public IP address and push it to Namecheaps Dynamic DNS per subdomain. Open "yourdomain.com" Insert your domain and your Namecheap DDNS password Open "subdomains" Replaces and insert your subdomains Execute Workflow Have fun!
by Weilun
🔄 n8n Workflow: Check and Update n8n Version This workflow automatically checks if the local n8n version is outdated and, if so, creates a file to signal an update is needed. 🖥️ Working Environment Operating System:** Ubuntu 24.04 n8n Installation:** Docker container 📁 Project Directory Structure n8n/ ├── check_update.txt ├── check-update.sh ├── compose.yml ├── update_n8n.cron 🧾 File Descriptions check_update.txt Contains a single word: true: Update is needed false: No update required check-update.sh #!/bin/bash PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin if grep -q "true" /home/sysadmin/n8n/check_update.txt; then Place your update logic here echo "Update needed - please insert update logic" echo true > /home/sysadmin/n8n/check_update.txt fi Purpose: Checks the contents of check_update.txt If it contains true, executes update logic (currently a placeholder) Resets check_update.txt to true update_n8n.cron SHELL=/bin/sh 10 5 * * * /bin/sh /home/sysadmin/n8n/check-update.sh Purpose: Runs the check-update.sh script daily at 5:10 AM Uses /bin/sh as the shell environment 🧩 n8n Workflow Breakdown 1. Schedule Trigger 🕓 Purpose:** Triggers the workflow every day at 5:00 AM Node Type:** Schedule Trigger 2. Get the latest n8n version 🌐 Purpose:** Fetches the latest version of n8n from npm Endpoint:** https://registry.npmjs.org/n8n/latest Node Type:** HTTP Request 3. Get Local n8n version 🖥️ Purpose:** Retrieves the currently running n8n version Endpoint:** http://192.168.100.18:5678/rest/settings Node Type:** HTTP Request 4. If 🔍 Purpose:** Compares the local and latest versions Condition:** If not equal → update is needed 5. SSH1 🧾 Purpose:** Writes the result to a file on the host via SSH Logic:** echo "{{ $('If').params.conditions ? 'false' : 'true' }}" > check_update.txt Effect: Updates check_update.txt with "true" if an update is needed, "false" otherwise. 🛠️ Setting up Crontab on Ubuntu 1. Register the cron job with: crontab update_n8n.cron 2. Verify that your cron job is registered: crontab -l ✅ Result 5:00 AM** – n8n workflow checks versions and writes result to check_update.txt 5:10 AM** – Cron runs check-update.sh to respond to update flag