by Harshil Agrawal
This example workflow allows you to create, update, and get a document in Google Cloud Firestore. The workflow uses the Set node to set the data, however, you might receive data from a different source. Add the node that receives the data before the Set node and set the values you want to insert in a document, in the Set node. Also, update the Columns/ attributes fields in the Google Cloud Firestore node.
by Don Jayamaha Jr
A professional-grade AI automation system for spot market trading insights on Binance. It analyzes multi-timeframe technical indicators, live price/order data, and crypto sentiment, then delivers fully formatted Telegram-style trading reports. 🎥 Watch Tutorial: 🧩 Required Workflows You must install and activate all of the following workflows for the system to function correctly: | ✅ Workflow Name | 📌 Function Description | | -------------------------------------------------- | -------------------------------------------------------------------------------- | | Binance Spot Market Quant AI Agent | Final AI orchestrator. Parses user prompt and generates Telegram-ready reports. | | Binance SM Financial Analyst Tool | Calls indicator tools and price/order data tools. Synthesizes structured inputs. | | Binance SM News and Sentiment Analyst Webhook Tool | Analyzes crypto sentiment, gives summary and headlines via POST webhook. | | Binance SM Price/24hrStats/OrderBook/Kline Tool | Pulls price, order book, 24h stats, and OHLCV klines for 15m–1d. | | Binance SM 15min Indicators Tool | Calculates 15m RSI, MACD, BBANDS, ADX, SMA/EMA from Binance kline data. | | Binance SM 1hour Indicators Tool | Same as above but for 1h timeframe. | | Binance SM 4hour Indicators Tool | Same as above but for 4h timeframe. | | Binance SM 1day Indicators Tool | Same as above but for 1d timeframe. | | Binance SM Indicators Webhook Tool | Technical backend. Handles all webhook logic for each timeframe tool. | ⚙️ Installation Instructions Step 1: Import Workflows Open your n8n Editor UI Import each workflow JSON file one by one Activate them or ensure they're called via Execute Workflow Step 2: Set Credentials OpenAI API Key** (GPT-4o recommended) Binance endpoints** are public (no auth required) Step 3: Configure Webhook Endpoints Deploy Binance SM Indicators Webhook Tool Ensure the following paths are reachable: /webhook/15m /webhook/1h /webhook/4h /webhook/1d Step 4: Telegram Integration Create a Telegram bot using @BotFather Add your Telegram API token to n8n credentials Replace the Telegram ID placeholder with your own Step 5: Final Trigger Trigger the Binance Spot Market Quant AI Agent manually or from Telegram The agent: Extracts the trading pair (e.g. BTCUSDT) Calls all tools for market data and sentiment Generates a clean, HTML-formatted Telegram report 💬 Telegram Report Output Format BTCUSDT Market Report Spot Strategy • Action: Buy • Entry: $63,800 | SL: $61,200 | TP: $66,500 • Rationale: MACD Crossover (1h) RSI Rebound from Oversold (15m) Sentiment: Bullish Leverage Strategy • Position: Long 3x • Entry: $63,800 • SL/TP zones same as above News Sentiment: Slightly Bullish • "Bitcoin rallies as ETF inflows surge" – CoinDesk • "Whales accumulate BTC at key support" – NewsBTC 🧠 System Overview [Telegram Trigger] → [Session + Auth Logic] → [Binance Spot Market Quant AI Agent] → [Financial Analyst Tool + News Tool] → [All Technical Indicator Tools (15m, 1h, 4h, 1d)] → [OrderBook/Price/Kline Fetcher] → [GPT-4o Reasoning] → [Split & Send Message to Telegram] 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding or resale permitted. 🔗 For support: LinkedIn – Don Jayamaha
by Harshil Agrawal
Based on your use case, you might want to trigger a workflow if new data gets added to your database. This workflow allows you to send a message to Mattermost when new data gets added in Google Sheets. The Interval node triggers the workflow every 45 minutes. You can modify the timing based on your use case. You can even use the Cron node to trigger the workflow. If you wish to fetch new Tweets from Twitter, replace the Google Sheet node with the respective node. Update the Function node accordingly.
by Harshil Agrawal
This workflow allows you to get analytics of a website and store it Airtable. In this workflow, we get the analytics for the sessions grouped by the country. Based on your use-case, you can select different Dimensions and set different Metrics. You can use the Cron node or the Interval node to trigger the workflow on a particular interval and fetch the analytics data regularly. Based on your use-case, you might want to store the data returned by Google Analytics to a database or a Google Sheet. Replace the Airtable node with the appropriate node.
by Harshil Agrawal
This workflow allows you to receive updates about the positiong of the ISS and add it to a table in TimescaleDB. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Based on your use-case you may want to fetch data from a different URL. Enter the URL in the URL field. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. TimescaleDB node: This node stores the information in a table named iss. You can use a different table as well.
by Shashikanth
Source code, I maintain this worflow here. Usage Guide This workflow backs up all workflows as JSON files named in the [workflow_name].json format. Steps Create GitHub Repository Skip this step if using an existing repository. Add GitHub Credentials In Credentials, add the GitHub credential for the repository owner. Download and Import Workflow Import this workflow into n8n. Set Global Values In the Globals node, set the following: repo.owner: GitHub username of the repository owner. repo.name: Name of the repository for backups. repo.path: Path to the folder within the repository where workflows will be saved. Configure GitHub Nodes Edit each GitHub node in the workflow to use the added credentials. Workflow Logic Each workflow run handles files based on their status: New Workflow If a workflow is new, create a new file in the repository. Unchanged Workflow If the workflow is unchanged, skip to the next item. Changed Workflow If a workflow has changes, update the corresponding file in the repository. Current Limitations / Needs work Name Change of Workflows If a workflow is renamed or deleted in n8n, the old file remains in the repository. Deleted Workflows Deleted workflows in n8n are not removed from the repository.
by Evoort Solutions
AI-Powered Product Research & SEO Content Automation Skip the guesswork and manual effort — this n8n flow automates the entire process of researching your product's online competition and generating high-quality SEO content. Whether you're launching a new product or optimizing existing listings, this workflow leverages real-time web data and AI-driven copywriting to deliver: 📈 Search-optimized metadata (Title, Description, Keywords) 🛍️ Engaging product descriptions tailored for marketing 📊 Auto-organized output ready for use in your content or e-commerce platform All of this happens with just one product title input! 🧠 How It Works • User submits a product title via a form. • The workflow uses Google Custom Search to gather real-time competitor content based on that title. • Titles, snippets, and keywords are extracted from the search results. • This information is sent to a language model (Google Gemini via LangChain) to generate: SEO-optimized metadata (Title, Description, Keywords) A compelling product description tailored for marketing • The AI-generated content is then parsed and organized into two categories: SEO data and product content. • The structured output is saved automatically into a connected Google Sheet for easy access or further automation. 🛠️ What Problems Does This Solve? Manual competitor research and writing SEO content from scratch can be: Time-consuming** Inconsistent in quality** Not optimized for search engines** Hard to scale for multiple products** This workflow automates the entire research + writing + structuring process. ✅ Key Benefits Instant Content Creation**: Generate polished SEO content in seconds. Competitor-Aware**: Pulls in real-time data from the web for relevant, market-aligned content. Scalable**: Easily repeat the process for multiple product titles with minimal effort. Data Centralization**: Stores everything in Google Sheets—great for collaboration or syncing with other tools. Customizable**: Easily extend or modify the workflow to include translations, publishing, or social media automation. ⚙️ Set-Up Steps • Connect Google Custom Search API with a valid API key and search engine ID (CX). • Connect and configure Google Gemini or LangChain with access credentials. • Provide access to a Google Sheet with columns for storing SEO and product data. • Estimated setup time: ~15–25 minutes depending on API access and sheet setup. 🚀 Let’s Get You Started with Automating Your LinkedIn Posts! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Ludwig
How It Works: • Scrapes company review data from Glassdoor using ScrapingBee. • Extracts demographic-based ratings using AI-powered text analysis. • Calculates workplace disparities with statistical measures like z-scores, effect sizes, and p-values. • Generates visualizations (scatter plots, bar charts) to highlight patterns of discrimination or bias. Example Visualizations: Set Up Steps: Estimated time: ~20 minutes. • Replace ScrapingBee and OpenAI credentials with your own. • Input the company name you want to analyze (best results with large U.S.-based organizations). • Run the workflow and review the AI-generated insights and visual reports. This workflow empowers users to identify potential workplace discrimination trends, helping advocate for greater equity and accountability. Additional Credit: Wes Medford For algorithms and inspiration
by ConvertAPI
Who is this for? For developers and organizations that need to convert XLSX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the XLSX file from the web. Converts the XLSX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Muzaffer AKYIL
Docker Registry Cleanup Template This template is designed to automatically clean up old image tags in the Docker registry and perform garbage collection. Features List all images in the registry Preserve the last 10 tags for each image (latest tag is always preserved) Delete old tags Email notification for Successful/Excused cancellation Registry garbage collection automation Failure notification in error conditions Prerequisites Docker Registry v2 API access Basic Authentication credentials SMTP email settings (for notifications) SSH node installed on n8n (for garbage collection) Installation 1. Identity Information Add the following credentials in n8n: HTTP Basic Auth**: For Registry access SSH Private Key**: For Garbage collection command Email SMTP**: For notifications 2. Set Variables Replace your-registry-url with your actual registry URL on all nodes: ‘url": ‘https://your.registry.com/v2/_catalog’. Customisation Retention Policy: Set the number of tags to be retained by changing the slice(0, 10) value in the Identify Tags to Remove node Schedule: Change the frequency of operation at the Trigger node Notification Content: Customise email templates according to your needs Notes Check DELETE operations before running in a test environment Make sure that the registry is not in read-only mode The registry may need to be put into maintenance mode for garbage collection Step Details: Retrieving image information:** The workflow starts by fetching a list of images and their associated tags from the Docker registry. Filtering and sorting:** The retrieved tags are then filtered and sorted based on specific criteria, such as creation date and tag name. Deleting old tags:** The workflow identifies old or unused tags and attempts to delete them from the registry. Sending notifications:** The workflow sends email notifications to inform the user about the status of the cleanup process, including any errors or successes. Executing additional cleanup tasks:** Finally, the workflow executes an SSH command on the Docker registry server to perform additional cleanup tasks, such as garbage collection. TL;DR In summary, this n8n template provides a robust and automated solution for managing and cleaning up Docker registries. By regularly running this workflow, users can ensure that their registry remains organized and efficient, and avoid running out of storage space.-
by Łukasz
What Is This? This workflow is a comprehensive solution for automating website audits and optimizations, leveraging advanced technologies to boost SEO effectiveness and overall site performance. Who Is It For? Designed for SEO specialists, digital marketers, webmasters, and content teams, this workflow empowers anyone responsible for website performance to automate and scale their audit processes. Agencies managing multiple client sites, in-house SEO teams aiming to save time on routine checks, and developers seeking to integrate data-driven insights into their deployment pipelines will all find this solution invaluable. By combining your site’s sitemap with Google Search Console and Google Analytics data, then applying AI-powered analysis, the workflow continuously uncovers actionable recommendations to boost search visibility, improve user engagement, and accelerate page performance. Whether you manage a single blog or oversee a sprawling e-commerce platform, this automated pipeline delivers precise, prioritized SEO improvements without manual data wrangling. How Does It Work? This end-to-end site analysis automation consists of five main stages: 1. URL Discovery Processes the sitemap.xml using HTTP Request and XML nodes to extract all site URLs. 2. Search Console Performance Analysis Uses the Google Search Console API to fetch detailed metrics for each page, including search position, clicks, impressions, and CTR. 3. Analytics Data Collection Connects to the Google Analytics API to automatically retrieve traffic metrics such as pageviews, average session duration, bounce rate, and conversions. 4. AI Data Processing Employs OpenAI models to perform in-depth analysis of the collected data. The artificial intelligence engine merges insights from all sources, identifies patterns, and produces detailed optimization recommendations. AI analyses website itsefl aswell. Consider testing different models. I do recommend at least trying out o4-mini. 5. Recommendation Generation Creates tailored suggestions for each page, in form of HTML table, that is being sent to your email. How To Set It Up? Accounts: An active n8n account or instance, API keys for Google Search Console and Google Analytics, an OpenAI access token. Enabled Google APIs: You will neeed at least following scopes: Google Search Console API Google Analytics Aadmin API Google Analytics Data API Scheduling: The workflow can run manually for ad hoc audits or be scheduled (daily, weekly) for continuous site monitoring. Testing: There are two nodes that are optional: "Sort for testing purposes" and "Limit for testing purposes" Together they randomly select items from sitemap and limit them to few so you don't need to run hundreds of sitemap.xml items at once, but you can run just a random batch first. Globals: There is node called "Globals- CHANGE ME!". You need to set up proper variables in there, which are: sitemap_url - self exlpainatory search_console_selector - for example "sc-domain:sailingbyte.com" but can be URL aswell- depends on how did you set up your search console analysis_start_date and analysis_end_date - date range for analytics, by default last 30 days analytics_selector_id - ID of Google Analytics setup, it is a large integer, you can find it in analytics url preceeded with letter "p", ex (your number is where there are X's): https://analytics.google.com/analytics/web/#/pXXXXXXXXX/reports/intelligenthome report_receiver - email which will receive report What's More? That's actually it. I hope that this automation will help your website improvement will be much easier! Thank you, perfect! Glad I could help. Visit my profile for other automations for businesses. And if you are looking for dedicated software development, do not hesitate to reach out!
by Raz Hadas
Stay ahead of the market with this powerful, automated workflow that performs real-time sentiment analysis on stock market news. By leveraging the advanced capabilities of Google Gemini, this solution provides you with actionable insights to make informed investment decisions. This workflow is designed for investors, traders, and financial analysts who want to automate the process of monitoring news and gauging market sentiment for specific stocks. It seamlessly integrates with Google Sheets for input and output, making it easy to track a portfolio of stocks. Key Features & Benefits Automated Daily Analysis: The workflow is triggered daily, providing you with fresh sentiment analysis just in time for the market open. Dynamic Stock Tracking: Easily manage your list of tracked stocks from a simple Google Sheet. AI-Powered Insights: Utilizes Google Gemini's sophisticated language model to analyze news content for its potential impact on stock prices, including a sentiment score and a detailed rationale. Comprehensive News Aggregation: Fetches the latest news articles from EODHD for each of your specified stock tickers. Error Handling & Validation: Includes built-in checks for invalid stock tickers and formats the AI output for reliable data logging. Centralized Reporting: Automatically logs the sentiment score, rationale, and date into a Google Sheet for easy tracking and historical analysis. How It Works This workflow follows a systematic process to deliver automated sentiment analysis: Scheduled Trigger: The workflow begins each day at a specified time. Fetch Stock Tickers: It reads a list of stock tickers from your designated Google Sheet. Loop and Fetch News: For each ticker, it retrieves the latest news articles using the EODHD API. AI Sentiment Analysis: The collected news articles are then passed to a Google Gemini-powered AI agent. The agent is prompted to act as a stock sentiment analyzer, evaluating the news and generating: A sentiment score from -1 (strong negative) to 1 (strong positive). A detailed rationale explaining the basis for the score. Data Formatting & Validation: The AI's output is parsed and validated to ensure it is in the correct JSON format. Log to Google Sheets: The final sentiment score and rationale are appended to your Google Sheet, alongside the corresponding stock ticker and the current date. Nodes Used Schedule Trigger Google Sheets SplitInBatches HttpRequest (EODHD) If Code (JavaScript) AI Agent (LangChain) Google Gemini Chat Model This workflow is a valuable tool for anyone looking to harness the power of AI for financial market analysis. Deploy this automated solution to save time, gain a competitive edge, and make more data-driven trading decisions.